Difference between revisions of "Text Transfer Learning"

From
Jump to: navigation, search
(GPT-2)
Line 5: Line 5:
 
* [[Natural Language Processing (NLP)]]
 
* [[Natural Language Processing (NLP)]]
 
* [http://venturebeat.com/2019/10/24/google-achieves-state-of-the-art-nlp-performance-with-an-enormous-language-model-and-data-set/ Google achieves state-of-the-art NLP performance with an enormous language model and data set | Kyle Wiggers - Venture Beat] researchers at Google developed a new data set — Colossal Clean Crawled Corpus — and a unified framework and model dubbed [http://arxiv.org/pdf/1910.10683.pdf Text-to-Text Transformer] that converts language problems into a text-to-text format. Colossal Clean Crawled Corpus — were sourced from the Common Crawl project, which scrapes roughly 20 terabytes of English text from the web each month.
 
* [http://venturebeat.com/2019/10/24/google-achieves-state-of-the-art-nlp-performance-with-an-enormous-language-model-and-data-set/ Google achieves state-of-the-art NLP performance with an enormous language model and data set | Kyle Wiggers - Venture Beat] researchers at Google developed a new data set — Colossal Clean Crawled Corpus — and a unified framework and model dubbed [http://arxiv.org/pdf/1910.10683.pdf Text-to-Text Transformer] that converts language problems into a text-to-text format. Colossal Clean Crawled Corpus — were sourced from the Common Crawl project, which scrapes roughly 20 terabytes of English text from the web each month.
 +
* [[Generative Pre-trained Transformer-2 (GPT-2)]]
  
  

Revision as of 07:51, 13 November 2019

YouTube search...


Transfer algorithms: Bi-Directional Attention Flow (BIDAF), Document-QA (DOCQA), Reasoning Network (ReasoNet), R-NET, S-NET, and Assertion Based Question Answering (ABQA) Transfer Learning for Text using Deep Learning Virtual Machine (DLVM) | Anusua Trivedi and Wee Hyong Tok - Microsoft


042518_1628_TransferLea1.png