Difference between revisions of "Text Transfer Learning"

From
Jump to: navigation, search
m
Line 15: Line 15:
 
* [http://venturebeat.com/2019/10/24/google-achieves-state-of-the-art-nlp-performance-with-an-enormous-language-model-and-data-set/ Google achieves state-of-the-art NLP performance with an enormous language model and data set | Kyle Wiggers - Venture Beat] researchers at Google developed a new data set — Colossal Clean Crawled Corpus — and a unified framework and model dubbed [http://arxiv.org/pdf/1910.10683.pdf Text-to-Text Transformer] that converts language problems into a text-to-text format. Colossal Clean Crawled Corpus — were sourced from the Common Crawl project, which scrapes roughly 20 terabytes of English text from the web each month.
 
* [http://venturebeat.com/2019/10/24/google-achieves-state-of-the-art-nlp-performance-with-an-enormous-language-model-and-data-set/ Google achieves state-of-the-art NLP performance with an enormous language model and data set | Kyle Wiggers - Venture Beat] researchers at Google developed a new data set — Colossal Clean Crawled Corpus — and a unified framework and model dubbed [http://arxiv.org/pdf/1910.10683.pdf Text-to-Text Transformer] that converts language problems into a text-to-text format. Colossal Clean Crawled Corpus — were sourced from the Common Crawl project, which scrapes roughly 20 terabytes of English text from the web each month.
 
* [[Generative Pre-trained Transformer (GPT)]]
 
* [[Generative Pre-trained Transformer (GPT)]]
* [[Learning Techniques]]
 
  
  

Revision as of 06:47, 9 September 2020

YouTube search... ...Google search


Transfer algorithms: Bi-Directional Attention Flow (BIDAF), Document-QA (DOCQA), Reasoning Network (ReasoNet), R-NET, S-NET, and Assertion Based Question Answering (ABQA) Transfer Learning for Text using Deep Learning Virtual Machine (DLVM) | Anusua Trivedi and Wee Hyong Tok - Microsoft


042518_1628_TransferLea1.png