Difference between revisions of "Text Transfer Learning"
Line 14: | Line 14: | ||
<youtube>zxJJ0T54HX8</youtube> | <youtube>zxJJ0T54HX8</youtube> | ||
<youtube>qN9hHlZKIL4</youtube> | <youtube>qN9hHlZKIL4</youtube> | ||
+ | |||
+ | == GPT-2 == | ||
+ | * [http://medium.com/@ajitrajasekharan/gpt-2-a-promising-but-nascent-transfer-learning-method-that-could-reduce-or-even-eliminate-in-some-48ea3370cc21 GPT-2 A nascent transfer learning method that could eliminate supervised learning some NLP tasks | Ajit Rajasekharan - Medium] |
Revision as of 07:50, 13 November 2019
- Image/Video Transfer Learning
- Transfer Learning for Text Mining | Weike Pan, Erheng Zhong, and Qiang Yang
- Natural Language Processing (NLP)
- Google achieves state-of-the-art NLP performance with an enormous language model and data set | Kyle Wiggers - Venture Beat researchers at Google developed a new data set — Colossal Clean Crawled Corpus — and a unified framework and model dubbed Text-to-Text Transformer that converts language problems into a text-to-text format. Colossal Clean Crawled Corpus — were sourced from the Common Crawl project, which scrapes roughly 20 terabytes of English text from the web each month.
Transfer algorithms: Bi-Directional Attention Flow (BIDAF), Document-QA (DOCQA), Reasoning Network (ReasoNet), R-NET, S-NET, and Assertion Based Question Answering (ABQA) Transfer Learning for Text using Deep Learning Virtual Machine (DLVM) | Anusua Trivedi and Wee Hyong Tok - Microsoft