Difference between revisions of "Text Transfer Learning"
| Line 8: | Line 8: | ||
[http://www.google.com/search?q=Text+document+speech+words+Transfer+Learning+machine+neural+network ...Google search] | [http://www.google.com/search?q=Text+document+speech+words+Transfer+Learning+machine+neural+network ...Google search] | ||
| + | * [[Learning Techniques]] | ||
* [[Image/Video Transfer Learning]] | * [[Image/Video Transfer Learning]] | ||
* [[Transfer Learning]] | * [[Transfer Learning]] | ||
| − | |||
* [http://pdfs.semanticscholar.org/1bb2/39731589f3114a3fe5b35e42a635b5eacb38.pdf Transfer Learning for Text Mining | Weike Pan, Erheng Zhong, and Qiang Yang] | * [http://pdfs.semanticscholar.org/1bb2/39731589f3114a3fe5b35e42a635b5eacb38.pdf Transfer Learning for Text Mining | Weike Pan, Erheng Zhong, and Qiang Yang] | ||
* [[Natural Language Processing (NLP)]] | * [[Natural Language Processing (NLP)]] | ||
Revision as of 09:15, 23 February 2020
YouTube search... ...Google search
- Learning Techniques
- Image/Video Transfer Learning
- Transfer Learning
- Transfer Learning for Text Mining | Weike Pan, Erheng Zhong, and Qiang Yang
- Natural Language Processing (NLP)
- Google achieves state-of-the-art NLP performance with an enormous language model and data set | Kyle Wiggers - Venture Beat researchers at Google developed a new data set — Colossal Clean Crawled Corpus — and a unified framework and model dubbed Text-to-Text Transformer that converts language problems into a text-to-text format. Colossal Clean Crawled Corpus — were sourced from the Common Crawl project, which scrapes roughly 20 terabytes of English text from the web each month.
- Generative Pre-trained Transformer-2 (GPT-2)
- Learning Techniques
Transfer algorithms: Bi-Directional Attention Flow (BIDAF), Document-QA (DOCQA), Reasoning Network (ReasoNet), R-NET, S-NET, and Assertion Based Question Answering (ABQA) Transfer Learning for Text using Deep Learning Virtual Machine (DLVM) | Anusua Trivedi and Wee Hyong Tok - Microsoft