Difference between revisions of "Text Transfer Learning"
m |
m |
||
(3 intermediate revisions by the same user not shown) | |||
Line 19: | Line 19: | ||
* [[Learning Techniques]] | * [[Learning Techniques]] | ||
* [[Video/Image]] ... [[Vision]] ... [[Enhancement]] ... [[Fake]] ... [[Reconstruction]] ... [[Colorize]] ... [[Occlusions]] ... [[Predict image]] ... [[Image/Video Transfer Learning]] | * [[Video/Image]] ... [[Vision]] ... [[Enhancement]] ... [[Fake]] ... [[Reconstruction]] ... [[Colorize]] ... [[Occlusions]] ... [[Predict image]] ... [[Image/Video Transfer Learning]] | ||
− | * [[Transfer Learning]] | + | * [[Perspective]] ... [[Context]] ... [[In-Context Learning (ICL)]] ... [[Transfer Learning]] ... [[Out-of-Distribution (OOD) Generalization]] |
+ | * [[Causation vs. Correlation]] ... [[Autocorrelation]] ...[[Convolution vs. Cross-Correlation (Autocorrelation)]] | ||
* [[Large Language Model (LLM)]] ... [[Natural Language Processing (NLP)]] ...[[Natural Language Generation (NLG)|Generation]] ... [[Natural Language Classification (NLC)|Classification]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding]] ... [[Language Translation|Translation]] ... [[Natural Language Tools & Services|Tools & Services]] | * [[Large Language Model (LLM)]] ... [[Natural Language Processing (NLP)]] ...[[Natural Language Generation (NLG)|Generation]] ... [[Natural Language Classification (NLC)|Classification]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding]] ... [[Language Translation|Translation]] ... [[Natural Language Tools & Services|Tools & Services]] | ||
* [http://venturebeat.com/2019/10/24/google-achieves-state-of-the-art-nlp-performance-with-an-enormous-language-model-and-data-set/ Google achieves state-of-the-art NLP performance with an enormous language model and data set | Kyle Wiggers - Venture Beat] researchers at Google developed a new data set — Colossal Clean Crawled Corpus — and a unified framework and model dubbed [http://arxiv.org/pdf/1910.10683.pdf Text-to-Text Transformer] that converts language problems into a text-to-text format. Colossal Clean Crawled Corpus — were sourced from the Common Crawl project, which scrapes roughly 20 terabytes of English text from the web each month. | * [http://venturebeat.com/2019/10/24/google-achieves-state-of-the-art-nlp-performance-with-an-enormous-language-model-and-data-set/ Google achieves state-of-the-art NLP performance with an enormous language model and data set | Kyle Wiggers - Venture Beat] researchers at Google developed a new data set — Colossal Clean Crawled Corpus — and a unified framework and model dubbed [http://arxiv.org/pdf/1910.10683.pdf Text-to-Text Transformer] that converts language problems into a text-to-text format. Colossal Clean Crawled Corpus — were sourced from the Common Crawl project, which scrapes roughly 20 terabytes of English text from the web each month. | ||
* [[Attention]] Mechanism ... [[Transformer]] ... [[Generative Pre-trained Transformer (GPT)]] ... [[Generative Adversarial Network (GAN)|GAN]] ... [[Bidirectional Encoder Representations from Transformers (BERT)|BERT]] | * [[Attention]] Mechanism ... [[Transformer]] ... [[Generative Pre-trained Transformer (GPT)]] ... [[Generative Adversarial Network (GAN)|GAN]] ... [[Bidirectional Encoder Representations from Transformers (BERT)|BERT]] | ||
− | * [[Generative AI]] ... [[Conversational AI]] ... [[ChatGPT]] | [[OpenAI]] ... [[Bing]] | [[Microsoft]] ... [[ | + | * [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Generative AI]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]] |
+ | * [[Conversational AI]] ... [[ChatGPT]] | [[OpenAI]] ... [[Bing/Copilot]] | [[Microsoft]] ... [[Gemini]] | [[Google]] ... [[Claude]] | [[Anthropic]] ... [[Perplexity]] ... [[You]] ... [[phind]] ... [[Ernie]] | [[Baidu]] | ||
* [http://pdfs.semanticscholar.org/1bb2/39731589f3114a3fe5b35e42a635b5eacb38.pdf Transfer Learning for Text Mining | Weike Pan, Erheng Zhong, and Qiang Yang] | * [http://pdfs.semanticscholar.org/1bb2/39731589f3114a3fe5b35e42a635b5eacb38.pdf Transfer Learning for Text Mining | Weike Pan, Erheng Zhong, and Qiang Yang] | ||
Transfer algorithms: Bi-Directional Attention Flow (BIDAF), Document-QA (DOCQA), Reasoning Network (ReasoNet), R-NET, S-NET, and Assertion Based Question Answering (ABQA) [http://blogs.technet.microsoft.com/machinelearning/2018/04/25/transfer-learning-for-text-using-deep-learning-virtual-machine-dlvm/ Transfer Learning for Text using Deep Learning Virtual Machine (DLVM) | Anusua Trivedi and Wee Hyong Tok - Microsoft] | Transfer algorithms: Bi-Directional Attention Flow (BIDAF), Document-QA (DOCQA), Reasoning Network (ReasoNet), R-NET, S-NET, and Assertion Based Question Answering (ABQA) [http://blogs.technet.microsoft.com/machinelearning/2018/04/25/transfer-learning-for-text-using-deep-learning-virtual-machine-dlvm/ Transfer Learning for Text using Deep Learning Virtual Machine (DLVM) | Anusua Trivedi and Wee Hyong Tok - Microsoft] | ||
− | |||
− | |||
<youtube>zxJJ0T54HX8</youtube> | <youtube>zxJJ0T54HX8</youtube> | ||
<youtube>qN9hHlZKIL4</youtube> | <youtube>qN9hHlZKIL4</youtube> |
Latest revision as of 15:35, 28 April 2024
YouTube search... ...Google search
- Learning Techniques
- Video/Image ... Vision ... Enhancement ... Fake ... Reconstruction ... Colorize ... Occlusions ... Predict image ... Image/Video Transfer Learning
- Perspective ... Context ... In-Context Learning (ICL) ... Transfer Learning ... Out-of-Distribution (OOD) Generalization
- Causation vs. Correlation ... Autocorrelation ...Convolution vs. Cross-Correlation (Autocorrelation)
- Large Language Model (LLM) ... Natural Language Processing (NLP) ...Generation ... Classification ... Understanding ... Translation ... Tools & Services
- Google achieves state-of-the-art NLP performance with an enormous language model and data set | Kyle Wiggers - Venture Beat researchers at Google developed a new data set — Colossal Clean Crawled Corpus — and a unified framework and model dubbed Text-to-Text Transformer that converts language problems into a text-to-text format. Colossal Clean Crawled Corpus — were sourced from the Common Crawl project, which scrapes roughly 20 terabytes of English text from the web each month.
- Attention Mechanism ... Transformer ... Generative Pre-trained Transformer (GPT) ... GAN ... BERT
- Artificial Intelligence (AI) ... Generative AI ... Machine Learning (ML) ... Deep Learning ... Neural Network ... Reinforcement ... Learning Techniques
- Conversational AI ... ChatGPT | OpenAI ... Bing/Copilot | Microsoft ... Gemini | Google ... Claude | Anthropic ... Perplexity ... You ... phind ... Ernie | Baidu
- Transfer Learning for Text Mining | Weike Pan, Erheng Zhong, and Qiang Yang
Transfer algorithms: Bi-Directional Attention Flow (BIDAF), Document-QA (DOCQA), Reasoning Network (ReasoNet), R-NET, S-NET, and Assertion Based Question Answering (ABQA) Transfer Learning for Text using Deep Learning Virtual Machine (DLVM) | Anusua Trivedi and Wee Hyong Tok - Microsoft