Difference between revisions of "ULMFiT"

From
Jump to: navigation, search
m
m
Line 3: Line 3:
  
 
* Google's [[Bidirectional Encoder Representations from Transformers (BERT)]] - built on ideas from ULMFiT, [[ELMo]], and [[OpenAI]]
 
* Google's [[Bidirectional Encoder Representations from Transformers (BERT)]] - built on ideas from ULMFiT, [[ELMo]], and [[OpenAI]]
* [[Natural Language Processing (NLP)]]
+
* [[Large Language Model (LLM)]] ... [[Natural Language Processing (NLP)]]  ...[[Natural Language Generation (NLG)|Generation]] ... [[Natural Language Classification (NLC)|Classification]] ...  [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding]] ... [[Language Translation|Translation]] ... [[Natural Language Tools & Services|Tools & Services]]
  
 
* [http://github.com/explosion/spaCy/issues/2342 Adding Universal Language Model Fine-tuning ULMFiT pre-trained LM to spacy and alowing a simple way to train new models]
 
* [http://github.com/explosion/spaCy/issues/2342 Adding Universal Language Model Fine-tuning ULMFiT pre-trained LM to spacy and alowing a simple way to train new models]
  
 
<youtube>zxJJ0T54HX8</youtube>
 
<youtube>zxJJ0T54HX8</youtube>

Revision as of 14:37, 28 April 2023