Difference between revisions of "ULMFiT"
m |
m |
||
| Line 5: | Line 5: | ||
* [[Large Language Model (LLM)]] ... [[Natural Language Processing (NLP)]] ...[[Natural Language Generation (NLG)|Generation]] ... [[Natural Language Classification (NLC)|Classification]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding]] ... [[Language Translation|Translation]] ... [[Natural Language Tools & Services|Tools & Services]] | * [[Large Language Model (LLM)]] ... [[Natural Language Processing (NLP)]] ...[[Natural Language Generation (NLG)|Generation]] ... [[Natural Language Classification (NLC)|Classification]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding]] ... [[Language Translation|Translation]] ... [[Natural Language Tools & Services|Tools & Services]] | ||
| − | * [http://github.com/explosion/spaCy/issues/2342 Adding Universal Language Model Fine-tuning ULMFiT pre-trained LM to spacy and alowing a simple way to train new models] | + | * [http://github.com/explosion/spaCy/issues/2342 Adding Universal Language Model [[Fine-tuning]] ULMFiT pre-trained LM to spacy and alowing a simple way to train new models] |
<youtube>zxJJ0T54HX8</youtube> | <youtube>zxJJ0T54HX8</youtube> | ||
Latest revision as of 19:33, 16 August 2023
Youtube search... ...Google search
- Google's Bidirectional Encoder Representations from Transformers (BERT) - built on ideas from ULMFiT, ELMo, and OpenAI
- Large Language Model (LLM) ... Natural Language Processing (NLP) ...Generation ... Classification ... Understanding ... Translation ... Tools & Services
- Adding Universal Language Model Fine-tuning ULMFiT pre-trained LM to spacy and alowing a simple way to train new models