Difference between revisions of "Bidirectional Encoder Representations from Transformers (BERT)"
| Line 1: | Line 1: | ||
| − | [http://www.youtube.com/results?search_query=BERT+nlp+language Youtube search...] | + | [http://www.youtube.com/results?search_query=BERT+Transformer+nlp+language Youtube search...] |
| − | [http://www.google.com/search?q=BERT+nlp+language ...Google search] | + | [http://www.google.com/search?q=BERT+Transformer+nlp+language ...Google search] |
* Google's BERT - built on ideas from [[ULMFiT]], [[ELMo]], and [http://openai.com/ OpenAI] | * Google's BERT - built on ideas from [[ULMFiT]], [[ELMo]], and [http://openai.com/ OpenAI] | ||
| Line 7: | Line 7: | ||
| − | <youtube> | + | <youtube>0EtD5ybnh_s</youtube> |
<youtube>BhlOGGzC0Q0</youtube> | <youtube>BhlOGGzC0Q0</youtube> | ||
Revision as of 17:29, 19 January 2019
Youtube search... ...Google search
- Google's BERT - built on ideas from ULMFiT, ELMo, and OpenAI
- Transformer-XL
- Natural Language Processing (NLP)