Difference between revisions of "Bidirectional Encoder Representations from Transformers (BERT)"

From
Jump to: navigation, search
Line 17: Line 17:
 
* Google's BERT - built on ideas from [[ULMFiT]], [[ELMo]], and [http://openai.com/ OpenAI]
 
* Google's BERT - built on ideas from [[ULMFiT]], [[ELMo]], and [http://openai.com/ OpenAI]
 
* [[Attention]] Mechanism/[[Transformer]] Model
 
* [[Attention]] Mechanism/[[Transformer]] Model
 +
** [[Generative Pre-trained Transformer (GPT)]]2/3
 
** [[Transformer-XL]]
 
** [[Transformer-XL]]
 
* [http://venturebeat.com/2019/05/16/microsoft-makes-googles-bert-nlp-model-better/ Microsoft makes Google’s BERT NLP model better | Khari Johnson - VentureBeat]  
 
* [http://venturebeat.com/2019/05/16/microsoft-makes-googles-bert-nlp-model-better/ Microsoft makes Google’s BERT NLP model better | Khari Johnson - VentureBeat]  

Revision as of 14:53, 26 July 2020

Youtube search... ...Google search





BERT Research | Chris McCormick