Difference between revisions of "Bidirectional Encoder Representations from Transformers (BERT)"

From
Jump to: navigation, search
m
m
Line 8: Line 8:
 
[http://www.google.com/search?q=BERT+Transformer+nlp+language ...Google search]
 
[http://www.google.com/search?q=BERT+Transformer+nlp+language ...Google search]
  
* [[Natural Language Processing (NLP)]]
+
* [[Assistants]] ... [[Hybrid Assistants]]  ... [[Agents]]  ... [[Negotiation]]
 +
* [[Natural Language Processing (NLP)]]  ...[[Natural Language Generation (NLG)|Generation]]  ...[[Large Language Model (LLM)|LLM]]  ...[[Natural Language Tools & Services|Tools & Services]]
 +
* [[Attention]] Mechanism  ...[[Transformer]] Model  ...[[Generative Pre-trained Transformer (GPT)]]
 
* [[SMART - Multi-Task Deep Neural Networks (MT-DNN)]]
 
* [[SMART - Multi-Task Deep Neural Networks (MT-DNN)]]
 
* [[Deep Distributed Q Network Partial Observability]]
 
* [[Deep Distributed Q Network Partial Observability]]

Revision as of 22:50, 25 February 2023

Youtube search... ...Google search





BERT Research | Chris McCormick