Difference between revisions of "Bidirectional Encoder Representations from Transformers (BERT)"

From
Jump to: navigation, search
m
m
Line 8: Line 8:
 
[https://www.google.com/search?q=BERT+Transformer+nlp+language ...Google search]
 
[https://www.google.com/search?q=BERT+Transformer+nlp+language ...Google search]
  
* [[Assistants]] ... [[Hybrid Assistants]]  ... [[Agents]]  ... [[Negotiation]] ... [[Hugging_Face#HuggingGPT|HuggingGPT]] ... [[LangChain]]
 
 
* [[Natural Language Processing (NLP)]]  ...[[Natural Language Generation (NLG)|Generation]]  ...[[Large Language Model (LLM)|LLM]]  ...[[Natural Language Tools & Services|Tools & Services]]
 
* [[Natural Language Processing (NLP)]]  ...[[Natural Language Generation (NLG)|Generation]]  ...[[Large Language Model (LLM)|LLM]]  ...[[Natural Language Tools & Services|Tools & Services]]
 +
* [[Assistants]]  ... [[Agents]]  ... [[Negotiation]] ... [[Hugging_Face#HuggingGPT|HuggingGPT]] ... [[LangChain]]
 
* [[Attention]] Mechanism  ...[[Transformer]] Model  ...[[Generative Pre-trained Transformer (GPT)]]
 
* [[Attention]] Mechanism  ...[[Transformer]] Model  ...[[Generative Pre-trained Transformer (GPT)]]
 
* [[SMART - Multi-Task Deep Neural Networks (MT-DNN)]]
 
* [[SMART - Multi-Task Deep Neural Networks (MT-DNN)]]

Revision as of 15:43, 20 April 2023

Youtube search... ...Google search





BERT Research | Chris McCormick