Difference between revisions of "Meta"
m |
m |
||
| Line 23: | Line 23: | ||
* [[AI Marketplace & Toolkit/Model Interoperability]] | * [[AI Marketplace & Toolkit/Model Interoperability]] | ||
* [[Natural Language Tools & Services]] | * [[Natural Language Tools & Services]] | ||
| + | * [[Sequence to Sequence (Seq2Seq)#Retrieval Augmented Generation (RAG)|Retrieval Augmented Generation (RAG)] ...an end-to-end differentiable model that combines an information retrieval component (Facebook AI’s dense-passage retrieval system ) with a seq2seq generator (our Bidirectional and Auto-Regressive Transformers [BART] model) | ||
* [http://pytext.readthedocs.io/en/master/overview.html PyText] ...build end-to-end pipelines for training and inference | * [http://pytext.readthedocs.io/en/master/overview.html PyText] ...build end-to-end pipelines for training and inference | ||
* [[Bidirectional Encoder Representations from Transformers (BERT)]] | * [[Bidirectional Encoder Representations from Transformers (BERT)]] | ||
Revision as of 17:12, 29 September 2020
YouTube search... ...Google search
- Facebook AI
- Case Studies
- Python
- PyTorch authored by Facebook
- Generative Adversarial Network (GAN)
- Assistants
- Capabilities
- Caffe / Caffe2
- AI Marketplace & Toolkit/Model Interoperability
- Natural Language Tools & Services
- [[Sequence to Sequence (Seq2Seq)#Retrieval Augmented Generation (RAG)|Retrieval Augmented Generation (RAG)] ...an end-to-end differentiable model that combines an information retrieval component (Facebook AI’s dense-passage retrieval system ) with a seq2seq generator (our Bidirectional and Auto-Regressive Transformers [BART] model)
- PyText ...build end-to-end pipelines for training and inference
- Bidirectional Encoder Representations from Transformers (BERT)
- Bot Framework
- Building Your Environment
- Spell
- DialogFlow
- Decentralized: Federated & Distributed
- Differentiable Programming
- Metaverse
|
|
|
|