Difference between revisions of "Toolformer"
m |
m |
||
| Line 24: | Line 24: | ||
Toolformer is a model trained by [[Meta]] AI to decide which APIs to call, when to call them, what arguments to pass, and how to best incorporate the results into future token prediction. This is done in a self-supervised way, requiring nothing more than a handful of demonstrations for each API. They incorporate a range of tools, including a calculator, a Q&A system, a search engine, a translation system, and a calendar. Toolformer achieves substantially improved zero-shot performance across a variety of downstream tasks, often competitive with much larger models, without sacrificing its core language modeling abilities | Toolformer is a model trained by [[Meta]] AI to decide which APIs to call, when to call them, what arguments to pass, and how to best incorporate the results into future token prediction. This is done in a self-supervised way, requiring nothing more than a handful of demonstrations for each API. They incorporate a range of tools, including a calculator, a Q&A system, a search engine, a translation system, and a calendar. Toolformer achieves substantially improved zero-shot performance across a variety of downstream tasks, often competitive with much larger models, without sacrificing its core language modeling abilities | ||
| + | |||
| + | * Given just a handful of human-written examples of how an API can be used, we let a LM annotate a huge language modeling dataset with potential API calls. | ||
| + | * We then use a self-supervised loss to determine which of these API calls actually help the model in predicting future tokens. | ||
| + | * Finally, we finetune the LM itself on the API calls that it considers useful. | ||
Revision as of 06:05, 16 March 2023
YouTube search... ...Google search
- Meta
- Toolformer: Language Models Can Teach Themselves to Use Tools | T. Schick, J. Dwivedi-Yu, R. Dessì, R. Raileanu, M. Lomeli, L. Zettlemoyer, N. Cancedda, & T. Scialom .. Language models (LMs) can teach themselves to use external tools via simple APIs and achieve the best of both worlds
- Assistants ... Hybrid Assistants ... Agents ... Negotiation
- Python ... Generative AI with Python ... Javascript ... Generative AI with Javascript ... Game Development with Generative AI
- Generative AI ... OpenAI's ChatGPT ... Perplexity ... Microsoft's BingAI ... You ...Google's Bard
- Attention Mechanism/Transformer Model
- Prompt Engineering (PE) ...PromptBase ... Prompt Injection Attack
- Proximal Policy Optimization (PPO)
- Natural Language Generation (NLG)
- Natural Language Tools & Services
- Meta develops an AI language bot that can use external software tools | Benj Edwards - Ars Technica ... With Toolformer, an LLM can improve its abilities by calling APIs to external programs ...
- Meta AI and UPF Researchers Introduce Toolformer: A Language Model That Learns in a Self-Supervised Way How to Use Different Tools Such as Search Engines via Simple API Calls | Khushboo Gupta - MarketTechPost
Toolformer is a model trained by Meta AI to decide which APIs to call, when to call them, what arguments to pass, and how to best incorporate the results into future token prediction. This is done in a self-supervised way, requiring nothing more than a handful of demonstrations for each API. They incorporate a range of tools, including a calculator, a Q&A system, a search engine, a translation system, and a calendar. Toolformer achieves substantially improved zero-shot performance across a variety of downstream tasks, often competitive with much larger models, without sacrificing its core language modeling abilities
- Given just a handful of human-written examples of how an API can be used, we let a LM annotate a huge language modeling dataset with potential API calls.
- We then use a self-supervised loss to determine which of these API calls actually help the model in predicting future tokens.
- Finally, we finetune the LM itself on the API calls that it considers useful.
|
|
|
|