Difference between revisions of "Generative Pre-trained Transformer (GPT)"
m (→Generative Pre-trained Transformer 4 (GPT-4)) |
m (→Generative Pre-trained Transformer 3 (GPT-3 & GPT 3.5)) |
||
| Line 98: | Line 98: | ||
[https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT-3+GPT-3.5+AI ...Google News] | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT-3+GPT-3.5+AI ...Google News] | ||
[https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT-3+GPT-3.5+AI&qft=interval%3d%228%22 ...Bing News] | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT-3+GPT-3.5+AI&qft=interval%3d%228%22 ...Bing News] | ||
| − | |||
* [https://arxiv.org/abs/2005.14165 Language Models are Few-Shot Learners | T. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, and D. Amodei - arXiv.org] | * [https://arxiv.org/abs/2005.14165 Language Models are Few-Shot Learners | T. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, and D. Amodei - arXiv.org] | ||
Revision as of 12:33, 30 March 2023
YouTube ... Quora ...Google search ...Google News ...Bing News
- Case Studies
- Natural Language Processing (NLP) ...Generation ...LLM ...Tools & Services
- Assistants ... Hybrid Assistants ... Agents ... Negotiation ... LangChain
- Attention Mechanism ...Transformer Model ...Generative Pre-trained Transformer (GPT)
- Generative AI ... OpenAI's ChatGPT ... Perplexity ... Microsoft's BingAI ... You ...Google's Bard ... Baidu's Ernie
- Sequence to Sequence (Seq2Seq)
- Recurrent Neural Network (RNN)
- Long Short-Term Memory (LSTM)
- ELMo
- Bidirectional Encoder Representations from Transformers (BERT) ... a better model, but less investment than the larger OpenAI organization
- OpenAI Blog | OpenAI
- Text Transfer Learning
- Generated Image
- SynthPub
- Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever
- Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
- Language Models are Unsupervised Multitask Learners - GitHub
- Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ - trained on over 147M dialogs
- minGPT | Andrej Karpathy - GitHub
- SambaNova Systems ... Dataflow-as-a-Service GPT
- Facebook-owner Meta opens access to AI large language model | Elizabeth Culliford - Reuters ... Facebook 175-billion-parameter language model - Open Pretrained Transformer (OPT-175B)
Contents
GPT4All
YouTube ... Quora ...Google search ...Google News ...Bing News
- Github | GPT4All
- Dataset viewer | NOMIC.ai
- Tech report: GPT4All: Training an Assistant-style Chatbot with Large Scale DataDistillation from GPT-3.5-Turbo | Y. Anand, Z. Nussbaum, B. Duderstadt, B. Schmidt, & A. Mulyar - NOMIC.ai
A chatbot trained on a massive collection of clean assistant data including code, stories and dialogue. Demo, data and code to train an assistant-style large language model with ~800k GPT-3.5-Turbo Generations based on LLaMa
|
|
Generative Pre-trained Transformer 4 (GPT-4)
YouTube ... Quora ...Google search ...Google News ...Bing News
- GPT-4 | OpenAI
- Research Paper | OpenAI
- GPT-4: Facts, Rumors and Expectations about next-gen AI model | Nick Babich - Medium
- How does GPT-4 work and how can you start using it in ChatGPT? | Mohammed Haddad - Aljazeera] ... Launched on March 14, GPT-4 is the successor to GPT-3 and is the technology behind the viral Chatbot ChatGPT.
- OpenAI unveils GPT-4 with new capabilities, Microsoft's Bing is already using it
- Stripe | OpenAI Customer Stories ... 15 of the prototypes were considered strong candidates to be integrated into the platform, including support customization, answering questions about support, and fraud detection
- Morgan Stanley | OpenAI Customer Stories ... access, process and synthesize content almost instantaneously
- The 411 on GPT-4 | The AI Exchange
One of ChatGPT-4’s most dazzling new features is the ability to handle not only words, but pictures too, in what is being called “multimodal” technology. A user will have the ability to submit a picture alongside text — both of which ChatGPT-4 will be able to process and discuss. The ability to input video is also on the horizon. - Everything You Need to Know About ChatGPT-4 | Alex Millson - Bloomberg, Time
Generative Pre-trained Transformer 3 (GPT-3 & GPT 3.5)
YouTube ... Quora ...Google search ...Google News ...Bing News
- Language Models are Few-Shot Learners | T. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, and D. Amodei - arXiv.org
- GPT-3: Demos, Use-cases, Implications | Simon O'Regan - Towards Data Science
- OpenAI API ...today the API runs models with weights from the GPT-3 family with many speed and throughput improvements.
- GPT-3 by OpenAI – Outlook and Examples | Praveen Govindaraj | Medium
- GPT-3 Creative Fiction | R. Gwern
- GPT-3: Brown et al., 2020
- Qlik and GPT-3 integration - how to start? | Jacek - Qlik
Try...
- Sushant Kumar's micro-site - Replace your 'word' in the following URL to see what GPT-3 generates: https://thoughts.sushant-kumar.com/word
- Serendipity ...an AI powered recommendation engine for anything you want.
- Taglines.ai ... just about every business has a tagline — a short, catchy phrase designed to quickly communicate what it is that they do.
- Simplify.so ...simple, easy-to-understand explanations for everything
|
|
|
|
|
|
|
|
|
|
|
|
|
|
GPT Impact to Development
- * Development ...AI Pair Programming Tools ... Analytics ... Visualization ... Diagrams for Business Analysis
- https://analyticsindiamag.com/will-the-much-hyped-gpt-3-impact-the-coders/ Will The Much-Hyped GPT-3 Impact The Coders? | Analytics India Magazine]
- With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. | Sharif Shameem - debuild
|
|
|
|
|
|
Generative Pre-trained Transformer 2 (GPT-2)
- GitHub
- How to Get Started with OpenAIs GPT-2 for Text Generation | Amal Nair - Analytics India Magazine
- GPT-2: It learned on the Internet | Janelle Shane
- Too powerful NLP model (GPT-2): What is Generative Pre-Training | Edward Ma
- GPT-2 A nascent transfer learning method that could eliminate supervised learning some NLP tasks | Ajit Rajasekharan - Medium
- OpenAI Creates Platform for Generating Fake News. Wonderful | Nick Kolakowski - Dice
- InferKit | Adam D King- completes your text.
Coding Train Late Night 2
|
|
r/SubSimulator
Subreddit populated entirely by AI personifications of other subreddits -- all posts and comments are generated automatically using:
results in coherent and realistic simulated content.
GetBadNews
- Get Bad News game - Can you beat my score? Play the fake news game! Drop all pretense of ethics and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while
Let's build GPT: from scratch, in code, spelled out | Andrej Karpathy
|
|