Difference between revisions of "Generative Pre-trained Transformer (GPT)"
m |
m |
||
| Line 65: | Line 65: | ||
{| class="wikitable" style="width: 550px;" | {| class="wikitable" style="width: 550px;" | ||
|| | || | ||
| − | <youtube> | + | <youtube>G6Z_S6hs29s</youtube> |
| − | <b> | + | <b>14 Cool Apps Built on [[OpenAI]]'s GPT-3 API |
| − | </b><br> | + | </b><br>14 Cool applications just built on top of [[OpenAI]]'s GPT-3 (generative predictive transformer) API (currently in private beta). |
|} | |} | ||
|}<!-- B --> | |}<!-- B --> | ||
| Line 148: | Line 148: | ||
{| class="wikitable" style="width: 550px;" | {| class="wikitable" style="width: 550px;" | ||
|| | || | ||
| − | <youtube> | + | <youtube>8psgEDhT1MM</youtube> |
| − | <b> | + | <b>GPT 3 Demo and Explanation - An AI revolution from [[OpenAI]] |
| − | </b><br> | + | </b><br>GPT 3 can write poetry, translate text, chat convincingly, and answer abstract questions. It's being used to code, design and much more. I'll give you a demo of some of the latest in this technology and some of how it works. GPT3 comes from a company called [[OpenAI]]. [[OpenAI]] was founded by Elon Musk and Sam Altman (former president of Y-combinator the startup accelerator). [[OpenAI]] was founded with over a Billion invested to collaborate and create human-level AI for the benefit of society. GPT 3 has been developed for a number of years. One of the early papers published was on Generative Pre-Training. The idea behind generative pre-training (GPT) is that while most AI's are trained on labeled data, there's a ton of data that isn't labeled. If you can evaluate the words and use them to train and tune the AI it can start to create predictions of future text on the unlabeled data. You repeat the process until predictions start to converge. The newest GPT is able to do a ton. Some of the demos include: - GPT 3 demo of how to design a user interface using AI - GPT 3 demo of how to code a react application using AI - GPT 3 demo of an excel plug-in to fill data using AI - GPT 3 demo of a search engine/answer engine using AI - GPT3 demo of command line auto-complete from English to shell commands |
|} | |} | ||
|<!-- M --> | |<!-- M --> | ||
Revision as of 21:07, 31 August 2020
YouTube search... ...Google search
- Case Studies
- Text Transfer Learning
- Natural Language Generation (NLG)
- Generated Image
- OpenAI Blog | OpenAI
- Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever
- Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
- Attention Mechanism/Transformer Model
- Bidirectional Encoder Representations from Transformers (BERT)
- ELMo
- Language Models are Unsupervised Multitask Learners - GitHub
- Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ - trained on over 147M dialogs
- minGPT | Andrej Karpathy - GitHub
Contents
Generative Pre-trained Transformer (GPT-3)
- Language Models are Few-Shot Learners | T. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, and D. Amodei - arXiv.org
- GPT-3: Demos, Use-cases, Implications | Simon O'Regan - Towards Data Science
- OpenAI API ...today the API runs models with weights from the GPT-3 family with many speed and throughput improvements.
- GPT-3 by OpenAI – Outlook and Examples | Praveen Govindaraj | Medium
- GPT-3 Creative Fiction | R. Gwern
Try...
- Sushant Kumar's micro-site - Replace your 'word' in the following URL to see what GPT-3 generates: http://thoughts.sushant-kumar.com/word
- Serendipity ...an AI powered recommendation engine for anything you want.
- Taglines.ai ... just about every business has a tagline — a short, catchy phrase designed to quickly communicate what it is that they do.
- Simplify.so ...simple, easy-to-understand explanations for everything
|
|
|
|
|
|
|
|
|
|
|
|
GPT Impact to Development
- Development
- http://analyticsindiamag.com/will-the-much-hyped-gpt-3-impact-the-coders/ Will The Much-Hyped GPT-3 Impact The Coders? | Analytics India Magazine]
- With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. | Sharif Shameem - debuild
|
|
Generative Pre-trained Transformer (GPT-2)
- GitHub
- How to Get Started with OpenAIs GPT-2 for Text Generation | Amal Nair - Analytics India Magazine
- GPT-2: It learned on the Internet | Janelle Shane
- Too powerful NLP model (GPT-2): What is Generative Pre-Training | Edward Ma
- GPT-2 A nascent transfer learning method that could eliminate supervised learning some NLP tasks | Ajit Rajasekharan - Medium
- OpenAI Creates Platform for Generating Fake News. Wonderful | Nick Kolakowski - Dice
- InferKit | Adam D King- completes your text.
Coding Train Late Night 2
r/SubSimulator
Subreddit populated entirely by AI personifications of other subreddits -- all posts and comments are generated automatically using:
results in coherent and realistic simulated content.
GetBadNews
- Get Bad News game - Can you beat my score? Play the fake news game! Drop all pretense of ethics and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while