Generative Pre-trained Transformer (GPT)
- Case Studies
- Text Transfer Learning
- Natural Language Generation (NLG)
- Generated Image
- OpenAI Blog | OpenAI
- Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever
- Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
- Attention Mechanism/Transformer Model
- Bidirectional Encoder Representations from Transformers (BERT)
- Language Models are Unsupervised Multitask Learners - GitHub
- Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ - trained on over 147M dialogs
Generative Pre-trained Transformer (GPT-3)
- Language Models are Few-Shot Learners | T. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, and D. Amodei - arXiv.org
- GPT-3: Demos, Use-cases, Implications | Simon O'Regan - Towards Data Science
- OpenAI API ...today the API runs models with weights from the GPT-3 family with many speed and throughput improvements.
- GPT-3 by OpenAI – Outlook and Examples | Praveen Govindaraj | Medium
- GPT-3 Creative Fiction | R. Gwern
GPT Impact to Development
- http://analyticsindiamag.com/will-the-much-hyped-gpt-3-impact-the-coders/ Will The Much-Hyped GPT-3 Impact The Coders? | Analytics India Magazine]
- With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. | Sharif Shameem - debuild
Generative Pre-trained Transformer (GPT-2)
- How to Get Started with OpenAIs GPT-2 for Text Generation | Amal Nair - Analytics India Magazine
- GPT-2: It learned on the Internet | Janelle Shane
- Too powerful NLP model (GPT-2): What is Generative Pre-Training | Edward Ma
- GPT-2 A nascent transfer learning method that could eliminate supervised learning some NLP tasks | Ajit Rajasekharan - Medium
- OpenAI Creates Platform for Generating Fake News. Wonderful | Nick Kolakowski - Dice
- InferKit | Adam D King- completes your text.
a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher
Coding Train Late Night 2
Subreddit populated entirely by AI personifications of other subreddits -- all posts and comments are generated automatically using:
results in coherent and realistic simulated content.
- Get Bad News game - Can you beat my score? Play the fake news game! Drop all pretense of ethics and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while