Difference between revisions of "Generative Pre-trained Transformer (GPT)"

From
Jump to: navigation, search
m
m
Line 14: Line 14:
 
* [[Natural Language Generation (NLG)]]
 
* [[Natural Language Generation (NLG)]]
 
* [[Generated Image]]
 
* [[Generated Image]]
* [http://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] |] [[OpenAI]]
+
* [http://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [[OpenAI]]
 
* [http://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever]
 
* [http://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever]
 
* [http://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
 
* [http://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.

Revision as of 13:26, 15 August 2020

YouTube search... ...Google search

1*jbcwhhB8PEpJRk781rML_g.png

Generative Pre-trained Transformer (GPT-3)


Sushant Kumar's micro-site - Replace your 'word' in the following URL to see what GPT-3 generates: http://thoughts.sushant-kumar.com/word


GPT Impact to Development

Generative Pre-trained Transformer (GPT-2)

a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher

Coding Train Late Night 2

r/SubSimulator

Subreddit populated entirely by AI personifications of other subreddits -- all posts and comments are generated automatically using:

results in coherent and realistic simulated content.


GetBadNews

  • Get Bad News game - Can you beat my score? Play the fake news game! Drop all pretense of ethics and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while