Difference between revisions of "Generative Pre-trained Transformer (GPT)"

From
Jump to: navigation, search
Line 8: Line 8:
 
[http://www.google.com/search?q=Generative+Pre+trained+Transformer-2+GPT+generation+nlg+natural+language+semantics ...Google search]
 
[http://www.google.com/search?q=Generative+Pre+trained+Transformer-2+GPT+generation+nlg+natural+language+semantics ...Google search]
  
 +
* [[Text Transfer Learning]]
 
* [[Natural Language Generation (NLG)]]
 
* [[Natural Language Generation (NLG)]]
 
* [http://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [http://openai.com/ OpenAI]
 
* [http://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [http://openai.com/ OpenAI]
Line 20: Line 21:
 
* [http://github.com/openai/gpt-2 Language Models are Unsupervised Multitask Learners - GitHub]
 
* [http://github.com/openai/gpt-2 Language Models are Unsupervised Multitask Learners - GitHub]
 
* [http://insights.dice.com/2019/02/19/openai-platform-generating-fake-news-wonderful OpenAI Creates Platform for Generating Fake News. Wonderful | Nick Kolakowski - Dice]
 
* [http://insights.dice.com/2019/02/19/openai-platform-generating-fake-news-wonderful OpenAI Creates Platform for Generating Fake News. Wonderful | Nick Kolakowski - Dice]
 +
* [http://medium.com/@ajitrajasekharan/gpt-2-a-promising-but-nascent-transfer-learning-method-that-could-reduce-or-even-eliminate-in-some-48ea3370cc21 GPT-2 A nascent transfer learning method that could eliminate supervised learning some NLP tasks | Ajit Rajasekharan - Medium]
  
 
a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.[http://arstechnica.com/information-technology/2019/02/researchers-scared-by-their-own-work-hold-back-deepfakes-for-text-ai/ Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher]
 
a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.[http://arstechnica.com/information-technology/2019/02/researchers-scared-by-their-own-work-hold-back-deepfakes-for-text-ai/ Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher]

Revision as of 07:51, 13 November 2019

YouTube search... ...Google search

a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher



1*jbcwhhB8PEpJRk781rML_g.png


r/SubSimulator

Subreddit populated entirely by AI personifications of other subreddits -- all posts and comments are generated automatically using:

results in coherent and realistic simulated content.


GetBadNews

  • Get Bad News game - Can you beat my score? Play the fake news game! Drop all pretense of ethics and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while

share-score.png