Difference between revisions of "Generative Pre-trained Transformer (GPT)"

From
Jump to: navigation, search
Line 9: Line 9:
  
 
* [[Natural Language Generation (NLG)]]
 
* [[Natural Language Generation (NLG)]]
* [http://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte pair encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
+
* [[Attention Mechanism/Model - Transformer Model]]
 +
* [http://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
  
 
a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.[http://arstechnica.com/information-technology/2019/02/researchers-scared-by-their-own-work-hold-back-deepfakes-for-text-ai/ Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher]
 
a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.[http://arstechnica.com/information-technology/2019/02/researchers-scared-by-their-own-work-hold-back-deepfakes-for-text-ai/ Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher]

Revision as of 21:52, 27 February 2019

GPT+generation+nlg+natural+language+semantics Youtube search... GPT+generation+nlg+natural+language+semantics+machine+learning+ML ...Google search

a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher