Difference between revisions of "Generative Pre-trained Transformer (GPT)"
| Line 9: | Line 9: | ||
* [[Natural Language Generation (NLG)]] | * [[Natural Language Generation (NLG)]] | ||
| − | * [http://arstechnica.com/information-technology/2019/02/researchers-scared-by-their-own-work-hold-back-deepfakes-for-text-ai/ Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher] | + | * [http://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte pair encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units. |
| + | |||
| + | a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.[http://arstechnica.com/information-technology/2019/02/researchers-scared-by-their-own-work-hold-back-deepfakes-for-text-ai/ Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher] | ||
<youtube>0n95f-eqZdw</youtube> | <youtube>0n95f-eqZdw</youtube> | ||
<youtube>H1lncbq8NC0</youtube> | <youtube>H1lncbq8NC0</youtube> | ||
| + | <youtube>2hpB_H_QMRk</youtube> | ||
| + | <youtube>u1_qMdb0kYU</youtube> | ||
Revision as of 21:45, 27 February 2019
GPT+generation+nlg+natural+language+semantics Youtube search... GPT+generation+nlg+natural+language+semantics+machine+learning+ML ...Google search
- Natural Language Generation (NLG)
- Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil Byte pair encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher