Difference between revisions of "Generative Pre-trained Transformer (GPT)"

From
Jump to: navigation, search
m
m
Line 58: Line 58:
 
||
 
||
 
<youtube>5fqxPOaaqi0</youtube>
 
<youtube>5fqxPOaaqi0</youtube>
<b>HH1
+
<b>What is GPT-3? Showcase, possibilities, and implications
</b><br>BB1
+
</b><br>What is going on in AI research lately? GPT-3 crashed the party, let’s see what it is and what it can do. Hoping we do not forget how problematic it might also become. GPT-3 Paper : Brown, Tom B., Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan et al. "Language models are few-shot learners." arXiv preprint arXiv:2005.14165 (2020). http://arxiv.org/pdf/2005.14165.pdf
 
|}
 
|}
 
|<!-- M -->
 
|<!-- M -->
Line 66: Line 66:
 
||
 
||
 
<youtube>8psgEDhT1MM</youtube>
 
<youtube>8psgEDhT1MM</youtube>
<b>HH2
+
<b>GPT 3 Demo and Explanation - An AI revolution from [[OpenAI]]
</b><br>BB2
+
</b><br>GPT 3 can write poetry, translate text, chat convincingly, and answer abstract questions. It's being used to code, design and much more.  I'll give you a demo of some of the latest in this technology and some of how it works. GPT3 comes from a company called [[OpenAI]]. [[OpenAI]] was founded by Elon Musk and Sam Altman (former president of Y-combinator the startup accelerator). [[OpenAI]] was founded with over a Billion invested to collaborate and create human-level AI for the benefit of society. GPT 3 has been developed for a number of years. One of the early papers published was on Generative Pre-Training.  The idea behind generative pre-training (GPT) is that while most AI's are trained on labeled data, there's a ton of data that isn't labeled.  If you can evaluate the words and use them to train and tune the AI it can start to create predictions of future text on the unlabeled data.  You repeat the process until predictions start to converge.  The newest GPT is able to do a ton. Some of the demos include:  - GPT 3 demo of how to design a user interface using AI  - GPT 3 demo of how to code a react application using AI  - GPT 3 demo of an excel plug-in to fill data using AI    - GPT 3 demo of a search engine/answer engine using AI  - GPT3 demo of command line auto-complete from English to shell commands
 
|}
 
|}
 
|}<!-- B -->
 
|}<!-- B -->
Line 75: Line 75:
 
||
 
||
 
<youtube>lQnLwUfwgyA</youtube>
 
<youtube>lQnLwUfwgyA</youtube>
<b>HH3
+
<b>This text generation AI is INSANE (GPT-3)
</b><br>BB3
+
</b><br>An overview of the gpt-3 machine learning model, why everyone should understand it, and why some (including its creator, open AI) think it's dangerous.
 
|}
 
|}
 
|<!-- M -->
 
|<!-- M -->
Line 83: Line 83:
 
||
 
||
 
<youtube>qqbqW4aVvHo</youtube>
 
<youtube>qqbqW4aVvHo</youtube>
<b>HH4
+
<b>GPT-3 Demo Installation -Generative pretrained Transformer model (Third generation of [[OpenAI]])
</b><br>BB4
+
</b><br>[[Python]]code.
 
|}
 
|}
 
|}<!-- B -->
 
|}<!-- B -->
Line 92: Line 92:
 
||
 
||
 
<youtube>SY5PvZrJhLE</youtube>
 
<youtube>SY5PvZrJhLE</youtube>
<b>HH5
+
<b>How Artificial Intelligence Changed the Future of Publishing | [[OpenAI]] GPT-3 and the Future of Books
</b><br>BB5
+
</b><br>Go from content chaos to clear, compelling writing that influences people to act without them realizing it: http://bit.ly/thebestwaytosayit  As Ed Leon Klinger shows in his GPT 3 demo and GPT 3 examples thread
 
|}
 
|}
 
|<!-- M -->
 
|<!-- M -->
Line 100: Line 100:
 
||
 
||
 
<youtube>pXOlc5CBKT8</youtube>
 
<youtube>pXOlc5CBKT8</youtube>
<b>HH6
+
<b>GPT-3: Language Models are Few-Shot Learners (Paper Explained)
</b><br>BB6
+
</b><br>How far can you go with ONLY language modeling? Can a large enough language model perform [[Natural Language Processing (NLP)]] task out of the box? [[OpenAI]] take on these and other questions by training a transformer that is an order of magnitude larger than anything that has ever been built before and the results are astounding.
 
|}
 
|}
 
|}<!-- B -->
 
|}<!-- B -->
Line 109: Line 109:
 
||
 
||
 
<youtube>_8yVOC4ciXc</youtube>
 
<youtube>_8yVOC4ciXc</youtube>
<b>HH7
+
<b>GPT3: An Even Bigger Language Model - Computerphile
</b><br>BB7
+
</b><br>Basic mathematics from a language model? Rob Miles on GPT3, where it seems like size does matter!  More from Rob Miles: http://bit.ly/Rob_Miles_YouTube  This video was filmed and edited by Sean Riley. Computer Science at the University of Nottingham: https://bit.ly/nottscomputer
 
|}
 
|}
 
|<!-- M -->
 
|<!-- M -->

Revision as of 20:45, 31 August 2020

YouTube search... ...Google search

1*jbcwhhB8PEpJRk781rML_g.png

Generative Pre-trained Transformer (GPT-3)




Try...

  • Serendipity ...an AI powered recommendation engine for anything you want.
  • Taglines.ai ... just about every business has a tagline — a short, catchy phrase designed to quickly communicate what it is that they do.
  • Simplify.so ...simple, easy-to-understand explanations for everything




What is GPT-3? Showcase, possibilities, and implications
What is going on in AI research lately? GPT-3 crashed the party, let’s see what it is and what it can do. Hoping we do not forget how problematic it might also become. GPT-3 Paper : Brown, Tom B., Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan et al. "Language models are few-shot learners." arXiv preprint arXiv:2005.14165 (2020). http://arxiv.org/pdf/2005.14165.pdf

GPT 3 Demo and Explanation - An AI revolution from OpenAI
GPT 3 can write poetry, translate text, chat convincingly, and answer abstract questions. It's being used to code, design and much more. I'll give you a demo of some of the latest in this technology and some of how it works. GPT3 comes from a company called OpenAI. OpenAI was founded by Elon Musk and Sam Altman (former president of Y-combinator the startup accelerator). OpenAI was founded with over a Billion invested to collaborate and create human-level AI for the benefit of society. GPT 3 has been developed for a number of years. One of the early papers published was on Generative Pre-Training. The idea behind generative pre-training (GPT) is that while most AI's are trained on labeled data, there's a ton of data that isn't labeled. If you can evaluate the words and use them to train and tune the AI it can start to create predictions of future text on the unlabeled data. You repeat the process until predictions start to converge. The newest GPT is able to do a ton. Some of the demos include: - GPT 3 demo of how to design a user interface using AI - GPT 3 demo of how to code a react application using AI - GPT 3 demo of an excel plug-in to fill data using AI - GPT 3 demo of a search engine/answer engine using AI - GPT3 demo of command line auto-complete from English to shell commands

This text generation AI is INSANE (GPT-3)
An overview of the gpt-3 machine learning model, why everyone should understand it, and why some (including its creator, open AI) think it's dangerous.

GPT-3 Demo Installation -Generative pretrained Transformer model (Third generation of OpenAI)
Pythoncode.

How Artificial Intelligence Changed the Future of Publishing | OpenAI GPT-3 and the Future of Books
Go from content chaos to clear, compelling writing that influences people to act without them realizing it: http://bit.ly/thebestwaytosayit As Ed Leon Klinger shows in his GPT 3 demo and GPT 3 examples thread

GPT-3: Language Models are Few-Shot Learners (Paper Explained)
How far can you go with ONLY language modeling? Can a large enough language model perform Natural Language Processing (NLP) task out of the box? OpenAI take on these and other questions by training a transformer that is an order of magnitude larger than anything that has ever been built before and the results are astounding.

GPT3: An Even Bigger Language Model - Computerphile
Basic mathematics from a language model? Rob Miles on GPT3, where it seems like size does matter! More from Rob Miles: http://bit.ly/Rob_Miles_YouTube This video was filmed and edited by Sean Riley. Computer Science at the University of Nottingham: https://bit.ly/nottscomputer

HH8
BB8

HH9
BB9

HH10
BB10

HH1
BB1

HH2
BB2

GPT Impact to Development

Generative Pre-trained Transformer (GPT-2)

a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher

Coding Train Late Night 2

r/SubSimulator

Subreddit populated entirely by AI personifications of other subreddits -- all posts and comments are generated automatically using:

results in coherent and realistic simulated content.


GetBadNews

  • Get Bad News game - Can you beat my score? Play the fake news game! Drop all pretense of ethics and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while