Difference between revisions of "Generative Pre-trained Transformer (GPT)"

From
Jump to: navigation, search
(Impact to Development)
m
(35 intermediate revisions by the same user not shown)
Line 2: Line 2:
 
|title=PRIMO.ai
 
|title=PRIMO.ai
 
|titlemode=append
 
|titlemode=append
|keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, Tensorflow, Google, Nvidia, Microsoft, Azure, Amazon, AWS  
+
|keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, TensorFlow, Facebook, Google, Nvidia, Microsoft, Azure, Amazon, AWS  
 
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools  
 
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools  
 
}}
 
}}
Line 14: Line 14:
 
* [[Natural Language Generation (NLG)]]
 
* [[Natural Language Generation (NLG)]]
 
* [[Generated Image]]
 
* [[Generated Image]]
* [http://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [http://openai.com/ OpenAI]
+
* [http://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [[OpenAI]]
 
* [http://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever]
 
* [http://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever]
 
* [http://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
 
* [http://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
Line 22: Line 22:
 
* [http://github.com/openai/gpt-2 Language Models are Unsupervised Multitask Learners - GitHub]
 
* [http://github.com/openai/gpt-2 Language Models are Unsupervised Multitask Learners - GitHub]
 
* [http://www.infoq.com/news/2019/11/microsoft-ai-conversation/ Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ] - trained on over 147M dialogs  
 
* [http://www.infoq.com/news/2019/11/microsoft-ai-conversation/ Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ] - trained on over 147M dialogs  
 +
* [http://github.com/karpathy/minGPT minGPT | Andrej Karpathy - GitHub]
  
 
http://cdn-images-1.medium.com/max/800/1*jbcwhhB8PEpJRk781rML_g.png
 
http://cdn-images-1.medium.com/max/800/1*jbcwhhB8PEpJRk781rML_g.png
Line 31: Line 32:
 
* [http://openai.com/blog/openai-api/ OpenAI API] ...today the API runs models with weights from the GPT-3 family with many speed and throughput improvements.
 
* [http://openai.com/blog/openai-api/ OpenAI API] ...today the API runs models with weights from the GPT-3 family with many speed and throughput improvements.
 
* [http://medium.com/@praveengovi.analytics/gpt-3-by-openai-outlook-and-examples-f234f9c62c41 GPT-3 by OpenAI – Outlook and Examples | Praveen Govindaraj | Medium]
 
* [http://medium.com/@praveengovi.analytics/gpt-3-by-openai-outlook-and-examples-f234f9c62c41 GPT-3 by OpenAI – Outlook and Examples | Praveen Govindaraj | Medium]
* [http://twitter.com/sharifshameem/status/1283322990625607681 With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. | Sharif Shameem] - [http://debuild.co/ debuild]
 
 
* [http://www.gwern.net/GPT-3 GPT-3 Creative Fiction | R. Gwern]
 
* [http://www.gwern.net/GPT-3 GPT-3 Creative Fiction | R. Gwern]
 +
  
 
<hr>
 
<hr>
  
[http://twitter.com/sushant_kumar Sushant Kumar]'s micro-site - Replace your 'word' in the following URL to see what GPT-3 generates:  
+
 
http://thoughts.sushant-kumar.com/word
+
==== <span id="Try"></span>Try... ====
 +
 
 +
 
 +
* [http://twitter.com/sushant_kumar Sushant Kumar]'s micro-site - Replace your 'word' in the following URL to see what GPT-3 generates: http://thoughts.sushant-kumar.com/word
 +
 
 +
* [http://serendipityrecs.com/ Serendipity] ...an AI powered recommendation engine for anything you want.
 +
 
 +
* [http://www.taglines.ai/ Taglines.ai] ... just about every business has a tagline — a short, catchy phrase designed to quickly communicate what it is that they do.
 +
 
 +
* [http://www.simplify.so/ Simplify.so] ...simple, easy-to-understand explanations for everything
 +
 
  
 
<hr>
 
<hr>
  
<youtube>8psgEDhT1MM</youtube>
+
 
 +
{|<!-- T -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 +
<youtube>5fqxPOaaqi0</youtube>
 +
<b>What is GPT-3? Showcase, possibilities, and implications
 +
</b><br>What is going on in AI research lately? GPT-3 crashed the party, let’s see what it is and what it can do. Hoping we do not forget how problematic it might also become. GPT-3 Paper : Brown, Tom B., Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan et al. "Language models are few-shot learners." arXiv preprint arXiv:2005.14165 (2020). http://arxiv.org/pdf/2005.14165.pdf
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 +
<youtube>G6Z_S6hs29s</youtube>
 +
<b>14 Cool Apps Built on [[OpenAI]]'s GPT-3 API
 +
</b><br>14 Cool applications just built on top of [[OpenAI]]'s GPT-3 (generative predictive transformer) API (currently in private beta).
 +
|}
 +
|}<!-- B -->
 +
{|<!-- T -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>lQnLwUfwgyA</youtube>
 
<youtube>lQnLwUfwgyA</youtube>
 +
<b>This text generation AI is INSANE (GPT-3)
 +
</b><br>An overview of the gpt-3 machine learning model, why everyone should understand it, and why some (including its creator, open AI) think it's dangerous.
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>qqbqW4aVvHo</youtube>
 
<youtube>qqbqW4aVvHo</youtube>
 +
<b>GPT-3 Demo Installation -Generative pretrained Transformer model (Third generation of [[OpenAI]])
 +
</b><br>[[Python]]code.
 +
|}
 +
|}<!-- B -->
 +
{|<!-- T -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 +
<youtube>SY5PvZrJhLE</youtube>
 +
<b>How Artificial Intelligence Changed the Future of Publishing | [[OpenAI]] GPT-3 and the Future of Books
 +
</b><br>Go from content chaos to clear, compelling writing that influences people to act without them realizing it: http://bit.ly/thebestwaytosayit  As Ed Leon Klinger shows in his GPT 3 demo and GPT 3 examples thread
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>pXOlc5CBKT8</youtube>
 
<youtube>pXOlc5CBKT8</youtube>
<youtube>SY5PvZrJhLE</youtube>
+
<b>GPT-3: Language Models are Few-Shot Learners (Paper Explained)
 +
</b><br>How far can you go with ONLY language modeling? Can a large enough language model perform [[Natural Language Processing (NLP)]] task out of the box? [[OpenAI]] take on these and other questions by training a transformer that is an order of magnitude larger than anything that has ever been built before and the results are astounding.
 +
|}
 +
|}<!-- B -->
 +
{|<!-- T -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>_8yVOC4ciXc</youtube>
 
<youtube>_8yVOC4ciXc</youtube>
 +
<b>GPT3: An Even Bigger Language Model - Computerphile
 +
</b><br>Basic mathematics from a language model? Rob Miles on GPT3, where it seems like size does matter!  More from Rob Miles: http://bit.ly/Rob_Miles_YouTube  This video was filmed and edited by Sean Riley. Computer Science at the University of Nottingham: https://bit.ly/nottscomputer
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>OznMk5Jexu8</youtube>
 
<youtube>OznMk5Jexu8</youtube>
<youtube>7qPDwsCLbZc</youtube>
+
<b>GPT-3 from [[OpenAI]] is here and it's a MONSTER!
 +
</b><br>GPT-3 is the largest language model to date with 175 billion parameters. It is able to do various [[Natural Language Processing (NLP)]] tasks (translation, question answering) without additional finetuning.
 +
|}
 +
|}<!-- B -->
 +
{|<!-- T -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 +
<youtube>kpiY_LemaTc</youtube>
 +
<b>GPT-3 vs Human Brain
 +
</b><br>GPT-3 has 175 billion parameters/synapses. Human brain has 100 trillion synapses. How much will it cost to train a language model the size of the human brain?  REFERENCES:
 +
 
 +
[1] [http://arxiv.org/abs/2005.14165 GPT-3 paper: Language Models are Few-Shot Learners]
 +
 
 +
[2] [http://lambdalabs.com/blog/demystifying-gpt-3/ OpenAI's GPT-3 Language Model: A Technical Overview]
 +
 
 +
[3] [http://arxiv.org/abs/2005.04305 Measuring the Algorithmic Efficiency of Neural Networks]
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>0ZVOmBp29E0</youtube>
 
<youtube>0ZVOmBp29E0</youtube>
 +
<b>Steve Omohundro on GPT-3
 +
</b><br>In this research meeting, guest Stephen Omohundro gave a fascinating talk on GPT-3, the new massive [[OpenAI]] Natural Language Processing model. He reviewed the network architecture, training process, and results in the context of past work. There was extensive discussion on the implications for [[Natural_Language_Processing_(NLP)]] and for Machine Intelligence / AGI.
 +
|}
 +
|}<!-- B -->
 +
{|<!-- T -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 +
<youtube>8psgEDhT1MM</youtube>
 +
<b>GPT 3 Demo and Explanation - An AI revolution from [[OpenAI]]
 +
</b><br>GPT 3 can write poetry, translate text, chat convincingly, and answer abstract questions. It's being used to code, design and much more.  I'll give you a demo of some of the latest in this technology and some of how it works. GPT3 comes from a company called [[OpenAI]]. [[OpenAI]] was founded by Elon Musk and Sam Altman (former president of Y-combinator the startup accelerator). [[OpenAI]] was founded with over a Billion invested to collaborate and create human-level AI for the benefit of society. GPT 3 has been developed for a number of years. One of the early papers published was on Generative Pre-Training.  The idea behind generative pre-training (GPT) is that while most AI's are trained on labeled data, there's a ton of data that isn't labeled.  If you can evaluate the words and use them to train and tune the AI it can start to create predictions of future text on the unlabeled data.  You repeat the process until predictions start to converge.  The newest GPT is able to do a ton. Some of the demos include:  - GPT 3 demo of how to design a user interface using AI  - GPT 3 demo of how to code a react application using AI  - GPT 3 demo of an excel plug-in to fill data using AI    - GPT 3 demo of a search engine/answer engine using AI  - GPT3 demo of command line auto-complete from English to shell commands 
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 +
<youtube>aDFLp4A1EmY</youtube>
 +
<b>Panel discussion - GPT-3 and Artificial General Intelligence 27 Aug 2020
 +
</b><br>Is GPT-3 a step towards creating artificial general intelligence?  Chair: Associate Professor Kate Devitt - Chief Scientist, Trusted Autonomous Systems
  
== Impact to [[Development]] ==
+
Panel:
 +
• Professor David Chalmers (NYU)
 +
• Professor Susan Schneider (NASA and Florida Atlantic University)
 +
• Professor Marcus Hutter (ANU)
 +
 
 +
A philosophical discussion on the development of artificial intelligence and specifically advances in Generative Pre-trained Transformer-3 (GPT-3). GPT-3 is an auto-complete algorithm created by OpenAI as part of their endeavour to develop artificial general intelligence. GPT-3 is the third in a series of autocomplete tools designed by OpenAI. (GPT stands for “generative pre-trained transformer.”). GPT-3 is fed on an unimaginatively large corpus of human knowledge including all of Wikipedia, millions of books, websites and other materials including philosophy texts. In fact, any type of information uploaded to the internet is possible food for GPT-3's artificial mind to dwell on. The result? Eerily coherent, complex and interesting thoughts about almost any topic. The sophisticated, nuanced text produced by GPT-3 seems to pass the Turing Test for many--including philosophers. Some of GPT-3's answers are shedding new light on enduring philosophical questions. Is GPT-3 the beginnings of an artificial general intelligence. Does it create ideas like a human mind, or even better than a human mind? Is human cognition similarly some sort of autocomplete program in our brains? Is it possible that GPT-3 one day becomes consciousness or is it already conscious?--How could we tell. If an AI passes our tests for consciousness, do we then have an obligation to accord it rights? If so, what sorts of rights might it deserve. Independently of rights, how should humans manage an AI that has access to everything that is posited and known and can trick humans into believing that another rational agent is communicating with them? The panel considers what GPT-3 tell us about the ambition to build an artificial general intelligence, consciousness, human thought and how we should treat AI in an increasingly digital and disembodied world rife with mis- and disinformation.
 +
|}
 +
|}<!-- B -->
 +
 
 +
= <span id="GPT Impact to Development"></span>GPT Impact to Development =
 
* [[Development]]
 
* [[Development]]
 
* http://analyticsindiamag.com/will-the-much-hyped-gpt-3-impact-the-coders/ Will The Much-Hyped GPT-3 Impact The Coders? | Analytics India Magazine]
 
* http://analyticsindiamag.com/will-the-much-hyped-gpt-3-impact-the-coders/ Will The Much-Hyped GPT-3 Impact The Coders? | Analytics India Magazine]
 +
* [http://twitter.com/sharifshameem/status/1283322990625607681 With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. | Sharif Shameem] - [http://debuild.co/ debuild]
  
 +
{|<!-- T -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>fZSFNUT6iY8</youtube>
 
<youtube>fZSFNUT6iY8</youtube>
 +
<b>[[OpenAI]] Model Generates [[Python]] Code
 +
</b><br>Credits: [[Microsoft]], [[OpenAI]]
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>utuz7wBGjKM</youtube>
 
<youtube>utuz7wBGjKM</youtube>
 +
<b>[[OpenAI]] Model Generates Python Code
 +
</b><br>This code completion engine can write an entire function from just the name! [[OpenAI]] demonstrates what happens when you learn a language model on thousands of GitHub [[Python]] repositories. Source Clip: http://youtu.be/fZSFNUT6iY8
 +
|}
 +
|}<!-- B -->
 +
{|<!-- T -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>y5-wzgIySb4</youtube>
 
<youtube>y5-wzgIySb4</youtube>
<youtube>nby3-VzgsOM</youtube>
+
<b>[[OpenAI]] and [[Microsoft]] Can Generate [[Python]] Code
 +
</b><br>[[OpenAI]] language model was trained on thousands of GitHub repositories using the same unsupervised learning as the GPT models. Build 2020
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 
<youtube>6SGj2OrTpbI</youtube>
 
<youtube>6SGj2OrTpbI</youtube>
<youtube>fSRYvKFt3ss</youtube>
+
<b>GPT 3 Explanation And Demo Reaction | Should You Be Scared ? Techies Reaction To GPT3 AI [[OpenAI]]
 +
</b><br>In this video we will look at some of the demos and reactions across social media on GPT-3. The links to the tweets and demo you see in this video have been linked below so please do react out to them if you have any questions.
 +
|}
 +
|}<!-- B -->
 +
{|<!-- T -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 +
<youtube>w4JYe-oY4HI</youtube>
 +
<b>Code 10x Faster With This CRAZY New AI Tool (GPT-3)
 +
</b><br>In this FREE LIVE training, Aaron and Naz will show you the new cutting edge machine learning AI, [[OpenAI]]'s GPT-3.
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 +
<youtube>-qizU_v-oqA</youtube>
 +
<b>Build 2 projects using GPT-3 in just a couple of minutes. Bare bones: branding generator, chat bot
 +
</b><br>Co-founded by Elon Musk [[OpenAI]] wants to make AI safe and accessible. A year ago the startup released GPT-2. That language model was at that time deemed too powerful to release. Eventually [[OpenAI]] made the model available. This year they've trained a much more powerful model at least 1 magnitude larger than GPT-2. I was one of the lucky 750+ people granted access as Beta testers by [[OpenAI]] to see what can be built using the GPT-3 API. The model costs millions of dollars to train which makes it out of reach for most organizations. This video skims the surface of what you can get done with this amazing new model.
 +
|}
 +
|}<!-- B -->
  
 
= Generative Pre-trained Transformer (GPT-2) =
 
= Generative Pre-trained Transformer (GPT-2) =
Line 73: Line 237:
 
* [http://inferkit.com/ InferKit | Adam D King]- completes your text.
 
* [http://inferkit.com/ InferKit | Adam D King]- completes your text.
  
a text-generating bot based on a model with 1.5 billion parameters. ...Ultimately, OpenAI's researchers kept the full thing to themselves, only releasing a pared-down 117 million parameter version of the model (which we have dubbed "GPT-2 Junior") as a safer demonstration of what the full GPT-2 model could do.[http://arstechnica.com/information-technology/2019/02/researchers-scared-by-their-own-work-hold-back-deepfakes-for-text-ai/ Twenty minutes into the future with OpenAI’s Deep Fake Text AI | Sean Gallagher]
 
  
 +
== Coding Train Late Night 2 ==
  
<youtube>UULqu7LQoHs</youtube>
+
{|<!-- T -->
<youtube>OJGDmXDt5QA</youtube>
+
| valign="top" |
<youtube>XsHmtuXucqY</youtube>
+
{| class="wikitable" style="width: 550px;"
<youtube>M6EXmoP5jX8</youtube>
+
||
<youtube>gg-JnQ1E2ek</youtube>
+
<youtube>0LZUSkwCYfU</youtube>
<youtube>0OtZ8dUFxXA</youtube>
+
<b>Coding Train Late Night 2: Fetch, GPT-2 and RunwayML
<youtube>8ypnLjwpzK8</youtube>
+
</b><br>The Coding Train
<youtube>LWDbAoPyQAk</youtube>
+
0:00 Live Stream Starts
<youtube>H1lncbq8NC0</youtube>
+
3:51 Introduction With Dad Jokes
<youtube>2hpB_H_QMRk</youtube>
+
11:29 Coding Late At Night Projects and Notes
<youtube>u1_qMdb0kYU</youtube>
+
16:48 Scraping Dad Jokes With Fetch
<youtube>0n95f-eqZdw</youtube>
+
50:10 Training a Model With Runway
 +
57:52 Small Break
 +
1:00:15 Controlling Hue Lights
 +
1:20:00 Dad Joke Model
 +
1:32:27 Skip: Audio Glitch (LOUD)
 +
1:35:00 Dad Joke Model
 +
1:49:25 Dad Joke Generator
 +
1:54:25 Goodbyes and End of Stream
  
 +
Website: http://thecodingtrain.com/
 +
|}
 +
|<!-- M -->
 +
| valign="top" |
 +
{| class="wikitable" style="width: 550px;"
 +
||
 +
<youtube>kWsDL-6D-nk</youtube>
 +
<b>Coding Train Late Night 3: GPT-2, Hue Lights, Discord Bot
 +
</b><br>The Coding Train
 +
0:00 Live Stream Starts
 +
3:50 Introduction
 +
9:50 AI Joke Generator
 +
13:30 Live Stream Notes
 +
19:50 Generative Text Training with GPT-2
 +
29:40 Dad Joke Model Training
 +
1:11:27 Using Hue Lights API
 +
1:31:50 More Dad Joke Generator
 +
1:37:33 Discord Bot
 +
2:15:04 Goodbyes and End of Stream
 +
 +
Website: http://thecodingtrain.com/
 +
|}
 +
|}<!-- B -->
  
 
== r/SubSimulator ==
 
== r/SubSimulator ==
Line 98: Line 292:
  
 
results in coherent and realistic simulated content.  
 
results in coherent and realistic simulated content.  
 
  
  
 
== GetBadNews ==
 
== GetBadNews ==
  
* [http://getbadnews.com Get Bad News] game - Can you beat my score? Play the fake news game! Drop all pretense of ethics and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while
+
* [http://getbadnews.com Get Bad News] game - Can you beat my score? Play the fake news game! Drop all pretense of [[ethics]] and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while
  
http://www.getbadnews.com/wp-content/uploads/2018/02/share-score.png
+
<img src="http://www.getbadnews.com/wp-content/uploads/2018/02/share-score.png" width="500" height="250">

Revision as of 17:17, 8 October 2020

YouTube search... ...Google search

1*jbcwhhB8PEpJRk781rML_g.png

Generative Pre-trained Transformer (GPT-3)




Try...

  • Serendipity ...an AI powered recommendation engine for anything you want.
  • Taglines.ai ... just about every business has a tagline — a short, catchy phrase designed to quickly communicate what it is that they do.
  • Simplify.so ...simple, easy-to-understand explanations for everything




What is GPT-3? Showcase, possibilities, and implications
What is going on in AI research lately? GPT-3 crashed the party, let’s see what it is and what it can do. Hoping we do not forget how problematic it might also become. GPT-3 Paper : Brown, Tom B., Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan et al. "Language models are few-shot learners." arXiv preprint arXiv:2005.14165 (2020). http://arxiv.org/pdf/2005.14165.pdf

14 Cool Apps Built on OpenAI's GPT-3 API
14 Cool applications just built on top of OpenAI's GPT-3 (generative predictive transformer) API (currently in private beta).

This text generation AI is INSANE (GPT-3)
An overview of the gpt-3 machine learning model, why everyone should understand it, and why some (including its creator, open AI) think it's dangerous.

GPT-3 Demo Installation -Generative pretrained Transformer model (Third generation of OpenAI)
Pythoncode.

How Artificial Intelligence Changed the Future of Publishing | OpenAI GPT-3 and the Future of Books
Go from content chaos to clear, compelling writing that influences people to act without them realizing it: http://bit.ly/thebestwaytosayit As Ed Leon Klinger shows in his GPT 3 demo and GPT 3 examples thread

GPT-3: Language Models are Few-Shot Learners (Paper Explained)
How far can you go with ONLY language modeling? Can a large enough language model perform Natural Language Processing (NLP) task out of the box? OpenAI take on these and other questions by training a transformer that is an order of magnitude larger than anything that has ever been built before and the results are astounding.

GPT3: An Even Bigger Language Model - Computerphile
Basic mathematics from a language model? Rob Miles on GPT3, where it seems like size does matter! More from Rob Miles: http://bit.ly/Rob_Miles_YouTube This video was filmed and edited by Sean Riley. Computer Science at the University of Nottingham: https://bit.ly/nottscomputer

GPT-3 from OpenAI is here and it's a MONSTER!
GPT-3 is the largest language model to date with 175 billion parameters. It is able to do various Natural Language Processing (NLP) tasks (translation, question answering) without additional finetuning.

GPT-3 vs Human Brain
GPT-3 has 175 billion parameters/synapses. Human brain has 100 trillion synapses. How much will it cost to train a language model the size of the human brain? REFERENCES:

[1] GPT-3 paper: Language Models are Few-Shot Learners

[2] OpenAI's GPT-3 Language Model: A Technical Overview

[3] Measuring the Algorithmic Efficiency of Neural Networks

Steve Omohundro on GPT-3
In this research meeting, guest Stephen Omohundro gave a fascinating talk on GPT-3, the new massive OpenAI Natural Language Processing model. He reviewed the network architecture, training process, and results in the context of past work. There was extensive discussion on the implications for Natural_Language_Processing_(NLP) and for Machine Intelligence / AGI.

GPT 3 Demo and Explanation - An AI revolution from OpenAI
GPT 3 can write poetry, translate text, chat convincingly, and answer abstract questions. It's being used to code, design and much more. I'll give you a demo of some of the latest in this technology and some of how it works. GPT3 comes from a company called OpenAI. OpenAI was founded by Elon Musk and Sam Altman (former president of Y-combinator the startup accelerator). OpenAI was founded with over a Billion invested to collaborate and create human-level AI for the benefit of society. GPT 3 has been developed for a number of years. One of the early papers published was on Generative Pre-Training. The idea behind generative pre-training (GPT) is that while most AI's are trained on labeled data, there's a ton of data that isn't labeled. If you can evaluate the words and use them to train and tune the AI it can start to create predictions of future text on the unlabeled data. You repeat the process until predictions start to converge. The newest GPT is able to do a ton. Some of the demos include: - GPT 3 demo of how to design a user interface using AI - GPT 3 demo of how to code a react application using AI - GPT 3 demo of an excel plug-in to fill data using AI - GPT 3 demo of a search engine/answer engine using AI - GPT3 demo of command line auto-complete from English to shell commands

Panel discussion - GPT-3 and Artificial General Intelligence 27 Aug 2020
Is GPT-3 a step towards creating artificial general intelligence? Chair: Associate Professor Kate Devitt - Chief Scientist, Trusted Autonomous Systems

Panel: • Professor David Chalmers (NYU) • Professor Susan Schneider (NASA and Florida Atlantic University) • Professor Marcus Hutter (ANU)

A philosophical discussion on the development of artificial intelligence and specifically advances in Generative Pre-trained Transformer-3 (GPT-3). GPT-3 is an auto-complete algorithm created by OpenAI as part of their endeavour to develop artificial general intelligence. GPT-3 is the third in a series of autocomplete tools designed by OpenAI. (GPT stands for “generative pre-trained transformer.”). GPT-3 is fed on an unimaginatively large corpus of human knowledge including all of Wikipedia, millions of books, websites and other materials including philosophy texts. In fact, any type of information uploaded to the internet is possible food for GPT-3's artificial mind to dwell on. The result? Eerily coherent, complex and interesting thoughts about almost any topic. The sophisticated, nuanced text produced by GPT-3 seems to pass the Turing Test for many--including philosophers. Some of GPT-3's answers are shedding new light on enduring philosophical questions. Is GPT-3 the beginnings of an artificial general intelligence. Does it create ideas like a human mind, or even better than a human mind? Is human cognition similarly some sort of autocomplete program in our brains? Is it possible that GPT-3 one day becomes consciousness or is it already conscious?--How could we tell. If an AI passes our tests for consciousness, do we then have an obligation to accord it rights? If so, what sorts of rights might it deserve. Independently of rights, how should humans manage an AI that has access to everything that is posited and known and can trick humans into believing that another rational agent is communicating with them? The panel considers what GPT-3 tell us about the ambition to build an artificial general intelligence, consciousness, human thought and how we should treat AI in an increasingly digital and disembodied world rife with mis- and disinformation.

GPT Impact to Development

OpenAI Model Generates Python Code
Credits: Microsoft, OpenAI

OpenAI Model Generates Python Code
This code completion engine can write an entire function from just the name! OpenAI demonstrates what happens when you learn a language model on thousands of GitHub Python repositories. Source Clip: http://youtu.be/fZSFNUT6iY8

OpenAI and Microsoft Can Generate Python Code
OpenAI language model was trained on thousands of GitHub repositories using the same unsupervised learning as the GPT models. Build 2020

GPT 3 Explanation And Demo Reaction | Should You Be Scared ? Techies Reaction To GPT3 AI OpenAI
In this video we will look at some of the demos and reactions across social media on GPT-3. The links to the tweets and demo you see in this video have been linked below so please do react out to them if you have any questions.

Code 10x Faster With This CRAZY New AI Tool (GPT-3)
In this FREE LIVE training, Aaron and Naz will show you the new cutting edge machine learning AI, OpenAI's GPT-3.

Build 2 projects using GPT-3 in just a couple of minutes. Bare bones: branding generator, chat bot
Co-founded by Elon Musk OpenAI wants to make AI safe and accessible. A year ago the startup released GPT-2. That language model was at that time deemed too powerful to release. Eventually OpenAI made the model available. This year they've trained a much more powerful model at least 1 magnitude larger than GPT-2. I was one of the lucky 750+ people granted access as Beta testers by OpenAI to see what can be built using the GPT-3 API. The model costs millions of dollars to train which makes it out of reach for most organizations. This video skims the surface of what you can get done with this amazing new model.

Generative Pre-trained Transformer (GPT-2)


Coding Train Late Night 2

Coding Train Late Night 2: Fetch, GPT-2 and RunwayML
The Coding Train 0:00 Live Stream Starts 3:51 Introduction With Dad Jokes 11:29 Coding Late At Night Projects and Notes 16:48 Scraping Dad Jokes With Fetch 50:10 Training a Model With Runway 57:52 Small Break 1:00:15 Controlling Hue Lights 1:20:00 Dad Joke Model 1:32:27 Skip: Audio Glitch (LOUD) 1:35:00 Dad Joke Model 1:49:25 Dad Joke Generator 1:54:25 Goodbyes and End of Stream

Website: http://thecodingtrain.com/

Coding Train Late Night 3: GPT-2, Hue Lights, Discord Bot
The Coding Train 0:00 Live Stream Starts 3:50 Introduction 9:50 AI Joke Generator 13:30 Live Stream Notes 19:50 Generative Text Training with GPT-2 29:40 Dad Joke Model Training 1:11:27 Using Hue Lights API 1:31:50 More Dad Joke Generator 1:37:33 Discord Bot 2:15:04 Goodbyes and End of Stream

Website: http://thecodingtrain.com/

r/SubSimulator

Subreddit populated entirely by AI personifications of other subreddits -- all posts and comments are generated automatically using:

results in coherent and realistic simulated content.


GetBadNews

  • Get Bad News game - Can you beat my score? Play the fake news game! Drop all pretense of ethics and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while