Difference between revisions of "Generative Pre-trained Transformer (GPT)"
m |
m |
||
| (165 intermediate revisions by the same user not shown) | |||
| Line 2: | Line 2: | ||
|title=PRIMO.ai | |title=PRIMO.ai | ||
|titlemode=append | |titlemode=append | ||
| − | |keywords=artificial, intelligence, machine, learning, models | + | |keywords=ChatGPT, artificial, intelligence, machine, learning, NLP, NLG, NLC, NLU, models, data, singularity, moonshot, Sentience, AGI, Emergence, Moonshot, Explainable, TensorFlow, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Hugging Face, OpenAI, Tensorflow, OpenAI, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Meta, LLM, metaverse, assistants, agents, digital twin, IoT, Transhumanism, Immersive Reality, Generative AI, Conversational AI, Perplexity, Bing, You, Bard, Ernie, prompt Engineering LangChain, Video/Image, Vision, End-to-End Speech, Synthesize Speech, Speech Recognition, Stanford, MIT |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools |
| − | |description=Helpful resources for your journey with artificial intelligence; | + | |
| + | <!-- Google tag (gtag.js) --> | ||
| + | <script async src="https://www.googletagmanager.com/gtag/js?id=G-4GCWLBVJ7T"></script> | ||
| + | <script> | ||
| + | window.dataLayer = window.dataLayer || []; | ||
| + | function gtag(){dataLayer.push(arguments);} | ||
| + | gtag('js', new Date()); | ||
| + | |||
| + | gtag('config', 'G-4GCWLBVJ7T'); | ||
| + | </script> | ||
}} | }} | ||
| − | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT | + | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT YouTube] |
| − | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT+ | + | [https://www.quora.com/search?q=Generative%20Pre%20trained%20Transformer%20%GPT ... Quora] |
| + | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT ...Google search] | ||
| + | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT ...Google News] | ||
| + | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT&qft=interval%3d%228%22 ...Bing News] | ||
| − | * [[ | + | * [[Large Language Model (LLM)]] ... [[Large Language Model (LLM)#Multimodal|Multimodal]] ... [[Foundation Models (FM)]] ... [[Generative Pre-trained Transformer (GPT)|Generative Pre-trained]] ... [[Transformer]] ... [[Attention]] ... [[Generative Adversarial Network (GAN)|GAN]] ... [[Bidirectional Encoder Representations from Transformers (BERT)|BERT]] |
| − | ** [[ | + | * [[Conversational AI]] ... [[ChatGPT]] | [[OpenAI]] ... [[Bing/Copilot]] | [[Microsoft]] ... [[Gemini]] | [[Google]] ... [[Claude]] | [[Anthropic]] ... [[Perplexity]] ... [[You]] ... [[phind]] ... [[Ernie]] | [[Baidu]] |
| − | * | + | * [[Natural Language Processing (NLP)]] ... [[Natural Language Generation (NLG)|Generation (NLG)]] ... [[Natural Language Classification (NLC)|Classification (NLC)]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding (NLU)]] ... [[Language Translation|Translation]] ... [[Summarization]] ... [[Sentiment Analysis|Sentiment]] ... [[Natural Language Tools & Services|Tools]] |
| − | * [ | + | * [[Agents]] ... [[Robotic Process Automation (RPA)|Robotic Process Automation]] ... [[Assistants]] ... [[Personal Companions]] ... [[Personal Productivity|Productivity]] ... [[Email]] ... [[Negotiation]] ... [[LangChain]] |
| − | + | * [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Generative AI]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]] | |
| − | + | * [[Sequence to Sequence (Seq2Seq)]] | |
| − | + | * [[Recurrent Neural Network (RNN)]] | |
| − | + | * [[Long Short-Term Memory (LSTM)]] | |
| − | + | * [[ELMo]] | |
| − | + | * [[Bidirectional Encoder Representations from Transformers (BERT)]] ... a better model, but less investment than the larger [[OpenAI]] organization | |
| − | + | * [https://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [[OpenAI]] | |
| − | |||
| − | |||
| − | |||
* [[Text Transfer Learning]] | * [[Text Transfer Learning]] | ||
| − | * [[ | + | * [[Video/Image]] ... [[Vision]] ... [[Enhancement]] ... [[Fake]] ... [[Reconstruction]] ... [[Colorize]] ... [[Occlusions]] ... [[Predict image]] ... [[Image/Video Transfer Learning]] |
| − | + | [[Writing/Publishing#SynthPub|Writing/Publishing - SynthPub]] | |
| − | |||
* [https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever] | * [https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever] | ||
* [https://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units. | * [https://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units. | ||
| − | |||
| − | |||
| − | |||
| − | |||
* [https://github.com/openai/gpt-2 Language Models are Unsupervised Multitask Learners - GitHub] | * [https://github.com/openai/gpt-2 Language Models are Unsupervised Multitask Learners - GitHub] | ||
* [https://www.infoq.com/news/2019/11/microsoft-ai-conversation/ Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ] - trained on over 147M dialogs | * [https://www.infoq.com/news/2019/11/microsoft-ai-conversation/ Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ] - trained on over 147M dialogs | ||
| Line 37: | Line 41: | ||
* [https://sambanova.ai/solutions/gpt/ SambaNova Systems] ... Dataflow-as-a-Service GPT | * [https://sambanova.ai/solutions/gpt/ SambaNova Systems] ... Dataflow-as-a-Service GPT | ||
* [https://www.reuters.com/technology/facebook-owner-meta-opens-access-ai-large-language-model-2022-05-03/ [[Meta|Facebook]]-owner Meta opens access to AI large language model | Elizabeth Culliford - Reuters] ... [[Meta|Facebook]] 175-billion-parameter language model - Open Pretrained Transformer (OPT-175B) | * [https://www.reuters.com/technology/facebook-owner-meta-opens-access-ai-large-language-model-2022-05-03/ [[Meta|Facebook]]-owner Meta opens access to AI large language model | Elizabeth Culliford - Reuters] ... [[Meta|Facebook]] 175-billion-parameter language model - Open Pretrained Transformer (OPT-175B) | ||
| + | * [https://lilianweng.github.io/posts/2018-06-24-attention/ Resource on Transformers | Lilian Weng - Lil'Log] | ||
<img src=https://production-media.paperswithcode.com/methods/Screen_Shot_2020-05-27_at_12.41.44_PM.png width="1000"> | <img src=https://production-media.paperswithcode.com/methods/Screen_Shot_2020-05-27_at_12.41.44_PM.png width="1000"> | ||
* [https://paperswithcode.com/method/gpt GPT | Papers With Code] | * [https://paperswithcode.com/method/gpt GPT | Papers With Code] | ||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| Line 160: | Line 51: | ||
| − | + | <b><span id="Try"></span>Try...</b> | |
| − | |||
* [https://twitter.com/sushant_kumar Sushant Kumar]'s micro-site - Replace your 'word' in the following URL to see what GPT-3 generates: https://thoughts.sushant-kumar.com/word | * [https://twitter.com/sushant_kumar Sushant Kumar]'s micro-site - Replace your 'word' in the following URL to see what GPT-3 generates: https://thoughts.sushant-kumar.com/word | ||
| − | |||
* [https://serendipityrecs.com/ Serendipity] ...an AI powered recommendation engine for anything you want. | * [https://serendipityrecs.com/ Serendipity] ...an AI powered recommendation engine for anything you want. | ||
| − | |||
* [https://www.taglines.ai/ Taglines.ai] ... just about every business has a tagline — a short, catchy phrase designed to quickly communicate what it is that they do. | * [https://www.taglines.ai/ Taglines.ai] ... just about every business has a tagline — a short, catchy phrase designed to quickly communicate what it is that they do. | ||
| − | |||
* [https://www.simplify.so/ Simplify.so] ...simple, easy-to-understand explanations for everything | * [https://www.simplify.so/ Simplify.so] ...simple, easy-to-understand explanations for everything | ||
| Line 175: | Line 62: | ||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
{|<!-- T --> | {|<!-- T --> | ||
| valign="top" | | | valign="top" | | ||
| Line 211: | Line 80: | ||
|} | |} | ||
|}<!-- B --> | |}<!-- B --> | ||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | + | == <span id="GPT Impact to Development"></span>GPT Impact to Development == | |
| − | + | * * [[Development]] ...[[Development#AI Pair Programming Tools|AI Pair Programming Tools]] ... [[Analytics]] ... [[Visualization]] ... [[Diagrams for Business Analysis]] | |
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | = <span id="GPT Impact to Development"></span>GPT Impact to Development = | ||
| − | * [[Development]] | ||
* https://analyticsindiamag.com/will-the-much-hyped-gpt-3-impact-the-coders/ Will The Much-Hyped GPT-3 Impact The Coders? | Analytics India Magazine] | * https://analyticsindiamag.com/will-the-much-hyped-gpt-3-impact-the-coders/ Will The Much-Hyped GPT-3 Impact The Coders? | Analytics India Magazine] | ||
* [https://twitter.com/sharifshameem/status/1283322990625607681 With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. | Sharif Shameem] - [https://debuild.co/ debuild] | * [https://twitter.com/sharifshameem/status/1283322990625607681 With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. | Sharif Shameem] - [https://debuild.co/ debuild] | ||
| Line 336: | Line 107: | ||
{| class="wikitable" style="width: 550px;" | {| class="wikitable" style="width: 550px;" | ||
|| | || | ||
| − | <youtube> | + | <youtube>3P3TcKaegbA</youtube> |
| − | <b> | + | <b>Generative Python Code with GPT |
| − | </b><br> | + | </b><br>In a quest to teach neural networks via transformers to write Python code. Project name: Generative Python Transformers! |
|} | |} | ||
|<!-- M --> | |<!-- M --> | ||
| Line 367: | Line 138: | ||
|}<!-- B --> | |}<!-- B --> | ||
| − | = | + | = <span id="Custom GPTs"></span>Custom GPTs = |
| − | * | + | * [[Agents]] ... [[Robotic Process Automation (RPA)|Robotic Process Automation]] ... [[Assistants]] ... [[Personal Companions]] ... [[Personal Productivity|Productivity]] ... [[Email]] ... [[Negotiation]] ... [[LangChain]] |
| − | + | ||
| − | + | Custom GPTs are personalized versions of AI models like [[ChatGPT]] that can be tailored for specific tasks or projects. They represent a significant advancement in AI implementation, allowing businesses and individuals to customize AI tools to meet unique challenges and operational needs. | |
| − | + | ||
| − | * [https:// | + | == <span id="OpenAI Platform"></span>OpenAI Platform == |
| − | + | * [https://chat.openai.com/create OpenAI Platform] | |
| − | * [https:// | + | |
| − | + | [[OpenAI]] allows Plus and Enterprise users to create custom GPTs that can browse the web, create images, and run code. Users can upload knowledge files, modify the GPT's appearance, and define its actions | |
| − | + | ||
| + | === <span id="OpenAI GPT Store"></span>OpenAI GPT Store === | ||
| + | * [https://chatgpt.com/gpts GPT Store] | ||
| + | |||
| + | The [[OpenAI]] GPT Store provides a platform for users to create, share, and monetize their custom GPTs, expanding the capabilities and possibilities of AI assistants like [[ChatGPT]]. It allows users of [[ChatGPT]] Plus to create and share their own custom chatbots, known as GPTs (Generative Pre-trained Transformers). The GPT Store offers a platform for developers to monetize their custom GPTs and provides a wide range of AI tools and capabilities for users to explore and enhance their AI assistant capabilities | ||
| + | |||
| + | <youtube>2wYcJEcKVPk</youtube> | ||
| + | <youtube>amjnJrfByS0</youtube> | ||
| + | <youtube>VudB3E9tSbc</youtube> | ||
| + | <youtube>SVA-OBl44m4</youtube> | ||
| + | === <span id="OpenAI GPT Builder"></span>OpenAI GPT Builder === | ||
| − | == | + | With the GPT Builder, users can tailor GPTs for specific tasks or topics by combining instructions, knowledge, and capabilities. It enables users to build AI agents without the need for coding skills, making it accessible to a wide range of individuals, including educators, coaches, and anyone interested in building helpful tools. |
| + | |||
| + | To create a GPT using the GPT Builder, users can access the builder interface through the [[OpenAI]] platform at chat.openai.com/gpts/editor or by selecting "My GPTs" after logging in. The builder interface provides a split screen with a Create panel where users can enter prompts and instructions to build their chatbot, and a Preview panel that allows users to interact with the chatbot as they build it, making it easier to refine and customize the GPT. | ||
| + | |||
| + | The GPT Builder also offers features such as the ability to add images to the GPT, either by asking the builder to create an image or by uploading custom images. Additionally, GPTs can be granted access to web browsing, [[Video/Image#DALL-E | DALL-E]] (an image generation model), and [[OpenAI]]'s Code Interpreter tool for writing and executing software. The builder interface also includes a Knowledge section where users can upload custom data to enhance the capabilities of their GPTs . | ||
| + | |||
| + | <youtube>f2uPl2MlV24</youtube> | ||
| + | <youtube>SjJsXyBTPUc</youtube> | ||
| + | |||
| + | = <span id="Let's build GPT: from scratch, in code, spelled out - Andrej Karpathy"></span>Let's build GPT: from scratch, in code, spelled out - Andrej Karpathy = | ||
| + | * [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Generative AI]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]] | ||
| + | * [[Development]] ... [[Notebooks]] ... [[Development#AI Pair Programming Tools|AI Pair Programming]] ... [[Codeless Options, Code Generators, Drag n' Drop|Codeless, Generators, Drag n' Drop]] ... [[Algorithm Administration#AIOps/MLOps|AIOps/MLOps]] ... [[Platforms: AI/Machine Learning as a Service (AIaaS/MLaaS)|AIaaS/MLaaS]] | ||
{|<!-- T --> | {|<!-- T --> | ||
| Line 385: | Line 177: | ||
{| class="wikitable" style="width: 550px;" | {| class="wikitable" style="width: 550px;" | ||
|| | || | ||
| − | <youtube> | + | <youtube>kCc8FmEb1nY</youtube> |
| − | <b> | + | <b>Let's build GPT: from scratch, in code, spelled out. |
| − | </b><br>The | + | </b><br>[[Creatives#Andrej Karpathy |Andrej Karpathy]] ...We build a Generatively Pretrained Transformer (GPT), following the paper "Attention is All You Need" and OpenAI's GPT-2 / GPT-3. We talk about connections to ChatGPT, which has taken the world by storm. We watch GitHub Copilot, itself a GPT, help us write a GPT (meta :D!) . I recommend people watch the earlier makemore videos to get comfortable with the autoregressive language modeling framework and basics of tensors and PyTorch nn, which we take for granted in this video. |
| − | + | ||
| − | + | Links: | |
| − | 11: | + | * [https://colab.research.google.com/drive/1JMLa53HDuA-i7ZBmqV7ZnA3c_fvtXnx-?usp=sharing Google colab for the video] |
| − | + | * [https://github.com/karpathy/ng-video-lecture GitHub repo for the video] | |
| − | + | * [https://www.youtube.com/watch?v=VMj-3S1tku0&list=PLAqhIrjkxbuWI23v9cThsA9GvCAUhRvKZ&index=2 Playlist of the whole Zero to Hero series so far] | |
| − | + | * [https://github.com/karpathy/nanoGPT nanoGPT repo] | |
| − | + | * [https://karpathy.ai my website] | |
| − | + | * [https://twitter.com/karpathy my twitter] | |
| − | 1: | + | * [https://discord.gg/3zy8kqD9Cp our Discord channel] |
| − | + | ||
| − | + | Supplementary links: | |
| − | + | * [https://arxiv.org/abs/1706.03762 Attention is All You Need paper] | |
| + | * [https://arxiv.org/abs/2005.14165 OpenAI GPT-3 paper] | ||
| + | * [https://openai.com/blog/chatgpt/ OpenAI ChatGPT blog post] | ||
| + | * The GPU I'm training the model on is from Lambda GPU Cloud, I think the best and easiest way to spin up an on-demand GPU instance in the cloud that you can ssh to: https://lambdalabs.com . If you prefer to work in notebooks, I think the easiest path today is Google Colab. | ||
| + | |||
| + | Suggested exercises: | ||
| + | * EX1: The n-dimensional tensor mastery challenge: Combine the `Head` and `MultiHeadAttention` into one class that processes all the heads in parallel, treating the heads as another batch dimension (answer is in nanoGPT). | ||
| + | * EX2: Train the GPT on your own dataset of choice! What other data could be fun to blabber on about? (A fun advanced suggestion if you like: train a GPT to do addition of two numbers, i.e. a+b=c. You may find it helpful to predict the digits of c in reverse order, as the typical addition algorithm (that you're hoping it learns) would proceed right to left too. You may want to modify the data loader to simply serve random problems and skip the generation of train.bin, val.bin. You may want to mask out the loss at the input positions of a+b that just specify the problem using y=-1 in the targets (see CrossEntropyLoss ignore_index). Does your Transformer learn to add? Once you have this, swole doge project: build a calculator clone in GPT, for all of +-*/. Not an easy problem. You may need Chain of Thought traces.) | ||
| + | * EX3: Find a dataset that is very large, so large that you can't see a gap between train and val loss. Pretrain the transformer on this data, then initialize with that model and finetune it on tiny shakespeare with a smaller number of steps and lower learning rate. Can you obtain a lower validation loss by the use of pretraining? | ||
| + | * EX4: Read some transformer papers and implement one additional feature or change that people seem to use. Does it improve the performance of your GPT? | ||
| + | |||
| + | Chapters: | ||
| + | * 00:00:00 intro: ChatGPT, Transformers, nanoGPT, Shakespeare baseline language modeling, code setup | ||
| + | * 00:07:52 reading and exploring the data | ||
| + | * 00:09:28 tokenization, train/val split | ||
| + | * 00:14:27 data loader: batches of chunks of data | ||
| + | * 00:22:11 simplest baseline: bigram language model, loss, generation | ||
| + | * 00:34:53 training the bigram model | ||
| + | * 00:38:00 port our code to a script Building the "self-attention" | ||
| + | * 00:42:13 version 1: averaging past [[context]] with for loops, the weakest form of aggregation | ||
| + | * 00:47:11 the trick in self-attention: matrix multiply as weighted aggregation | ||
| + | * 00:51:54 version 2: using matrix multiply | ||
| + | * 00:54:42 version 3: adding softmax | ||
| + | * 00:58:26 minor code cleanup | ||
| + | * 01:00:18 positional encoding | ||
| + | * 01:02:00 THE CRUX OF THE VIDEO: version 4: self-attention | ||
| + | * 01:11:38 note 1: attention as [[Agents#Communication | communication]] | ||
| + | * 01:12:46 note 2: attention has no notion of space, operates over sets | ||
| + | * 01:13:40 note 3: there is no [[Agents#Communication | communication]] across batch dimension | ||
| + | * 01:14:14 note 4: encoder blocks vs. decoder blocks | ||
| + | * 01:15:39 note 5: attention vs. self-attention vs. cross-attention | ||
| + | * 01:16:56 note 6: "scaled" self-attention. why divide by sqrt(head_size) Building the Transformer | ||
| + | * 01:19:11 inserting a single self-attention block to our network | ||
| + | * 01:21:59 multi-headed self-attention | ||
| + | * 01:24:25 feedforward layers of transformer block | ||
| + | * 01:26:48 residual connections | ||
| + | * 01:32:51 layernorm (and its relationship to our previous batchnorm) | ||
| + | * 01:37:49 scaling up the model! creating a few variables. adding dropout Notes on Transformer | ||
| + | * 01:42:39 encoder vs. decoder vs. both (?) Transformers | ||
| + | * 01:46:22 super quick walkthrough of nanoGPT, batched multi-headed self-attention | ||
| + | * 01:48:53 back to ChatGPT, GPT-3, pretraining vs. finetuning, RLHF | ||
| + | * 01:54:32 conclusions | ||
| − | + | Corrections: | |
| + | * 00:57:00 Oops "tokens from the future cannot communicate", not "past". Sorry! :) | ||
| + | * 01:20:05 Oops I should be using the head_size for the normalization, not C | ||
|} | |} | ||
|<!-- M --> | |<!-- M --> | ||
| Line 407: | Line 242: | ||
{| class="wikitable" style="width: 550px;" | {| class="wikitable" style="width: 550px;" | ||
|| | || | ||
| − | <youtube> | + | <youtube>9uw3F6rndnA</youtube> |
| − | <b> | + | <b>Transformers: The best idea in AI | Andrej Karpathy and Lex Fridman |
| − | </b><br> | + | </b><br>[https://www.youtube.com/watch?v=cdiD-9MMpb0 Lex Fridman Podcast full episode] |
| − | + | Please support this podcast by checking out our sponsors: | |
| − | + | * [https://www.eightsleep.com/lex Eight Sleep: to get special savings] | |
| − | + | * [https://betterhelp.com/lex BetterHelp: to get 10% off] | |
| − | + | * [https://fundrise.com/lex Fundrise] | |
| − | + | * [https://athleticgreens.com/lex Athletic Greens: to get 1 month of fish oil] | |
| − | + | ||
| − | + | GUEST BIO: | |
| − | + | [[Creatives#Andrej Karpathy |Andrej Karpathy]] is a legendary AI researcher, engineer, and educator. He's the former director of AI at Tesla, a founding member of [[OpenAI]], and an educator at Stanford. | |
| − | |||
| − | |||
| − | + | PODCAST INFO: | |
| + | * [https://lexfridman.com/podcast Podcast website] | ||
| + | * [https://apple.co/2lwqZIr Apple Podcasts] | ||
| + | * [https://spoti.fi/2nEwCF8 Spotify] | ||
| + | * [https://lexfridman.com/feed/podcast/ RSS] | ||
| + | * [https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4 Full episodes playlist] | ||
| + | Clips playlist: https://www.youtube.com/playlist?list... | ||
| + | |||
| + | SOCIAL: | ||
| + | * [https://twitter.com/lexfridman Twitter] | ||
| + | * [https://www.linkedin.com/in/lexfridman LinkedIn] | ||
| + | * [https://www.facebook.com/lexfridman [[Meta|Facebook]]] | ||
| + | * [https://www.instagram.com/lexfridman Instagram] | ||
| + | * [https://medium.com/@lexfridman Medium] | ||
| + | * [https://reddit.com/r/lexfridman Reddit] | ||
| + | * [https://www.patreon.com/lexfridman Support on Patreon] | ||
| + | |||
|} | |} | ||
|}<!-- B --> | |}<!-- B --> | ||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
Latest revision as of 09:06, 28 May 2025
YouTube ... Quora ...Google search ...Google News ...Bing News
- Large Language Model (LLM) ... Multimodal ... Foundation Models (FM) ... Generative Pre-trained ... Transformer ... Attention ... GAN ... BERT
- Conversational AI ... ChatGPT | OpenAI ... Bing/Copilot | Microsoft ... Gemini | Google ... Claude | Anthropic ... Perplexity ... You ... phind ... Ernie | Baidu
- Natural Language Processing (NLP) ... Generation (NLG) ... Classification (NLC) ... Understanding (NLU) ... Translation ... Summarization ... Sentiment ... Tools
- Agents ... Robotic Process Automation ... Assistants ... Personal Companions ... Productivity ... Email ... Negotiation ... LangChain
- Artificial Intelligence (AI) ... Generative AI ... Machine Learning (ML) ... Deep Learning ... Neural Network ... Reinforcement ... Learning Techniques
- Sequence to Sequence (Seq2Seq)
- Recurrent Neural Network (RNN)
- Long Short-Term Memory (LSTM)
- ELMo
- Bidirectional Encoder Representations from Transformers (BERT) ... a better model, but less investment than the larger OpenAI organization
- OpenAI Blog | OpenAI
- Text Transfer Learning
- Video/Image ... Vision ... Enhancement ... Fake ... Reconstruction ... Colorize ... Occlusions ... Predict image ... Image/Video Transfer Learning
- Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever
- Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
- Language Models are Unsupervised Multitask Learners - GitHub
- Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ - trained on over 147M dialogs
- minGPT | Andrej Karpathy - GitHub
- SambaNova Systems ... Dataflow-as-a-Service GPT
- Facebook-owner Meta opens access to AI large language model | Elizabeth Culliford - Reuters ... Facebook 175-billion-parameter language model - Open Pretrained Transformer (OPT-175B)
- Resource on Transformers | Lilian Weng - Lil'Log
Try...
- Sushant Kumar's micro-site - Replace your 'word' in the following URL to see what GPT-3 generates: https://thoughts.sushant-kumar.com/word
- Serendipity ...an AI powered recommendation engine for anything you want.
- Taglines.ai ... just about every business has a tagline — a short, catchy phrase designed to quickly communicate what it is that they do.
- Simplify.so ...simple, easy-to-understand explanations for everything
|
|
Contents
GPT Impact to Development
- * Development ...AI Pair Programming Tools ... Analytics ... Visualization ... Diagrams for Business Analysis
- https://analyticsindiamag.com/will-the-much-hyped-gpt-3-impact-the-coders/ Will The Much-Hyped GPT-3 Impact The Coders? | Analytics India Magazine]
- With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. | Sharif Shameem - debuild
|
|
|
|
|
|
Custom GPTs
- Agents ... Robotic Process Automation ... Assistants ... Personal Companions ... Productivity ... Email ... Negotiation ... LangChain
Custom GPTs are personalized versions of AI models like ChatGPT that can be tailored for specific tasks or projects. They represent a significant advancement in AI implementation, allowing businesses and individuals to customize AI tools to meet unique challenges and operational needs.
OpenAI Platform
OpenAI allows Plus and Enterprise users to create custom GPTs that can browse the web, create images, and run code. Users can upload knowledge files, modify the GPT's appearance, and define its actions
OpenAI GPT Store
The OpenAI GPT Store provides a platform for users to create, share, and monetize their custom GPTs, expanding the capabilities and possibilities of AI assistants like ChatGPT. It allows users of ChatGPT Plus to create and share their own custom chatbots, known as GPTs (Generative Pre-trained Transformers). The GPT Store offers a platform for developers to monetize their custom GPTs and provides a wide range of AI tools and capabilities for users to explore and enhance their AI assistant capabilities
OpenAI GPT Builder
With the GPT Builder, users can tailor GPTs for specific tasks or topics by combining instructions, knowledge, and capabilities. It enables users to build AI agents without the need for coding skills, making it accessible to a wide range of individuals, including educators, coaches, and anyone interested in building helpful tools.
To create a GPT using the GPT Builder, users can access the builder interface through the OpenAI platform at chat.openai.com/gpts/editor or by selecting "My GPTs" after logging in. The builder interface provides a split screen with a Create panel where users can enter prompts and instructions to build their chatbot, and a Preview panel that allows users to interact with the chatbot as they build it, making it easier to refine and customize the GPT.
The GPT Builder also offers features such as the ability to add images to the GPT, either by asking the builder to create an image or by uploading custom images. Additionally, GPTs can be granted access to web browsing, DALL-E (an image generation model), and OpenAI's Code Interpreter tool for writing and executing software. The builder interface also includes a Knowledge section where users can upload custom data to enhance the capabilities of their GPTs .
Let's build GPT: from scratch, in code, spelled out - Andrej Karpathy
- Artificial Intelligence (AI) ... Generative AI ... Machine Learning (ML) ... Deep Learning ... Neural Network ... Reinforcement ... Learning Techniques
- Development ... Notebooks ... AI Pair Programming ... Codeless, Generators, Drag n' Drop ... AIOps/MLOps ... AIaaS/MLaaS
|
|