Difference between revisions of "Generative Pre-trained Transformer (GPT)"
m (→Generative Pre-trained Transformer 4 (GPT-4)) |
m |
||
| (87 intermediate revisions by the same user not shown) | |||
| Line 2: | Line 2: | ||
|title=PRIMO.ai | |title=PRIMO.ai | ||
|titlemode=append | |titlemode=append | ||
| − | |keywords=artificial, intelligence, machine, learning, models | + | |keywords=ChatGPT, artificial, intelligence, machine, learning, NLP, NLG, NLC, NLU, models, data, singularity, moonshot, Sentience, AGI, Emergence, Moonshot, Explainable, TensorFlow, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Hugging Face, OpenAI, Tensorflow, OpenAI, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Meta, LLM, metaverse, assistants, agents, digital twin, IoT, Transhumanism, Immersive Reality, Generative AI, Conversational AI, Perplexity, Bing, You, Bard, Ernie, prompt Engineering LangChain, Video/Image, Vision, End-to-End Speech, Synthesize Speech, Speech Recognition, Stanford, MIT |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools |
| − | |description=Helpful resources for your journey with artificial intelligence; | + | |
| + | <!-- Google tag (gtag.js) --> | ||
| + | <script async src="https://www.googletagmanager.com/gtag/js?id=G-4GCWLBVJ7T"></script> | ||
| + | <script> | ||
| + | window.dataLayer = window.dataLayer || []; | ||
| + | function gtag(){dataLayer.push(arguments);} | ||
| + | gtag('js', new Date()); | ||
| + | |||
| + | gtag('config', 'G-4GCWLBVJ7T'); | ||
| + | </script> | ||
}} | }} | ||
| − | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT | + | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT YouTube] |
| − | [https://www.quora.com/search?q=Generative%20Pre%20trained%20Transformer%20% | + | [https://www.quora.com/search?q=Generative%20Pre%20trained%20Transformer%20%GPT ... Quora] |
| − | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT | + | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT ...Google search] |
| − | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT | + | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT ...Google News] |
| − | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT | + | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT&qft=interval%3d%228%22 ...Bing News] |
| − | |||
| − | * [[ | + | * [[Large Language Model (LLM)]] ... [[Large Language Model (LLM)#Multimodal|Multimodal]] ... [[Foundation Models (FM)]] ... [[Generative Pre-trained Transformer (GPT)|Generative Pre-trained]] ... [[Transformer]] ... [[Attention]] ... [[Generative Adversarial Network (GAN)|GAN]] ... [[Bidirectional Encoder Representations from Transformers (BERT)|BERT]] |
| − | * | + | * [[Conversational AI]] ... [[ChatGPT]] | [[OpenAI]] ... [[Bing/Copilot]] | [[Microsoft]] ... [[Gemini]] | [[Google]] ... [[Claude]] | [[Anthropic]] ... [[Perplexity]] ... [[You]] ... [[phind]] ... [[Ernie]] | [[Baidu]] |
| − | * [[Natural Language Processing (NLP)]] | + | * [[Natural Language Processing (NLP)]] ... [[Natural Language Generation (NLG)|Generation (NLG)]] ... [[Natural Language Classification (NLC)|Classification (NLC)]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding (NLU)]] ... [[Language Translation|Translation]] ... [[Summarization]] ... [[Sentiment Analysis|Sentiment]] ... [[Natural Language Tools & Services|Tools]] |
| − | + | * [[Agents]] ... [[Robotic Process Automation (RPA)|Robotic Process Automation]] ... [[Assistants]] ... [[Personal Companions]] ... [[Personal Productivity|Productivity]] ... [[Email]] ... [[Negotiation]] ... [[LangChain]] | |
| − | * [[ | + | * [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Generative AI]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]] |
| − | |||
* [[Sequence to Sequence (Seq2Seq)]] | * [[Sequence to Sequence (Seq2Seq)]] | ||
* [[Recurrent Neural Network (RNN)]] | * [[Recurrent Neural Network (RNN)]] | ||
| Line 25: | Line 32: | ||
* [https://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [[OpenAI]] | * [https://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [[OpenAI]] | ||
* [[Text Transfer Learning]] | * [[Text Transfer Learning]] | ||
| − | * [[Video/Image]] | + | * [[Video/Image]] ... [[Vision]] ... [[Enhancement]] ... [[Fake]] ... [[Reconstruction]] ... [[Colorize]] ... [[Occlusions]] ... [[Predict image]] ... [[Image/Video Transfer Learning]] |
| − | + | [[Writing/Publishing#SynthPub|Writing/Publishing - SynthPub]] | |
* [https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever] | * [https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever] | ||
* [https://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units. | * [https://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units. | ||
| Line 39: | Line 46: | ||
* [https://paperswithcode.com/method/gpt GPT | Papers With Code] | * [https://paperswithcode.com/method/gpt GPT | Papers With Code] | ||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| Line 155: | Line 62: | ||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
{|<!-- T --> | {|<!-- T --> | ||
| valign="top" | | | valign="top" | | ||
| Line 191: | Line 80: | ||
|} | |} | ||
|}<!-- B --> | |}<!-- B --> | ||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
== <span id="GPT Impact to Development"></span>GPT Impact to Development == | == <span id="GPT Impact to Development"></span>GPT Impact to Development == | ||
| Line 319: | Line 107: | ||
{| class="wikitable" style="width: 550px;" | {| class="wikitable" style="width: 550px;" | ||
|| | || | ||
| − | <youtube> | + | <youtube>3P3TcKaegbA</youtube> |
| − | <b> | + | <b>Generative Python Code with GPT |
| − | </b><br> | + | </b><br>In a quest to teach neural networks via transformers to write Python code. Project name: Generative Python Transformers! |
|} | |} | ||
|<!-- M --> | |<!-- M --> | ||
| Line 350: | Line 138: | ||
|}<!-- B --> | |}<!-- B --> | ||
| − | = | + | = <span id="Custom GPTs"></span>Custom GPTs = |
| − | * | + | * [[Agents]] ... [[Robotic Process Automation (RPA)|Robotic Process Automation]] ... [[Assistants]] ... [[Personal Companions]] ... [[Personal Productivity|Productivity]] ... [[Email]] ... [[Negotiation]] ... [[LangChain]] |
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | + | Custom GPTs are personalized versions of AI models like [[ChatGPT]] that can be tailored for specific tasks or projects. They represent a significant advancement in AI implementation, allowing businesses and individuals to customize AI tools to meet unique challenges and operational needs. | |
| − | + | == <span id="OpenAI Platform"></span>OpenAI Platform == | |
| − | + | * [https://chat.openai.com/create OpenAI Platform] | |
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | + | [[OpenAI]] allows Plus and Enterprise users to create custom GPTs that can browse the web, create images, and run code. Users can upload knowledge files, modify the GPT's appearance, and define its actions | |
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
| − | + | === <span id="OpenAI GPT Store"></span>OpenAI GPT Store === | |
| − | + | * [https://chatgpt.com/gpts GPT Store] | |
| − | |||
| − | + | The [[OpenAI]] GPT Store provides a platform for users to create, share, and monetize their custom GPTs, expanding the capabilities and possibilities of AI assistants like [[ChatGPT]]. It allows users of [[ChatGPT]] Plus to create and share their own custom chatbots, known as GPTs (Generative Pre-trained Transformers). The GPT Store offers a platform for developers to monetize their custom GPTs and provides a wide range of AI tools and capabilities for users to explore and enhance their AI assistant capabilities | |
| − | + | <youtube>2wYcJEcKVPk</youtube> | |
| + | <youtube>amjnJrfByS0</youtube> | ||
| + | <youtube>VudB3E9tSbc</youtube> | ||
| + | <youtube>SVA-OBl44m4</youtube> | ||
| − | + | === <span id="OpenAI GPT Builder"></span>OpenAI GPT Builder === | |
| − | |||
| − | + | With the GPT Builder, users can tailor GPTs for specific tasks or topics by combining instructions, knowledge, and capabilities. It enables users to build AI agents without the need for coding skills, making it accessible to a wide range of individuals, including educators, coaches, and anyone interested in building helpful tools. | |
| − | + | To create a GPT using the GPT Builder, users can access the builder interface through the [[OpenAI]] platform at chat.openai.com/gpts/editor or by selecting "My GPTs" after logging in. The builder interface provides a split screen with a Create panel where users can enter prompts and instructions to build their chatbot, and a Preview panel that allows users to interact with the chatbot as they build it, making it easier to refine and customize the GPT. | |
| − | + | The GPT Builder also offers features such as the ability to add images to the GPT, either by asking the builder to create an image or by uploading custom images. Additionally, GPTs can be granted access to web browsing, [[Video/Image#DALL-E | DALL-E]] (an image generation model), and [[OpenAI]]'s Code Interpreter tool for writing and executing software. The builder interface also includes a Knowledge section where users can upload custom data to enhance the capabilities of their GPTs . | |
| − | < | + | <youtube>f2uPl2MlV24</youtube> |
| + | <youtube>SjJsXyBTPUc</youtube> | ||
| − | = Let's build GPT: from scratch, in code, spelled out | + | = <span id="Let's build GPT: from scratch, in code, spelled out - Andrej Karpathy"></span>Let's build GPT: from scratch, in code, spelled out - Andrej Karpathy = |
| + | * [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Generative AI]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]] | ||
| + | * [[Development]] ... [[Notebooks]] ... [[Development#AI Pair Programming Tools|AI Pair Programming]] ... [[Codeless Options, Code Generators, Drag n' Drop|Codeless, Generators, Drag n' Drop]] ... [[Algorithm Administration#AIOps/MLOps|AIOps/MLOps]] ... [[Platforms: AI/Machine Learning as a Service (AIaaS/MLaaS)|AIaaS/MLaaS]] | ||
{|<!-- T --> | {|<!-- T --> | ||
| Line 461: | Line 210: | ||
* 00:34:53 training the bigram model | * 00:34:53 training the bigram model | ||
* 00:38:00 port our code to a script Building the "self-attention" | * 00:38:00 port our code to a script Building the "self-attention" | ||
| − | * 00:42:13 version 1: averaging past context with for loops, the weakest form of aggregation | + | * 00:42:13 version 1: averaging past [[context]] with for loops, the weakest form of aggregation |
* 00:47:11 the trick in self-attention: matrix multiply as weighted aggregation | * 00:47:11 the trick in self-attention: matrix multiply as weighted aggregation | ||
* 00:51:54 version 2: using matrix multiply | * 00:51:54 version 2: using matrix multiply | ||
Latest revision as of 09:06, 28 May 2025
YouTube ... Quora ...Google search ...Google News ...Bing News
- Large Language Model (LLM) ... Multimodal ... Foundation Models (FM) ... Generative Pre-trained ... Transformer ... Attention ... GAN ... BERT
- Conversational AI ... ChatGPT | OpenAI ... Bing/Copilot | Microsoft ... Gemini | Google ... Claude | Anthropic ... Perplexity ... You ... phind ... Ernie | Baidu
- Natural Language Processing (NLP) ... Generation (NLG) ... Classification (NLC) ... Understanding (NLU) ... Translation ... Summarization ... Sentiment ... Tools
- Agents ... Robotic Process Automation ... Assistants ... Personal Companions ... Productivity ... Email ... Negotiation ... LangChain
- Artificial Intelligence (AI) ... Generative AI ... Machine Learning (ML) ... Deep Learning ... Neural Network ... Reinforcement ... Learning Techniques
- Sequence to Sequence (Seq2Seq)
- Recurrent Neural Network (RNN)
- Long Short-Term Memory (LSTM)
- ELMo
- Bidirectional Encoder Representations from Transformers (BERT) ... a better model, but less investment than the larger OpenAI organization
- OpenAI Blog | OpenAI
- Text Transfer Learning
- Video/Image ... Vision ... Enhancement ... Fake ... Reconstruction ... Colorize ... Occlusions ... Predict image ... Image/Video Transfer Learning
- Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever
- Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
- Language Models are Unsupervised Multitask Learners - GitHub
- Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ - trained on over 147M dialogs
- minGPT | Andrej Karpathy - GitHub
- SambaNova Systems ... Dataflow-as-a-Service GPT
- Facebook-owner Meta opens access to AI large language model | Elizabeth Culliford - Reuters ... Facebook 175-billion-parameter language model - Open Pretrained Transformer (OPT-175B)
- Resource on Transformers | Lilian Weng - Lil'Log
Try...
- Sushant Kumar's micro-site - Replace your 'word' in the following URL to see what GPT-3 generates: https://thoughts.sushant-kumar.com/word
- Serendipity ...an AI powered recommendation engine for anything you want.
- Taglines.ai ... just about every business has a tagline — a short, catchy phrase designed to quickly communicate what it is that they do.
- Simplify.so ...simple, easy-to-understand explanations for everything
|
|
Contents
GPT Impact to Development
- * Development ...AI Pair Programming Tools ... Analytics ... Visualization ... Diagrams for Business Analysis
- https://analyticsindiamag.com/will-the-much-hyped-gpt-3-impact-the-coders/ Will The Much-Hyped GPT-3 Impact The Coders? | Analytics India Magazine]
- With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. | Sharif Shameem - debuild
|
|
|
|
|
|
Custom GPTs
- Agents ... Robotic Process Automation ... Assistants ... Personal Companions ... Productivity ... Email ... Negotiation ... LangChain
Custom GPTs are personalized versions of AI models like ChatGPT that can be tailored for specific tasks or projects. They represent a significant advancement in AI implementation, allowing businesses and individuals to customize AI tools to meet unique challenges and operational needs.
OpenAI Platform
OpenAI allows Plus and Enterprise users to create custom GPTs that can browse the web, create images, and run code. Users can upload knowledge files, modify the GPT's appearance, and define its actions
OpenAI GPT Store
The OpenAI GPT Store provides a platform for users to create, share, and monetize their custom GPTs, expanding the capabilities and possibilities of AI assistants like ChatGPT. It allows users of ChatGPT Plus to create and share their own custom chatbots, known as GPTs (Generative Pre-trained Transformers). The GPT Store offers a platform for developers to monetize their custom GPTs and provides a wide range of AI tools and capabilities for users to explore and enhance their AI assistant capabilities
OpenAI GPT Builder
With the GPT Builder, users can tailor GPTs for specific tasks or topics by combining instructions, knowledge, and capabilities. It enables users to build AI agents without the need for coding skills, making it accessible to a wide range of individuals, including educators, coaches, and anyone interested in building helpful tools.
To create a GPT using the GPT Builder, users can access the builder interface through the OpenAI platform at chat.openai.com/gpts/editor or by selecting "My GPTs" after logging in. The builder interface provides a split screen with a Create panel where users can enter prompts and instructions to build their chatbot, and a Preview panel that allows users to interact with the chatbot as they build it, making it easier to refine and customize the GPT.
The GPT Builder also offers features such as the ability to add images to the GPT, either by asking the builder to create an image or by uploading custom images. Additionally, GPTs can be granted access to web browsing, DALL-E (an image generation model), and OpenAI's Code Interpreter tool for writing and executing software. The builder interface also includes a Knowledge section where users can upload custom data to enhance the capabilities of their GPTs .
Let's build GPT: from scratch, in code, spelled out - Andrej Karpathy
- Artificial Intelligence (AI) ... Generative AI ... Machine Learning (ML) ... Deep Learning ... Neural Network ... Reinforcement ... Learning Techniques
- Development ... Notebooks ... AI Pair Programming ... Codeless, Generators, Drag n' Drop ... AIOps/MLOps ... AIaaS/MLaaS
|
|