Difference between revisions of "GPT-5"
m |
m |
||
Line 11: | Line 11: | ||
[https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT+GPT-3+GPT-4+GPT-5&qft=interval%3d%228%22 ...Bing News] | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT+GPT-3+GPT-4+GPT-5&qft=interval%3d%228%22 ...Bing News] | ||
+ | * [[Large Language Model (LLM)]] ... [[Natural Language Processing (NLP)]] ...[[Natural Language Generation (NLG)|Generation]] ... [[Natural Language Classification (NLC)|Classification]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding]] ... [[Language Translation|Translation]] ... [[Natural Language Tools & Services|Tools & Services]] | ||
+ | * [[Large Language Model (LLM)#Multimodal|Multimodal Language Model]]s ... [[GPT-4]] ... [[GPT-5]] | ||
* [[Attention]] Mechanism ...[[Transformer]] ...[[Generative Pre-trained Transformer (GPT)]] ... [[Generative Adversarial Network (GAN)|GAN]] ... [[Bidirectional Encoder Representations from Transformers (BERT)|BERT]] | * [[Attention]] Mechanism ...[[Transformer]] ...[[Generative Pre-trained Transformer (GPT)]] ... [[Generative Adversarial Network (GAN)|GAN]] ... [[Bidirectional Encoder Representations from Transformers (BERT)|BERT]] | ||
− | |||
− | |||
* [[Case Studies]] | * [[Case Studies]] | ||
** [[Writing / Publishing]] | ** [[Writing / Publishing]] |
Revision as of 17:55, 26 May 2023
YouTube ... Quora ...Google search ...Google News ...Bing News
- Large Language Model (LLM) ... Natural Language Processing (NLP) ...Generation ... Classification ... Understanding ... Translation ... Tools & Services
- Multimodal Language Models ... GPT-4 ... GPT-5
- Attention Mechanism ...Transformer ...Generative Pre-trained Transformer (GPT) ... GAN ... BERT
- Case Studies
- Assistants ... Agents ... Negotiation ... LangChain
- Generative AI ... Conversational AI ... OpenAI's ChatGPT ... Perplexity ... Microsoft's Bing ... You ...Google's Bard ... Baidu's Ernie
- Sequence to Sequence (Seq2Seq)
- Recurrent Neural Network (RNN)
- Long Short-Term Memory (LSTM)
- ELMo
- Bidirectional Encoder Representations from Transformers (BERT) ... a better model, but less investment than the larger OpenAI organization
- OpenAI Blog | OpenAI
- Text Transfer Learning
- Video/Image
- SynthPub
- Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever
- Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units.
- Language Models are Unsupervised Multitask Learners - GitHub
- Microsoft Releases DialogGPT AI Conversation Model | Anthony Alford - InfoQ - trained on over 147M dialogs
- minGPT | Andrej Karpathy - GitHub
- SambaNova Systems ... Dataflow-as-a-Service GPT
- Facebook-owner Meta opens access to AI large language model | Elizabeth Culliford - Reuters ... Facebook 175-billion-parameter language model - Open Pretrained Transformer (OPT-175B)
- Resource on Transformers | Lilian Weng - Lil'Log
Generative Pre-trained Transformer 5 (GPT-5)
YouTube ... Quora ...Google search ...Google News ...Bing News
- Singularity ... Sentience ... AGI ... Curious Reasoning ... Emergence ... Moonshots ... Explainable AI ... Automated Learning
- GPT-5: Everything we know about the next major ChatGPT AI upgrade | Chris Smith - BGR