|
|
| (67 intermediate revisions by the same user not shown) |
| Line 2: |
Line 2: |
| | |title=PRIMO.ai | | |title=PRIMO.ai |
| | |titlemode=append | | |titlemode=append |
| − | |keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, TensorFlow, Facebook, Google, Nvidia, Microsoft, Azure, Amazon, AWS | + | |keywords=ChatGPT, artificial, intelligence, machine, learning, NLP, NLG, NLC, NLU, models, data, singularity, moonshot, Sentience, AGI, Emergence, Moonshot, Explainable, TensorFlow, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Hugging Face, OpenAI, Tensorflow, OpenAI, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Meta, LLM, metaverse, assistants, agents, digital twin, IoT, Transhumanism, Immersive Reality, Generative AI, Conversational AI, Perplexity, Bing, You, Bard, Ernie, prompt Engineering LangChain, Video/Image, Vision, End-to-End Speech, Synthesize Speech, Speech Recognition, Stanford, MIT |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools |
| − | |description=Helpful resources for your journey with artificial intelligence; Attention, GPT, chat, videos, articles, techniques, courses, profiles, and tools | + | |
| | + | <!-- Google tag (gtag.js) --> |
| | + | <script async src="https://www.googletagmanager.com/gtag/js?id=G-4GCWLBVJ7T"></script> |
| | + | <script> |
| | + | window.dataLayer = window.dataLayer || []; |
| | + | function gtag(){dataLayer.push(arguments);} |
| | + | gtag('js', new Date()); |
| | + | |
| | + | gtag('config', 'G-4GCWLBVJ7T'); |
| | + | </script> |
| | }} | | }} |
| − | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT+AI YouTube] | + | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT YouTube] |
| − | [https://www.quora.com/search?q=Generative%20Pre%20trained%20Transformer%20%GPT20AI ... Quora] | + | [https://www.quora.com/search?q=Generative%20Pre%20trained%20Transformer%20%GPT ... Quora] |
| − | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT+AI ...Google search] | + | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT ...Google search] |
| − | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT+AI ...Google News] | + | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT ...Google News] |
| − | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT+AI&qft=interval%3d%228%22 ...Bing News] | + | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT&qft=interval%3d%228%22 ...Bing News] |
| | | | |
| − | | + | * [[Large Language Model (LLM)]] ... [[Large Language Model (LLM)#Multimodal|Multimodal]] ... [[Foundation Models (FM)]] ... [[Generative Pre-trained Transformer (GPT)|Generative Pre-trained]] ... [[Transformer]] ... [[Attention]] ... [[Generative Adversarial Network (GAN)|GAN]] ... [[Bidirectional Encoder Representations from Transformers (BERT)|BERT]] |
| − | * [[Case Studies]] | + | * [[Conversational AI]] ... [[ChatGPT]] | [[OpenAI]] ... [[Bing/Copilot]] | [[Microsoft]] ... [[Gemini]] | [[Google]] ... [[Claude]] | [[Anthropic]] ... [[Perplexity]] ... [[You]] ... [[phind]] ... [[Ernie]] | [[Baidu]] |
| − | ** [[Writing/Publishing]] | + | * [[Natural Language Processing (NLP)]] ... [[Natural Language Generation (NLG)|Generation (NLG)]] ... [[Natural Language Classification (NLC)|Classification (NLC)]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding (NLU)]] ... [[Language Translation|Translation]] ... [[Summarization]] ... [[Sentiment Analysis|Sentiment]] ... [[Natural Language Tools & Services|Tools]] |
| − | * [[Natural Language Processing (NLP)]] ...[[Natural Language Generation (NLG)|Generation]] ...[[Large Language Model (LLM)|LLM]] ...[[Natural Language Tools & Services|Tools & Services]] | + | * [[Agents]] ... [[Robotic Process Automation (RPA)|Robotic Process Automation]] ... [[Assistants]] ... [[Personal Companions]] ... [[Personal Productivity|Productivity]] ... [[Email]] ... [[Negotiation]] ... [[LangChain]] |
| − | * [[Assistants]] ... [[Hybrid Assistants]] ... [[Agents]] ... [[Negotiation]] ... [[Hugging_Face#HuggingGPT|HuggingGPT]] ... [[LangChain]]
| + | * [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Generative AI]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]] |
| − | * [[Attention]] Mechanism ...[[Transformer]] Model ...[[Generative Pre-trained Transformer (GPT)]]
| |
| − | * [[Generative AI]] ... [[OpenAI]]'s [[ChatGPT]] ... [[Perplexity]] ... [[Microsoft]]'s [[Bing]] ... [[You]] ...[[Google]]'s [[Bard]] ... [[Baidu]]'s [[Ernie]]
| |
| | * [[Sequence to Sequence (Seq2Seq)]] | | * [[Sequence to Sequence (Seq2Seq)]] |
| | * [[Recurrent Neural Network (RNN)]] | | * [[Recurrent Neural Network (RNN)]] |
| Line 25: |
Line 32: |
| | * [https://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [[OpenAI]] | | * [https://openai.com/blog/gpt-2-6-month-follow-up/ OpenAI Blog] | [[OpenAI]] |
| | * [[Text Transfer Learning]] | | * [[Text Transfer Learning]] |
| − | * [[Video/Image]] | + | * [[Video/Image]] ... [[Vision]] ... [[Enhancement]] ... [[Fake]] ... [[Reconstruction]] ... [[Colorize]] ... [[Occlusions]] ... [[Predict image]] ... [[Image/Video Transfer Learning]] |
| − | * [[SynthPub]]
| + | [[Writing/Publishing#SynthPub|Writing/Publishing - SynthPub]] |
| | * [https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever] | | * [https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf Language Models are Unsupervised Multitask Learners | Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever] |
| | * [https://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units. | | * [https://neural-monkey.readthedocs.io/en/latest/machine_translation.html Neural Monkey | Jindřich Libovický, Jindřich Helcl, Tomáš Musil] Byte Pair Encoding (BPE) enables NMT model translation on open-vocabulary by encoding rare and unknown words as sequences of subword units. |
| Line 39: |
Line 46: |
| | * [https://paperswithcode.com/method/gpt GPT | Papers With Code] | | * [https://paperswithcode.com/method/gpt GPT | Papers With Code] |
| | | | |
| − |
| |
| − | = <span id="Generative Pre-trained Transformer 5 (GPT-5)"></span>Generative Pre-trained Transformer 5 (GPT-5) =
| |
| − | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT-5+AI YouTube]
| |
| − | [https://www.quora.com/search?q=Generative%20Pre%20trained%20Transformer%20%GPT-520AI ... Quora]
| |
| − | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT-5+AI ...Google search]
| |
| − | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT-5+AI ...Google News]
| |
| − | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT-5+AI&qft=interval%3d%228%22 ...Bing News]
| |
| − |
| |
| − | * [[Singularity]] ... [[Moonshots]] ... [[Emergence]] ... [[Explainable / Interpretable AI]] ... [[Artificial General Intelligence (AGI)| AGI]] ... [[Inside Out - Curious Optimistic Reasoning]] ... [[Algorithm Administration#Automated Learning|Automated Learning]]
| |
| − | * [https://bgr.com/tech/chatgpt-gpt-5-everything-we-know-about-the-next-major-ai-upgrade/ GPT-5: Everything we know about the next major ChatGPT AI upgrade | Chris Smith - BGR]
| |
| − |
| |
| − |
| |
| − | <youtube>CcnPatOYIgo</youtube>
| |
| − | <youtube>bS88NVwzeig</youtube>
| |
| − | <youtube>c4aR_smQgxY</youtube>
| |
| − | <youtube>LBsy9U0Xwlw</youtube>
| |
| − |
| |
| − | = GPT4All =
| |
| − | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT4All+AI YouTube]
| |
| − | [https://www.quora.com/search?q=Generative%20Pre%20trained%20Transformer%20%GPT4All20AI ... Quora]
| |
| − | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT4All+AI ...Google search]
| |
| − | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT4All+AI ...Google News]
| |
| − | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT4All+AI&qft=interval%3d%228%22 ...Bing News]
| |
| − |
| |
| − | * [https://github.com/nomic-ai/gpt4all Github | GPT4All]
| |
| − | * [https://atlas.nomic.ai/map/gpt4all_data_clean Dataset viewer | NOMIC.ai]
| |
| − | * [https://www.youtube.com/redirect?event=video_description&redir_token=QUFFLUhqbkg2by0wWXBQWDZNR2FSc05Wc0p2dWxGZ24xQXxBQ3Jtc0tsMWpqMW14clFBTDhDZi1HekxzVGp6aHFiaVM2YWNBVk9kX3VsMmNkbDFOanZEVEJocVBCS1hsVWtXTUZGNWFFb1d2ZTc4Z3BUQTVRa29YWTlKXzZ6OVBmUTdiNzZtRWM4VGs4VnJJbEdtUXFuTGtNMA&q=https%3A%2F%2Fs3.amazonaws.com%2Fstatic.nomic.ai%2Fgpt4all%2F2023_GPT4All_Technical_Report.pdf Tech report: GPT4All: Training an Assistant-style Chatbot with Large Scale DataDistillation from GPT-3.5-Turbo | Y. Anand, Z. Nussbaum, B. Duderstadt, B. Schmidt, & A. Mulyar - NOMIC.ai]
| |
| − |
| |
| − |
| |
| − | A chatbot trained on a massive collection of clean assistant data including code, stories and dialogue. Demo, data and code to train an assistant-style large language model with ~800k GPT-3.5-Turbo Generations based on LLaMa
| |
| − |
| |
| − |
| |
| − | {|<!-- T -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>DDfUoQWnrfM</youtube>
| |
| − | <b>GPT4ALL: Install '[[ChatGPT]]' Locally (weights & fine-tuning!) - Tutorial
| |
| − | </b><br>Matthew Berman - In this video, I walk you through installing the newly released GPT4ALL large language model on your local computer. This model is brought to you by the fine people at Nomic AI, furthering the open-source LLM mission. GPT4ALL is trained using the same technique as Alpaca, which is an assistant-style large language model with ~800k GPT-3.5-Turbo Generations based on LLaMa. IMO, it works even better than Alpaca and is super fast. This is basically like having ChatGPT on your local computer. Easy install. Nomic AI was also kind enough to include the weights in addition to the quantized model.
| |
| − | |}
| |
| − | |<!-- M -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>GhRNIuTA2Z0</youtube>
| |
| − | <b>Is GPT4All your new personal ChatGPT?
| |
| − | </b><br>In this video we are looking at the GPT4ALL model which an interesting (even though not for commercial use) project of taking a LLaMa model and finetuning with a lot more instruction tasks than Alpaca.
| |
| − |
| |
| − | * [https://colab.research.google.com/drive/1NWZN15plz8rxrk-9OcxNwwIk1V1MfBsJ?usp=sharing Colab | Sam Witteveen]
| |
| − | |}
| |
| − | |}<!-- B -->
| |
| − |
| |
| − | = <span id="Generative Pre-trained Transformer 4 (GPT-4)"></span>Generative Pre-trained Transformer 4 (GPT-4) =
| |
| − | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT-4+AI YouTube]
| |
| − | [https://www.quora.com/search?q=Generative%20Pre%20trained%20Transformer%20%GPT-420AI ... Quora]
| |
| − | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT-4+AI ...Google search]
| |
| − | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT-4+AI ...Google News]
| |
| − | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT-4+AI&qft=interval%3d%228%22 ...Bing News]
| |
| − |
| |
| − |
| |
| − | * [https://openai.com/product/gpt-4 GPT-4 |] [[OpenAI]]
| |
| − | * [https://openai.com/research/gpt-4 Research Paper |] [[OpenAI]]
| |
| − | * [https://uxplanet.org/gpt-4-facts-rumors-and-expectations-about-next-gen-ai-model-52a4ddcd662a GPT-4: Facts, Rumors and Expectations about next-gen AI model | Nick Babich - Medium]
| |
| − | * [https://www.aljazeera.com/news/2023/3/15/how-do-ai-models-like-gpt-4-work-and-how-can-you-start-using-it How does GPT-4 work and how can you start using it in] [[ChatGPT?]] | Mohammed Haddad - Aljazeera] ... Launched on March 14, GPT-4 is the successor to GPT-3 and is the technology behind the viral [[Assistants#Chatbot | Chatbot]] ChatGPT.
| |
| − | * [https://www.gsmarena.com/openai_unveils_gpt4_with_new_capabilities_microsofts_bing_is_already_using_it-news-57911.php OpenAI unveils GPT-4 with new capabilities, Microsoft's Bing is already using it]
| |
| − | * [https://openai.com/customer-stories/stripe Stripe | ] [[OpenAI]] Customer Stories ... 15 of the prototypes were considered strong candidates to be integrated into the platform, including support customization, answering questions about support, and fraud detection
| |
| − | * [https://openai.com/customer-stories/morgan-stanley Morgan Stanley |] [[OpenAI]] Customer Stories ... access, process and synthesize content almost instantaneously
| |
| − | * [https://theaiexchange.beehiiv.com/p/gpt-4-cheatsheet The 411 on GPT-4 | The AI Exchange]
| |
| − | * [https://dataconomy.com/2023/03/how-to-use-gpt-4-features-use-cases-example/ OpenAI released GPT-4, the highly anticipated successor to ChatGPT | Eray Eliaçık - Dataconomy]
| |
| − |
| |
| − |
| |
| − |
| |
| − | <hr>
| |
| − |
| |
| − | GPT-4, known as Prometheus can be used on:
| |
| − | * Microsoft Edge: [[Bing|Microsoft Bing Chat]]
| |
| − | * Chrome Extension: [https://chrome.google.com/webstore/detail/usechatgptai-copilot-on-c/mhnlakgilnojmhinhkckjpncpbhabphi UseChatGPT.AI: Copilot on Chrome (GPT-4 ✓)]
| |
| − | * Android: [https://play.google.com/store/apps/details?id=com.codecandy.aiassistant&hl=en AI Assistant Widget Chat GPT-4 on Google Play Store]
| |
| − |
| |
| − | <hr>
| |
| − |
| |
| − |
| |
| − | One of ChatGPT-4’s most dazzling new features is the ability to handle not only words, but pictures too, in what is being called “multimodal” technology. A user will have the ability to submit a picture alongside text — both of which ChatGPT-4 will be able to process and discuss. The ability to input video is also on the horizon. - [https://time.com/6263022/what-to-know-about-chatgpt-4/ Everything You Need to Know About ChatGPT-4 | Alex Millson - Bloomberg, Time]
| |
| − |
| |
| − | <youtube>W2Ed6w9s5XQ</youtube>
| |
| − | <youtube>qbIk7-JPB2c</youtube>
| |
| − |
| |
| − |
| |
| − | == <span id="Autonomous GPT"></span>Autonomous GPT ==
| |
| − | [https://www.youtube.com/results?search_query=ai+Autonomous+GPT YouTube]
| |
| − | [https://www.quora.com/search?q=AI%20Autonomous%20%GPT ... Quora]
| |
| − | [https://www.google.com/search?q=ai+Autonomous+GPT ...Google search]
| |
| − | [https://news.google.com/search?q=ai+Autonomous+GPT ...Google News]
| |
| − | [https://www.bing.com/news/search?q=ai+Autonomous+GPT&qft=interval%3d%228%22 ...Bing News]
| |
| − |
| |
| − | * [[Assistants]] ... [[Hybrid Assistants]] ... [[Agents]] ... [[Negotiation]] ... [[Hugging_Face#HuggingGPT|HuggingGPT]] ... [[LangChain]]
| |
| − | * [https://www.vice.com/en/article/epvdme/developers-are-connecting-multiple-ai-agents-to-make-more-autonomous-ai Developers Are Connecting Multiple AI Agents to Make More ‘Autonomous’ AI | Chloe Xiang - Vice] ... Auto-GPT
| |
| − | * [https://github.com/Torantulino/Auto-GPT Auto-GPT | Toran Bruce Richards] ... driven by GPT-4, chains together LLM "thoughts", to autonomously achieve whatever goal you set by can prompting itself to complete an objective.
| |
| − | * [https://yoheinakajima.com/task-driven-autonomous-agent-utilizing-gpt-4-pinecone-and-langchain-for-diverse-applications/ Task-driven Autonomous Agent Utilizing GPT-4], [[AI-Powered Search#Pinecone|Pinecone]], and [[LangChain]] for Diverse Applications | Yohei Nakajima
| |
| − |
| |
| − |
| |
| − | <youtube>7MeHry2pglw</youtube>
| |
| − | <youtube>6NoTuqDAkfg</youtube>
| |
| − |
| |
| − | == <span id="Auto-GPT"></span>Auto-GPT ==
| |
| − | [https://www.youtube.com/results?search_query=ai+AutoGPT YouTube]
| |
| − | [https://www.quora.com/search?q=AI%20AutoGPT ... Quora]
| |
| − | [https://www.google.com/search?q=ai+AutoGPT ...Google search]
| |
| − | [https://news.google.com/search?q=ai+AutoGPT ...Google News]
| |
| − | [https://www.bing.com/news/search?q=ai+AutoGPT&qft=interval%3d%228%22 ...Bing News]
| |
| − |
| |
| − |
| |
| − | * [https://www.zdnet.com/article/what-is-auto-gpt-everything-to-know-about-the-next-powerful-ai-tool/ What is Auto-GPT? Everything to know about the next powerful AI tool | Sabrina Ortiz - ZDnet] ... Auto-GPT can do a lot of things ChatGPT can't do.
| |
| − |
| |
| − | <youtube>wzwAFRaKsB8</youtube>
| |
| − | <youtube>jn8n212l3PQ</youtube>
| |
| − | <youtube>s4w5mGQVxjU</youtube>
| |
| − | <youtube>0m0AbdoFLq4</youtube>
| |
| − |
| |
| − | == <span id="BabyAGI"></span>BabyAGI ==
| |
| − | [https://www.youtube.com/results?search_query=ai+BabyAGI YouTube]
| |
| − | [https://www.quora.com/search?q=AI%20BabyAGI ... Quora]
| |
| − | [https://www.google.com/search?q=ai+BabyAGI ...Google search]
| |
| − | [https://news.google.com/search?q=ai+BabyAGI ...Google News]
| |
| − | [https://www.bing.com/news/search?q=ai+BabyAGI&qft=interval%3d%228%22 ...Bing News]
| |
| − |
| |
| − | * [https://www.fastcompany.com/90880294/auto-gpt-and-babyagi-how-autonomous-agents-are-bringing-generative-ai-to-the-masses Auto-GPT and BabyAGI: How ‘autonomous agents’ are bringing generative AI to the masses | Mark Sullivan - Fast Company] ... Autonomous agents may mark an important step toward a world where AI-driven systems are smart enough to work on their own, without need of human involvement.
| |
| − |
| |
| − |
| |
| − | <youtube>pAtguEz7CBs</youtube>
| |
| − | <youtube>wnM1FBxzTlc</youtube>
| |
| − |
| |
| − | = Generative Pre-trained Transformer 3 (GPT-3 & GPT 3.5) =
| |
| − | [https://www.youtube.com/results?search_query=Generative+Pre+trained+Transformer+GPT-3+GPT-3.5+AI YouTube]
| |
| − | [https://www.quora.com/search?q=Generative%20Pre%20trained%20Transformer%20%GPT-3%20%GPT-3.520AI ... Quora]
| |
| − | [https://www.google.com/search?q=Generative+Pre+trained+Transformer+GPT-3+GPT-3.5+AI ...Google search]
| |
| − | [https://news.google.com/search?q=Generative+Pre+trained+Transformer+GPT-3+GPT-3.5+AI ...Google News]
| |
| − | [https://www.bing.com/news/search?q=Generative+Pre+trained+Transformer+GPT-3+GPT-3.5+AI&qft=interval%3d%228%22 ...Bing News]
| |
| − |
| |
| − | * [https://arxiv.org/abs/2005.14165 Language Models are Few-Shot Learners | T. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, and D. Amodei - arXiv.org]
| |
| − | * [https://towardsdatascience.com/gpt-3-demos-use-cases-implications-77f86e540dc1 GPT-3: Demos, Use-cases, Implications | Simon O'Regan - Towards Data Science]
| |
| − | * [https://openai.com/blog/openai-api/ OpenAI API] ...today the API runs models with weights from the GPT-3 family with many speed and throughput improvements.
| |
| − | * [https://medium.com/@praveengovi.analytics/gpt-3-by-openai-outlook-and-examples-f234f9c62c41 GPT-3 by OpenAI – Outlook and Examples | Praveen Govindaraj | Medium]
| |
| − | * [https://www.gwern.net/GPT-3 GPT-3 Creative Fiction | R. Gwern]
| |
| − | * [https://arxiv.org/abs/2005.14165 GPT-3: Brown et al., 2020]
| |
| − | * [https://www.linkedin.com/pulse/qlik-chatgpt-api-how-start-jacek-harazin/ Qlik and GPT-3 integration - how to start? | Jacek - Qlik]
| |
| | | | |
| | | | |
| Line 201: |
Line 62: |
| | | | |
| | | | |
| − | {|<!-- T -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>CW5xgCxXwdY</youtube>
| |
| − | <b>Chat GPT 4 Was Just ANNOUNCED (Open AI GPT 4)
| |
| − | </b><br>Get ready for the next generation of AI language technology with GPT-4! In this video, we'll be discussing what to expect from OpenAI's latest language model, including advancements in natural language processing, conversational AI, and language generation.
| |
| | | | |
| − | We'll also be looking at how GPT-4 is set to revolutionize industries such as customer service, content creation, and more. Stay tuned for an exciting look into the future of AI language technology with GPT-4!
| |
| − | |}
| |
| − | |<!-- M -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>8jODDWgqQmc</youtube>
| |
| − | <b>What is Chat GPT 4 (Open AI) ? Parameters: GPT 4 vs GPT 3
| |
| − | </b><br>Softreviewed
| |
| − | |}
| |
| − | |}<!-- B -->
| |
| | {|<!-- T --> | | {|<!-- T --> |
| | | valign="top" | | | | valign="top" | |
| Line 237: |
Line 80: |
| | |} | | |} |
| | |}<!-- B --> | | |}<!-- B --> |
| − | {|<!-- T -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>lQnLwUfwgyA</youtube>
| |
| − | <b>This text generation AI is INSANE (GPT-3)
| |
| − | </b><br>An overview of the gpt-3 machine learning model, why everyone should understand it, and why some (including its creator, open AI) think it's dangerous.
| |
| − | |}
| |
| − | |<!-- M -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>qqbqW4aVvHo</youtube>
| |
| − | <b>GPT-3 Demo Installation -Generative pretrained Transformer model (Third generation of [[OpenAI]])
| |
| − | </b><br>[[Python]]code.
| |
| − | |}
| |
| − | |}<!-- B -->
| |
| − | {|<!-- T -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>SY5PvZrJhLE</youtube>
| |
| − | <b>How Artificial Intelligence Changed the Future of Publishing | [[OpenAI]] GPT-3 and the Future of Books
| |
| − | </b><br>Go from content chaos to clear, compelling writing that influences people to act without them realizing it: https://bit.ly/thebestwaytosayit As Ed Leon Klinger shows in his GPT 3 demo and GPT 3 examples thread
| |
| − | |}
| |
| − | |<!-- M -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>pXOlc5CBKT8</youtube>
| |
| − | <b>GPT-3: Language Models are Few-Shot Learners (Paper Explained)
| |
| − | </b><br>How far can you go with ONLY language modeling? Can a large enough language model perform [[Natural Language Processing (NLP)]] task out of the box? [[OpenAI]] take on these and other questions by training a transformer that is an order of magnitude larger than anything that has ever been built before and the results are astounding.
| |
| − | |}
| |
| − | |}<!-- B -->
| |
| − | {|<!-- T -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>_8yVOC4ciXc</youtube>
| |
| − | <b>GPT3: An Even Bigger Language Model - Computerphile
| |
| − | </b><br>Basic mathematics from a language model? Rob Miles on GPT3, where it seems like size does matter! More from Rob Miles: https://bit.ly/Rob_Miles_YouTube This video was filmed and edited by Sean Riley. Computer Science at the University of Nottingham: https://bit.ly/nottscomputer
| |
| − | |}
| |
| − | |<!-- M -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>OznMk5Jexu8</youtube>
| |
| − | <b>GPT-3 from [[OpenAI]] is here and it's a MONSTER!
| |
| − | </b><br>GPT-3 is the largest language model to date with 175 billion parameters. It is able to do various [[Natural Language Processing (NLP)]] tasks (translation, question answering) without additional finetuning.
| |
| − | |}
| |
| − | |}<!-- B -->
| |
| − | {|<!-- T -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>kpiY_LemaTc</youtube>
| |
| − | <b>GPT-3 vs Human Brain
| |
| − | </b><br>GPT-3 has 175 billion parameters/synapses. Human brain has 100 trillion synapses. How much will it cost to train a language model the size of the human brain? REFERENCES:
| |
| − |
| |
| − | [1] [https://arxiv.org/abs/2005.14165 GPT-3 paper: Language Models are Few-Shot Learners]
| |
| − |
| |
| − | [2] [https://lambdalabs.com/blog/demystifying-gpt-3/ OpenAI's GPT-3 Language Model: A Technical Overview]
| |
| − |
| |
| − | [3] [https://arxiv.org/abs/2005.04305 Measuring the Algorithmic Efficiency of Neural Networks]
| |
| − | |}
| |
| − | |<!-- M -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>0ZVOmBp29E0</youtube>
| |
| − | <b>Steve Omohundro on GPT-3
| |
| − | </b><br>In this research meeting, guest Stephen Omohundro gave a fascinating talk on GPT-3, the new massive [[OpenAI]] Natural Language Processing model. He reviewed the network architecture, training process, and results in the context of past work. There was extensive discussion on the implications for [[Natural_Language_Processing_(NLP)]] and for Machine Intelligence / AGI.
| |
| − | |}
| |
| − | |}<!-- B -->
| |
| − | {|<!-- T -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>8psgEDhT1MM</youtube>
| |
| − | <b>GPT 3 Demo and Explanation - An AI revolution from [[OpenAI]]
| |
| − | </b><br>GPT 3 can write poetry, translate text, chat convincingly, and answer abstract questions. It's being used to code, design and much more. I'll give you a demo of some of the latest in this technology and some of how it works. GPT3 comes from a company called [[OpenAI]]. [[OpenAI]] was founded by Elon Musk and Sam Altman (former president of Y-combinator the startup accelerator). [[OpenAI]] was founded with over a Billion invested to collaborate and create human-level AI for the benefit of society. GPT 3 has been developed for a number of years. One of the early papers published was on Generative Pre-Training. The idea behind generative pre-training (GPT) is that while most AI's are trained on labeled data, there's a ton of data that isn't labeled. If you can evaluate the words and use them to train and tune the AI it can start to create predictions of future text on the unlabeled data. You repeat the process until predictions start to converge. The newest GPT is able to do a ton. Some of the demos include: - GPT 3 demo of how to design a user interface using AI - GPT 3 demo of how to code a react application using AI - GPT 3 demo of an excel plug-in to fill data using AI - GPT 3 demo of a search engine/answer engine using AI - GPT3 demo of command line auto-complete from English to shell commands
| |
| − | |}
| |
| − | |<!-- M -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>aDFLp4A1EmY</youtube>
| |
| − | <b>Panel discussion - GPT-3 and Artificial General Intelligence 27 Aug 2020
| |
| − | </b><br>Is GPT-3 a step towards creating artificial general intelligence? Chair: Associate Professor Kate Devitt - Chief Scientist, Trusted Autonomous Systems
| |
| − |
| |
| − | Panel:
| |
| − | • Professor David Chalmers (NYU)
| |
| − | • Professor Susan Schneider (NASA and Florida Atlantic University)
| |
| − | • Professor Marcus Hutter (ANU)
| |
| − |
| |
| − | A philosophical discussion on the [[development]] of artificial intelligence and specifically advances in Generative Pre-trained Transformer-3 (GPT-3). GPT-3 is an auto-complete algorithm created by OpenAI as part of their endeavour to develop artificial general intelligence. GPT-3 is the third in a series of autocomplete tools designed by OpenAI. (GPT stands for “generative pre-trained transformer.”). GPT-3 is fed on an unimaginatively large corpus of human knowledge including all of Wikipedia, millions of books, websites and other materials including philosophy texts. In fact, any type of information uploaded to the internet is possible food for GPT-3's artificial mind to dwell on. The result? Eerily coherent, complex and interesting thoughts about almost any topic. The sophisticated, nuanced text produced by GPT-3 seems to pass the Turing Test for many--including philosophers. Some of GPT-3's answers are shedding new light on enduring philosophical questions. Is GPT-3 the beginnings of an artificial general intelligence. Does it create ideas like a human mind, or even better than a human mind? Is human cognition similarly some sort of autocomplete program in our brains? Is it possible that GPT-3 one day becomes consciousness or is it already conscious?--How could we tell. If an AI passes our tests for consciousness, do we then have an obligation to accord it rights? If so, what sorts of rights might it deserve. Independently of rights, how should humans manage an AI that has access to everything that is posited and known and can trick humans into believing that another rational [[Agents|agent]] is communicating with them? The panel considers what GPT-3 tell us about the ambition to build an artificial general intelligence, consciousness, human thought and how we should treat AI in an increasingly digital and disembodied world rife with mis- and disinformation.
| |
| − | |}
| |
| − | |}<!-- B -->
| |
| − |
| |
| − |
| |
| − |
| |
| | | | |
| | == <span id="GPT Impact to Development"></span>GPT Impact to Development == | | == <span id="GPT Impact to Development"></span>GPT Impact to Development == |
| Line 396: |
Line 138: |
| | |}<!-- B --> | | |}<!-- B --> |
| | | | |
| − | = Generative Pre-trained Transformer 2 (GPT-2) = | + | = <span id="Custom GPTs"></span>Custom GPTs = |
| − | * GitHub | + | * [[Agents]] ... [[Robotic Process Automation (RPA)|Robotic Process Automation]] ... [[Assistants]] ... [[Personal Companions]] ... [[Personal Productivity|Productivity]] ... [[Email]] ... [[Negotiation]] ... [[LangChain]] |
| − | ** [https://github.com/openai/gpt-2/blob/master/README.md (117M parameter) version of GPT-2]
| |
| − | ** [https://github.com/openai/gpt-2 openai/gpt-2 GPT-2]
| |
| − | * [https://analyticsindiamag.com/how-to-get-started-with-openais-gpt-2-for-text-generation/ How to Get Started with OpenAIs GPT-2 for Text Generation | Amal Nair - Analytics India Magazine]
| |
| − | * [https://aiweirdness.com/post/182824715257/gpt-2-it-learned-on-the-internet GPT-2: It learned on the Internet | Janelle Shane]
| |
| − | * [https://towardsdatascience.com/too-powerful-nlp-model-generative-pre-training-2-4cc6afb6655 Too powerful NLP model (GPT-2): What is Generative Pre-Training | Edward Ma]
| |
| − | * [https://medium.com/@ajitrajasekharan/gpt-2-a-promising-but-nascent-transfer-learning-method-that-could-reduce-or-even-eliminate-in-some-48ea3370cc21 GPT-2 A nascent transfer learning method that could eliminate supervised learning some NLP tasks | Ajit Rajasekharan - Medium]
| |
| − | * [https://insights.dice.com/2019/02/19/openai-platform-generating-fake-news-wonderful OpenAI Creates Platform for Generating Fake News. Wonderful | Nick Kolakowski - Dice]
| |
| − | * [https://inferkit.com/ InferKit | Adam D King]- completes your text.
| |
| | | | |
| − | == Coding Train Late Night 2 ==
| + | Custom GPTs are personalized versions of AI models like [[ChatGPT]] that can be tailored for specific tasks or projects. They represent a significant advancement in AI implementation, allowing businesses and individuals to customize AI tools to meet unique challenges and operational needs. |
| | | | |
| − | {|<!-- T -->
| + | == <span id="OpenAI Platform"></span>OpenAI Platform == |
| − | | valign="top" |
| + | * [https://chat.openai.com/create OpenAI Platform] |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>0LZUSkwCYfU</youtube>
| |
| − | <b>Coding Train Late Night 2: Fetch, GPT-2 and RunwayML
| |
| − | </b><br>The Coding Train
| |
| − | 0:00 Live Stream Starts
| |
| − | 3:51 Introduction With Dad Jokes
| |
| − | 11:29 Coding Late At Night Projects and Notes
| |
| − | 16:48 Scraping Dad Jokes With Fetch
| |
| − | 50:10 Training a Model With Runway
| |
| − | 57:52 Small Break
| |
| − | 1:00:15 Controlling Hue Lights
| |
| − | 1:20:00 Dad Joke Model
| |
| − | 1:32:27 Skip: Audio Glitch (LOUD)
| |
| − | 1:35:00 Dad Joke Model
| |
| − | 1:49:25 Dad Joke Generator
| |
| − | 1:54:25 Goodbyes and End of Stream
| |
| | | | |
| − | Website: https://thecodingtrain.com/
| + | [[OpenAI]] allows Plus and Enterprise users to create custom GPTs that can browse the web, create images, and run code. Users can upload knowledge files, modify the GPT's appearance, and define its actions |
| − | |}
| |
| − | |<!-- M -->
| |
| − | | valign="top" |
| |
| − | {| class="wikitable" style="width: 550px;"
| |
| − | ||
| |
| − | <youtube>kWsDL-6D-nk</youtube>
| |
| − | <b>Coding Train Late Night 3: GPT-2, Hue Lights, Discord Bot
| |
| − | </b><br>The Coding Train
| |
| − | 0:00 Live Stream Starts
| |
| − | 3:50 Introduction
| |
| − | 9:50 AI Joke Generator
| |
| − | 13:30 Live Stream Notes
| |
| − | 19:50 Generative Text Training with GPT-2
| |
| − | 29:40 Dad Joke Model Training
| |
| − | 1:11:27 Using Hue Lights API
| |
| − | 1:31:50 More Dad Joke Generator
| |
| − | 1:37:33 Discord Bot
| |
| − | 2:15:04 Goodbyes and End of Stream
| |
| | | | |
| − | Website: https://thecodingtrain.com/
| + | === <span id="OpenAI GPT Store"></span>OpenAI GPT Store === |
| − | |}
| + | * [https://chatgpt.com/gpts GPT Store] |
| − | |}<!-- B -->
| |
| | | | |
| − | === r/SubSimulator ===
| + | The [[OpenAI]] GPT Store provides a platform for users to create, share, and monetize their custom GPTs, expanding the capabilities and possibilities of AI assistants like [[ChatGPT]]. It allows users of [[ChatGPT]] Plus to create and share their own custom chatbots, known as GPTs (Generative Pre-trained Transformers). The GPT Store offers a platform for developers to monetize their custom GPTs and provides a wide range of AI tools and capabilities for users to explore and enhance their AI assistant capabilities |
| | | | |
| − | Subreddit populated entirely by AI personifications of other subreddits -- all posts and comments are generated automatically using:
| + | <youtube>2wYcJEcKVPk</youtube> |
| | + | <youtube>amjnJrfByS0</youtube> |
| | + | <youtube>VudB3E9tSbc</youtube> |
| | + | <youtube>SVA-OBl44m4</youtube> |
| | | | |
| − | * [https://www.reddit.com/r/SubredditSimulator/ Markov Chain Model]
| + | === <span id="OpenAI GPT Builder"></span>OpenAI GPT Builder === |
| − | * [https://www.reddit.com/r/SubSimulatorGPT2/ GPT-2 Language]
| |
| | | | |
| − | results in coherent and realistic simulated content.
| + | With the GPT Builder, users can tailor GPTs for specific tasks or topics by combining instructions, knowledge, and capabilities. It enables users to build AI agents without the need for coding skills, making it accessible to a wide range of individuals, including educators, coaches, and anyone interested in building helpful tools. |
| | | | |
| − | === GetBadNews ===
| + | To create a GPT using the GPT Builder, users can access the builder interface through the [[OpenAI]] platform at chat.openai.com/gpts/editor or by selecting "My GPTs" after logging in. The builder interface provides a split screen with a Create panel where users can enter prompts and instructions to build their chatbot, and a Preview panel that allows users to interact with the chatbot as they build it, making it easier to refine and customize the GPT. |
| | | | |
| − | * [https://getbadnews.com Get Bad News] game - Can you beat my score? Play the fake news game! Drop all pretense of [[ethics]] and choose the path that builds your persona as an unscrupulous media magnate. Your task is to get as many followers as you can while
| + | The GPT Builder also offers features such as the ability to add images to the GPT, either by asking the builder to create an image or by uploading custom images. Additionally, GPTs can be granted access to web browsing, [[Video/Image#DALL-E | DALL-E]] (an image generation model), and [[OpenAI]]'s Code Interpreter tool for writing and executing software. The builder interface also includes a Knowledge section where users can upload custom data to enhance the capabilities of their GPTs . |
| | | | |
| − | <img src="https://www.getbadnews.com/wp-content/uploads/2018/02/share-score.png" width="500" height="250"> | + | <youtube>f2uPl2MlV24</youtube> |
| | + | <youtube>SjJsXyBTPUc</youtube> |
| | | | |
| − | = Let's build GPT: from scratch, in code, spelled out | Andrej Karpathy = | + | = <span id="Let's build GPT: from scratch, in code, spelled out - Andrej Karpathy"></span>Let's build GPT: from scratch, in code, spelled out - Andrej Karpathy = |
| | + | * [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Generative AI]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]] |
| | + | * [[Development]] ... [[Notebooks]] ... [[Development#AI Pair Programming Tools|AI Pair Programming]] ... [[Codeless Options, Code Generators, Drag n' Drop|Codeless, Generators, Drag n' Drop]] ... [[Algorithm Administration#AIOps/MLOps|AIOps/MLOps]] ... [[Platforms: AI/Machine Learning as a Service (AIaaS/MLaaS)|AIaaS/MLaaS]] |
| | | | |
| | {|<!-- T --> | | {|<!-- T --> |
| Line 507: |
Line 210: |
| | * 00:34:53 training the bigram model | | * 00:34:53 training the bigram model |
| | * 00:38:00 port our code to a script Building the "self-attention" | | * 00:38:00 port our code to a script Building the "self-attention" |
| − | * 00:42:13 version 1: averaging past context with for loops, the weakest form of aggregation | + | * 00:42:13 version 1: averaging past [[context]] with for loops, the weakest form of aggregation |
| | * 00:47:11 the trick in self-attention: matrix multiply as weighted aggregation | | * 00:47:11 the trick in self-attention: matrix multiply as weighted aggregation |
| | * 00:51:54 version 2: using matrix multiply | | * 00:51:54 version 2: using matrix multiply |