Difference between revisions of "Hugging Face"

From
Jump to: navigation, search
m (Wolfram Plugin)
m
 
(32 intermediate revisions by the same user not shown)
Line 2: Line 2:
 
|title=PRIMO.ai
 
|title=PRIMO.ai
 
|titlemode=append
 
|titlemode=append
|keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, Tensorflow, Google, Nvidia, Microsoft, Azure, Amazon, AWS  
+
|keywords=ChatGPT, artificial, intelligence, machine, learning, GPT-4, GPT-5, NLP, NLG, NLC, NLU, models, data, singularity, moonshot, Sentience, AGI, Emergence, Moonshot, Explainable, TensorFlow, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Hugging Face, OpenAI, Tensorflow, OpenAI, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Meta, LLM, metaverse, assistants, agents, digital twin, IoT, Transhumanism, Immersive Reality, Generative AI, Conversational AI, Perplexity, Bing, You, Bard, Ernie, prompt Engineering LangChain, Video/Image, Vision, End-to-End Speech, Synthesize Speech, Speech Recognition, Stanford, MIT |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools  
+
 
 +
<!-- Google tag (gtag.js) -->
 +
<script async src="https://www.googletagmanager.com/gtag/js?id=G-4GCWLBVJ7T"></script>
 +
<script>
 +
  window.dataLayer = window.dataLayer || [];
 +
  function gtag(){dataLayer.push(arguments);}
 +
  gtag('js', new Date());
 +
 
 +
  gtag('config', 'G-4GCWLBVJ7T');
 +
</script>
 
}}
 
}}
 +
[https://www.youtube.com/results?search_query=ai+Hugging+Face YouTube]
 +
[https://www.quora.com/search?q=ai%20Hugging%20Face...X ... Quora]
 +
[https://www.google.com/search?q=ai+Hugging+Face ...Google search]
 +
[https://news.google.com/search?q=ai+Hugging+Face ...Google News]
 +
[https://www.bing.com/news/search?q=ai+Hugging+Face...X&qft=interval%3d%228%22 ...Bing News]
  
[https://www.youtube.com/results?search_query=Hugging+Face+deep+machine+learning+ML+artificial+intelligence YouTube search...]
+
* [[Development]] ... [[Notebooks]] ... [[Development#AI Pair Programming Tools|AI Pair Programming]] ... [[Codeless Options, Code Generators, Drag n' Drop|Codeless]] ... [[Hugging Face]] ... [[Algorithm Administration#AIOps/MLOps|AIOps/MLOps]] ... [[Platforms: AI/Machine Learning as a Service (AIaaS/MLaaS)|AIaaS/MLaaS]]
[https://www.google.com/search?q=Hugging+Face+deep+machine+learning+ML+artificial+intelligence ...Google search]
+
* [https://huggingface.co/ Hugging Face] ... The AI community building the future
 
+
* [https://huggingface.co/models Models | Hugging Face] ... click on Sort: Trending
* [https://huggingface.co/ Hugging Face]
+
* [[Embedding]] ... [[Fine-tuning]] ... [[Retrieval-Augmented Generation (RAG)|RAG]] ... [[Agents#AI-Powered Search|Search]] ... [[Clustering]] ... [[Recommendation]] ... [[Anomaly Detection]] ... [[Classification]] ... [[Dimensional Reduction]].  [[...find outliers]]
 
* [[Platforms: AI/Machine Learning as a Service (AIaaS/MLaaS)]]
 
* [[Platforms: AI/Machine Learning as a Service (AIaaS/MLaaS)]]
 
* [https://www.youtube.com/watch?v=00GKzGyWFEs&list=PLo2EIpI_JMQvWfQndUesu0nPBAtZ9gP1o Hugging Face course]
 
* [https://www.youtube.com/watch?v=00GKzGyWFEs&list=PLo2EIpI_JMQvWfQndUesu0nPBAtZ9gP1o Hugging Face course]
* [[Reinforcement Learning (RL) from Human Feedback (RLHF)]]
+
* [https://bytexd.com/what-is-hugging-face-beginners-guide What is Hugging Face - A Beginner's Guide | ByteXD] ... allows users to share machine learning models and datasets
* [https://www.topbots.com/pretrain-transformers-models-in-pytorch/ Pretrain Transformers Models in PyTorch Using Hugging Face Transformers | George Mihaila - TOPBOTS]
 
* [https://www.together.xyz/blog/openchatkit OpenChatKit | TogetherCompute] ... The first open-source [[ChatGPT]] alternative released; a 20B chat-GPT model under the Apache-2.0 license, which is available for free on Hugging Face.
 
** [https://laion.ai/ LAION]
 
** [https://huggingface.co/ontocord Ontocord]
 
 
 
 
 
 
 
  
 
Hugging Face is an American company that develops tools for building applications using machine learning. It is most notable for its transformers library built for natural language processing applications and its platform that allows users to share machine learning models and datasets. Hugging Face is a community and a platform for artificial intelligence and data science that aims to democratize AI knowledge and assets used in AI models. The platform allows users to build, train and deploy state of the art models powered by open source machine learning. It also provides a place where a broad community of data scientists, researchers, and ML engineers can come together and share ideas, get support and contribute to open source projects. Is there anything else you would like to know? - [https://en.wikipedia.org/wiki/Hugging_Face Wikipedia]
 
Hugging Face is an American company that develops tools for building applications using machine learning. It is most notable for its transformers library built for natural language processing applications and its platform that allows users to share machine learning models and datasets. Hugging Face is a community and a platform for artificial intelligence and data science that aims to democratize AI knowledge and assets used in AI models. The platform allows users to build, train and deploy state of the art models powered by open source machine learning. It also provides a place where a broad community of data scientists, researchers, and ML engineers can come together and share ideas, get support and contribute to open source projects. Is there anything else you would like to know? - [https://en.wikipedia.org/wiki/Hugging_Face Wikipedia]
  
 
Source: Conversation with Bing, 4/14/2023
 
(1) What is Hugging Face - A Beginner's Guide - ByteXD. https://bytexd.com/what-is-hugging-face-beginners-guide/.
 
(2) Hugging Face – The AI community building the future.. https://huggingface.co/.
 
(3) What's Hugging Face? An AI community for sharing ML models and datasets .... https://towardsdatascience.com/whats-hugging-face-122f4e7eb11a.
 
(4) Private Hub - Hugging Face. https://huggingface.co/platform.
 
(5) Hugging Face Hub documentation. https://huggingface.co/docs/hub/main.
 
(6) What is Hugging Face - A Beginner's Guide - ByteXD. https://bytexd.com/what-is-hugging-face-beginners-guide/.
 
allows users to share machine learning models and datasets.
 
 
<youtube>eqOSQeQNqaw</youtube>
 
 
<youtube>agwbNgxwkHc</youtube>
 
<youtube>agwbNgxwkHc</youtube>
 
<youtube>QEaBAZQCtwE</youtube>
 
<youtube>QEaBAZQCtwE</youtube>
<youtube>8xYYvO7LGBw</youtube>
 
  
= Wolfram ChatGPT =
+
= <span id="Hugging Face's Research"></span>Hugging Face's Research =  
[https://www.youtube.com/results?search_query=ai+Wolfram+ChatGPT YouTube]
+
<youtube>eqOSQeQNqaw</youtube>
[https://www.quora.com/search?q=ai%20Wolfram%20ChatGPT ... Quora]
 
[https://www.google.com/search?q=ai+Wolfram+ChatGPT ...Google search]
 
[https://news.google.com/search?q=ai+Wolfram+ChatGPT ...Google News]
 
[https://www.bing.com/news/search?q=ai+Wolfram+ChatGPT&qft=interval%3d%228%22 ...Bing News]
 
  
* [https://huggingface.co/spaces/JavaFXpert/Chat-GPT-LangChain Wolfram ChatGPT] ... [[Generative Pre-trained Transformer (GPT) | GPT]] + [https://www.wolframalpha.com/ WolframAlpha] + [[Speech Recognition#Whisper| Whisper]]
 
demonstrates a conversational agent implemented with OpenAI GPT-3.5 and LangChain. When necessary, it leverages tools for complex math, searching the internet, and accessing news and weather. Uses talking heads from Ex-Human.
 
  
<youtube>z5WZhCBRDpU</youtube>
+
= Hugging Face Community =
<youtube>EOQV9VakBgE</youtube>
+
* [https://towardsdatascience.com/whats-hugging-face-122f4e7eb11a What's Hugging Face? An AI community for sharing ML models and datasets]
 +
* [[Agents#HuggingGPT|HuggingGPT]]  ... in partnership with [[Microsoft]]
 +
* [https://www.topbots.com/pretrain-transformers-models-in-pytorch/ Pretrain Transformers Models in PyTorch Using Hugging Face Transformers | George Mihaila - TOPBOTS]
 +
* [https://www.together.xyz/blog/openchatkit OpenChatKit | TogetherCompute] ... The first open-source [[ChatGPT]] alternative released; a 20B chat-GPT model under the Apache-2.0 license, which is available for free on Hugging Face.
 +
** [https://laion.ai/ LAION]
 +
** [https://huggingface.co/ontocord Ontocord]
 +
** [[Wolfram]] ChatGPT
 +
** [[FLAN-T5 LLM]]
  
== Wolfram Plugin ==
+
Their platform is home to a large community of developers and researchers who work together to solve problems in audio, vision, and language with AI.
* [https://writings.stephenwolfram.com/2023/03/chatgpt-gets-its-wolfram-superpowers/ ChatGPT Gets Its “Wolfram Superpowers”!]
 
Wolfram plugin that will give them access to “computation, math, curated knowledge and real-time data” from Wolfram Alpha and Wolfram Language. When you ask the AI bot a question, it will automatically draw information from Wolfram. Wolfram showed [[ChatGPT]] being able to do mathematical integration, graph plotting, question-answering and even visual Jupiter’s moons. Firstly, when you send ChatGPT a query, it will reword it and send it over to Wolfram to compute. Wolfram will then respond, giving ChatGPT the answer. Finally, it is now down to ChatGPT’s advanced processing techniques to reiterate the answer back to you.  
 
  
<youtube>z5WZhCBRDpU</youtube>
+
== <span id="Hugging Face's Open-source Library"></span>Hugging Face's Open-source Library ==
<youtube>EOQV9VakBgE</youtube>
+
Hugging Face's open-source library, Transformers, is widely used for [[Natural Language Processing (NLP)]]  tasks. The company also offers an Inference API that allows developers to serve their models directly from Hugging Face infrastructure and run large scale [[[[Natural Language Processing (NLP)|NLP]] models in milliseconds with just a few lines of code. Hugging Face offers a wide range of machine learning models and datasets, as well as tools for building, training, and deploying state-of-the-art models.
  
= <span id="HuggingGPT"></span>HuggingGPT =
+
== <span id="Private Hub"></span>Private Hub ==
[https://www.youtube.com/results?search_query=ai+HuggingGPT YouTube]
+
* [https://huggingface.co/platform Private Hub]
[https://www.quora.com/search?q=ai%20HuggingGPT ... Quora]
+
** [https://huggingface.co/docs/hub/main Hugging Face Hub documentation]
[https://www.google.com/search?q=ai+HuggingGPT ...Google search]
 
[https://news.google.com/search?q=ai+HuggingGPT ...Google News]
 
[https://www.bing.com/news/search?q=ai+HuggingGPT&qft=interval%3d%228%22 ...Bing News]
 
  
 +
== <span id="LightGPT"></span>LightGPT ==
 +
* [https://huggingface.co/amazon/LightGPT amazon/LightGPT]
 +
* [https://huggingface.co/amazon/LightGPT/blob/main/README.md  README.md · amazon/LightGPT]
 +
* [https://huggingface.co/EleutherAI/gpt-j-6b EleutherAI/gpt-j-6b] 
 +
* [https://en.wikipedia.org/wiki/GPT-J  GPT-J | Wikipedia]
 +
* [https://huggingface.co/blog/gptj-sagemaker  Deploy GPT-J 6B for inference using Hugging Face Transformers]
 +
* [https://betterprogramming.pub/fine-tuning-gpt-j-6b-on-google-colab-or-equivalent-desktop-or-server-gpu-b6dc849cb205 Fine-tuning GPT-J 6B on Google Colab or Equivalent Desktop or Server]
  
* [https://arxiv.org/abs/2303.17580 HuggingGPT: Solving AI Tasks with ChatGPT and its Friends in HuggingFace | Y. Shen, K. Song, X. Tan, D. Li, W. Lu, & Y. Zhuang]
+
LightGPT is a language model developed by AWS Contributors. It is based on GPT-J 6B and was instruction fine-tuned on the high-quality, Apache-2.0 licensed OIG-small-chip instruction dataset with ~200K training examples. The model is designed to generate text based on a given instruction, and it can be deployed to [[Amazon]] [[SageMaker]]
* [[Assistants]] ... [[Hybrid Assistants]]  ... [[Agents]]  ... [[Negotiation]] ... [[Hugging_Face#HuggingGPT|HuggingGPT]] ... [[LangChain]]
 
* [https://www.marktechpost.com/2023/04/07/meet-hugginggpt-a-framework-that-leverages-llms-to-connect-various-ai-models-in-machine-learning-communities-hugging-face-to-solve-ai-tasks/ HuggingGPT] ... A Framework That Leverages LLMs to Connect Various AI Models in Machine Learning Communities Hugging Face to Solve AI Tasks
 
* [https://huggingface.co/spaces/microsoft/HuggingGPT JARVIS] [[Microsoft]]
 
  
HuggingGPT is a framework that leverages [[Large Language Model (LLM)]] such as [[ChatGPT]] to connect various AI models in machine learning communities like Hugging Face to solve AI tasks. Solving complicated AI tasks with different domains and modalities is a key step toward advanced artificial intelligence. While there are abundant AI models available for different domains and modalities, they cannot handle complicated AI tasks. Considering [[Large Language Model (LLM)]] have exhibited exceptional ability in language understanding, generation, interaction, and reasoning, we advocate that [[Large Language Model (LLM)|LLMs]] could act as a controller to manage existing AI models to solve complicated AI tasks and language could be a generic interface to empower this. Based on this philosophy, we present HuggingGPT, a framework that leverages [[Large Language Model (LLM)|LLMs]] (e.g., [[ChatGPT]]) to connect various AI models in machine learning communities (e.g., Hugging Face) to solve AI tasks. Specifically, we use [[ChatGPT]] to conduct task planning when receiving a user request, select models according to their function descriptions available in Hugging Face, execute each subtask with the selected AI model, and summarize the response according to the execution results. The workflow of this system consists of four stages:
+
GPT-J 6B is a transformer model trained using Ben Wang's Mesh Transformer JAX. "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters. The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary Position [[Embedding]] (RoPE) is applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as GPT-2/GPT-3. GPT-J learns an inner representation of the English language that can be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating text from a prompt. GPT-J-6B is not intended for deployment without [[fine-tuning]], supervision, and/or moderation. It is not a product in itself and cannot be used for human-facing interactions. For example, the model may generate harmful or offensive text. Please evaluate the risks associated with your particular use case.
  
# Task Planning: Using [[ChatGPT]] to analyze the requests of users to understand their intention, and disassemble them into possible solvable tasks via prompts.
+
== <span id="Whisper"></span>Whisper ==
# Model Selection: To solve the planned tasks, [[ChatGPT]] selects expert models that are hosted on Hugging Face based on model descriptions.
+
<youtube>8xYYvO7LGBw</youtube>
# Task Execution: Invoke and execute each selected model, and return the results to [[ChatGPT]].
 
# Response Generation: Finally, using [[ChatGPT]] to integrate the prediction of all models, and generate answers for users.
 
 
 
 
 
<img src="https://miro.medium.com/v2/resize:fit:828/format:webp/1*iS4cHbQAXv-43Pi1EJxK8A.png" width="1000">
 
 
 
<youtube>PfY9lVtM_H0</youtube>
 
<youtube>3_5FRLYS-2A</youtube>
 
<youtube>Rog9oHtVmjM</youtube>
 
 
 
= Hugging Face NLP Library - Open Parallel Corpus (OPUS) =
 
* [https://venturebeat.com/2020/05/16/hugging-face-dives-into-machine-translation-with-release-of-1000-models/ [[Hugging Face]] dives into machine translation with release of 1,000 models | Khari Johnson - VentureBeat]
 
* [https://towardsdatascience.com/build-your-own-machine-translation-service-with-transformers-d0709df0791b Build Your Own Machine Translation Service with Transformers Using the latest Helsinki NLP models available in the Transformers library to create a standardized machine translation service | Kyle Gallatin - Towards Data Science]
 
[[Hugging Face]]
 
 
 
OPUS is a growing collection of translated texts from the web. In the OPUS project we try to convert and align free online data, to add linguistic annotation, and to provide the community with a publicly available parallel corpus. OPUS is a project undertaken by the University of Helsinki and global partners to gather and open-source a wide variety of language data sets.
 
OPUS is based on open source products and the corpus is also delivered as an open content package. We used several tools to compile the current collection. All pre-processing is done automatically. No manual corrections have been carried out. The OPUS collection is growing!
 
[https://opus.nlpl.eu/ ... OPUS the open parallel corpus]
 
 
 
<youtube>G3pOvrKkFuk</youtube>
 
<youtube>G5lmya6eKtc</youtube>
 

Latest revision as of 20:15, 26 April 2024

YouTube ... Quora ...Google search ...Google News ...Bing News

Hugging Face is an American company that develops tools for building applications using machine learning. It is most notable for its transformers library built for natural language processing applications and its platform that allows users to share machine learning models and datasets. Hugging Face is a community and a platform for artificial intelligence and data science that aims to democratize AI knowledge and assets used in AI models. The platform allows users to build, train and deploy state of the art models powered by open source machine learning. It also provides a place where a broad community of data scientists, researchers, and ML engineers can come together and share ideas, get support and contribute to open source projects. Is there anything else you would like to know? - Wikipedia

Hugging Face's Research


Hugging Face Community

Their platform is home to a large community of developers and researchers who work together to solve problems in audio, vision, and language with AI.

Hugging Face's Open-source Library

Hugging Face's open-source library, Transformers, is widely used for Natural Language Processing (NLP) tasks. The company also offers an Inference API that allows developers to serve their models directly from Hugging Face infrastructure and run large scale [[NLP models in milliseconds with just a few lines of code. Hugging Face offers a wide range of machine learning models and datasets, as well as tools for building, training, and deploying state-of-the-art models.

Private Hub

LightGPT

LightGPT is a language model developed by AWS Contributors. It is based on GPT-J 6B and was instruction fine-tuned on the high-quality, Apache-2.0 licensed OIG-small-chip instruction dataset with ~200K training examples. The model is designed to generate text based on a given instruction, and it can be deployed to Amazon SageMaker

GPT-J 6B is a transformer model trained using Ben Wang's Mesh Transformer JAX. "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters. The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary Position Embedding (RoPE) is applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as GPT-2/GPT-3. GPT-J learns an inner representation of the English language that can be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating text from a prompt. GPT-J-6B is not intended for deployment without fine-tuning, supervision, and/or moderation. It is not a product in itself and cannot be used for human-facing interactions. For example, the model may generate harmful or offensive text. Please evaluate the risks associated with your particular use case.

Whisper