Difference between revisions of "Transfer Learning"

From
Jump to: navigation, search
m
m
(16 intermediate revisions by the same user not shown)
Line 2: Line 2:
 
|title=PRIMO.ai
 
|title=PRIMO.ai
 
|titlemode=append
 
|titlemode=append
|keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, Tensorflow, Google, Nvidia, Microsoft, Azure, Amazon, AWS  
+
|keywords=ChatGPT, artificial, intelligence, machine, learning, GPT-4, GPT-5, NLP, NLG, NLC, NLU, models, data, singularity, moonshot, Sentience, AGI, Emergence, Moonshot, Explainable, TensorFlow, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Hugging Face, OpenAI, Tensorflow, OpenAI, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Meta, LLM, metaverse, assistants, agents, digital twin, IoT, Transhumanism, Immersive Reality, Generative AI, Conversational AI, Perplexity, Bing, You, Bard, Ernie, prompt Engineering LangChain, Video/Image, Vision, End-to-End Speech, Synthesize Speech, Speech Recognition, Stanford, MIT |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools  
+
 
 +
<!-- Google tag (gtag.js) -->
 +
<script async src="https://www.googletagmanager.com/gtag/js?id=G-4GCWLBVJ7T"></script>
 +
<script>
 +
  window.dataLayer = window.dataLayer || [];
 +
  function gtag(){dataLayer.push(arguments);}
 +
  gtag('js', new Date());
 +
 
 +
  gtag('config', 'G-4GCWLBVJ7T');
 +
</script>
 
}}
 
}}
 
[http://www.youtube.com/results?search_query=Transfer+Learning+machine+neural+network YouTube search...]
 
[http://www.youtube.com/results?search_query=Transfer+Learning+machine+neural+network YouTube search...]
 
[http://www.google.com/search?q=Transfer+Learning+deep+machine+learning+ML ...Google search]
 
[http://www.google.com/search?q=Transfer+Learning+deep+machine+learning+ML ...Google search]
  
* [[Capabilities]]  
+
* [[In-Context Learning (ICL)]] ... [[Context]] ... [[Causation vs. Correlation]] ... [[Autocorrelation]] ... [[Out-of-Distribution (OOD) Generalization]] ... [[Transfer Learning]]
** [[Video/Image]] ... [[Vision]] ... [[Colorize]] ... [[Image/Video Transfer Learning]]
+
* [[Video/Image]] ... [[Vision]] ... [[Enhancement]] ... [[Fake]] ... [[Reconstruction]] ... [[Colorize]] ... [[Occlusions]] ... [[Predict image]] ... [[Image/Video Transfer Learning]]
** [[End-to-End Speech]] ... [[Synthesize Speech]] ... [[Speech Recognition]]
+
* [[End-to-End Speech]] ... [[Synthesize Speech]] ... [[Speech Recognition]] ... [[Music]]
 
* [[Learning Techniques]]
 
* [[Learning Techniques]]
 
** [[Text Transfer Learning]]
 
** [[Text Transfer Learning]]
** [[Image/Video Transfer Learning]]
 
 
** [[Style Transfer]]Video/Image
 
** [[Style Transfer]]Video/Image
 +
** [[Imitation Learning (IL)]]
 +
** [[Apprenticeship Learning - Inverse Reinforcement Learning (IRL)]]
 
* [[Transfer Learning With Keras]]
 
* [[Transfer Learning With Keras]]
* [[Singularity]] ... [[Moonshots]] ... [[Emergence]] ... [[Explainable / Interpretable AI]] ... [[Artificial General Intelligence (AGI)| AGI]] ... [[Inside Out - Curious Optimistic Reasoning]] ... [[Algorithm Administration#Automated Learning|Automated Learning]]
+
* [[Artificial General Intelligence (AGI) to Singularity]] ... [[Inside Out - Curious Optimistic Reasoning| Curious Reasoning]] ... [[Emergence]] ... [[Moonshots]] ... [[Explainable / Interpretable AI|Explainable AI]] ... [[Algorithm Administration#Automated Learning|Automated Learning]]
* [[Generative AI]] ... [[Conversational AI]] ... [[OpenAI]]'s [[ChatGPT]] ... [[Perplexity]] ... [[Microsoft]]'s [[Bing]] ... [[You]] ...[[Google]]'s [[Bard]] ... [[Baidu]]'s [[Ernie]]
+
* [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Generative AI]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]]
 +
* [[Conversational AI]] ... [[ChatGPT]] | [[OpenAI]] ... [[Bing/Copilot]] | [[Microsoft]] ... [[Gemini]] | [[Google]] ... [[Claude]] | [[Anthropic]] ... [[Perplexity]] ... [[You]] ... [[phind]] ... [[Ernie]] | [[Baidu]]
 +
* [[Large Language Model (LLM)]] ... [[Natural Language Processing (NLP)]] ...[[Natural Language Generation (NLG)|Generation]] ... [[Natural Language Classification (NLC)|Classification]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding]] ... [[Language Translation|Translation]] ... [[Natural Language Tools & Services|Tools & Services]]
 +
* [[FLAN-T5 LLM]] ... “Text-To-Text Transfer [[Transformer]]” ... FLAN stands for “Fine-tuned LAnguage Net”
 
* [http://hal.archives-ouvertes.fr/hal-01575126/document A Review of Transfer Learning Algorithms | Mohsen Kaboli]
 
* [http://hal.archives-ouvertes.fr/hal-01575126/document A Review of Transfer Learning Algorithms | Mohsen Kaboli]
 
* [http://venturebeat.com/2019/10/22/deepmind-transfers-cube-stacking-skills-from-simulation-to-physical-robot/ DeepMind transfers cube-stacking skills from simulation to physical robot. | Kyle Wiggers - VentureBeat]  
 
* [http://venturebeat.com/2019/10/22/deepmind-transfers-cube-stacking-skills-from-simulation-to-physical-robot/ DeepMind transfers cube-stacking skills from simulation to physical robot. | Kyle Wiggers - VentureBeat]  
Line 31: Line 44:
  
 
Transfer learning is a type of learning where a model is first trained on one task, then some or all of the model is used as the starting point for a related task. It is a useful approach on problems where there is a task related to the main task of interest and the related task has a large amount of data. It is different from multi-task learning as the tasks are learned sequentially in transfer learning, whereas multi-task learning seeks good performance on all considered tasks by a single model at the same time in parallel.An example is image classification, where a predictive model, such as an artificial neural network, can be trained on a large corpus of general images, and the weights of the model can be used as a starting point when training on a smaller more specific dataset, such as dogs and cats. The features already learned by the model on the broader task, such as extracting lines and patterns, will be helpful on the new related task.As noted, transfer learning is particularly useful with models that are incrementally trained and an existing model can be used as a starting point for continued training, such as deep learning networks. [http://machinelearningmastery.com/types-of-learning-in-machine-learning/ 14 Different Types of Learning in Machine Learning | Jason Brownlee - Machine Learning Mastery]  
 
Transfer learning is a type of learning where a model is first trained on one task, then some or all of the model is used as the starting point for a related task. It is a useful approach on problems where there is a task related to the main task of interest and the related task has a large amount of data. It is different from multi-task learning as the tasks are learned sequentially in transfer learning, whereas multi-task learning seeks good performance on all considered tasks by a single model at the same time in parallel.An example is image classification, where a predictive model, such as an artificial neural network, can be trained on a large corpus of general images, and the weights of the model can be used as a starting point when training on a smaller more specific dataset, such as dogs and cats. The features already learned by the model on the broader task, such as extracting lines and patterns, will be helpful on the new related task.As noted, transfer learning is particularly useful with models that are incrementally trained and an existing model can be used as a starting point for continued training, such as deep learning networks. [http://machinelearningmastery.com/types-of-learning-in-machine-learning/ 14 Different Types of Learning in Machine Learning | Jason Brownlee - Machine Learning Mastery]  
 
 
  
  
Line 44: Line 55:
 
<youtube>FQM13HkEfBk</youtube>
 
<youtube>FQM13HkEfBk</youtube>
 
<youtube>KHobCqYS4C4</youtube>
 
<youtube>KHobCqYS4C4</youtube>
 
= [[FLAN-T5 LLM]] =
 
<b>T5</b> stands for “Text-To-Text Transfer [[Transformer]]” . It is a model developed by [[Google]] Research that converts every language problem into a text-to-text format. T5 is an [[Autoencoder (AE) / Encoder-Decoder|encoder-decoder model]] pre-trained on a multi-task mixture of [[unsupervised]] and [[supervised]] tasks and for which each task is converted into a text-to-text format. It was presented in a paper by Google called "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer". T5 works well on a variety of tasks out-of-the-box by prepending a different prefix to the input corresponding to each task. [[Large Language Model (LLM)]] are AI models that have been trained on large amounts of text data and can generate human-like text. - [https://huggingface.co/docs/transformers/model_doc/t5 T5] | [[Hugging Face]]
 
 
<b>FLAN-T5</b> is an improved version of T5 with some architectural tweaks. FLAN stands for “Fine-tuned LAnguage Net”.  It was developed by [[Google]] Research and is pre-trained on C4 only without mixing in the [[supervised]] tasks. FLAN-T5 is designed to be highly customizable, allowing developers to fine-tune it to meet their specific needs. This means that developers can adjust the model’s parameters and architecture to better fit the data and task at hand. This can result in improved performance and accuracy on specific tasks. For example, a developer could fine-tune FLAN-T5 on a specific dataset to improve its performance on a particular language translation task. This flexibility makes FLAN-T5 a powerful tool for [[Natural Language Processing (NLP)]] tasks.
 
 
<b>FLAN-T5-XXL</b> is 11b
 
 
 
<img src="https://s3.amazonaws.com/moonup/production/uploads/1666363435475-62441d1d9fdefb55a0b7d12c.png" width="1000">
 
 
 
<youtube>oCU97mnl494</youtube>
 
<youtube>SHMsdAPo2Ls</youtube>
 
<youtube>_Qf_SiCLzw4</youtube>
 
<youtube>jgKj-7v2UYU</youtube>
 

Revision as of 11:18, 16 March 2024

YouTube search... ...Google search


Who is the predator? Who is the prey? Have you ever seen a live dinosaur? Not ever seeing a live dinosaur and knowing the predator/prey answers is 'transfer' learning.

Transfer learning (aka Knowledge Transfer, Learning to Learn) is a machine learning technique where a model trained on one task is re-purposed on a second related task. Transfer learning is an optimization that allows rapid progress or improved performance when modeling the second task. A Gentle Introduction to Transfer Learning for Deep Learning | Jason Brownlee - Machine Learning Mastery

Transfer learning is a type of learning where a model is first trained on one task, then some or all of the model is used as the starting point for a related task. It is a useful approach on problems where there is a task related to the main task of interest and the related task has a large amount of data. It is different from multi-task learning as the tasks are learned sequentially in transfer learning, whereas multi-task learning seeks good performance on all considered tasks by a single model at the same time in parallel.An example is image classification, where a predictive model, such as an artificial neural network, can be trained on a large corpus of general images, and the weights of the model can be used as a starting point when training on a smaller more specific dataset, such as dogs and cats. The features already learned by the model on the broader task, such as extracting lines and patterns, will be helpful on the new related task.As noted, transfer learning is particularly useful with models that are incrementally trained and an existing model can be used as a starting point for continued training, such as deep learning networks. 14 Different Types of Learning in Machine Learning | Jason Brownlee - Machine Learning Mastery