Difference between revisions of "Lifelong Learning"

From
Jump to: navigation, search
Line 8: Line 8:
 
[http://www.google.com/search?q=lifelong+learning+Catastrophic+Forgetting+Multi+model ...Google search]
 
[http://www.google.com/search?q=lifelong+learning+Catastrophic+Forgetting+Multi+model ...Google search]
  
 +
* [[Learning Techniques]]
 +
** [[Reinforcement Learning (RL)]]
 +
** [[Transfer Learning]]
 
* [http://www.darpa.mil/program/lifelong-learning-machines Lifelong Learning Machines (L2M) | DARPA]
 
* [http://www.darpa.mil/program/lifelong-learning-machines Lifelong Learning Machines (L2M) | DARPA]
 
* [http://www.darpa.mil/news-events/2019-03-12 Progress on Lifelong Learning Machines Shows Potential for Bio-Inspired Algorithms | USC & DARPA]
 
* [http://www.darpa.mil/news-events/2019-03-12 Progress on Lifelong Learning Machines Shows Potential for Bio-Inspired Algorithms | USC & DARPA]
 
* [http://www.darpa.mil/news-events/2018-05-03 Researchers Selected to Develop Novel Approaches to Lifelong Machine Learning | DARPA]
 
* [http://www.darpa.mil/news-events/2018-05-03 Researchers Selected to Develop Novel Approaches to Lifelong Machine Learning | DARPA]
* [[Learning Techniques]]
 
** [[Reinforcement Learning (RL)]]
 
** [[Transfer Learning]]
 
  
 
In recent years, researchers have developed deep neural networks that can perform a variety of tasks, including visual recognition and natural language processing (NLP) tasks. Although many of these models achieved remarkable results, they typically only perform well on one particular task due to what is referred to as "catastrophic forgetting."  Essentially, catastrophic forgetting means that when a model that was initially trained on task A is later trained on task B, its performance on task A will significantly decline. [http://techxplore.com/news/2019-03-approach-multi-model-deep-neural-networks.html A new approach to overcome multi-model forgetting in deep neural networks] and [https://techxplore.com/news/2019-03-memory-approach-enable-lifelong.html A generative memory approach to enable lifelong reinforcement learning] | Ingrid Fadelli
 
In recent years, researchers have developed deep neural networks that can perform a variety of tasks, including visual recognition and natural language processing (NLP) tasks. Although many of these models achieved remarkable results, they typically only perform well on one particular task due to what is referred to as "catastrophic forgetting."  Essentially, catastrophic forgetting means that when a model that was initially trained on task A is later trained on task B, its performance on task A will significantly decline. [http://techxplore.com/news/2019-03-approach-multi-model-deep-neural-networks.html A new approach to overcome multi-model forgetting in deep neural networks] and [https://techxplore.com/news/2019-03-memory-approach-enable-lifelong.html A generative memory approach to enable lifelong reinforcement learning] | Ingrid Fadelli

Revision as of 10:20, 23 February 2020

YouTube search... ...Google search

In recent years, researchers have developed deep neural networks that can perform a variety of tasks, including visual recognition and natural language processing (NLP) tasks. Although many of these models achieved remarkable results, they typically only perform well on one particular task due to what is referred to as "catastrophic forgetting." Essentially, catastrophic forgetting means that when a model that was initially trained on task A is later trained on task B, its performance on task A will significantly decline. A new approach to overcome multi-model forgetting in deep neural networks and A generative memory approach to enable lifelong reinforcement learning | Ingrid Fadelli

11-anewapproach.jpg

4-anewdevelopm.jpg

Forgetting

In the quest to build AI that goes beyond today's single-purpose machines, scientists are developing new tools to help AI remember the right things — and forget the rest. Saving AI from catastrophic forgetting | Kaveh Waddell - Axios

Watching AI Slowly Forget a Human Face Is Incredibly Creepy