Difference between revisions of "Lifelong Learning"

From
Jump to: navigation, search
m (BPeat moved page Multi-model Forgetting Challenge to Catastrophic Forgetting Challenge without leaving a redirect)
Line 5: Line 5:
 
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools  
 
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools  
 
}}
 
}}
[http://www.youtube.com/results?search_query=Multi-model+Forgetting YouTube search...]
+
[http://www.youtube.com/results?search_query=Multi-model+Catastrophic+Forgetting YouTube search...]
[http://www.google.com/search?q=Multi-model+Forgetting ...Google search]
+
[http://www.google.com/search?q=Multi-model+Catastrophic+Forgetting ...Google search]
  
* [[Auto Keras]]
+
* [[Automated Machine Learning (AML) - AutoML]]
  
 
In recent years, researchers have developed deep neural networks that can perform a variety of tasks, including visual recognition and natural language processing (NLP) tasks. Although many of these models achieved remarkable results, they typically only perform well on one particular task due to what is referred to as "catastrophic forgetting."  Essentially, catastrophic forgetting means that when a model that was initially trained on task A is later trained on task B, its performance on task A will significantly decline. [http://techxplore.com/news/2019-03-approach-multi-model-deep-neural-networks.html A new approach to overcome multi-model forgetting in deep neural networks | Ingrid Fadelli]
 
In recent years, researchers have developed deep neural networks that can perform a variety of tasks, including visual recognition and natural language processing (NLP) tasks. Although many of these models achieved remarkable results, they typically only perform well on one particular task due to what is referred to as "catastrophic forgetting."  Essentially, catastrophic forgetting means that when a model that was initially trained on task A is later trained on task B, its performance on task A will significantly decline. [http://techxplore.com/news/2019-03-approach-multi-model-deep-neural-networks.html A new approach to overcome multi-model forgetting in deep neural networks | Ingrid Fadelli]
Line 14: Line 14:
 
http://3c1703fe8d.site.internapcdn.net/newman/csz/news/800/2019/11-anewapproach.jpg
 
http://3c1703fe8d.site.internapcdn.net/newman/csz/news/800/2019/11-anewapproach.jpg
  
 
+
<youtube>6PlvyWQUQu8</youtube>
 
<youtube>QGFlZfflYYg</youtube>
 
<youtube>QGFlZfflYYg</youtube>
 
<youtube>ohyznBhxLow</youtube>
 
<youtube>ohyznBhxLow</youtube>
 
<youtube>5uQ0q0x_Xpk</youtube>
 
<youtube>5uQ0q0x_Xpk</youtube>
 +
<youtube>OBkruvl8ih8</youtube>
 
<youtube>OBkruvl8ih8</youtube>
 
<youtube>OBkruvl8ih8</youtube>

Revision as of 23:32, 13 March 2019

YouTube search... ...Google search

In recent years, researchers have developed deep neural networks that can perform a variety of tasks, including visual recognition and natural language processing (NLP) tasks. Although many of these models achieved remarkable results, they typically only perform well on one particular task due to what is referred to as "catastrophic forgetting." Essentially, catastrophic forgetting means that when a model that was initially trained on task A is later trained on task B, its performance on task A will significantly decline. A new approach to overcome multi-model forgetting in deep neural networks | Ingrid Fadelli

11-anewapproach.jpg