Difference between revisions of "Overfitting Challenge"

From
Jump to: navigation, search
m
Line 2: Line 2:
 
[http://www.google.com/search?q=Regularization+Dropout+deep+machine+learning+ML ...Google search]
 
[http://www.google.com/search?q=Regularization+Dropout+deep+machine+learning+ML ...Google search]
  
* [[Regularization]]
+
Overfitting is a problem in machine learning in general, not just in neural networks. The problem is inherent in the way machine learning models are developed: A set of "training data" is used to "train" the model. The goal is to have a model that can then be used on data that hasn't been seen before. Over-fitting refers to the problem of having the model trained to work so well on the training data that it starts to work more poorly on data it hasn't seen before. There are a number of techniques to mitigate or prevent over-fitting. [http://wiki.fast.ai/index.php/Over-fitting | Deep Learning Course Wiki]
** [[Dropout]]
 
  
Over-fitting is a problem in machine learning in general, not just in neural networks. The problem is inherent in the way machine learning models are developed: A set of "training data" is used to "train" the model. The goal is to have a model that can then be used on data that hasn't been seen before. Over-fitting refers to the problem of having the model trained to work so well on the training data that it starts to work more poorly on data it hasn't seen before. There are a number of techniques to mitigate or prevent over-fitting. [http://wiki.fast.ai/index.php/Over-fitting | Deep Learning Course Wiki]
+
Good practices for addressing overfitting:
 
 
Good practices:
 
  
 
* add more data
 
* add more data
Line 19: Line 16:
 
** [[Data Augmentation]]
 
** [[Data Augmentation]]
 
** Early Stopping
 
** Early Stopping
 
 
  
 
http://s3-ap-south-1.amazonaws.com/av-blog-media/wp-content/uploads/2018/04/Screen-Shot-2018-04-03-at-7.52.01-PM-e1522832332857.png
 
http://s3-ap-south-1.amazonaws.com/av-blog-media/wp-content/uploads/2018/04/Screen-Shot-2018-04-03-at-7.52.01-PM-e1522832332857.png

Revision as of 12:54, 30 December 2018

Youtube search... ...Google search

Overfitting is a problem in machine learning in general, not just in neural networks. The problem is inherent in the way machine learning models are developed: A set of "training data" is used to "train" the model. The goal is to have a model that can then be used on data that hasn't been seen before. Over-fitting refers to the problem of having the model trained to work so well on the training data that it starts to work more poorly on data it hasn't seen before. There are a number of techniques to mitigate or prevent over-fitting. | Deep Learning Course Wiki

Good practices for addressing overfitting:

  • add more data
  • use Data Augmentation
  • use batch normalization
  • use architectures that generalize well
  • reduce architecture complexity
  • add Regularization
    • L2 and L1 Regularization - update the general cost function by adding another term known as the regularization term.
    • Dropout - at every iteration, it randomly selects some nodes and temporarily removes the nodes (along with all of their incoming and outgoing connections)
    • Data Augmentation
    • Early Stopping

Screen-Shot-2018-04-03-at-7.52.01-PM-e1522832332857.png Screen-Shot-2018-04-04-at-2.43.37-PM-768x592.png