Difference between revisions of "Overfitting Challenge"

From
Jump to: navigation, search
Line 8: Line 8:
 
* add more data
 
* add more data
 
* use [[Data Augmentation]]
 
* use [[Data Augmentation]]
* use batch normalization
+
* use [[Batch Normalization]]
 
* use architectures that generalize well
 
* use architectures that generalize well
 
* reduce architecture complexity
 
* reduce architecture complexity

Revision as of 16:55, 30 December 2018

Youtube search... ...Google search

Overfitting is a problem in machine learning in general, not just in neural networks. The problem is inherent in the way machine learning models are developed: A set of "training data" is used to "train" the model. The goal is to have a model that can then be used on data that hasn't been seen before. Over-fitting refers to the problem of having the model trained to work so well on the training data that it starts to work more poorly on data it hasn't seen before. There are a number of techniques to mitigate or prevent over-fitting. | Deep Learning Course Wiki

Good practices for addressing overfitting:

Screen-Shot-2018-04-03-at-7.52.01-PM-e1522832332857.png Screen-Shot-2018-04-04-at-2.43.37-PM-768x592.png