Difference between revisions of "Early Stopping"

From
Jump to: navigation, search
m
m
Line 13: Line 13:
  
 
* add more data
 
* add more data
* use [[Data Augmentation, Data Labeling, and Auto-Tagging|Data Augmentation]]
+
* use [[Data Quality#Batch Norm(alization) & Standardization|Batch Norm(alization) & Standardization]]
* use [[Batch Norm(alization) & Standardization]]
 
 
* use architectures that generalize well
 
* use architectures that generalize well
 
* reduce architecture complexity
 
* reduce architecture complexity
 
* add [[Regularization]]
 
* add [[Regularization]]
 
** [[L1 and L2 Regularization]] -  update the general cost function by adding another term known as the regularization term.  
 
** [[L1 and L2 Regularization]] -  update the general cost function by adding another term known as the regularization term.  
** [[Dropout]] - at every iteration, it randomly selects some nodes and temporarily removes the nodes (along with all of their incoming and outgoing connections)
+
** Dropout - at every iteration, it randomly selects some nodes and temporarily removes the nodes (along with all of their incoming and outgoing connections)
** [[Data Augmentation]]
+
** [[Data Augmentation, Data Labeling, and Auto-Tagging|Data Augmentation]]
** Early Stopping
+
** [[Early Stopping]]
  
  
 
<youtube>7QfUNxkthq8</youtube>
 
<youtube>7QfUNxkthq8</youtube>
 
<youtube>ATuyK_HWZgc</youtube>
 
<youtube>ATuyK_HWZgc</youtube>

Revision as of 19:19, 19 September 2020

Youtube search... ...Google search

Good practices for addressing the Overfitting Challenge: