Difference between revisions of "Early Stopping"
Line 16: | Line 16: | ||
** [[Dropout]] - at every iteration, it randomly selects some nodes and temporarily removes the nodes (along with all of their incoming and outgoing connections) | ** [[Dropout]] - at every iteration, it randomly selects some nodes and temporarily removes the nodes (along with all of their incoming and outgoing connections) | ||
** [[Data Augmentation]] | ** [[Data Augmentation]] | ||
− | ** Early Stopping | + | ** [[Early Stopping]] |
<youtube>7QfUNxkthq8</youtube> | <youtube>7QfUNxkthq8</youtube> | ||
<youtube>ATuyK_HWZgc</youtube> | <youtube>ATuyK_HWZgc</youtube> |
Revision as of 16:35, 30 December 2018
Youtube search... ...Google search
Good practices for addressing the Overfitting Challenge:
- add more data
- use Data Augmentation
- use batch normalization
- use architectures that generalize well
- reduce architecture complexity
- add Regularization
- L1 and L2 Regularization - update the general cost function by adding another term known as the regularization term.
- Dropout - at every iteration, it randomly selects some nodes and temporarily removes the nodes (along with all of their incoming and outgoing connections)
- Data Augmentation
- Early Stopping