Difference between revisions of "Regularization"
m |
|||
| (30 intermediate revisions by the same user not shown) | |||
| Line 1: | Line 1: | ||
| + | {{#seo: | ||
| + | |title=PRIMO.ai | ||
| + | |titlemode=append | ||
| + | |keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, Tensorflow, Google, Nvidia, Microsoft, Azure, Amazon, AWS | ||
| + | |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | ||
| + | }} | ||
[http://www.youtube.com/results?search_query=Regularization+Overfitting Youtube search...] | [http://www.youtube.com/results?search_query=Regularization+Overfitting Youtube search...] | ||
[http://www.google.com/search?q=Regularization+deep+machine+learning+ML ...Google search] | [http://www.google.com/search?q=Regularization+deep+machine+learning+ML ...Google search] | ||
* [[Overfitting Challenge]] | * [[Overfitting Challenge]] | ||
| + | # Regularization | ||
| + | # [[Boosting]] | ||
| + | # [[Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking]] | ||
| − | + | Good practices for addressing the [[Overfitting Challenge]]: | |
| − | * L2 | + | |
| − | * | + | * add more data |
| − | * Data Augmentation | + | * use [[Data Quality#Batch Norm(alization) & Standardization|Batch Norm(alization) & Standardization]] |
| − | * Early Stopping | + | * use architectures that generalize well |
| + | * reduce architecture complexity | ||
| + | * add [[Regularization]] | ||
| + | ** [[L1 and L2 Regularization]] - update the general cost function by adding another term known as the regularization term. | ||
| + | ** Dropout - at every iteration, it randomly selects some nodes and temporarily removes the nodes (along with all of their incoming and outgoing connections) | ||
| + | ** [[Data Augmentation, Data Labeling, and Auto-Tagging|Data Augmentation]] | ||
| + | ** [[Early Stopping]] | ||
Regularization is a technique which makes slight modifications to the learning algorithm such that the model generalizes better. This in turn improves the model’s performance on the unseen data as well. [http://www.analyticsvidhya.com/blog/2018/04/fundamentals-deep-learning-regularization-techniques/ An Overview of Regularization Techniques in Deep Learning (with Python code) | Shubham Jain] | Regularization is a technique which makes slight modifications to the learning algorithm such that the model generalizes better. This in turn improves the model’s performance on the unseen data as well. [http://www.analyticsvidhya.com/blog/2018/04/fundamentals-deep-learning-regularization-techniques/ An Overview of Regularization Techniques in Deep Learning (with Python code) | Shubham Jain] | ||
| + | |||
| + | A machine learning model can overcome underfitting by adding more parameters, although its complexity increases and will require more efforts for interpretation. However, a real dilemma of a data scientist is that minimizing the prediction errors which are decomposed due to the bias and/or variance somehow turns into overfitting problems. Lasso, Ridge, and Elastic Net are popular ways of regularized statistical modeling approaches... [http://medium.com/@yongddeng/regression-analysis-lasso-ridge-and-elastic-net-9e65dc61d6d3 Regression Analysis: Lasso, Ridge, and Elastic Net | Sung Kim] | ||
| + | |||
| + | * Regression Models: | ||
| + | ** [[Ridge Regression]] | ||
| + | ** [[Lasso Regression]] | ||
| + | ** [[Elastic Net Regression]] | ||
| + | |||
| + | http://miro.medium.com/max/700/0*kuuC8_3Q2YjoLoqt.png | ||
<youtube>u73PU6Qwl1I</youtube> | <youtube>u73PU6Qwl1I</youtube> | ||
| − | <youtube> | + | <youtube>dEhGM708xUs</youtube> |
| + | <youtube>4nqD5TBlOWU</youtube> | ||
| + | <youtube>ctmNq7FgbvI</youtube> | ||
| + | <youtube>KIoz_aa1ed4</youtube> | ||
| + | |||
| + | == Adversarial Regularization (AdvReg) == | ||
| + | * [[SMART - Multi-Task Deep Neural Networks (MT-DNN)]] | ||
| + | * [http://www.tensorflow.org/neural_structured_learning/tutorials/adversarial_keras_cnn_mnist Adversarial regularization for image classification | Google] | ||
| + | |||
| + | <youtube>53gELTL3ibA</youtube> | ||
| + | <youtube>Tu3FqCD7-BY</youtube> | ||
Latest revision as of 19:18, 19 September 2020
Youtube search... ...Google search
Good practices for addressing the Overfitting Challenge:
- add more data
- use Batch Norm(alization) & Standardization
- use architectures that generalize well
- reduce architecture complexity
- add Regularization
- L1 and L2 Regularization - update the general cost function by adding another term known as the regularization term.
- Dropout - at every iteration, it randomly selects some nodes and temporarily removes the nodes (along with all of their incoming and outgoing connections)
- Data Augmentation
- Early Stopping
Regularization is a technique which makes slight modifications to the learning algorithm such that the model generalizes better. This in turn improves the model’s performance on the unseen data as well. An Overview of Regularization Techniques in Deep Learning (with Python code) | Shubham Jain
A machine learning model can overcome underfitting by adding more parameters, although its complexity increases and will require more efforts for interpretation. However, a real dilemma of a data scientist is that minimizing the prediction errors which are decomposed due to the bias and/or variance somehow turns into overfitting problems. Lasso, Ridge, and Elastic Net are popular ways of regularized statistical modeling approaches... Regression Analysis: Lasso, Ridge, and Elastic Net | Sung Kim
- Regression Models:
Adversarial Regularization (AdvReg)
- SMART - Multi-Task Deep Neural Networks (MT-DNN)
- Adversarial regularization for image classification | Google