Difference between revisions of "Regularization"

From
Jump to: navigation, search
Line 27: Line 27:
  
 
Regularization is a technique which makes slight modifications to the learning algorithm such that the model generalizes better. This in turn improves the model’s performance on the unseen data as well. [http://www.analyticsvidhya.com/blog/2018/04/fundamentals-deep-learning-regularization-techniques/ An Overview of Regularization Techniques in Deep Learning (with Python code) | Shubham Jain]
 
Regularization is a technique which makes slight modifications to the learning algorithm such that the model generalizes better. This in turn improves the model’s performance on the unseen data as well. [http://www.analyticsvidhya.com/blog/2018/04/fundamentals-deep-learning-regularization-techniques/ An Overview of Regularization Techniques in Deep Learning (with Python code) | Shubham Jain]
 +
 +
A machine learning model can overcome underfitting by adding more parameters, although its complexity increases and will require more efforts for interpretation.[1] However, a real dilemma of a data scientist is that minimizing the prediction errors which are decomposed due to the bias and/or variance somehow turns into overfitting problems. Lasso, Ridge, and Elastic Net are popular ways of regularized statistical modeling approaches... [http://medium.com/@yongddeng/regression-analysis-lasso-ridge-and-elastic-net-9e65dc61d6d3 Regression Analysis: Lasso, Ridge, and Elastic Net | Sung Kim]
  
 
* Regression Models:
 
* Regression Models:
Line 32: Line 34:
 
** [[Lasso Regression]]
 
** [[Lasso Regression]]
 
** [[Elastic Net Regression]]
 
** [[Elastic Net Regression]]
 +
 +
http://miro.medium.com/max/700/0*kuuC8_3Q2YjoLoqt.png
  
 
<youtube>u73PU6Qwl1I</youtube>
 
<youtube>u73PU6Qwl1I</youtube>

Revision as of 00:36, 13 July 2019

Youtube search... ...Google search

  1. Regularization
  2. Boosting
  3. Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking

Good practices for addressing the Overfitting Challenge:

Regularization is a technique which makes slight modifications to the learning algorithm such that the model generalizes better. This in turn improves the model’s performance on the unseen data as well. An Overview of Regularization Techniques in Deep Learning (with Python code) | Shubham Jain

A machine learning model can overcome underfitting by adding more parameters, although its complexity increases and will require more efforts for interpretation.[1] However, a real dilemma of a data scientist is that minimizing the prediction errors which are decomposed due to the bias and/or variance somehow turns into overfitting problems. Lasso, Ridge, and Elastic Net are popular ways of regularized statistical modeling approaches... Regression Analysis: Lasso, Ridge, and Elastic Net | Sung Kim

0*kuuC8_3Q2YjoLoqt.png