Elastic Net Regression
YouTube search... ...Google search
- AI Solver
- Capabilities
- Linear Regression
- Regularization
- Logistic Regression (LR)
- Statistics for Intelligence
- Overfitting Challenge
Elastic-Net Regression is combines Lasso Regression with Ridge Regression to give you the best of both worlds. It works well when there are lots of useless variables that need to be removed from the equation and it works well when there are lots of useful variables that need to be retained. And it does better than either one when it comes to handling correlated variables. [StatQuest
- Left plot: Elastic net gradient vs. bridge regression weight along each dimension - The x axis represents one component of a set of weights w∗ selected by bridge regression. The y axis represents the corresponding component of the elastic net gradient, evaluated at w∗. Note that the weights are multidimensional, but we're just looking at the weights/gradient along a single dimension.
- Right plot: Elastic net changes to bridge regression weights (2d) - Each point represents a set of 2d weights w∗ selected by bridge regression. For each choice of w∗, a vector is plotted pointing in the direction opposite the elastic net gradient, with magnitude proportional to that of the gradient. That is, the plotted vectors show how the elastic net wants to change the bridge regression solution. Bridge penalty vs. Elastic Net regularization | user20160