Difference between revisions of "Overfitting Challenge"

From
Jump to: navigation, search
Line 1: Line 1:
[http://www.youtube.com/results?search_query=Gradient+Boosting+Algorithms Youtube search...]
+
[http://www.youtube.com/results?search_query=Regularization+Overfitting Youtube search...]
 
 
* [[Gradient Descent Optimization & Challenges]]
 
* [[Objective vs. Cost vs. Loss vs. Error Function]]
 
 
 
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. [http://towardsdatascience.com/10-machine-learning-algorithms-you-need-to-know-77fb0055fe0 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium]
 
 
 
* [http://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
 
* [http://lightgbm.readthedocs.io/en/latest/ LightGBM] — uses only tree-based algorithms; has incredible high performance as well.
 
  
 
<youtube>u73PU6Qwl1I</youtube>
 
<youtube>u73PU6Qwl1I</youtube>

Revision as of 20:17, 28 July 2018