Difference between revisions of "Boosting"

From
Jump to: navigation, search
(Created page with "[http://www.youtube.com/results?search_query=Gradient+Boosting+Algorithms Youtube search...] * Gradient Descent Optimization & Challenges <youtube>q555kfIFUCM</youtube>...")
 
Line 3: Line 3:
 
* [[Gradient Descent Optimization & Challenges]]
 
* [[Gradient Descent Optimization & Challenges]]
  
<youtube>q555kfIFUCM</youtube>
+
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. [http://towardsdatascience.com/10-machine-learning-algorithms-you-need-to-know-77fb0055fe0 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium]
<youtube>Ilg3gGewQ5U</youtube>
+
 
 +
* [http://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
 +
* [http://lightgbm.readthedocs.io/en/latest/ LightGBM] — uses only tree-based algorithms; has incredible high performance as well.
 +
 
 +
<youtube>sRktKszFmSk</youtube>
 +
<youtube>ErDgauqnTHk</youtube>

Revision as of 20:59, 4 June 2018

Youtube search...

Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium

  • XGBoost — uses liner and tree algorithms
  • LightGBM — uses only tree-based algorithms; has incredible high performance as well.