Difference between revisions of "Boosting"

From
Jump to: navigation, search
Line 1: Line 1:
 
[http://www.youtube.com/results?search_query=Gradient+Boosting+Algorithms Youtube search...]
 
[http://www.youtube.com/results?search_query=Gradient+Boosting+Algorithms Youtube search...]
  
 +
* [[Boosted Decision Tree Regression]]
 
* [[Gradient Descent Optimization & Challenges]]
 
* [[Gradient Descent Optimization & Challenges]]
 
* [[Objective vs. Cost vs. Loss vs. Error Function]]
 
* [[Objective vs. Cost vs. Loss vs. Error Function]]
Line 9: Line 10:
 
* [http://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
 
* [http://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
 
* [http://lightgbm.readthedocs.io/en/latest/ LightGBM] — uses only tree-based algorithms; has incredible high performance as well.
 
* [http://lightgbm.readthedocs.io/en/latest/ LightGBM] — uses only tree-based algorithms; has incredible high performance as well.
 +
* [http://github.com/Microsoft/LightGBM LightGBM - A fast, distributed, high performance gradient boosting (GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks. It is under the umbrella of the DMTK(http://github.com/microsoft/dmtk) project of Microsoft.]
  
 
<youtube>sRktKszFmSk</youtube>
 
<youtube>sRktKszFmSk</youtube>
 
<youtube>ErDgauqnTHk</youtube>
 
<youtube>ErDgauqnTHk</youtube>

Revision as of 15:38, 12 January 2019

Youtube search...

Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium