Difference between revisions of "Boosting"

From
Jump to: navigation, search
m
Line 26: Line 26:
 
* [http://en.wikipedia.org/wiki/AdaBoost Adaptive Boosting (AdaBoost)]
 
* [http://en.wikipedia.org/wiki/AdaBoost Adaptive Boosting (AdaBoost)]
 
* [http://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
 
* [http://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
* [http://github.com/microsoft/LightGBM LightGBM, Light Gradient Boosting Machine] - A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks. LightGBM is under the umbrella of the [http://github.com/microsoft/dmtk DMTK project of Microsoft | GitHub]
+
* [[LightGBM]]  ...Microsoft's gradient boosting framework that uses tree based learning algorithms
  
 
<youtube>5CWwwtEM2TA</youtube>
 
<youtube>5CWwwtEM2TA</youtube>

Revision as of 13:12, 17 August 2020

Youtube search... ...Google search

  1. Regularization
  2. Boosting
  3. Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking


Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium