Difference between revisions of "Boosting"

From
Jump to: navigation, search
m
m
 
(One intermediate revision by the same user not shown)
Line 23: Line 23:
 
# Boosting
 
# Boosting
 
# [[Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking]]
 
# [[Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking]]
 
  
 
* [https://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia]
 
* [https://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia]
Line 36: Line 35:
 
* [https://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
 
* [https://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
 
* [[LightGBM]]  ...Microsoft's gradient boosting framework that uses tree based learning algorithms   
 
* [[LightGBM]]  ...Microsoft's gradient boosting framework that uses tree based learning algorithms   
 +
* [[Optimization Methods]]
  
 
<youtube>5CWwwtEM2TA</youtube>
 
<youtube>5CWwwtEM2TA</youtube>

Latest revision as of 10:31, 6 August 2023

Youtube search... ...Google search

  1. Regularization
  2. Boosting
  3. Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking

Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium