Difference between revisions of "Boosting"

From
Jump to: navigation, search
m
m
 
(2 intermediate revisions by the same user not shown)
Line 17: Line 17:
 
[https://www.google.com/search?q=Gradient+Boosting+Algorithms+machine+learning+ML+artificial+intelligence ...Google search]
 
[https://www.google.com/search?q=Gradient+Boosting+Algorithms+machine+learning+ML+artificial+intelligence ...Google search]
  
* [[Backpropagation]] ... [[Feed Forward Neural Network (FF or FFNN)|FFNN]] ... [[Forward-Forward]] ... [[Activation Functions]] ... [[Loss]] ... [[Boosting]] ... [[Gradient Descent Optimization & Challenges|Gradient Descent]] ... [[Algorithm Administration#Hyperparameter|Hyperparameter]] ... [[Manifold Hypothesis]] ... [[Principal Component Analysis (PCA)|PCA]]
+
* [[Backpropagation]] ... [[Feed Forward Neural Network (FF or FFNN)|FFNN]] ... [[Forward-Forward]] ... [[Activation Functions]] ...[[Softmax]] ... [[Loss]] ... [[Boosting]] ... [[Gradient Descent Optimization & Challenges|Gradient Descent]] ... [[Algorithm Administration#Hyperparameter|Hyperparameter]] ... [[Manifold Hypothesis]] ... [[Principal Component Analysis (PCA)|PCA]]
 
* [[Objective vs. Cost vs. Loss vs. Error Function]]
 
* [[Objective vs. Cost vs. Loss vs. Error Function]]
 
* [[Overfitting Challenge]]
 
* [[Overfitting Challenge]]
Line 23: Line 23:
 
# Boosting
 
# Boosting
 
# [[Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking]]
 
# [[Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking]]
 
  
 
* [https://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia]
 
* [https://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia]
Line 36: Line 35:
 
* [https://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
 
* [https://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms
 
* [[LightGBM]]  ...Microsoft's gradient boosting framework that uses tree based learning algorithms   
 
* [[LightGBM]]  ...Microsoft's gradient boosting framework that uses tree based learning algorithms   
 +
* [[Optimization Methods]]
  
 
<youtube>5CWwwtEM2TA</youtube>
 
<youtube>5CWwwtEM2TA</youtube>

Latest revision as of 10:31, 6 August 2023

Youtube search... ...Google search

  1. Regularization
  2. Boosting
  3. Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking

Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium