Difference between revisions of "Boosting"
m |
m |
||
Line 35: | Line 35: | ||
* [https://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms | * [https://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms | ||
* [[LightGBM]] ...Microsoft's gradient boosting framework that uses tree based learning algorithms | * [[LightGBM]] ...Microsoft's gradient boosting framework that uses tree based learning algorithms | ||
+ | * [[Optimization Methods]] | ||
<youtube>5CWwwtEM2TA</youtube> | <youtube>5CWwwtEM2TA</youtube> |
Latest revision as of 10:31, 6 August 2023
Youtube search... ...Google search
- Backpropagation ... FFNN ... Forward-Forward ... Activation Functions ...Softmax ... Loss ... Boosting ... Gradient Descent ... Hyperparameter ... Manifold Hypothesis ... PCA
- Objective vs. Cost vs. Loss vs. Error Function
- Overfitting Challenge
- Boosting | Wikipedia
- A Gentle Introduction to the Gradient Boosting Algorithm for Machine Learning | Jason Brownlee
- Ensemble Learning
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium
- Gradient Boosting Machine (GBM)
- Gradient (Boosted) Decision Tree (GBDT)
- Adaptive Boosting (AdaBoost)
- XGBoost — uses liner and tree algorithms
- LightGBM ...Microsoft's gradient boosting framework that uses tree based learning algorithms
- Optimization Methods