Difference between revisions of "Boosting"
m |
m |
||
Line 23: | Line 23: | ||
# Boosting | # Boosting | ||
# [[Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking]] | # [[Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking]] | ||
− | |||
* [https://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia] | * [https://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia] |
Revision as of 20:36, 13 July 2023
Youtube search... ...Google search
- Backpropagation ... FFNN ... Forward-Forward ... Activation Functions ...Softmax ... Loss ... Boosting ... Gradient Descent ... Hyperparameter ... Manifold Hypothesis ... PCA
- Objective vs. Cost vs. Loss vs. Error Function
- Overfitting Challenge
- Boosting | Wikipedia
- A Gentle Introduction to the Gradient Boosting Algorithm for Machine Learning | Jason Brownlee
- Ensemble Learning
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium
- Gradient Boosting Machine (GBM)
- Gradient (Boosted) Decision Tree (GBDT)
- Adaptive Boosting (AdaBoost)
- XGBoost — uses liner and tree algorithms
- LightGBM ...Microsoft's gradient boosting framework that uses tree based learning algorithms