Difference between revisions of "Boosting"
m |
m |
||
Line 17: | Line 17: | ||
[https://www.google.com/search?q=Gradient+Boosting+Algorithms+machine+learning+ML+artificial+intelligence ...Google search] | [https://www.google.com/search?q=Gradient+Boosting+Algorithms+machine+learning+ML+artificial+intelligence ...Google search] | ||
− | * [[Backpropagation]] ... [[Feed Forward Neural Network (FF or FFNN)|FFNN]] ... [[Forward-Forward]] ... [[Activation Functions]] ... [[Loss]] ... [[Boosting]] ... [[Gradient Descent Optimization & Challenges|Gradient Descent]] ... [[Algorithm Administration#Hyperparameter|Hyperparameter]] ... [[Manifold Hypothesis]] ... [[Principal Component Analysis (PCA)|PCA]] | + | * [[Backpropagation]] ... [[Feed Forward Neural Network (FF or FFNN)|FFNN]] ... [[Forward-Forward]] ... [[Activation Functions]] ...[[Softmax]] ... [[Loss]] ... [[Boosting]] ... [[Gradient Descent Optimization & Challenges|Gradient Descent]] ... [[Algorithm Administration#Hyperparameter|Hyperparameter]] ... [[Manifold Hypothesis]] ... [[Principal Component Analysis (PCA)|PCA]] |
* [[Objective vs. Cost vs. Loss vs. Error Function]] | * [[Objective vs. Cost vs. Loss vs. Error Function]] | ||
* [[Overfitting Challenge]] | * [[Overfitting Challenge]] |
Revision as of 02:04, 11 July 2023
Youtube search... ...Google search
- Backpropagation ... FFNN ... Forward-Forward ... Activation Functions ...Softmax ... Loss ... Boosting ... Gradient Descent ... Hyperparameter ... Manifold Hypothesis ... PCA
- Objective vs. Cost vs. Loss vs. Error Function
- Overfitting Challenge
- Boosting | Wikipedia
- A Gentle Introduction to the Gradient Boosting Algorithm for Machine Learning | Jason Brownlee
- Ensemble Learning
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium
- Gradient Boosting Machine (GBM)
- Gradient (Boosted) Decision Tree (GBDT)
- Adaptive Boosting (AdaBoost)
- XGBoost — uses liner and tree algorithms
- LightGBM ...Microsoft's gradient boosting framework that uses tree based learning algorithms