Difference between revisions of "Boosting"
m |
m (Text replacement - "http:" to "https:") |
||
Line 5: | Line 5: | ||
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | ||
}} | }} | ||
− | [ | + | [https://www.youtube.com/results?search_query=Gradient+Boosting+Algorithms Youtube search...] |
− | [ | + | [https://www.google.com/search?q=Gradient+Boosting+Algorithms+machine+learning+ML+artificial+intelligence ...Google search] |
* [[Overfitting Challenge]] | * [[Overfitting Challenge]] | ||
Line 16: | Line 16: | ||
* [[Gradient Descent Optimization & Challenges]] | * [[Gradient Descent Optimization & Challenges]] | ||
* [[Objective vs. Cost vs. Loss vs. Error Function]] | * [[Objective vs. Cost vs. Loss vs. Error Function]] | ||
− | * [ | + | * [https://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia] |
− | * [ | + | * [https://machinelearningmastery.com/gentle-introduction-gradient-boosting-algorithm-machine-learning/ A Gentle Introduction to the Gradient Boosting Algorithm for Machine Learning | Jason Brownlee] |
* [[Ensemble Learning]] | * [[Ensemble Learning]] | ||
− | Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. [ | + | Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. [https://towardsdatascience.com/10-machine-learning-algorithms-you-need-to-know-77fb0055fe0 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium] |
* [[Gradient Boosting Machine (GBM)]] | * [[Gradient Boosting Machine (GBM)]] | ||
* Gradient [[(Boosted) Decision Tree]] (GBDT) | * Gradient [[(Boosted) Decision Tree]] (GBDT) | ||
− | * [ | + | * [https://en.wikipedia.org/wiki/AdaBoost Adaptive Boosting (AdaBoost)] |
− | * [ | + | * [https://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms |
* [[LightGBM]] ...Microsoft's gradient boosting framework that uses tree based learning algorithms | * [[LightGBM]] ...Microsoft's gradient boosting framework that uses tree based learning algorithms | ||
Revision as of 04:29, 28 March 2023
Youtube search... ...Google search
- Gradient Descent Optimization & Challenges
- Objective vs. Cost vs. Loss vs. Error Function
- Boosting | Wikipedia
- A Gentle Introduction to the Gradient Boosting Algorithm for Machine Learning | Jason Brownlee
- Ensemble Learning
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium
- Gradient Boosting Machine (GBM)
- Gradient (Boosted) Decision Tree (GBDT)
- Adaptive Boosting (AdaBoost)
- XGBoost — uses liner and tree algorithms
- LightGBM ...Microsoft's gradient boosting framework that uses tree based learning algorithms