Difference between revisions of "Boosting"
| Line 5: | Line 5: | ||
* [[Objective vs. Cost vs. Loss vs. Error Function]] | * [[Objective vs. Cost vs. Loss vs. Error Function]] | ||
* [http://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia] | * [http://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia] | ||
| + | * [http://machinelearningmastery.com/gentle-introduction-gradient-boosting-algorithm-machine-learning/ A Gentle Introduction to the Gradient Boosting Algorithm for Machine Learning | Jason Brownlee] | ||
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. [http://towardsdatascience.com/10-machine-learning-algorithms-you-need-to-know-77fb0055fe0 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium] | Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. [http://towardsdatascience.com/10-machine-learning-algorithms-you-need-to-know-77fb0055fe0 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium] | ||
Revision as of 15:50, 12 January 2019
- Gradient Boosting Machine (GBM)
- Gradient Descent Optimization & Challenges
- Objective vs. Cost vs. Loss vs. Error Function
- Boosting | Wikipedia
- A Gentle Introduction to the Gradient Boosting Algorithm for Machine Learning | Jason Brownlee
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium
- XGBoost — uses liner and tree algorithms
- LightGBM - A fast, distributed, high performance gradient boosting (GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks. It is under the umbrella of the DMTK(http://github.com/microsoft/dmtk) project of Microsoft. LightGBM