Difference between revisions of "Boosting"
| Line 3: | Line 3: | ||
* [[Gradient Descent Optimization & Challenges]] | * [[Gradient Descent Optimization & Challenges]] | ||
* [[Objective vs. Cost vs. Loss vs. Error Function]] | * [[Objective vs. Cost vs. Loss vs. Error Function]] | ||
| + | * [http://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia] | ||
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. [http://towardsdatascience.com/10-machine-learning-algorithms-you-need-to-know-77fb0055fe0 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium] | Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. [http://towardsdatascience.com/10-machine-learning-algorithms-you-need-to-know-77fb0055fe0 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium] | ||
Revision as of 21:45, 12 September 2018
- Gradient Descent Optimization & Challenges
- Objective vs. Cost vs. Loss vs. Error Function
- Boosting | Wikipedia
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium
- XGBoost — uses liner and tree algorithms
- LightGBM — uses only tree-based algorithms; has incredible high performance as well.