Difference between revisions of "Overfitting Challenge"
(Created page with "[http://www.youtube.com/results?search_query=Gradient+Boosting+Algorithms Youtube search...] * Gradient Descent Optimization & Challenges * Objective vs. Cost vs. Loss...") |
m (BPeat moved page Regularization to Regularization - The Problem Of Overfitting without leaving a redirect) |
(No difference)
| |
Revision as of 20:14, 28 July 2018
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium
- XGBoost — uses liner and tree algorithms
- LightGBM — uses only tree-based algorithms; has incredible high performance as well.