Difference between revisions of "Boosting"
m |
|||
| (15 intermediate revisions by the same user not shown) | |||
| Line 2: | Line 2: | ||
|title=PRIMO.ai | |title=PRIMO.ai | ||
|titlemode=append | |titlemode=append | ||
| − | |keywords=artificial, intelligence, machine, learning, models | + | |keywords=ChatGPT, artificial, intelligence, machine, learning, GPT-4, GPT-5, NLP, NLG, NLC, NLU, models, data, singularity, moonshot, Sentience, AGI, Emergence, Moonshot, Explainable, TensorFlow, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Hugging Face, OpenAI, Tensorflow, OpenAI, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Meta, LLM, metaverse, assistants, agents, digital twin, IoT, Transhumanism, Immersive Reality, Generative AI, Conversational AI, Perplexity, Bing, You, Bard, Ernie, prompt Engineering LangChain, Video/Image, Vision, End-to-End Speech, Synthesize Speech, Speech Recognition, Stanford, MIT |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools |
| − | |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | + | |
| + | <!-- Google tag (gtag.js) --> | ||
| + | <script async src="https://www.googletagmanager.com/gtag/js?id=G-4GCWLBVJ7T"></script> | ||
| + | <script> | ||
| + | window.dataLayer = window.dataLayer || []; | ||
| + | function gtag(){dataLayer.push(arguments);} | ||
| + | gtag('js', new Date()); | ||
| + | |||
| + | gtag('config', 'G-4GCWLBVJ7T'); | ||
| + | </script> | ||
}} | }} | ||
| − | [ | + | [https://www.youtube.com/results?search_query=Gradient+Boosting+Algorithms Youtube search...] |
| − | [ | + | [https://www.google.com/search?q=Gradient+Boosting+Algorithms+machine+learning+ML+artificial+intelligence ...Google search] |
| − | * [[ | + | * [[Backpropagation]] ... [[Feed Forward Neural Network (FF or FFNN)|FFNN]] ... [[Forward-Forward]] ... [[Activation Functions]] ...[[Softmax]] ... [[Loss]] ... [[Boosting]] ... [[Gradient Descent Optimization & Challenges|Gradient Descent]] ... [[Algorithm Administration#Hyperparameter|Hyperparameter]] ... [[Manifold Hypothesis]] ... [[Principal Component Analysis (PCA)|PCA]] |
| − | |||
* [[Objective vs. Cost vs. Loss vs. Error Function]] | * [[Objective vs. Cost vs. Loss vs. Error Function]] | ||
| − | * [ | + | * [[Overfitting Challenge]] |
| − | + | # [[Regularization]] | |
| + | # Boosting | ||
| + | # [[Multiclassifiers; Ensembles and Hybrids; Bagging, Boosting, and Stacking]] | ||
| − | + | * [https://en.wikipedia.org/wiki/Boosting_(machine_learning) Boosting | Wikipedia] | |
| + | * [https://machinelearningmastery.com/gentle-introduction-gradient-boosting-algorithm-machine-learning/ A Gentle Introduction to the Gradient Boosting Algorithm for Machine Learning | Jason Brownlee] | ||
| + | * [[Ensemble Learning]] | ||
| − | + | Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. [https://towardsdatascience.com/10-machine-learning-algorithms-you-need-to-know-77fb0055fe0 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium] | |
| − | * [ | + | |
| + | * [[Gradient Boosting Machine (GBM)]] | ||
| + | * Gradient [[(Boosted) Decision Tree]] (GBDT) | ||
| + | * [https://en.wikipedia.org/wiki/AdaBoost Adaptive Boosting (AdaBoost)] | ||
| + | * [https://xgboost.readthedocs.io/en/latest/ XGBoost] — uses liner and tree algorithms | ||
| + | * [[LightGBM]] ...Microsoft's gradient boosting framework that uses tree based learning algorithms | ||
| + | * [[Optimization Methods]] | ||
| + | <youtube>5CWwwtEM2TA</youtube> | ||
| + | <youtube>UHBmv7qCey4</youtube> | ||
<youtube>sRktKszFmSk</youtube> | <youtube>sRktKszFmSk</youtube> | ||
<youtube>ErDgauqnTHk</youtube> | <youtube>ErDgauqnTHk</youtube> | ||
Latest revision as of 10:31, 6 August 2023
Youtube search... ...Google search
- Backpropagation ... FFNN ... Forward-Forward ... Activation Functions ...Softmax ... Loss ... Boosting ... Gradient Descent ... Hyperparameter ... Manifold Hypothesis ... PCA
- Objective vs. Cost vs. Loss vs. Error Function
- Overfitting Challenge
- Boosting | Wikipedia
- A Gentle Introduction to the Gradient Boosting Algorithm for Machine Learning | Jason Brownlee
- Ensemble Learning
Gradient Boosting Algorithm uses multiple weak algorithms to create a more powerful accurate algorithm. Instead of using a single estimator, having multiple will create a more stable and robust algorithm. The specialty of Gradient Boosting Algorithms is their higher accuracy. There are several Gradient Boosting Algorithms. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium
- Gradient Boosting Machine (GBM)
- Gradient (Boosted) Decision Tree (GBDT)
- Adaptive Boosting (AdaBoost)
- XGBoost — uses liner and tree algorithms
- LightGBM ...Microsoft's gradient boosting framework that uses tree based learning algorithms
- Optimization Methods