Difference between revisions of "Gradient Descent Optimization & Challenges"
Line 9: | Line 9: | ||
[http://www.google.com/search?q=Gradient+Descent+Optimization+Challenges+deep+machine+learning+ML+artificial+intelligence ...Google search] | [http://www.google.com/search?q=Gradient+Descent+Optimization+Challenges+deep+machine+learning+ML+artificial+intelligence ...Google search] | ||
− | * [[ | + | * Gradient [[Boosting]] Algorithms |
* [[Backpropagation]] | * [[Backpropagation]] | ||
* [[Objective vs. Cost vs. Loss vs. Error Function]] | * [[Objective vs. Cost vs. Loss vs. Error Function]] |
Revision as of 07:35, 4 February 2019
YouTube search... ...Google search
- Gradient Boosting Algorithms
- Backpropagation
- Objective vs. Cost vs. Loss vs. Error Function
- Topology and Weight Evolving Artificial Neural Network (TWEANN)
Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch
Vanishing & Exploding Gradients Problems
Vanishing & Exploding Gradients Challenges with Long Short-Term Memory (LSTM) and Recurrent Neural Networks (RNN)