Difference between revisions of "Gradient Descent Optimization & Challenges"
Line 1: | Line 1: | ||
− | * [[ | + | * [[Gradient Boosting Algorithms]] |
* [[Backpropagation]] | * [[Backpropagation]] | ||
* [[Objective vs. Cost vs. Loss vs. Error Function]] | * [[Objective vs. Cost vs. Loss vs. Error Function]] |
Revision as of 15:32, 12 January 2019
Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch
Vanishing & Exploding Gradients Problems
Vanishing & Exploding Gradients Challenges with Long Short-Term Memory (LSTM) and Recurrent Neural Networks (RNN)