Difference between revisions of "Gradient Descent Optimization & Challenges"
Line 1: | Line 1: | ||
* [[Gradient Boosting Algorithms]] | * [[Gradient Boosting Algorithms]] | ||
* [[Backpropagation]] | * [[Backpropagation]] | ||
+ | * [[Objective vs. Cost vs. Loss vs. Error Function]] | ||
== Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch == | == Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch == |
Revision as of 14:23, 24 June 2018
Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch
Vanishing & Exploding Gradients Problems
Vanishing & Exploding Gradients Challenges with Long Short-Term Memory (LSTM) and Recurrent Neural Networks (RNN)