Difference between revisions of "Gradient Descent Optimization & Challenges"
m (→Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch) |
|||
Line 1: | Line 1: | ||
+ | * [[Gradient Boosting Algorithms]] | ||
+ | |||
== Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch == | == Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch == | ||
[http://www.youtube.com/results?search_query=gradient+descent+%28sgd%29+neural+network Youtube search...] | [http://www.youtube.com/results?search_query=gradient+descent+%28sgd%29+neural+network Youtube search...] |
Revision as of 20:52, 4 June 2018
Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch
Vanishing & Exploding Gradients Problems
Vanishing & Exploding Gradients Challenges with Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)