Difference between revisions of "Gradient Descent Optimization & Challenges"
m (→Vanishing/Exploding Gradients Problems) |
|||
| Line 11: | Line 11: | ||
<youtube>qhXZsFVxGKo</youtube> | <youtube>qhXZsFVxGKo</youtube> | ||
| + | <youtube>qO_NLVjD6zE</youtube> | ||
| + | |||
| + | |||
| + | == Vanishing & Exploding Gradients Challenges with Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) == | ||
| + | [[http://www.youtube.com/results?search_query=Difficult+to+Train+an+RNN+LSTM+vanishing YouTube Search]] | ||
| + | |||
| + | <youtube>Pp4oKq4kCYs</youtube> | ||
| + | <youtube>2GNbIKTKCfE</youtube> | ||
| + | <youtube>A7poQbTrhxc</youtube> | ||
<youtube>qO_NLVjD6zE</youtube> | <youtube>qO_NLVjD6zE</youtube> | ||
Revision as of 23:13, 5 May 2018
Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch
Vanishing & Exploding Gradients Problems
Vanishing & Exploding Gradients Challenges with Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)