Difference between revisions of "Gradient Descent Optimization & Challenges"

From
Jump to: navigation, search
m (Vanishing/Exploding Gradients Problems)
Line 11: Line 11:
  
 
<youtube>qhXZsFVxGKo</youtube>
 
<youtube>qhXZsFVxGKo</youtube>
 +
<youtube>qO_NLVjD6zE</youtube>
 +
 +
 +
== Vanishing & Exploding Gradients Challenges with Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) ==
 +
[[http://www.youtube.com/results?search_query=Difficult+to+Train+an+RNN+LSTM+vanishing YouTube Search]]
 +
 +
<youtube>Pp4oKq4kCYs</youtube>
 +
<youtube>2GNbIKTKCfE</youtube>
 +
<youtube>A7poQbTrhxc</youtube>
 
<youtube>qO_NLVjD6zE</youtube>
 
<youtube>qO_NLVjD6zE</youtube>

Revision as of 23:13, 5 May 2018

Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch

Youtube search...

Vanishing & Exploding Gradients Problems

YouTube Search


Vanishing & Exploding Gradients Challenges with Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)

[YouTube Search]