Difference between revisions of "Gradient Descent Optimization & Challenges"

From
Jump to: navigation, search
(Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch)
(Vanishing & Exploding Gradients Challenges with Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM))
Line 20: Line 20:
 
== Vanishing & Exploding Gradients Challenges with Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) ==
 
== Vanishing & Exploding Gradients Challenges with Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) ==
 
[[http://www.youtube.com/results?search_query=Difficult+to+Train+an+RNN+LSTM+vanishing YouTube Search]]
 
[[http://www.youtube.com/results?search_query=Difficult+to+Train+an+RNN+LSTM+vanishing YouTube Search]]
 +
 +
* [[Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Recurrent Neural Network (RNN)]]
  
 
<youtube>Pp4oKq4kCYs</youtube>
 
<youtube>Pp4oKq4kCYs</youtube>

Revision as of 09:30, 24 June 2018

Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch

Youtube search...

Vanishing & Exploding Gradients Problems

YouTube Search


Vanishing & Exploding Gradients Challenges with Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)

[YouTube Search]