Difference between revisions of "Gradient Descent Optimization & Challenges"

From
Jump to: navigation, search
Line 2: Line 2:
 
* [[Backpropagation]]
 
* [[Backpropagation]]
 
* [[Objective vs. Cost vs. Loss vs. Error Function]]
 
* [[Objective vs. Cost vs. Loss vs. Error Function]]
 +
* [[Topology and Weight Evolving Artificial Neural Network (TWEANN)]]
  
 
== Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch ==
 
== Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch ==

Revision as of 21:28, 19 January 2019

Gradient Descent - Stochastic (SGD), Batch (BGD) & Mini-Batch

Youtube search...

Vanishing & Exploding Gradients Problems

YouTube Search

Vanishing & Exploding Gradients Challenges with Long Short-Term Memory (LSTM) and Recurrent Neural Networks (RNN)

[YouTube Search]