Difference between revisions of "Backpropagation"

From
Jump to: navigation, search
Line 11: Line 11:
 
* [[Objective vs. Cost vs. Loss vs. Error Function]]
 
* [[Objective vs. Cost vs. Loss vs. Error Function]]
 
* [http://en.wikipedia.org/wiki/Backpropagation Wikipedia]
 
* [http://en.wikipedia.org/wiki/Backpropagation Wikipedia]
 +
* [http://neuralnetworksanddeeplearning.com/chap2.html How the backpropagation algorithm works]
 +
* [http://hmkcode.github.io/ai/backpropagation-step-by-step/ Backpropagation Step by Step]
 +
* [[Other Challenges]]
 +
  
 
[http://developers.google.com/machine-learning/glossary/ The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. Machine Learning Glossary | Google]
 
[http://developers.google.com/machine-learning/glossary/ The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. Machine Learning Glossary | Google]
 +
 +
 +
http://hmkcode.github.io/images/ai/backpropagation.png
 +
  
 
<youtube>q555kfIFUCM</youtube>
 
<youtube>q555kfIFUCM</youtube>

Revision as of 21:33, 16 February 2019