Difference between revisions of "Backpropagation"

From
Jump to: navigation, search
m
m
Line 19: Line 19:
  
  
[http://developers.google.com/machine-learning/glossary/ The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. Machine Learning Glossary | Google]
+
The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. [http://developers.google.com/machine-learning/glossary/ Machine Learning Glossary | Google]
  
  

Revision as of 06:45, 25 September 2020

Youtube search... ...Google search


The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. Machine Learning Glossary | Google


backpropagation.png