Difference between revisions of "Backpropagation"
m |
m |
||
Line 8: | Line 8: | ||
[http://www.google.com/search?q=Backpropagation+deep+machine+learning+ML ...Google search] | [http://www.google.com/search?q=Backpropagation+deep+machine+learning+ML ...Google search] | ||
+ | * [[Fast-Forward]] ... Geoffrey Hinton's alternative to backpropagation proposal | ||
* [[Gradient Descent Optimization & Challenges]] | * [[Gradient Descent Optimization & Challenges]] | ||
* [[Objective vs. Cost vs. Loss vs. Error Function]] | * [[Objective vs. Cost vs. Loss vs. Error Function]] |
Revision as of 11:55, 26 February 2023
Youtube search... ...Google search
- Fast-Forward ... Geoffrey Hinton's alternative to backpropagation proposal
- Gradient Descent Optimization & Challenges
- Objective vs. Cost vs. Loss vs. Error Function
- Wikipedia
- Manifold Hypothesis
- How the backpropagation algorithm works
- Backpropagation Step by Step
- What is Backpropagation? | Daniel Nelson - Unite.ai
- Other Challenges in Artificial Intelligence
- A Beginner's Guide to Backpropagation in Neural Networks | Chris Nicholson - A.I. Wiki pathmind
The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. Machine Learning Glossary | Google