Difference between revisions of "Backpropagation"
m |
m |
||
Line 24: | Line 24: | ||
http://hmkcode.github.io/images/ai/backpropagation.png | http://hmkcode.github.io/images/ai/backpropagation.png | ||
− | |||
− | |||
− | |||
<youtube>Ilg3gGewQ5U</youtube> | <youtube>Ilg3gGewQ5U</youtube> | ||
<youtube>An5z8lR8asY</youtube> | <youtube>An5z8lR8asY</youtube> | ||
Line 34: | Line 31: | ||
<youtube>WZDMNM36PsM</youtube> | <youtube>WZDMNM36PsM</youtube> | ||
<youtube>g9V-MHxSCcs</youtube> | <youtube>g9V-MHxSCcs</youtube> | ||
+ | <youtube>q555kfIFUCM</youtube> | ||
+ | <youtube>FaHHWdsIYQg</youtube> |
Revision as of 06:46, 25 September 2020
Youtube search... ...Google search
- Gradient Descent Optimization & Challenges
- Objective vs. Cost vs. Loss vs. Error Function
- Wikipedia
- Manifold Hypothesis
- How the backpropagation algorithm works
- Backpropagation Step by Step
- What is Backpropagation? | Daniel Nelson - Unite.ai
- Other Challenges in Artificial Intelligence
- A Beginner's Guide to Backpropagation in Neural Networks | Chris Nicholson - A.I. Wiki pathmind
The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. Machine Learning Glossary | Google