Difference between revisions of "Backpropagation"
m |
m (Text replacement - "http:" to "https:") |
||
Line 5: | Line 5: | ||
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | ||
}} | }} | ||
− | [ | + | [https://www.youtube.com/results?search_query=backpropagation Youtube search...] |
− | [ | + | [https://www.google.com/search?q=Backpropagation+deep+machine+learning+ML ...Google search] |
* [[Backpropagation]] ...[[Gradient Descent Optimization & Challenges]] ...[[Feed Forward Neural Network (FF or FFNN)]] ...[[Forward-Forward]] | * [[Backpropagation]] ...[[Gradient Descent Optimization & Challenges]] ...[[Feed Forward Neural Network (FF or FFNN)]] ...[[Forward-Forward]] | ||
* [[Objective vs. Cost vs. Loss vs. Error Function]] | * [[Objective vs. Cost vs. Loss vs. Error Function]] | ||
− | * [ | + | * [https://en.wikipedia.org/wiki/Backpropagation Wikipedia] |
* [[Manifold Hypothesis]] | * [[Manifold Hypothesis]] | ||
− | * [ | + | * [https://neuralnetworksanddeeplearning.com/chap2.html How the backpropagation algorithm works] |
− | * [ | + | * [https://hmkcode.github.io/ai/backpropagation-step-by-step/ Backpropagation Step by Step] |
− | * [ | + | * [https://www.unite.ai/what-is-backpropagation/ What is Backpropagation? | Daniel Nelson - Unite.ai] |
* [[Other Challenges]] in Artificial Intelligence | * [[Other Challenges]] in Artificial Intelligence | ||
− | * [ | + | * [https://pathmind.com/wiki/backpropagation A Beginner's Guide to Backpropagation in Neural Networks | Chris Nicholson - A.I. Wiki pathmind] |
− | The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. [ | + | The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. [https://developers.google.com/machine-learning/glossary/ Machine Learning Glossary | Google] |
− | + | https://hmkcode.github.io/images/ai/backpropagation.png | |
<youtube>Ilg3gGewQ5U</youtube> | <youtube>Ilg3gGewQ5U</youtube> |
Revision as of 01:44, 28 March 2023
Youtube search... ...Google search
- Backpropagation ...Gradient Descent Optimization & Challenges ...Feed Forward Neural Network (FF or FFNN) ...Forward-Forward
- Objective vs. Cost vs. Loss vs. Error Function
- Wikipedia
- Manifold Hypothesis
- How the backpropagation algorithm works
- Backpropagation Step by Step
- What is Backpropagation? | Daniel Nelson - Unite.ai
- Other Challenges in Artificial Intelligence
- A Beginner's Guide to Backpropagation in Neural Networks | Chris Nicholson - A.I. Wiki pathmind
The primary algorithm for performing gradient descent on neural networks. First, the output values of each node are calculated (and cached) in a forward pass. Then, the partial derivative of the error with respect to each parameter is calculated in a backward pass through the graph. Machine Learning Glossary | Google