Difference between revisions of "Feed Forward Neural Network (FF or FFNN)"
m |
|||
Line 11: | Line 11: | ||
** [[...predict categories]] | ** [[...predict categories]] | ||
* [[Capabilities]] | * [[Capabilities]] | ||
+ | * [[Backpropagation]] ... [[Feed Forward Neural Network (FF or FFNN)]] ... [[Forward-Forward]] | ||
* [http://www.asimovinstitute.org/author/fjodorvanveen/ Neural Network Zoo | Fjodor Van Veen] | * [http://www.asimovinstitute.org/author/fjodorvanveen/ Neural Network Zoo | Fjodor Van Veen] | ||
Revision as of 12:55, 26 February 2023
YouTube search... ...Google search
- AI Solver
- Capabilities
- Backpropagation ... Feed Forward Neural Network (FF or FFNN) ... Forward-Forward
- Neural Network Zoo | Fjodor Van Veen
Feed forward neural networks (FF or FFNN) and perceptrons (P) are very straight forward, they feed information from the front to the back (input and output, respectively). Neural networks are often described as having layers, where each layer consists of either input, hidden or output cells in parallel. A layer alone never has connections and in general two adjacent layers are fully connected (every neuron form one layer to every neuron to another layer). The simplest somewhat practical network has two input cells and one output cell, which can be used to model logic gates. One usually trains FFNNs through back-propagation, giving the network paired datasets of “what goes in” and “what we want to have coming out”. This is called supervised learning, as opposed to unsupervised learning where we only give it input and let the network fill in the blanks. The error being back-propagated is often some variation of the difference between the input and the output (like MSE or just the linear difference). Given that the network has enough hidden neurons, it can theoretically always model the relationship between the input and output. Practically their use is a lot more limited but they are popularly combined with other networks to form new networks. Rosenblatt, Frank. The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review 65.6 (1958): 386.