Back Propagation
Is the [see page 16, application] of the previous outputs of a neural-network (forward pass) to refine the parameters of the network.
For a neural network we feed the information forwards to do our prediction, but for training we need to [see page 14, feed] the local-gradients backwards ([see page 16, back-propagation]).
Back-propagation is not very biologically plausible.
For
- [see page 16, forward-propagation] all outputs are accumulated together from every neuron in the previous layer as input for a output-layer neuron
- All the output-layer (neurons) local-gradients goto a single hidden-layer neuron