Backward pass in backpropagation
backward pass in backpropagation
The phase of the error backpropagation learning algorithm when the weights are updated, using the delta rule or some modification of it.
The backward pass starts at the output layer of the feedforward network, and updates the incoming weights to units in that layer using the delta rule. Then it works backward, starting with the penultimate layer (last hidden layer), updating the incoming weights to those layers.
Statistics collected during the forward pass are used during the backward pass in updating the weights.
Vous devez demander un compte pour contribuer à cette page.