Règle delta généralisée
en construction
Définition
xxxxxxx
Français
xxxxxxx
Anglais
generalized delta rule An improvement on the in error backpropagation learning. If the learning rate (often denoted by η) is small, the backprop algorithm proceeds slowly, but accurately follows the path of steepest descent on the error surface. If η is too large, the algorithm may "bounce off the canyon walls of the error surface" - i.e. not work well. This can be largely avoided by modifying the delta rule to include a momentum term:
Δwji(n) = α Δwji(n–1) + η δj(n) yi(n)
in the notation of Haykin's text (Neural networks - a comprehensive foundation). The constant α is a termed the momentum constant and can be adjusted to achieve the best effect. The second summand corresponds to the standard delta rule, while the first summand says "add α × the previous change to this weight."
This new rule is called the generalized delta rule. The effect is that if the basic delta rule would be consistently pushing a weight in the same direction, then it gradually gathers "momentum" in that direction.
Contributeurs: Imane Meziani, wiki