« Surface d'erreur » : différence entre les versions
m (Remplacement de texte — « Catégorie:Apprentissage machine » par « Catégorie:Apprentissage automatique ») |
m (ClaireGorjux a déplacé la page Error surface vers Surface d'erreur) |
(Aucune différence)
|
Version du 24 octobre 2021 à 11:21
en construction
Définition
XXXXXXXXXXXXXXX
Français
XXXXXXXXXXXXXXX
Anglais
error surface
When total error of a backpropagation-trained neural network is expressed as a function of the weights, and graphed (to the extent that this is possible with a large number of weights), the result is a surface termed the error surface. The course of learning can be traced on the error surface: as learning is supposed to reduce error, when the learning algorithm causes the weights to change, the current point on the error surface should descend into a valley of the error surface.
The "point" defined by the current set of weights is termed a point in weight space. Thus weight space is the set of all possible values of the weights.
See also local minimum and gradient descent.
Contributeurs: Claire Gorjux, wiki