« Minimum local » : différence entre les versions
(Page créée avec « == en construction == Catégorie:Vocabulary Catégorie:Apprentissage machine Catégorie:UNSW == Définition == xxxxxxx == Français == xxxxxxx == Anglais... ») Balise : Éditeur de wikicode 2017 |
Aucun résumé des modifications Balise : Éditeur de wikicode 2017 |
||
Ligne 15 : | Ligne 15 : | ||
'''local minimum''' | '''local minimum''' | ||
Understanding this term depends to some extent on the error surface metaphor. | |||
When an artificial neural network learning algorithm causes the total error of the net to descend into a valley of the error surface, that valley may or may not lead to the lowest point on the entire error surface. If it does not, the minimum into which the total error will eventually fall is termed a local minimum. The learning algorithm is sometimes referred to in this case as "trapped in a local minimum." | |||
In such cases, it usually helps to restart the algorithm with a new, randomly chosen initial set of weights - i.e. at a new random point in weight space. As this means a new starting point on the error surface, it is likely to lead into a different valley, and hopefully this one will lead to the true (absolute) minimum error, or at least a better minimum error. | |||
<small> | <small> | ||
[http://www.cse.unsw.edu.au/~billw/dictionaries/mldict.html Source : UNWS machine learning dictionary] ] | [http://www.cse.unsw.edu.au/~billw/dictionaries/mldict.html Source : UNWS machine learning dictionary] ] |
Version du 13 septembre 2019 à 10:45
en construction
Définition
xxxxxxx
Français
xxxxxxx
Anglais
local minimum
Understanding this term depends to some extent on the error surface metaphor.
When an artificial neural network learning algorithm causes the total error of the net to descend into a valley of the error surface, that valley may or may not lead to the lowest point on the entire error surface. If it does not, the minimum into which the total error will eventually fall is termed a local minimum. The learning algorithm is sometimes referred to in this case as "trapped in a local minimum."
In such cases, it usually helps to restart the algorithm with a new, randomly chosen initial set of weights - i.e. at a new random point in weight space. As this means a new starting point on the error surface, it is likely to lead into a different valley, and hopefully this one will lead to the true (absolute) minimum error, or at least a better minimum error.
Contributeurs: Imane Meziani, wiki, Sihem Kouache