« Minimum local » : différence entre les versions


(Page créée avec « == en construction == Catégorie:Vocabulary Catégorie:Apprentissage machine Catégorie:UNSW == Définition == xxxxxxx == Français == xxxxxxx == Anglais... »)
Balise : Éditeur de wikicode 2017
 
Aucun résumé des modifications
Balise : Éditeur de wikicode 2017
Ligne 15 : Ligne 15 :
'''local minimum'''
'''local minimum'''


The function<center>φ(''x'') = 1/(1 + exp(–''x''))</center>which, when graphed, looks rather like a smoothed version of the step function<center>step(''x'') = 0 if ''x'' < 0, = 1 if ''x'' ≥ 0</center><br /><center></center>
Understanding this term depends to some extent on the error surface metaphor.


When an artificial neural network learning algorithm causes the total error of the net to descend into a valley of the error surface, that valley may or may not lead to the lowest point on the entire error surface. If it does not, the minimum into which the total error will eventually fall is termed a local minimum. The learning algorithm is sometimes referred to in this case as "trapped in a local minimum."


It is used to transform the total net input of an artificial neuron in some implementations of backprop-trained networks.
In such cases, it usually helps to restart the algorithm with a new, randomly chosen initial set of weights - i.e. at a new random point in weight space. As this means a new starting point on the error surface, it is likely to lead into a different valley, and hopefully this one will lead to the true (absolute) minimum error, or at least a better minimum error.
 
A related function, also sometimes used in backprop-trained networks, is 2φ(''x'')–1, which can also be expressed as tanh(''x''/2). tanh(''x''/2) is, of course, a smoothed version of the step function which jumps from –1 to 1 at ''x'' = 0, i.e. the function which = –1 if ''x'' < 0, and = 1 if ''x'' ≥ 0.


<small>
<small>


[http://www.cse.unsw.edu.au/~billw/dictionaries/mldict.html      Source : UNWS machine learning dictionary]  ]
[http://www.cse.unsw.edu.au/~billw/dictionaries/mldict.html      Source : UNWS machine learning dictionary]  ]

Version du 13 septembre 2019 à 10:45

en construction


Définition

xxxxxxx

Français

xxxxxxx

Anglais

local minimum

Understanding this term depends to some extent on the error surface metaphor.

When an artificial neural network learning algorithm causes the total error of the net to descend into a valley of the error surface, that valley may or may not lead to the lowest point on the entire error surface. If it does not, the minimum into which the total error will eventually fall is termed a local minimum. The learning algorithm is sometimes referred to in this case as "trapped in a local minimum."

In such cases, it usually helps to restart the algorithm with a new, randomly chosen initial set of weights - i.e. at a new random point in weight space. As this means a new starting point on the error surface, it is likely to lead into a different valley, and hopefully this one will lead to the true (absolute) minimum error, or at least a better minimum error.

Source : UNWS machine learning dictionary ]

Contributeurs: Imane Meziani, wiki, Sihem Kouache