Quickprop


Révision datée du 18 décembre 2020 à 17:10 par Pitpitt (discussion | contributions) (Page créée avec « ==en construction== == Définition == XXXXXXXXX == Français == ''' XXXXXXXXX ''' == Anglais == ''' Quickprop''' Quickprop is an iterative method for determining the m... »)
(diff) ← Version précédente | Voir la version actuelle (diff) | Version suivante → (diff)

en construction

Définition

XXXXXXXXX

Français

XXXXXXXXX

Anglais

Quickprop

Quickprop is an iterative method for determining the minimum of the loss function of an artificial neural network[1], following an algorithm inspired by the Newton's method. Sometimes, the algorithm is classified to the group of the second order learning methods. It follows a quadratic approximation of the previous gradient step and the current gradient, which is expected to be close to the minimum of the loss function, under the assumption that the loss function is locally approximately square, trying to describe it by means of an upwardly open parabola. The minimum is sought in the vertex of the parabola. The procedure requires only local information of the artificial neuron to which it is applied.


Source : Source : Wikipedia

Source : Wikipedia Machine learning algorithms

Contributeurs: Imane Meziani, wiki