« Backpropagation » : différence entre les versions


(Page créée avec « ==en construction== == Définition == XXXXXXXXX == Français == ''' XXXXXXXXX ''' == Anglais == ''' Backpropagation''' In machine learning, backpropagation (backprop,[... »)
 
(Page redirigée vers Rétropropagation)
Balise : Nouvelle redirection
 
Ligne 1 : Ligne 1 :
==en construction==
#REDIRECTION[[Rétropropagation]]


== Définition ==
[[Catégorie:ENGLISH]]
XXXXXXXXX
 
== Français ==
''' XXXXXXXXX '''
 
== Anglais ==
''' Backpropagation'''
 
In machine learning, backpropagation (backprop,[1] BP) is a widely used algorithm for training feedforward neural networks. Generalizations of backpropagation exists for other artificial neural networks (ANNs), and for functions generally. These classes of algorithms are all referred to generically as "backpropagation".[2] In fitting a neural network, backpropagation computes the gradient of the loss function with respect to the weights of the network for a single input–output example, and does so efficiently, unlike a naive direct computation of the gradient with respect to each weight individually. This efficiency makes it feasible to use gradient methods for training multilayer networks, updating weights to minimize loss; gradient descent, or variants such as stochastic gradient descent, are commonly used. The backpropagation algorithm works by computing the gradient of the loss function with respect to each weight by the chain rule, computing the gradient one layer at a time, iterating backward from the last layer to avoid redundant calculations of intermediate terms in the chain rule; this is an example of dynamic programming.[3]
 
 
 
<small>
 
[https://en.wikipedia.org/wiki/Backpropagation  Source :  Source : Wikipedia  ]
 
[https://en.wikipedia.org/wiki/Outline_of_machine_learning#Machine_learning_algorithms  Source : Wikipedia Machine learning algorithms  ]
 
[[Catégorie:vocabulary]]
[[Catégorie:Wikipedia-IA‏‎]]

Dernière version du 21 avril 2021 à 19:51

Rediriger vers :

Contributeurs: wiki