Asynchronous Stochastic Gradient Descent
en construction
Définition
XXXXXXX
Voir aussi descente de gradient stochastique et réseau de neurones artificiels
Français
Descente de gradient stochastique asynchrone
Anglais
Asynchronous Stochastic Gradient Descent
Asynchronous SGD
ASGD
Deep neural networks have been shown to achieve state-of-the-art performance in several machine learning tasks. Stochastic Gradient Descent (SGD) is the preferred optimization algorithm for training these networks and asynchronous SGD (ASGD) has been widely adopted for accelerating the training of large-scale deep networks in a distributed computing environment.
Source
Contributeurs: Arianne