« Problème de l'explosion du gradient » : différence entre les versions


Aucun résumé des modifications
Aucun résumé des modifications
Ligne 11 : Ligne 11 :
== Termes privilégiés ==
== Termes privilégiés ==


   
=== ===
   
   
   
   
Ligne 18 : Ligne 18 :




'''Exploding Gradient Problem'''
===Exploding Gradient Problem===


The Exploding Gradient Problem is the opposite of the Vanishing Gradient Problem. In Deep Neural Networks gradients may explode during backpropagation, resulting number overflows. A common technique to deal with exploding gradients is to perform Gradient Clipping.
The Exploding Gradient Problem is the opposite of the Vanishing Gradient Problem. In Deep Neural Networks gradients may explode during backpropagation, resulting number overflows. A common technique to deal with exploding gradients is to perform Gradient Clipping.
• On the difficulty of training recurrent neural networks
• On the difficulty of training recurrent neural networks

Version du 6 mars 2018 à 20:45

Domaine

Vocabulary Apprentissage profond

Définition

Termes privilégiés

Anglais

Exploding Gradient Problem

The Exploding Gradient Problem is the opposite of the Vanishing Gradient Problem. In Deep Neural Networks gradients may explode during backpropagation, resulting number overflows. A common technique to deal with exploding gradients is to perform Gradient Clipping. • On the difficulty of training recurrent neural networks