« Extinction de neurone » : différence entre les versions


Aucun résumé des modifications
(extinction de neurone à la place de neurone éteint)
Ligne 1 : Ligne 1 :
== Domaine ==
==Domaine==


[[category:Vocabulary]] Vocabulary<br />
[[category:Vocabulary]]
[[Category:Claude]]Claude<br />
Vocabulary<br />
[[Catégorie:Apprentissage profond]] Apprentissage profond
[[Category:Claude]]
Claude<br />
[[Catégorie:Apprentissage profond]]
Apprentissage profond


== Définition ==
==Définition==
injection de bruit pour rendre le réseau plus robuste, équivalent <br />
injection de bruit pour rendre le réseau plus robuste, équivalent <br />
<br />
<br />


== Français ==
==Français==


<poll>
<poll>
Choisissez parmi ces termes proposés :
Choisissez parmi ces termes proposés :
défaut de réseau
extinction de neurone
maille de réseau
maille de réseau
neurone éteint
point mort
point mort
trou de neurone
trou de mémoire
trou de mémoire
trou de neurone
trou de réseau
trou de réseau
</poll>
</poll>
Ligne 24 : Ligne 26 :
<br />
<br />


== Anglais ==
==Anglais==


===Dropout===
===Dropout===


Dropout is a regularization technique for Neural Networks that prevents overfitting. It prevents neurons from co-adapting by randomly setting a fraction of them to 0 at each training iteration. Dropout can be interpreted in various ways, such as randomly sampling from an exponential number of different networks. Dropout layers first gained popularity through their use in CNNs, but have since been applied to other layers, including input embeddings or recurrent networks.
Dropout is a regularization technique for Neural Networks that prevents overfitting. It prevents neurons from co-adapting by randomly setting a fraction of them to 0 at each training iteration. Dropout can be interpreted in various ways, such as randomly sampling from an exponential number of different networks. Dropout layers first gained popularity through their use in CNNs, but have since been applied to other layers, including input embeddings or recurrent networks.
* Dropout: A Simple Way to Prevent Neural Networks from Overfitting
 
* Recurrent Neural Network Regularization
*Dropout: A Simple Way to Prevent Neural Networks from Overfitting
*Recurrent Neural Network Regularization


<br />
<br />

Version du 11 avril 2019 à 17:02

Domaine

Vocabulary
Claude
Apprentissage profond

Définition

injection de bruit pour rendre le réseau plus robuste, équivalent

Français

<poll> Choisissez parmi ces termes proposés : extinction de neurone maille de réseau point mort trou de neurone trou de mémoire trou de réseau </poll>

Anglais

Dropout

Dropout is a regularization technique for Neural Networks that prevents overfitting. It prevents neurons from co-adapting by randomly setting a fraction of them to 0 at each training iteration. Dropout can be interpreted in various ways, such as randomly sampling from an exponential number of different networks. Dropout layers first gained popularity through their use in CNNs, but have since been applied to other layers, including input embeddings or recurrent networks.

  • Dropout: A Simple Way to Prevent Neural Networks from Overfitting
  • Recurrent Neural Network Regularization