« Connexion saute-couche » : différence entre les versions


(Page créée avec « __NOTOC__ == Domaine == Category:VocabularyVocabulary<br /> Category:Intelligence artificielleIntelligence artificielle<br /> <br /> == Définition == <br />... »)
 
Aucun résumé des modifications
Ligne 13 : Ligne 13 :
<br />
<br />
== Termes privilégiés ==
== Termes privilégiés ==
=== ===
 




Ligne 19 : Ligne 19 :
== Anglais ==
== Anglais ==


===  residual connections ===
===  residual connection ===
=== [[skip connection]] ===


<br/>
<br/>

Version du 25 mars 2018 à 13:51

Domaine

Vocabulary
Intelligence artificielle



Définition


Termes privilégiés


Anglais

residual connection

skip connection


ResNet and its constituent residual blocks draw their names from the ‘residual’—the difference between the predicted and target values. The authors of ResNet used residual learning of the form H(x) = F(x) + x. Simply, this means that even in the case of no residual, F(x)=0, we would still preserve an identity mapping of the input, x. The resulting learned residual allows our network to theoretically do no worse (than without it).