Batch Normalization

De wiki.datafranca.org
Aller à : navigation, rechercher

Domaine

Vocabulary Apprentissage profond

Définition

Termes privilégiés

Anglais

Batch Normalization

Batch Normalization is a technique that normalizes layer inputs per mini-batch. It speed up training, allows for the usage of higher learner rates, and can act as a regularizer. Batch Normalization has been found to be very effective for Convolutional and Feedforward Neural Networks but hasn’t been successfully applied to Recurrent Neural Networks. • Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift • Batch Normalized Recurrent Neural Networks