Vocabulary Apprentissage profond
Batch Normalization is a technique that normalizes layer inputs per mini-batch. It speed up training, allows for the usage of higher learner rates, and can act as a regularizer. Batch Normalization has been found to be very effective for Convolutional and Feedforward Neural Networks but hasn’t been successfully applied to Recurrent Neural Networks. • Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift • Batch Normalized Recurrent Neural Networks
Vous devez demander un compte pour contribuer à cette page.