« Régression de crête » : différence entre les versions


m (Remplacement de texte — « DeepAI.org ] » par « DeepAI.org ] Catégorie:DeepAI.org  »)
m (Imeziani a déplacé la page Ridge Regression vers Régression de crête)
(Aucune différence)

Version du 6 novembre 2021 à 11:36

en construction

Définition

XXXXXXXXX

Français

XXXXXXXXX

Anglais

Ridge Regression

Ridge regression for neural networks performs regularization during the training phase with the L2 norm, i.e. it adds a term which is the sum of squares of the weights to the objective (loss) function being minimized. Thus, ridge regression minimizes the following during training: Objective = base_loss(weights) + alpha * (sum of squares of the weights) The base_loss will depend on the underling task (e.g. cross-entropy loss for classification) and alpha is generally adjusted during model validation, and is called the regularization parameter. Ridge regression is also called weight decay.



Source : DeepAI.org

Contributeurs: Imane Meziani, wiki