« Mixture d'experts » : différence entre les versions
(Page créée avec « ==en construction== == Définition == XXXXXXXXX == Français == ''' XXXXXXXXX ''' == Anglais == ''' Mixture of experts''' Mixture of experts refers to a machine learni... ») |
m (Imeziani a déplacé la page Mixture of experts vers Mélange d'experts) |
(Aucune différence)
|
Version du 10 novembre 2021 à 21:42
en construction
Définition
XXXXXXXXX
Français
XXXXXXXXX
Anglais
Mixture of experts
Mixture of experts refers to a machine learning technique where multiple experts (learners) are used to divide the problem space into homogeneous regions.[1] An example from the computer vision domain is combining a neural network model for human detection with another for pose estimation. If the output is conditioned on multiple levels of probabilistic gating functions, the mixture is called a hierarchical mixture of experts.[2]
A gating network decides which expert to use for each input region. Learning thus consists of 1) learning the parameters of individual learners and 2) learning the parameters of the gating network.
Contributeurs: Claude Coulombe, Imane Meziani, wiki