Pondération attentionnelle
Définition
XXXXXXXXX
Français
XXXXXXXXX
Anglais
Attention-Weighting
Attention-weighting is a technique by which the model learns which part of the incoming sequence needs to be focused on. Think of it as the ‘Eye of Sauron’ scanning everything at all times and throwing light on the parts that are relevant.
Source
Contributeurs: Arianne , Patrick Drouin, wiki