Attention-Weighting


Définition

XXXXXXXXX

Français

XXXXXXXXX

Anglais

Attention-Weighting

Attention-weighting is a technique by which the model learns which part of the incoming sequence needs to be focused on. Think of it as the ‘Eye of Sauron’ scanning everything at all times and throwing light on the parts that are relevant.

Source

Source : arxiv



Contributeurs: wiki