Autoattention multitêtes


Définition

XXXXXXXXX

Français

Autoattention multitêtes

Autoattention multi-têtes

Anglais

Multi-Head Attention

Multi-Head Self-Attention


Multi-head Attention is a module for attention mechanisms which runs through an attention mechanism several times in parallel. The independent attention outputs are then concatenated and linearly transformed into the expected dimension. Intuitively, multiple attention heads allows for attending to parts of the sequence differently (e.g. longer-term dependencies versus shorter-term dependencies).


Source

Source : Cordonnier, J.-B. (2023), Transformer Models for Vision.

Source: Punyakeerthi (2024), Difference between Self-Attention and Multi-head Self-Attention

Source : paperswithcode