BERT
en construction
Définition
Transformer-based machine learning technique for natural language processing (NLP).
Introduced by Google in 2019, BERT is a massive pre-trained deeply bidirectional encoder-based transformer model that comes in two variants. BERT-Base has 110 million parameters, and BERT-Large has 340 million parameters.
Français
BERT
Anglais
BERT
Bidirectional Encoder Representations from Transformers
Contributeurs: Claude Coulombe, Imane Meziani, Jean Benoît Morel, wiki