BERT


en construction

Définition

Transformer-based machine learning technique for natural language processing (NLP).

Introduced by Google in 2019, BERT is a massive pre-trained deeply bidirectional encoder-based transformer model that comes in two variants. BERT-Base has 110 million parameters, and BERT-Large has 340 million parameters.

Français

BERT

Anglais

BERT

Bidirectional Encoder Representations from Transformers



Source : Wikipedia (BERT - Language model)

Source : googleblog.com

Source : kdnuggets.com