« BERT » : différence entre les versions


(Page créée avec « ==en construction== == Définition == XXXXXXXXX == Français == ''' BERT''' == Anglais == ''' BERT''' BERT belongs to a class of NLP-based language algorithms known as... »)
 
Aucun résumé des modifications
Ligne 1 : Ligne 1 :
==en construction==
==en construction==
== Définition ==
Transformer-based machine learning technique for natural language processing (NLP).


== Définition ==
Introduced by Google in 2019, BERT is a massive pre-trained deeply bidirectional encoder-based transformer model that comes in two variants. BERT-Base has 110 million parameters, and BERT-Large has 340 million parameters.
XXXXXXXXX


== Français ==
== Français ==
''' BERT'''
'''BERT'''


== Anglais ==
== Anglais ==
''' BERT'''
'''BERT'''
 
'''Bidirectional Encoder Representations from Transformers'''


BERT belongs to a class of NLP-based language algorithms known as transformers. BERT is a massive pre-trained deeply bidirectional encoder-based transformer model that comes in two variants. BERT-Base has 110 million parameters, and BERT-Large has 340 million parameters.


<small>
<small>


[https://ai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.html  Source : googleblog]


[https://www.kdnuggets.com/2021/11/guide-word-embedding-techniques-nlp.html Source : kdnuggets ]
[https://en.wikipedia.org/wiki/BERT_(language_model)  Source : Wikipedia (BERT - Language model) ]
 
[https://ai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.html  Source : googleblog.com ]
 
[https://www.kdnuggets.com/2021/11/guide-word-embedding-techniques-nlp.html Source : kdnuggets.com ]




[[Catégorie:vocabulary]]
[[Catégorie:vocabulary]]

Version du 18 novembre 2021 à 11:11

en construction

Définition

Transformer-based machine learning technique for natural language processing (NLP).

Introduced by Google in 2019, BERT is a massive pre-trained deeply bidirectional encoder-based transformer model that comes in two variants. BERT-Base has 110 million parameters, and BERT-Large has 340 million parameters.

Français

BERT

Anglais

BERT

Bidirectional Encoder Representations from Transformers



Source : Wikipedia (BERT - Language model)

Source : googleblog.com

Source : kdnuggets.com