« Word2vec » : différence entre les versions
(Page créée avec « == Domaine == catégorie:Démo Catégorie Démo Catégorie:Apprentissage profond Apprentissage profond == Définition == == Termes privilégiés ==... ») |
Aucun résumé des modifications |
||
Ligne 18 : | Ligne 18 : | ||
word2vec | '''word2vec''' | ||
word2vec is an algorithm and tool to learn word embeddings by trying to predict the context of words in a document. The resulting word vectors have some interesting properties, for example vector('queen') ~= vector('king') - vector('man') + vector('woman'). Two different objectives can be used to learn these embeddings: The Skip-Gram objective tries to predict a context from on a word, and the CBOW objective tries to predict a word from its context. | word2vec is an algorithm and tool to learn word embeddings by trying to predict the context of words in a document. The resulting word vectors have some interesting properties, for example vector('queen') ~= vector('king') - vector('man') + vector('woman'). Two different objectives can be used to learn these embeddings: The Skip-Gram objective tries to predict a context from on a word, and the CBOW objective tries to predict a word from its context. |
Version du 26 février 2018 à 19:47
Domaine
Catégorie Démo Apprentissage profond
Définition
Termes privilégiés
Anglais
word2vec
word2vec is an algorithm and tool to learn word embeddings by trying to predict the context of words in a document. The resulting word vectors have some interesting properties, for example vector('queen') ~= vector('king') - vector('man') + vector('woman'). Two different objectives can be used to learn these embeddings: The Skip-Gram objective tries to predict a context from on a word, and the CBOW objective tries to predict a word from its context.
Contributeurs: Claude Coulombe, Imane Meziani, wiki