Main Content
wordEmbedding
Word embedding model to map words to vectors and back
Description
A word embedding, popularized by the word2vec, GloVe, and fastText libraries, maps words in a vocabulary to real vectors.
The vectors attempt to capture the semantics of the words, so that similar words have similar vectors. Some embeddings also capture relationships between words, such as "king is to queen as man is to woman". In vector form, this relationship is king – man + woman = queen.
Creation
Create a word embedding by loading a pretrained embedding using fastTextWordEmbedding
, reading an embedding from a file using readWordEmbedding
, or by training an embedding using trainWordEmbedding
.
Properties
Object Functions
vec2word | Map embedding vector to word |
word2vec | Map word to embedding vector |
isVocabularyWord | Test if word is member of word embedding or encoding |
writeWordEmbedding | Write word embedding file |
Examples
Version History
Introduced in R2017b