wordEmbeddingLayer
Word embedding layer for deep learning neural network
Description
A word embedding layer maps word indices to vectors.
Use a word embedding layer in a deep learning long short-term memory (LSTM) network. An LSTM network is a type of recurrent neural network (RNN) that can learn long-term dependencies between time steps of sequence data. A word embedding layer maps a sequence of word indices to embedding vectors and learns the word embedding during training.
This layer requires Deep Learning Toolbox™.
Creation
Syntax
Description
creates a word embedding layer and specifies the embedding dimension and vocabulary
size.layer
= wordEmbeddingLayer(dimension
,numWords
)
sets optional properties
using one or more name-value pairs. Enclose each property name in single quotes.layer
= wordEmbeddingLayer(dimension
,numWords
,Name,Value
)
Properties
Examples
References
[1] Glorot, Xavier, and Yoshua Bengio. "Understanding the Difficulty of Training Deep Feedforward Neural Networks." In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 249–356. Sardinia, Italy: AISTATS, 2010. https://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf
[2] He, Kaiming, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. "Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification." In 2015 IEEE International Conference on Computer Vision (ICCV), 1026–34. Santiago, Chile: IEEE, 2015. https://doi.org/10.1109/ICCV.2015.123
[3] Saxe, Andrew M., James L. McClelland, and Surya Ganguli. "Exact Solutions to the Nonlinear Dynamics of Learning in Deep Linear Neural Networks.” Preprint, submitted February 19, 2014. https://arxiv.org/abs/1312.6120.
Extended Capabilities
Version History
Introduced in R2018bSee Also
trainNetwork
(Deep Learning Toolbox) | doc2sequence
| trainWordEmbedding
| wordEncoding
| lstmLayer
(Deep Learning Toolbox) | sequenceInputLayer
(Deep Learning Toolbox) | fastTextWordEmbedding
| tokenizedDocument
| word2vec