Why is my LSTM network taking 3 hours to train on a single CPU?
6 views (last 30 days)
Show older comments
I have been training a LSTM network that contains 5000 samples of numerical data (each containing 100 time-steps). I am perplexed as to why this network is training so much slower than other networks I have experimented with in patternnet. Is this just the nature of the LSTM structure? Could there be something in my code that is causing the network to run slower than normal? Thanks. My slow code is below:
RTrain = Sample5000; %100*5000 double where each of 5000 samples
%has 100 time-steps. There are 4 classes represented in this sample (1250
%of each).
% Assign Categories to the 4 cases
m = [1:5000];
% placing a 1 in row 1 for 0-1250,
m(1, 1:1250) = 1;
% a 1 in row 2 for 1251-2500,
m(1, 1251:2500) = 2;
% and so on
m(1, 2501:3750) = 3;
m(1, 3751:5000) = 4;
m = transpose(m);
Categories = categorical(m);
%%Define LSTM Network Architecture
inputSize = 1;
outputSize = 100; %Number of time steps
outputMode = 'last';
numClasses = 4;
layers = [ ...
sequenceInputLayer(inputSize)
lstmLayer(outputSize,'OutputMode',outputMode)
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer]
% Adjust training options
maxEpochs = 150;
options = trainingOptions('sgdm', ...
'MaxEpochs',maxEpochs);
%%Train LSTM Network
net = trainNetwork(RTrain,Categories,layers,options);
0 Comments
Answers (0)
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!