Sparse Autoencoder with Adam optimization
6 views (last 30 days)
Show older comments
Hello!
I have a data set that contains 4 parts 1- Train Attribute( 121x125973 double ) , 2- Train Label (1x125973 double ), 3- Test Attribute(121x22544 double ) , 4- Test Label (1x22544 double) for NSL KDD dataset and it is ready to implement algorithem.
I applied sparse autoencoder and works with out any problem
options.Method = 'lbfgs' ;
options.maxIter = maxIter ;
options.useMex = 0 ;
[opttheta, cost] = minFunc( @(p)sparseAutoencoderCost(p, inputSize, ...
hs, l1, sp, beta, trainAttr), theta, options) ;
trainFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
trainAttr) ;
testFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
testAttr) ;
But when I try to optimize the result using Adam optimizer I faced this problem " Unrecognized property 'GRADIENTDECAYFACTOR' for class 'nnet.cnn.TrainingOptionsADAM'.
this is my code
options = trainingOptions('adam', ...
'InitialLearnRate',3e-4, ...
'SquaredGradientDecayFactor',0.99, ...
'MaxEpochs',20, ...
'MiniBatchSize',64, ...
'Plots','training-progress');
[opttheta, cost] = minFunc( @(p)sparseAutoencoderCost(p, inputSize, ...
hs, l1, sp, beta, trainAttr), theta, options) ;
trainFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
trainAttr) ;
testFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
testAttr) ;
I wonder how can apply sparse autoencoder with adam optimization ?
0 Comments
Answers (0)
See Also
Categories
Find more on Eigenvalue Problems in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!