how to deal with out of memory error when training a large data set or large neural network narx

how to deal with out of memory error when training a large data set or large neural network stacked autoencoder ......just initiated to analyze deep neural network with time series predictions , what i guess is by using batch trainings but can any one help me in detail

Answers (2)

What are
1. input and target
2. size(input), size(target)
3. Significant lags of
a. target autocorrelation function
b. input/target cross correlation function
i am xtremly sorry i wrote narx but i actually tried it with stacked autoencoder . hidden layers [100 150] even on HPC server i was getting same out of memory error.
inputs are 4 *2795 and output 1*2795 .

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Asked:

SAM
on 15 Apr 2015

Edited:

SAM
on 15 Apr 2015

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!