Out of memory during neural network training

I know this is a common problem, but all the solutions I have tried have failed.
Basically I want to train a big neural network and I obtain 'Out of memory' error.
My training set is a 729x3456 matrix of doubles and the neural network is a so called 'autoencoder' with layers of these sizes
3456 - 4000 - 2000 - 1000 - 300 - 1000 - 2000 - 4000 - 3456
In my code, first of all I do
net = feedforwardnet([layer1, layer2, layer3, layer4, layer3, layer2, layer1], 'trainscg');
net = configure(net, Dtrain', Dtrain');
where I use the 'trainscg' function because I read that it is the one that uses less memory. Then I initialize the weights and biases according to some values (which I have already calculated), set the 'transferFcn' and start training.
I tried cleaning the workspace as much as possible and I also tried to put
net.efficiency.memoryReduction = 4;
before training, since I read it can help. Anyway I still have 'Out of memory', even if I increase the value to 60.
Here is the output of the command 'memory', executed when the workspace contains just the training set and four numbers (the size of the layers)
>> memory
Maximum possible array: 4508 MB (4.727e+09 bytes) *
Memory available for all arrays: 4508 MB (4.727e+09 bytes) *
Memory used by MATLAB: 1927 MB (2.020e+09 bytes)
Physical Memory (RAM): 8080 MB (8.472e+09 bytes)
* Limited by System Memory (physical + swap file) available.
What else can I do to solve the problem?

 Accepted Answer

You will never be able to solve a problem of that size. I suggest
1. Using feature extraction to SUBSTANTIALLY reduce the input dimensionality.
2. Use no more than 1 or 2 hidden layers.

4 Comments

Thank you for your answer, but I am afraid I will have to solve that problem.
Using feature extraction is not feasible, since what I am trying to realize is a feature extraction.
The number of the hidden layers has to be high as well, since this approach is called " deep learning" and is based on a high number of hidden layers.
Is there anything else I can try to "split" the training in several little "pieces" that the PC can handle without errors?
minomic
minomic on 20 Apr 2015
Edited: minomic on 20 Apr 2015
Maybe I found a way: I increased the size of the paging file and now when I execute 'memory' Matlab says I have more than 21 GB available...
I will try this way and report if the problem is solved or not.

Sign in to comment.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!