NaN in Neural network training and simulation; tonndata
    14 views (last 30 days)
  
       Show older comments
    
Hello,
I have two questions. Thank you very much for any inputs and ideas!!! 1) How to deal with NaN in Neural network training and simulation? The datasets I used as following. The input dataset is a 6*204 matrix with several NaNs. The output dataset is a 6*204 matrix with many NaNs. My simulation dataset is a 6*864000 matrix with many NaNs. I used nntool GUI to train network and do simulation. But the simulation results have numbers even for the simulation samples with all NaNs. I want to ask if there is a way I can set that NaN is not replaced by anything. Just keep it as NaN when do training and simulation.
2) When I try cell array generated by tonndata, the neural network treat the cell array as one sample, so there is no way to separate all samples into training data, test data, validation data. Anyone can share me why using cell array in neural network?
I googled but could not find a good answer or document about these two issues. Thank you very much for any inputs and help!
3 Comments
Accepted Answer
  Greg Heath
      
      
 on 22 Jun 2015
         Do not refer to NaN as " No value ". It stands for "Not a Number" and is just referred to as NaN  pronounced as en-ay-en.
 close all, clear all, clc
 [ I   N ]     = size(x) % [ 6 5 ]
 [ O N ]       = size(t) % [ 6 5 ]
 net           = fitnet; % H=10
 net.divideFcn = 'dividetrain'; % Not much data
 Hub = -1+ceil((N*O-O)/(I+O+1)) % 1 H= 10 is overfitting: need overtraining mitigation
 rng('default')
 for i = 1:20
    net           = configure(net,x,t);
    [ net tr   ]  = trainbr(net,x,t); %mitigate overtraining
    y             = net(x)
    stopcrit{i,1} = tr.stop;
    MSE(i,1)     = mse(t-y);
 end
 lasty = y
% = [  2.405      2.405    2.405      2.405    2.405
%      1.2         1.2         1.2         1.2        1.2
%      1.6819    NaN      0.676      NaN      1.559
%     -1.605    -1.605    -1.605    -1.605    -1.605
%      1.5        1.5         1.5          1.5        1.5
%      1.067     NaN      1.9648     NaN      1.1768 ]
 stopcrit1 = stopcrit{1}
% = Minimum gradient reached.
stopcrit = stopcrit  % repmat(stopcrit1,20,1)
 MSEp     = MSE'
% MSEp = e-17 x [ ...
% 104.51      6.77     8.44   103.22  200.35   0.30  0.14 
% 131.41  177.22  195.06   832.83      0.55  0.22   0.89   
%    0.87      0.85   583.46  430.18       0.54  0.33   ]
3 Comments
  Greg Heath
      
      
 on 23 Jun 2015
				Don't skip the calculation. It helps set an upper bound on the search for an optimal H. For example, it is desirable to have H << Hub.
More Answers (1)
See Also
Categories
				Find more on Deep Learning Toolbox in Help Center and File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

