Performance of test data Neural Network
Show older comments
Hi, I'm using Greg Heath's code of K-fold cross validation (CV) for neural network from this website (<http://www.mathworks.com/matlabcentral/newsreader/view_thread/340857)>. I used the Bayesian training function and 3 folds for the CV, then I took the average of the 3 folds CV results to represent the performance for the network, I repeat this procedure for 10 times, and then compare the results of the 10 networks to choose the network with the best performance, I expect that the network with the best training performance should produce the best testing performance, because using the K-fold cross validation it means that all the data has been trained and tested and by taking the average of the results that mean it represent all the trained and tested sets. Unfortunately the results were not like what I expect, because some times I found the best training network performance produce the worst testing performance. I repeat the code also for many cases of hidden nodes from 3 until 10, but the same problem still exist and I couldn't find a network which produce the best training and testing performance. Is there any thing wrong I did with code which affect the performance results? can anyone help?
This is the code.
k = 3 X=x';Y=y';
[ I N ] = size(X) %[ 5 1792 ]
[O N ] = size(Y) %[ 1 1792 ]
MSE00 = mean(var(Y',1)) % 30.5996 Biased Reference
MSE00a = mean(var(Y')) % 30.6167 Unbiased Reference
rng(0)
ind0 = randperm(N);
M = floor(N/k) % length(valind & tstind)
Ntrn = N-2*M % length(trnind)
Ntrneq = Ntrn*O % No. training equations
H =3 % default No. hidden nodes
Nw = (I+1)*H+(H+1)*O % No. unknown weights
Ndof = Ntrneq-Nw % No. of estimation degrees of freedom
MSEgoal = 0.01*Ndof*MSE00a/Ntrneq
MinGrad = MSEgoal/10
nuNN=10; netoutp=cell(1,nuNN);
for j=1:nuNN clear perform Aveperfor net nets net = feedforwardnet(H);
net=init(net);
nets = cell(1,k);
for i = 1:k
nets{i}.trainParam.goal = MSEgoal;
nets{i}.trainParam.min_grad = MinGrad;
net.divideFcn = 'divideind';
valind = 1 + M*(i-1) : M*i;
if i==k
tstind = 1:M;
trnind = [ M+1:M*(k-1) , M*k+1:N ];
else
tstind = valind + M;
trnind = [ 1:valind(1)-1 , tstind(end)+1:N ];
end
trnInd = ind0(trnind); %Note upper & lower case "i"
valInd = ind0(valind);
tstInd = ind0(tstind);
net.divideParam.trainInd = trnInd;
net.divideParam.valInd = valInd;
net.divideParam.testInd = tstInd;
[ nets{i} tr{i} yy e ] = train( net, X,Y);
yt = nets{i}(X);
e = gsubtract(Y,yt);
bestepoch(i,1) = tr{i}.best_epoch;
R2trn(i,1) = 1 - tr{i}.best_perf/MSE00;
R2trna(i,1) = 1 -(Ntrneq/Ndof)* tr{i}.best_perf/MSE00a;
R2val(i,1) = 1 - tr{i}.best_vperf/MSE00;
R2tst(i,1) = 1 - tr{i}.best_tperf/MSE00;
performance(i,1) = perform(nets{i},Y,yt) trainTargets = Y .* tr{i}.trainMask{1}; testTargets = Y .* tr{i}.testMask{1}; trainPerformance(i,1) = perform(nets{i},trainTargets,yt) testPerformance(i,1) = perform(nets{i},testTargets,yt)
end
perform=[performance trainPerformance testPerformance];
Aveperfor=[mean(performance,1) mean(trainPerformance,1) mean(testPerformance,1)];
netoutp{j}=Aveperfor;
end D=cell2mat(netoutp');
Many Thanks in advane
Accepted Answer
More Answers (0)
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!