Meaning of tr.best_perf
7 views (last 30 days)
Show older comments
I'm training multiple ANN using different parameters for each (I may use two hidden layers in one case, three on the other, also changing the number of neurons and so on).
I am capturing all tr.best_perf values for them. I do understand that the least tr.best_perf gives me the best possible ANN configuration. But one question arises: what is exactle the meaning of tr.best_perf? Is tr.best_perf the RMS error function?

Thank you
0 Comments
Answers (1)
Greg Heath
on 13 Sep 2017
> I'm training multiple ANN using different parameters for each (I may use > two hidden layers in one case, three on the other, also changing the > number of neurons and so on).
It is usually sufficient to
1. Only use one hidden layer.
2. If I,H,O are the sizes of the Input, Hidden and Output layers,
only vary the number of hidden nodes (default H = 10)
3. Implicitly vary the values of the initial random weights via
changing the state of the random number generator RNG. However,
since the RNG automatically changes states whenever it generates
a new random value, it is only necessary to initialize the RNG once.
[ x, t ]= simplefit_dataset;
MSEref = mse( t-mean(t,2) ) % 8.3378
rng(0), Ntrials = 10, for i = 1:Ntrials
[ net tr y e ] = train(fitnet,x,t);
NMSE(i) = mse(e)/MSEref ; end
% Undeclared NMSEgoal = 0.01
NMSE = NMSE
minmaxNMSE = minmax(NMSE)
% NMSE = 1e-4 *[ 0.1756 0.2511 0.0074 0.0844 0.0080
% 0.0250 0.0090 0.1906 0.6936 0.1255 ]
%
% minmax(NMSE) = [ 7.4e-7 6.94e-5]
Since all 10 designs are better than my undeclared goal of 0.01, the next step would be to increase stability by minimizing H below the default H = 10 subject to the constraint NMSE <= 0.01.
>I am capturing all tr.best_perf values for them. I do understand that
the least tr.best_perf gives me the best possible ANN configuration.
But one question arises: what is exactle the meaning of tr.best_perf?
Is tr.best_perf the RMS error function?
1. You cannot get unbiased estimates by using training subset performance.
2. That is why the MATLAB default is to also use nontraining validation and test subsets.
3. The test subset is assumed to be unavailable to the designer. Therefore, the test subset performance is assumed to be unbiased and is the only performance measure a savy sponsor ($) will accept.
4. Although the validation subset is not used to directly calculate weights, it is used in design to prevent unacceptable performance on nontraining data. Therefore, the validation subset performance is assumed to be only slightly biased and can be used by the designer to impartially rank multiple designs.
5. It helps to think of the following decompositions
total = design + nondesign
design = train + val
nondesign = test
unbiased = test slightly biased = val totally biased = train
%%%%%%%%%%%%%% END OF LECTURE !!! %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
LOOK AT YOUR OUTPUT:
tr.performFcn = 'mse' tr.best_perf = 5.09e-7 % The lowest training set mse
If I were you, I would
1. Obtain Ntrials designs using MATLAB defaults. 2. Determine the minimum number of hidden nodes that will yield satisfactory performance.
Hope this helps
%Thank you for formally accepting my answer*
Greg
0 Comments
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!