Minimizing the Parameters used in the neural net using the Genetic Algorithm

I am trying to reproduce the results of a research paper that minimizes the number of parameters used as input for a neural net. I have to reduce the number of parameters used as input for a neural net.The number of input parameters is 18.The author of the research paper reduces the number of parameters to 3.I have generated the matlab code from the neural net toolbox and then used the example quoted for optimizing the neural net with genetic algorithms mentioned on matlab answers as a reference to achieve my target.As i am new to genetic algorithms and neural nets i am not sure how should i know which parameters are sufficient for proper calculation of the output from then neural net.To be more precise which output argument of the genetic algorithm which give me those parameters. By the way these parameters should be calculated as output of the genetic algorithm.Secondly what is the concept of population in a genetic algorithms and how should i generate it.The fitness function i am using is MSE. Currently i am using sample data set from the matlab to keep things simple for me.The input has 13 features which might be reduced to 2 or 3 depending on optimization.
inputs = wineInputs; targets = wineTargets;
% Create a Pattern Recognition Network hiddenLayerSize = 10; net = patternnet(hiddenLayerSize);
% Choose Input and Output Pre/Post-Processing Functions % For a list of all processing functions type: help nnprocess net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Setup Division of Data for Training, Validation, Testing % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand'; % Divide data randomly net.divideMode = 'sample'; % Divide up every sample net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100;
% For help on training function 'trainlm' type: help trainlm % For a list of all training functions type: help nntrain net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% Choose a Performance Function % For a list of all performance functions type: help nnperformance net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions % For a list of all plot functions type: help nnplot net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ... 'plotregression', 'plotfit'};
% Train the Network [net,tr] = train(net,inputs,targets);
% Test the Network outputs = net(inputs); errors = gsubtract(targets,outputs); performance = perform(net,targets,outputs);
% Recalculate Training, Validation and Test Performance trainTargets = targets .* tr.trainMask{1}; valTargets = targets .* tr.valMask{1}; testTargets = targets .* tr.testMask{1}; trainPerformance = perform(net,trainTargets,outputs); valPerformance = perform(net,valTargets,outputs); testPerformance = perform(net,testTargets,outputs); h = @(x) mse(x, net, inputs, targets); ga_opts = gaoptimset('TolFun', 1e-8,'display','iter'); %[x_ga_opt, err_ga] = ga(h, 10, ga_opts); [x,fval,exitflag,output,population] = ga(h, 10, ga_opts); %returns the matrix, population, whose rows are the final population.
% View the Network view(net) Thanks for help in advance

Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Asked:

on 13 Mar 2016

Edited:

on 13 Mar 2016

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!