Problem with ANN output values !

Hi there,
I have problem with my ANN, I have bunch of audio files, I extracted MFCC features from my audio files and made numeric matrix from it as input, then I classified every of that sample with numbers from 1 to 4 class and developed my ANN it gives good results according to MSE. Now i want to use another audio sample, extract features and store it as other numeric matrix, when I done that and use function
output = net (Sample1)
X = round (output)
It gave me result 9 but I have just 4 classes, i thought that was problem with sample then tried another one and gave me same result. Below I will post code I used, if you could please help.
This is code for MFCC extraction:
function mfccs = msf_mfcc(speech,fs,varargin)
p = inputParser;
addOptional(p,'winlen', 0.025,@(x)gt(x,0));
addOptional(p,'winstep', 0.01, @(x)gt(x,0));
addOptional(p,'nfilt', 26, @(x)ge(x,1));
addOptional(p,'lowfreq', 0, @(x)ge(x,0));
addOptional(p,'highfreq', fs/2, @(x)ge(x,0));
addOptional(p,'nfft', 512, @(x)gt(x,0));
addOptional(p,'ncep', 13, @(x)ge(x,1));
addOptional(p,'liftercoeff', 22, @(x)ge(x,0));
addOptional(p,'appendenergy',true, @(x)ismember(x,[true,false]));
addOptional(p,'preemph', 0, @(x)ge(x,0));
parse(p,varargin{:});
in = p.Results;
H = msf_filterbank(in.nfilt, fs, in.lowfreq, in.highfreq, in.nfft);
pspec = msf_powspec(speech, fs, 'winlen', in.winlen, 'winstep', in.winstep, 'nfft', in.nfft);
en = sum(pspec,2);
feat = dct(log(H*pspec'))';
mfccs = lifter(feat(:,1:in.ncep), in.liftercoeff);
if in.appendenergy
mfccs(:,1) = log10(en);
end
end
function lcep = lifter(cep,L)
[N,D] = size(cep);
n = 0:D-1;
lift = 1 + (L/2)*sin(pi*n/L);
lcep = cep .* repmat(lift,N,1);
end
And this is code of my ANN :
% Solve an Input-Output Fitting problem with a Neural Network
% Script generated by Neural Fitting app
% Created 07-Jun-2019 17:51:17
%
% This script assumes these variables are defined:
%
% input - input data.
% target - target data.
x = input;
t = target;
% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainlm'; % Levenberg-Marquardt backpropagation.
% Create a Fitting Network
hiddenLayerSize = 10;
net = fitnet(hiddenLayerSize,trainFcn);
% Choose Input and Output Pre/Post-Processing Functions
% For a list of all processing functions type: help nnprocess
net.input.processFcns = {'removeconstantrows','mapminmax'};
net.output.processFcns = {'removeconstantrows','mapminmax'};
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivision
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean Squared Error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ...
'plotregression', 'plotfit'};
% Train the Network
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y)
% Recalculate Training, Validation and Test Performance
trainTargets = t .* tr.trainMask{1};
valTargets = t .* tr.valMask{1};
testTargets = t .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotfit(net,x,t)
% Deployment
% Change the (false) values to (true) to enable the following code blocks.
% See the help for each generation function for more information.
if (false)
% Generate MATLAB function for neural network for application
% deployment in MATLAB scripts or with MATLAB Compiler and Builder
% tools, or simply to examine the calculations your trained neural
% network performs.
genFunction(net,'myNeuralNetworkFunction');
y = myNeuralNetworkFunction(x);
end
if (false)
% Generate a matrix-only MATLAB function for neural network code
% generation with MATLAB Coder tools.
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
y = myNeuralNetworkFunction(x);
end
if (false)
% Generate a Simulink diagram for simulation or deployment with.
% Simulink Coder tools.
gensim(net);
end
Best regards !

Answers (0)

Products

Release

R2018a

Asked:

on 7 Jun 2019

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!