How can I increase classification accuracy after feature extraction
Show older comments
I need to get a higher CA than the current value which is random between 60-70 due to randomized AEK_C1/C2 and VK_C1/C2.
clear
close all
clc
load TrainData.mat
load LabelTrain.mat
%dividing train group as C1 and C2
ndx1=(LabelTrain==1);
TrainDataC1=TrainData(:,:,ndx1);
ndx2=(LabelTrain==2);
TrainDataC2=TrainData(:,:,ndx2);
%Assigning half of C1 to AEK and the other half to VK, RANDOMLY
r1=randomsayi(1,70,70);
AEK_C1=TrainDataC1(:,:,r1(1:35,1));
VK_C1=TrainDataC1(:,:,r1(36:70,1));
%Assigning half of C2 to AEK and the other half to VK, RANDOMLY
r2=randomsayi(1,70,70);
AEK_C2=TrainDataC2(:,:,r2(1:35,1));
VK_C2=TrainDataC2(:,:,r2(36:70,1));
AEK=cat(3,AEK_C1,AEK_C2);
AEK_Label= [ones(1,35) 2*ones(1,35)];
VK=cat(3,VK_C1,VK_C2);
VK_Label= [ones(1,35) 2*ones(1,35)];
%clearvars -except AEK VK AEK_Label VK_Label
clear LabelTrain
E=2;
for i=1:70
Trial=AEK(:,E,i);
FV_AEK(i,:)=[std(Trial) kurtosis(Trial)];
end
for i=1:70
Trial=VK(:,E,i);
FV_VK(i,:)=[std(Trial) kurtosis(Trial)];
end
%Validation process
for k=1:35
class = knnclassify(FV_VK,FV_AEK,AEK_Label,k,'euclidean','nearest'); %prediction oc labels
cp=classperf(VK_Label,class); %comparement with predicted and actual labels
CA(k)=cp.CorrectRate*100;%Select the classification accuracy value
end
[BestCA k]=max(CA)
FV_Train=[FV_VK;FV_AEK];
TrainLabel2= [VK_Label AEK_Label];
load TestData
for i=1:140
Trial=TestData(:,E,i);
FV_Test(i,:)=[std(Trial) kurtosis(Trial)];
end
TestLabel_MG = knnclassify(FV_Test,FV_Train,TrainLabel2,k,'euclidean','nearest'); %prediction oc labels
[BestCA k]=max(CA)
% plot(FV(1:70,1),FV(1:70,2),'+')
% hold on
% plot(FV(71:140,1),FV(71:140,2),'ro')
This is the base code given to me. It uses feature extraction and if I add another method like skewness or remove one of them, it doesn't change the output. So I thought maybe after extraction I can use feature selection so it can give a better result but either i couldn't apply it properly or it doesn't change anything, probably the first one. If it will improve the results which datas should I use in the feature selection (e.g. idx = fscmrmr(X,y), which datas I should use insted X and y)? I tried feature selection because we study that in school, but if there is a way without selection I'm open to that too. I also add the datas and codes as attachment so you can see and use to get a better idea. Thank you.
Answers (1)
Image Analyst
on 30 Dec 2023
0 votes
To get better classification accuracy you can either pick better features to measure, use more training samples, or pick a better classification algorithm. There are ways to determine which features are the most important ones, but it involves partial least squares regression and can get pretty tricky.
7 Comments
Image Analyst
on 30 Dec 2023
How many features for each observation do you have? StDev, kurtosis, and what else?
Have you tried using the Classification Learner app on the App tab of the tool ribbon?
Muhammed
on 30 Dec 2023
Image Analyst
on 30 Dec 2023
What does each of the three dimensions represent?
Muhammed
on 30 Dec 2023
Image Analyst
on 30 Dec 2023
What do each of the columns represent?
Muhammed
on 31 Dec 2023
Categories
Find more on Classification Learner App in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
