Grid search in svm
    6 views (last 30 days)
  
       Show older comments
    
Hi,
I am having training data (train.mat) and testing data (test.mat), I need to perform grid search in this. How can I do svm training with this?  Kindly help me with this. Thanks in advance.
2 Comments
Accepted Answer
  Walter Roberson
      
      
 on 20 Jul 2022
        Nbox = 15;    %change these as desired
Nkern = 10;   %change these as desired
bcvec = logspace(-3, 3, Nbox);
ksvec = logspace(-3, 3, Nkern);
goodness = zeros(Nbox, Nkern);
for ksidx = 1 : Nkern
    ks = ksvec(ksidx);
    for bcidx = 1 : Nbox
        bc = bcvec(bcidx);
        mdl = fitcsvm(Training_Data, Training_Class, ...
            'Standardize', true, ...
            'KernelFunction','rbf', ...
            'BoxConstraint', bc, ...
            'KernelScale', ks);
        predictions = predict(mdl, Test_Data);
        correct_matches = nnz(predictions == Test_Class);
        goodness(bcidx, ksidx) = correct_matches;
    end
end
best_score = max(goodness(:));
[bcidx, ksidx] = find(goodness == best_score);
best_bc = bcvec(bcidx);
best_ks = ksvec(ksidx);
best_fits = table(best_bc, best_ks, 'VariableNames', {'BoxConstraint', 'KernelScale'});
The output will be a table, best_fits, listing all of the combinations of BoxConstraint and KernelScale that together lead to the best scores discovered, where "score" here is determined [in this code] solely by the number of correct matches.
You would have to change how the goodness is calculated if you wanted to be concerned about aspects such as balancing the accuracies of the various classes. With the current code, if it was (for example) 100% successful in matching the largest class, and 0% successful in all of the other classes, then the goodness calculated by this code might be as high as it gets, even though it might be terrible for the other classes. You would probably be better off calculating the goodness some other way.
6 Comments
  Walter Roberson
      
      
 on 22 Jul 2022
				temp = load('train.mat');
Training_Data1 = temp.finaltrain(:,1:end/2);
Training_Data2 = temp.finaltrain(:,end/2+1:end);
Training_Data = [Training_Data1; Training_Data2];
Training_Class = [ones(size(Training_Data1, 1), 1); 2*ones(size(Training_Data2,1),1)];
temp = load('test.mat');
Test_Data1 = temp.finaltest(:,1:end/2);
Test_Data2 = temp.finaltest(:,end/2+1:end);
Test_Data = [Test_Data1; Test_Data2];
Test_Class = [ones(size(Test_Data1, 1), 1); 2*ones(size(Test_Data2,1),1)];
Nbox = 25;    %change these as desired
Nkern = 20;   %change these as desired
bcvec = logspace(-3, 3, Nbox);
ksvec = logspace(-3, 3, Nkern);
goodness = zeros(Nbox, Nkern);
wb = waitbar(0, 'please wait, processing kernel scales');
cleanMe = onCleanup(@() delete(wb));
for ksidx = 1 : Nkern
    waitbar(ksidx/Nkern, wb);
    ks = ksvec(ksidx);
    for bcidx = 1 : Nbox
        bc = bcvec(bcidx);
        mdl = fitcsvm(Training_Data, Training_Class, ...
            'Standardize', true, ...
            'KernelFunction','rbf', ...
            'BoxConstraint', bc, ...
            'KernelScale', ks);
        predictions = predict(mdl, Test_Data);
        correct_matches = nnz(predictions == Test_Class);
        goodness(bcidx, ksidx) = correct_matches;
    end
end
clear cleanMe  %get rid of waitbar
best_score = max(goodness(:));
[bcidx, ksidx] = find(goodness == best_score);
best_bc = bcvec(bcidx);
best_ks = ksvec(ksidx);
best_fits = table(best_bc, best_ks, 'VariableNames', {'BoxConstraint', 'KernelScale'});
best_fits
More Answers (0)
See Also
Categories
				Find more on Get Started with MATLAB in Help Center and File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
