fitrsvm fails if epsilon is generated using a for loop
Show older comments
I wanted to run a grid search to find suitable parameters for my SVM model but I have discovered that fitrsvm gives inconsistent errors if the value of the epsilon parameter is generated using a ‘for loop’. For example the RMSE for my model with epsilon = 0.8 will be different if I use the for loop:
for epsilon = 0.8:.1:1.2
compared with if I use the for loop
for epsilon = 0.1:.1:1.2
The RMSEs are 2.6868 and 2.7020 respectively
I thought this might be some floating point error, so I tried to ensure that the epsilon value passed to fitrsvm was exactly 0.8. I did this by creating variable d_epsilon (line 17) and passing its value to fitrsvm (ie by changing line 26 to ‘Epsilon’ = d_epsilon but this did not work. By contrast using c_epsilon which is completely independent of the for loop (line 16) does work.
In my real project, I use nested loops to search for values for Epsilon, Boxconstraint, and KernelScale. The inconsistencies in my results are about 10%. (I am using a grid search as the parameters returned using OptimizeHyperparameters perform worse that some of the parameters cited in journal articles for my dataset (UCI’s auto-mpg).
clear all
%%read in auto-mpg.csv. This is a cleaned version of UCI dataset auto-mpg
data = readtable('auto-mpg.csv','ReadVariableNames',false);
VarNames = {'mpg','cylinders' 'displacement' 'horsepower' 'weight' 'acceleration' ...
'modelYear' 'origin' 'carName'};
data.Properties.VariableNames = VarNames;
data = [data(:,2:9) data(:,1)];
data.carName=[];
%%carry out 10 fold cross-validation with different epsilon values
testResults_SVM=[];
testActual_SVM=[];
rng('default')
c = cvpartition(data.mpg,'KFold',10);
for epsilon = 0.1:0.1:1.2
%c_epsilon= 0.80000;
%d_epsilon = str2double(string(round(epsilon,2)))
for fold = 1:10
cv_trainingData = data(c.training(fold), :);
cv_testData = data(c.test(fold), :);
AutoSVM = fitrsvm(cv_trainingData,'mpg',...
'KernelFunction', 'gaussian', ...
'PolynomialOrder', [], ...
'KernelScale', 5.5, ...
'BoxConstraint', 100, ...
'Epsilon', epsilon, ...
'Standardize', true);
convergenceChk(fold)=AutoSVM.ConvergenceInfo.Converged;
testResults_SVM=[testResults_SVM;predict(AutoSVM,cv_testData)];
testActual_SVM=[testActual_SVM;cv_testData.mpg];
end
%%generate summary statistics and plots
residual_SVM = testResults_SVM-testActual_SVM;
AutoMSE_SVM=((sum((residual_SVM).^2))/size(testResults_SVM,1));
AutoRMSE_SVM = sqrt(AutoMSE_SVM);
if round(epsilon,4) == 0.8
AutoRMSE_SVM
end
end
A copy of my dataset and code is attached or can be accessed via: https://drive.google.com/open?id=1ph1KwdGgFbmNVSwI63LREcXEDN3hkP_Q
Does anyone know a workaround to this? I am using Matlab R2017b
Accepted Answer
More Answers (1)
Walter Roberson
on 24 Mar 2018
Observe:
>> V1 = 0:.1:1
V1 =
Columns 1 through 5
0 0.1 0.2 0.3 0.4
Columns 6 through 10
0.5 0.6 0.7 0.8 0.9
Column 11
1
>> V2 = (0:10)/10
V2 =
Columns 1 through 5
0 0.1 0.2 0.3 0.4
Columns 6 through 10
0.5 0.6 0.7 0.8 0.9
Column 11
1
>> V1-V2
ans =
Columns 1 through 5
0 0 0 5.55111512312578e-17 0
Columns 6 through 10
0 0 0 0 0
Column 11
0
The colon operator works by starting at the lowest value and adding the closest floating point representation of the increment value. Each time through the increment is added. If the increment is not exactly representable in floating point, or you span powers of 2 in the range, then you are going to have cumulative floating point round-off problems.
Except... if you check carefully, the bit pattern does not exactly match this description. I am not sure how, precisely, colon is currently implemented.
The take-away lesson here is that numbers generated by the colon operator are subject to floating point round-off and should not be used for comparison by equality.
1 Comment
Adam White
on 24 Mar 2018
Edited: Adam White
on 24 Mar 2018
Categories
Find more on Support Vector Machine Regression in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!