Dear all,
I want to use Feature Forward selection with SVM but I get the criterion very small and very different if I do the svm with the selected features with the loss function. But the features selected are ok for my model but I do not understand the output.  
           classifierfun = @(train_data,train_labels,test_data,test_labels) ...
                 loss(fitcsvm(train_data,train_labels,'KernelFunction', 
            'gaussian','Standardize',true),test_data,test_labels,'LossFun', 'ClassifError');   
                [fs,history] = sequentialfs(classifierfun,table2array(TableFeaturesNormalized),
          Y,'cv',c,'nfeatures',min(size(TableFeaturesNormalized,2),max_its_fs),'options',opts)
                        Step 1, added column 5, criterion value 0.00873988
                        Step 2, added column 9, criterion value 0.00812571
                        Step 3, added column 1, criterion value 0.00839142
                        Step 4, added column 2, criterion value 0.00785281
                        Step 5, added column 3, criterion value 0.00792138
                        Step 6, added column 4, criterion value 0.00827403
                        Step 7, added column 7, criterion value 0.00872569
                        Step 8, added column 6, criterion value 0.00859294
                        Step 9, added column 8, criterion value 0.00879047
                        If I replace it with
                           classifierfun = @(train_data,train_labels,test_data,test_labels) ...
                                 sum(predict(fitcsvm(train_data,train_labels,'KernelFunction',  
                    'gaussian','Standardize',true), test_data) ~= test_labels); 
The criterion makes sense (around 0.30) but the features selected are not so good as using the loss function. Any help?
Thanks