In case anybody else is looking for a solution. I used the crossval function to wrap the training of the decision tree. This way the implementation of other loss functions is straightforward.
function [trainedClassifier, qualityMeasures] = trainDTwCrossVal(data, predictorNames, MaxNumSplits)
% cross validation
numberOfFolds=5;
cp = cvpartition(data.typeBehavior,'k',numberOfFolds); % creates a random partition for a stratified k-fold cross-validation
vals=crossval(@trainDT2, data, 'partition', cp); % Loss estimate using cross validation
function testval = trainDT2(trainingData, testingData) % nested function to train one DT with trainingData and test with testingData
Testval are quality measures of the prediction, derived from the confusion matrix, calculated inside the nested function to train the decision tree.
% C=[TP FP
% FN TN]
TP=C(1,1); FP=C(1,2); FN=C(2,1); TN=C(2,2);
% Matthews correlation coefficient, worst value = -1, best value = 1
if ( (TP+FP)*(TP+FN)*(TN+FP)*(TN+FN) ) == 0
MCC = 0; % set MCC to zero, if the denominator is zero
else
MCC = (TP*TN - FP*FN) / ...
sqrt( (TP+FP)*(TP+FN)*(TN+FP)*(TN+FN) );
end
accuracy=(TP+TN)/(TP+TN+FP+FN); % accuracy, worst value = 0, best value = 1
F1score=2*TP/(2*TP+FP+FN); % F1 score, worst value = 0, best value = 1
testval=[accuracy F1score MCC];
