Main Content

ClassificationECOC

Multiclass model for support vector machines (SVMs) and other classifiers

Description

ClassificationECOC is an error-correcting output codes (ECOC) classifier for multiclass learning, where the classifier consists of multiple binary learners such as support vector machines (SVMs). Trained ClassificationECOC classifiers store training data, parameter values, prior probabilities, and coding matrices. Use these classifiers to perform tasks such as predicting labels or posterior probabilities for new data (see predict).

Creation

Create a ClassificationECOC object by using fitcecoc.

If you specify linear or kernel binary learners without specifying cross-validation options, then fitcecoc returns a CompactClassificationECOC object instead.

Properties

expand all

After you create a ClassificationECOC model object, you can use dot notation to access its properties. For an example, see Train Multiclass Model Using SVM Learners.

ECOC Properties

Trained binary learners, specified as a cell vector of model objects. The number of binary learners depends on the number of classes in Y and the coding design.

The software trains BinaryLearner{j} according to the binary problem specified by CodingMatrix(:,j). For example, for multiclass learning using SVM learners, each element of BinaryLearners is a CompactClassificationSVM classifier.

Data Types: cell

Binary learner loss function, specified as a character vector representing the loss function name.

This table identifies the default BinaryLoss value, which depends on the score ranges returned by the binary learners.

AssumptionDefault Value

All binary learners are any of the following:

  • Classification decision trees

  • Discriminant analysis models

  • k-nearest neighbor models

  • Linear or kernel classification models of logistic regression learners

  • Naive Bayes models

"quadratic"
All binary learners are SVMs or linear or kernel classification models of SVM learners."hinge"
All binary learners are ensembles trained by AdaboostM1 or GentleBoost."exponential"
All binary learners are ensembles trained by LogitBoost."binodeviance"
You specify to predict class posterior probabilities by setting FitPosterior=true in fitcecoc."quadratic"
Binary learners are heterogeneous and use different loss functions."hamming"

To check the default value, use dot notation to display the BinaryLoss property of the trained model at the command line.

To potentially increase accuracy, specify a binary loss function other than the default during a prediction or loss computation by using the BinaryLoss name-value argument of predict or loss. For more information, see Binary Loss.

Data Types: char

Binary learner class labels, specified as a numeric matrix. BinaryY is a NumObservations-by-L matrix, where L is the number of binary learners (length(Mdl.BinaryLearners)).

Elements of BinaryY are –1, 0, or 1, and the value corresponds to a dichotomous class assignment. This table describes how learner j assigns observation k to a dichotomous class corresponding to the value of BinaryY(k,j).

ValueDichotomous Class Assignment
–1Learner j assigns observation k to a negative class.
0Before training, learner j removes observation k from the data set.
1Learner j assigns observation k to a positive class.

Data Types: double

This property is read-only.

Bin edges for numeric predictors, specified as a cell array of p numeric vectors, where p is the number of predictors. Each vector includes the bin edges for a numeric predictor. The element in the cell array for a categorical predictor is empty because the software does not bin categorical predictors.

The software bins numeric predictors only if you specify the 'NumBins' name-value argument as a positive integer scalar when training a model with tree learners. The BinEdges property is empty if the 'NumBins' value is empty (default).

You can reproduce the binned predictor data Xbinned by using the BinEdges property of the trained model mdl.

X = mdl.X; % Predictor data
Xbinned = zeros(size(X));
edges = mdl.BinEdges;
% Find indices of binned predictors.
idxNumeric = find(~cellfun(@isempty,edges));
if iscolumn(idxNumeric)
    idxNumeric = idxNumeric';
end
for j = idxNumeric 
    x = X(:,j);
    % Convert x to array if x is a table.
    if istable(x) 
        x = table2array(x);
    end
    % Group x into bins by using the discretize function.
    xbinned = discretize(x,[-inf; edges{j}; inf]); 
    Xbinned(:,j) = xbinned;
end
Xbinned contains the bin indices, ranging from 1 to the number of bins, for numeric predictors. Xbinned values are 0 for categorical predictors. If X contains NaNs, then the corresponding Xbinned values are NaNs.

Data Types: cell

Class assignment codes for the binary learners, specified as a numeric matrix. CodingMatrix is a K-by-L matrix, where K is the number of classes and L is the number of binary learners.

The elements of CodingMatrix are –1, 0, and 1, and the values correspond to dichotomous class assignments. This table describes how learner j assigns observations in class i to a dichotomous class corresponding to the value of CodingMatrix(i,j).

ValueDichotomous Class Assignment
–1Learner j assigns observations in class i to a negative class.
0Before training, learner j removes observations in class i from the data set.
1Learner j assigns observations in class i to a positive class.

Data Types: double | single | int8 | int16 | int32 | int64

Coding design name, specified as a character vector. For more details, see Coding Design.

Data Types: char

Binary learner weights, specified as a numeric row vector. The length of LearnerWeights is equal to the number of binary learners (length(Mdl.BinaryLearners)).

LearnerWeights(j) is the sum of the observation weights that binary learner j uses to train its classifier.

The software uses LearnerWeights to fit posterior probabilities by minimizing the Kullback-Leibler divergence. The software ignores LearnerWeights when it uses the quadratic programming method of estimating posterior probabilities.

Data Types: double | single

Other Classification Properties

Categorical predictor indices, specified as a vector of positive integers. CategoricalPredictors contains index values indicating that the corresponding predictors are categorical. The index values are between 1 and p, where p is the number of predictors used to train the model. If none of the predictors are categorical, then this property is empty ([]).

Data Types: single | double

This property is read-only.

Unique class labels used in training, specified as a categorical or character array, logical or numeric vector, or cell array of character vectors. ClassNames has the same data type as the class labels Y. (The software treats string arrays as cell arrays of character vectors.) ClassNames also determines the class order.

Data Types: categorical | char | logical | single | double | cell

This property is read-only.

Misclassification costs, specified as a square numeric matrix. Cost has K rows and columns, where K is the number of classes.

Cost(i,j) is the cost of classifying a point into class j if its true class is i. The order of the rows and columns of Cost corresponds to the order of the classes in ClassNames.

Data Types: double

Expanded predictor names, specified as a cell array of character vectors.

If the model uses encoding for categorical variables, then ExpandedPredictorNames includes the names that describe the expanded variables. Otherwise, ExpandedPredictorNames is the same as PredictorNames.

Data Types: cell

Parameter values, such as the name-value pair argument values, used to train the ECOC classifier, specified as an object. ModelParameters does not contain estimated parameters.

Access properties of ModelParameters using dot notation. For example, list the templates containing parameters of the binary learners by using Mdl.ModelParameters.BinaryLearner.

Number of observations in the training data, specified as a positive numeric scalar.

Data Types: double

This property is read-only.

Predictor names in order of their appearance in the predictor data X, specified as a cell array of character vectors. The length of PredictorNames is equal to the number of columns in X.

Data Types: cell

This property is read-only.

Prior class probabilities, specified as a numeric vector. Prior has as many elements as the number of classes in ClassNames, and the order of the elements corresponds to the order of the classes in ClassNames.

fitcecoc incorporates misclassification costs differently among different types of binary learners.

Data Types: double

Response variable name, specified as a character vector.

Data Types: char

Rows of the original training data used in fitting the ClassificationECOC model, specified as a logical vector. This property is empty if all rows are used.

Data Types: logical

This property is read-only.

Score transformation function to apply to the predicted scores, specified as 'none'. An ECOC model does not support score transformation.

Observation weights used to train the ECOC classifier, specified as a numeric vector. W has NumObservations elements.

The software normalizes the weights used for training so that sum(W,'omitnan') is 1.

Data Types: single | double

Unstandardized predictor data used to train the ECOC classifier, specified as a numeric matrix or table.

Each row of X corresponds to one observation, and each column corresponds to one variable.

Data Types: single | double | table

Observed class labels used to train the ECOC classifier, specified as a categorical or character array, logical or numeric vector, or cell array of character vectors. Y has NumObservations elements and has the same data type as the input argument Y of fitcecoc. (The software treats string arrays as cell arrays of character vectors.)

Each row of Y represents the observed classification of the corresponding row of X.

Data Types: categorical | char | logical | single | double | cell

Hyperparameter Optimization Properties

This property is read-only.

Cross-validation optimization of hyperparameters, specified as a BayesianOptimization object or a table of hyperparameters and associated values. This property is nonempty if the 'OptimizeHyperparameters' name-value pair argument is nonempty when you create the model. The value of HyperparameterOptimizationResults depends on the setting of the Optimizer field in the HyperparameterOptimizationOptions structure when you create the model.

Value of Optimizer OptionValue of HyperparameterOptimizationResults
"bayesopt" (default)Object of class BayesianOptimization
"gridsearch" or "randomsearch"Table of hyperparameters used, observed objective function values (cross-validation loss), and rank of observations from lowest (best) to highest (worst)

Object Functions

compactReduce size of multiclass error-correcting output codes (ECOC) model
compareHoldoutCompare accuracies of two classification models using new data
crossvalCross-validate machine learning model
discardSupportVectorsDiscard support vectors of linear SVM binary learners in ECOC model
edgeClassification edge for multiclass error-correcting output codes (ECOC) model
gatherGather properties of Statistics and Machine Learning Toolbox object from GPU
incrementalLearnerConvert multiclass error-correcting output codes (ECOC) model to incremental learner
lossClassification loss for multiclass error-correcting output codes (ECOC) model
marginClassification margins for multiclass error-correcting output codes (ECOC) model
partialDependenceCompute partial dependence
plotPartialDependenceCreate partial dependence plot (PDP) and individual conditional expectation (ICE) plots
predictClassify observations using multiclass error-correcting output codes (ECOC) model
resubEdgeResubstitution classification edge for multiclass error-correcting output codes (ECOC) model
limeLocal interpretable model-agnostic explanations (LIME)
resubLossResubstitution classification loss for multiclass error-correcting output codes (ECOC) model
resubMarginResubstitution classification margins for multiclass error-correcting output codes (ECOC) model
resubPredictClassify observations in multiclass error-correcting output codes (ECOC) model
shapleyShapley values
testckfoldCompare accuracies of two classification models by repeated cross-validation

Examples

collapse all

Train a multiclass error-correcting output codes (ECOC) model using support vector machine (SVM) binary learners.

Load Fisher's iris data set. Specify the predictor data X and the response data Y.

load fisheriris
X = meas;
Y = species;

Train a multiclass ECOC model using the default options.

Mdl = fitcecoc(X,Y)
Mdl = 
  ClassificationECOC
             ResponseName: 'Y'
    CategoricalPredictors: []
               ClassNames: {'setosa'  'versicolor'  'virginica'}
           ScoreTransform: 'none'
           BinaryLearners: {3x1 cell}
               CodingName: 'onevsone'


Mdl is a ClassificationECOC model. By default, fitcecoc uses SVM binary learners and a one-versus-one coding design. You can access Mdl properties using dot notation.

Display the class names and the coding design matrix.

Mdl.ClassNames
ans = 3x1 cell
    {'setosa'    }
    {'versicolor'}
    {'virginica' }

CodingMat = Mdl.CodingMatrix
CodingMat = 3×3

     1     1     0
    -1     0     1
     0    -1    -1

A one-versus-one coding design for three classes yields three binary learners. The columns of CodingMat correspond to the learners, and the rows correspond to the classes. The class order is the same as the order in Mdl.ClassNames. For example, CodingMat(:,1) is [1; –1; 0] and indicates that the software trains the first SVM binary learner using all observations classified as 'setosa' and 'versicolor'. Because 'setosa' corresponds to 1, it is the positive class; 'versicolor' corresponds to –1, so it is the negative class.

You can access each binary learner using cell indexing and dot notation.

Mdl.BinaryLearners{1}   % The first binary learner
ans = 
  CompactClassificationSVM
             ResponseName: 'Y'
    CategoricalPredictors: []
               ClassNames: [-1 1]
           ScoreTransform: 'none'
                     Beta: [4x1 double]
                     Bias: 1.4505
         KernelParameters: [1x1 struct]


Compute the resubstitution classification error.

error = resubLoss(Mdl)
error = 
0.0067

The classification error on the training data is small, but the classifier might be an overfitted model. You can cross-validate the classifier using crossval and compute the cross-validation classification error instead.

Train an ECOC classifier using SVM binary learners. Then, access properties of the binary learners, such as estimated parameters, by using dot notation.

Load Fisher's iris data set. Specify the petal dimensions as the predictors and the species names as the response.

load fisheriris
X = meas(:,3:4);
Y = species;

Train an ECOC classifier using SVM binary learners and the default coding design (one-versus-one). Standardize the predictors and save the support vectors.

t = templateSVM('Standardize',true,'SaveSupportVectors',true);
predictorNames = {'petalLength','petalWidth'};
responseName = 'irisSpecies';
classNames = {'setosa','versicolor','virginica'}; % Specify class order
Mdl = fitcecoc(X,Y,'Learners',t,'ResponseName',responseName,...
    'PredictorNames',predictorNames,'ClassNames',classNames)
Mdl = 
  ClassificationECOC
           PredictorNames: {'petalLength'  'petalWidth'}
             ResponseName: 'irisSpecies'
    CategoricalPredictors: []
               ClassNames: {'setosa'  'versicolor'  'virginica'}
           ScoreTransform: 'none'
           BinaryLearners: {3x1 cell}
               CodingName: 'onevsone'


t is a template object that contains options for SVM classification. The function fitcecoc uses default values for the empty ([]) properties. Mdl is a ClassificationECOC classifier. You can access properties of Mdl using dot notation.

Display the class names and the coding design matrix.

Mdl.ClassNames
ans = 3x1 cell
    {'setosa'    }
    {'versicolor'}
    {'virginica' }

Mdl.CodingMatrix
ans = 3×3

     1     1     0
    -1     0     1
     0    -1    -1

The columns correspond to SVM binary learners, and the rows correspond to the distinct classes. The row order is the same as the order in the ClassNames property of Mdl. For each column:

  • 1 indicates that fitcecoc trains the SVM using observations in the corresponding class as members of the positive group.

  • –1 indicates that fitcecoc trains the SVM using observations in the corresponding class as members of the negative group.

  • 0 indicates that the SVM does not use observations in the corresponding class.

In the first SVM, for example, fitcecoc assigns all observations to 'setosa' or 'versicolor', but not 'virginica'.

Access properties of the SVMs using cell subscripting and dot notation. Store the standardized support vectors of each SVM. Unstandardize the support vectors.

L = size(Mdl.CodingMatrix,2); % Number of SVMs
sv = cell(L,1); % Preallocate for support vector indices
for j = 1:L
    SVM = Mdl.BinaryLearners{j};
    sv{j} = SVM.SupportVectors;
    sv{j} = sv{j}.*SVM.Sigma + SVM.Mu;
end

sv is a cell array of matrices containing the unstandardized support vectors for the SVMs.

Plot the data, and identify the support vectors.

figure
gscatter(X(:,1),X(:,2),Y);
hold on
markers = {'ko','ro','bo'}; % Should be of length L
for j = 1:L
    svs = sv{j};
    plot(svs(:,1),svs(:,2),markers{j},...
        'MarkerSize',10 + (j - 1)*3);
end
title('Fisher''s Iris -- ECOC Support Vectors')
xlabel(predictorNames{1})
ylabel(predictorNames{2})
legend([classNames,{'Support vectors - SVM 1',...
    'Support vectors - SVM 2','Support vectors - SVM 3'}],...
    'Location','Best')
hold off

Figure contains an axes object. The axes object with title Fisher's Iris -- ECOC Support Vectors, xlabel petalLength, ylabel petalWidth contains 6 objects of type line. One or more of the lines displays its values using only markers These objects represent setosa, versicolor, virginica, Support vectors - SVM 1, Support vectors - SVM 2, Support vectors - SVM 3.

You can pass Mdl to these functions:

  • predict, to classify new observations

  • resubLoss, to estimate the classification error on the training data

  • crossval, to perform 10-fold cross-validation

Cross-validate an ECOC classifier with SVM binary learners, and estimate the generalized classification error.

Load Fisher's iris data set. Specify the predictor data X and the response data Y.

load fisheriris
X = meas;
Y = species;
rng(1); % For reproducibility

Create an SVM template, and standardize the predictors.

t = templateSVM('Standardize',true)
t = 
Fit template for SVM.
    Standardize: 1

t is an SVM template. Most of the template object properties are empty. When training the ECOC classifier, the software sets the applicable properties to their default values.

Train the ECOC classifier, and specify the class order.

Mdl = fitcecoc(X,Y,'Learners',t,...
    'ClassNames',{'setosa','versicolor','virginica'});

Mdl is a ClassificationECOC classifier. You can access its properties using dot notation.

Cross-validate Mdl using 10-fold cross-validation.

CVMdl = crossval(Mdl);

CVMdl is a ClassificationPartitionedECOC cross-validated ECOC classifier.

Estimate the generalized classification error.

genError = kfoldLoss(CVMdl)
genError = 
0.0400

The generalized classification error is 4%, which indicates that the ECOC classifier generalizes fairly well.

More About

expand all

Algorithms

expand all

Alternative Functionality

You can use these alternative algorithms to train a multiclass model:

References

[1] Fürnkranz, Johannes. “Round Robin Classification.” J. Mach. Learn. Res., Vol. 2, 2002, pp. 721–747.

[2] Escalera, S., O. Pujol, and P. Radeva. “Separability of ternary codes for sparse designs of error-correcting output codes.” Pattern Recog. Lett. Vol. 30, Issue 3, 2009, pp. 285–297.

Extended Capabilities

Version History

Introduced in R2014b

expand all