Main Content

kfoldEdge

Classification edge for cross-validated classification model

    Description

    example

    E = kfoldEdge(CVMdl) returns the classification edge obtained by the cross-validated classification model CVMdl. For every fold, kfoldEdge computes the classification edge for validation-fold observations using a classifier trained on training-fold observations. CVMdl.X and CVMdl.Y contain both sets of observations.

    E = kfoldEdge(CVMdl,Name,Value) returns the classification edge with additional options specified by one or more name-value arguments. For example, specify the folds to use or specify to compute the classification edge for each individual fold.

    Examples

    collapse all

    Compute the k-fold edge for a model trained on Fisher's iris data.

    Load Fisher's iris data set.

    load fisheriris

    Train a classification tree classifier.

    tree = fitctree(meas,species);

    Cross-validate the classifier using 10-fold cross-validation.

    cvtree = crossval(tree);

    Compute the k-fold edge.

    edge = kfoldEdge(cvtree)
    edge = 0.8578
    

    Compute the k-fold edge for an ensemble trained on the Fisher iris data.

    Load the sample data set.

    load fisheriris

    Train an ensemble of 100 boosted classification trees.

    t = templateTree('MaxNumSplits',1); % Weak learner template tree object
    ens = fitcensemble(meas,species,'Learners',t);

    Create a cross-validated ensemble from ens and find the classification edge.

    rng(10,'twister') % For reproducibility
    cvens = crossval(ens);
    E = kfoldEdge(cvens)
    E = 3.2033
    

    Input Arguments

    collapse all

    Cross-validated partitioned classifier, specified as a ClassificationPartitionedModel, ClassificationPartitionedEnsemble, or ClassificationPartitionedGAM object. You can create the object in two ways:

    • Pass a trained classification model listed in the following table to its crossval object function.

    • Train a classification model using a function listed in the following table and specify one of the cross-validation name-value arguments for the function.

    Name-Value Arguments

    Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

    Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

    Example: kfoldEdge(CVMdl,'Folds',[1 2 3 5]) specifies to use the first, second, third, and fifth folds to compute the classification edge, but to exclude the fourth fold.

    Fold indices to use, specified as a positive integer vector. The elements of Folds must be within the range from 1 to CVMdl.KFold.

    The software uses only the folds specified in Folds.

    Example: 'Folds',[1 4 10]

    Data Types: single | double

    Flag to include interaction terms of the model, specified as true or false. This argument is valid only for a generalized additive model (GAM). That is, you can specify this argument only when CVMdl is ClassificationPartitionedGAM.

    The default value is true if the models in CVMdl (CVMdl.Trained) contain interaction terms. The value must be false if the models do not contain interaction terms.

    Example: 'IncludeInteractions',false

    Data Types: logical

    Aggregation level for the output, specified as 'average', 'individual', or 'cumulative'.

    ValueDescription
    'average'The output is a scalar average over all folds.
    'individual'The output is a vector of length k containing one value per fold, where k is the number of folds.
    'cumulative'

    Note

    If you want to specify this value, CVMdl must be a ClassificationPartitionedEnsemble object or ClassificationPartitionedGAM object.

    • If CVMdl is ClassificationPartitionedEnsemble, then the output is a vector of length min(CVMdl.NumTrainedPerFold). Each element j is an average over all folds that the function obtains by using ensembles trained with weak learners 1:j.

    • If CVMdl is ClassificationPartitionedGAM, then the output value depends on the IncludeInteractions value.

      • If IncludeInteractions is false, then L is a (1 + min(NumTrainedPerFold.PredictorTrees))-by-1 numeric column vector. The first element of L is an average over all folds that is obtained only the intercept (constant) term. The (j + 1)th element of L is an average obtained using the intercept term and the first j predictor trees per linear term.

      • If IncludeInteractions is true, then L is a (1 + min(NumTrainedPerFold.InteractionTrees))-by-1 numeric column vector. The first element of L is an average over all folds that is obtained using the intercept (constant) term and all predictor trees per linear term. The (j + 1)th element of L is an average obtained using the intercept term, all predictor trees per linear term, and the first j interaction trees per interaction term.

    Example: 'Mode','individual'

    Output Arguments

    collapse all

    Classification edge, returned as a numeric scalar or numeric column vector.

    • If Mode is 'average', then E is the average classification edge over all folds.

    • If Mode is 'individual', then E is a k-by-1 numeric column vector containing the classification edge for each fold, where k is the number of folds.

    • If Mode is 'cumulative' and CVMdl is ClassificationPartitionedEnsemble, then E is a min(CVMdl.NumTrainedPerFold)-by-1 numeric column vector. Each element j is the average classification edge over all folds that the function obtains by using ensembles trained with weak learners 1:j.

    • If Mode is 'cumulative' and CVMdl is ClassificationPartitionedGAM, then the output value depends on the IncludeInteractions value.

      • If IncludeInteractions is false, then L is a (1 + min(NumTrainedPerFold.PredictorTrees))-by-1 numeric column vector. The first element of L is the average classification edge over all folds that is obtained using only the intercept (constant) term. The (j + 1)th element of L is the average edge obtained using the intercept term and the first j predictor trees per linear term.

      • If IncludeInteractions is true, then L is a (1 + min(NumTrainedPerFold.InteractionTrees))-by-1 numeric column vector. The first element of L is the average classification edge over all folds that is obtained using the intercept (constant) term and all predictor trees per linear term. The (j + 1)th element of L is the average edge obtained using the intercept term, all predictor trees per linear term, and the first j interaction trees per interaction term.

    More About

    collapse all

    Classification Edge

    The classification edge is the weighted mean of the classification margins.

    One way to choose among multiple classifiers, for example to perform feature selection, is to choose the classifier that yields the greatest edge.

    Classification Margin

    The classification margin for binary classification is, for each observation, the difference between the classification score for the true class and the classification score for the false class. The classification margin for multiclass classification is the difference between the classification score for the true class and the maximal score for the false classes.

    If the margins are on the same scale (that is, the score values are based on the same score transformation), then they serve as a classification confidence measure. Among multiple classifiers, those that yield greater margins are better.

    Algorithms

    kfoldEdge computes the classification edge as described in the corresponding edge object function. For a model-specific description, see the appropriate edge function reference page in the following table.

    Model Typeedge Function
    Discriminant analysis classifieredge
    Ensemble classifieredge
    Generalized additive model classifieredge
    k-nearest neighbor classifieredge
    Naive Bayes classifieredge
    Neural network classifieredge
    Support vector machine classifieredge
    Binary decision tree for multiclass classificationedge

    Extended Capabilities

    Version History

    Introduced in R2011a

    expand all