Main Content

ClassificationGAM

Generalized additive model (GAM) for binary classification

Since R2021a

    Description

    A ClassificationGAM object is a generalized additive model (GAM) object for binary classification. It is an interpretable model that explains class scores (the logit of class probabilities) using a sum of univariate and bivariate shape functions.

    You can classify new observations by using the predict function, and plot the effect of each shape function on the prediction (class score) for an observation by using the plotLocalEffects function. For the full list of object functions for ClassificationGAM, see Object Functions.

    Creation

    Create a ClassificationGAM object by using fitcgam. You can specify both linear terms and interaction terms for predictors to include univariate shape functions (predictor trees) and bivariate shape functions (interaction trees) in a trained model, respectively.

    You can update a trained model by using resume or addInteractions.

    • The resume function resumes training for the existing terms in a model.

    • The addInteractions function adds interaction terms to a model that contains only linear terms.

    Properties

    expand all

    GAM Properties

    This property is read-only.

    Bin edges for numeric predictors, specified as a cell array of p numeric vectors, where p is the number of predictors. Each vector includes the bin edges for a numeric predictor. The element in the cell array for a categorical predictor is empty because the software does not bin categorical predictors.

    The software bins numeric predictors only if you specify the 'NumBins' name-value argument as a positive integer scalar when training a model with tree learners. The BinEdges property is empty if the 'NumBins' value is empty (default).

    You can reproduce the binned predictor data Xbinned by using the BinEdges property of the trained model mdl.

    X = mdl.X; % Predictor data
    Xbinned = zeros(size(X));
    edges = mdl.BinEdges;
    % Find indices of binned predictors.
    idxNumeric = find(~cellfun(@isempty,edges));
    if iscolumn(idxNumeric)
        idxNumeric = idxNumeric';
    end
    for j = idxNumeric 
        x = X(:,j);
        % Convert x to array if x is a table.
        if istable(x) 
            x = table2array(x);
        end
        % Group x into bins by using the discretize function.
        xbinned = discretize(x,[-inf; edges{j}; inf]); 
        Xbinned(:,j) = xbinned;
    end
    Xbinned contains the bin indices, ranging from 1 to the number of bins, for numeric predictors. Xbinned values are 0 for categorical predictors. If X contains NaNs, then the corresponding Xbinned values are NaNs.

    Data Types: cell

    This property is read-only.

    Interaction term indices, specified as a t-by-2 matrix of positive integers, where t is the number of interaction terms in the model. Each row of the matrix represents one interaction term and contains the column indexes of the predictor data X for the interaction term. If the model does not include an interaction term, then this property is empty ([]).

    The software adds interaction terms to the model in the order of importance based on the p-values. Use this property to check the order of the interaction terms added to the model.

    Data Types: double

    This property is read-only.

    Intercept (constant) term of the model, which is the sum of the intercept terms in the predictor trees and interaction trees, specified as a numeric scalar.

    Data Types: single | double

    This property is read-only.

    Parameters used to train the model, specified as a model parameter object. ModelParameters contains parameter values such as those for the name-value arguments used to train the model. ModelParameters does not contain estimated parameters.

    Access the fields of ModelParameters by using dot notation. For example, access the maximum number of decision splits per interaction tree by using Mdl.ModelParameters.MaxNumSplitsPerInteraction.

    This property is read-only.

    Bin edges for interaction term detection for numeric predictors, specified as a cell array of p numeric vectors, where p is the number of predictors. Each vector includes the bin edges for a numeric predictor. The element in the cell array for a categorical predictor is empty because the software does not bin categorical predictors.

    To speed up the interaction term detection process, the software bins numeric predictors into at most 8 equiprobable bins. The number of bins can be less than 8 if a predictor has fewer than 8 unique values.

    Data Types: cell

    This property is read-only.

    Reason training the model stops, specified as a structure with two fields, PredictorTrees and InteractionTrees.

    Use this property to check if the model contains the specified number of trees for each linear term ('NumTreesPerPredictor') and for each interaction term ('NumTreesPerInteraction'). If the fitcgam function terminates training before adding the specified number of trees, this property contains the reason for the termination.

    Data Types: struct

    Other Classification Properties

    This property is read-only.

    Categorical predictor indices, specified as a vector of positive integers. CategoricalPredictors contains index values indicating that the corresponding predictors are categorical. The index values are between 1 and p, where p is the number of predictors used to train the model. If none of the predictors are categorical, then this property is empty ([]).

    Data Types: double

    This property is read-only.

    Unique class labels used in training, specified as a categorical or character array, logical or numeric vector, or cell array of character vectors. ClassNames has the same data type as the class labels Y. (The software treats string arrays as cell arrays of character vectors.) ClassNames also determines the class order.

    Data Types: single | double | logical | char | cell | categorical

    Misclassification costs, specified as a 2-by-2 numeric matrix.

    Cost(i,j) is the cost of classifying a point into class j if its true class is i. The order of the rows and columns of Cost corresponds to the order of the classes in ClassNames.

    The software uses the Cost value for prediction, but not training. You can change the value by using dot notation.

    Example: Mdl.Cost = C;

    Data Types: double

    This property is read-only.

    Expanded predictor names, specified as a cell array of character vectors.

    ExpandedPredictorNames is the same as PredictorNames for a generalized additive model.

    Data Types: cell

    This property is read-only.

    Number of observations in the training data stored in X and Y, specified as a numeric scalar.

    Data Types: double

    This property is read-only.

    Predictor variable names, specified as a cell array of character vectors. The order of the elements in PredictorNames corresponds to the order in which the predictor names appear in the training data.

    Data Types: cell

    This property is read-only.

    Prior class probabilities, specified as a numeric vector with two elements. The order of the elements corresponds to the order of the elements in ClassNames.

    Data Types: double

    This property is read-only.

    Response variable name, specified as a character vector.

    Data Types: char

    This property is read-only.

    Rows of the original training data used in fitting the ClassificationGAM model, specified as a logical vector. This property is empty if all rows are used.

    Data Types: logical

    Score transformation, specified as a character vector or function handle. ScoreTransform represents a built-in transformation function or a function handle for transforming predicted classification scores.

    To change the score transformation function to function, for example, use dot notation.

    • For a built-in function, enter a character vector.

      Mdl.ScoreTransform = 'function';

      This table describes the available built-in functions.

      ValueDescription
      'doublelogit'1/(1 + e–2x)
      'invlogit'log(x / (1 – x))
      'ismax'Sets the score for the class with the largest score to 1, and sets the scores for all other classes to 0
      'logit'1/(1 + ex)
      'none' or 'identity'x (no transformation)
      'sign'–1 for x < 0
      0 for x = 0
      1 for x > 0
      'symmetric'2x – 1
      'symmetricismax'Sets the score for the class with the largest score to 1, and sets the scores for all other classes to –1
      'symmetriclogit'2/(1 + ex) – 1

    • For a MATLAB® function or a function that you define, enter its function handle.

      Mdl.ScoreTransform = @function;

      function must accept a matrix (the original scores) and return a matrix of the same size (the transformed scores).

    This property determines the output score computation for object functions such as predict, margin, and edge. Use 'logit' to compute posterior probabilities, and use 'none' to compute the logit of posterior probabilities.

    Data Types: char | function_handle

    This property is read-only.

    Observation weights used to train the model, specified as an n-by-1 numeric vector. n is the number of observations (NumObservations).

    The software normalizes the observation weights specified in the 'Weights' name-value argument so that the elements of W within a particular class sum up to the prior probability of that class.

    Data Types: double

    This property is read-only.

    Predictors used to train the model, specified as a numeric matrix or table.

    Each row of X corresponds to one observation, and each column corresponds to one variable.

    Data Types: single | double | table

    This property is read-only.

    Class labels used to train the model, specified as a categorical or character array, logical or numeric vector, or cell array of character vectors. Y has the same data type as the response variable used to train the model. (The software treats string arrays as cell arrays of character vectors.)

    Each row of Y represents the observed classification of the corresponding row of X.

    Data Types: single | double | logical | char | cell | categorical

    Hyperparameter Optimization Properties

    This property is read-only.

    Description of the cross-validation optimization of hyperparameters, specified as a BayesianOptimization object or a table of hyperparameters and associated values. This property is nonempty when the 'OptimizeHyperparameters' name-value argument of fitcgam is not 'none' (default) when the object is created. The value of HyperparameterOptimizationResults depends on the setting of the Optimizer field in the HyperparameterOptimizationOptions structure of fitcgam when the object is created.

    Value of Optimizer OptionValue of HyperparameterOptimizationResults
    "bayesopt" (default)Object of class BayesianOptimization
    "gridsearch" or "randomsearch"Table of hyperparameters used, observed objective function values (cross-validation loss), and rank of observations from lowest (best) to highest (worst)

    Object Functions

    expand all

    compactReduce size of machine learning model
    crossvalCross-validate machine learning model
    addInteractionsAdd interaction terms to univariate generalized additive model (GAM)
    resumeResume training of generalized additive model (GAM)
    limeLocal interpretable model-agnostic explanations (LIME)
    partialDependenceCompute partial dependence
    plotLocalEffectsPlot local effects of terms in generalized additive model (GAM)
    plotPartialDependenceCreate partial dependence plot (PDP) and individual conditional expectation (ICE) plots
    shapleyShapley values
    predictClassify observations using generalized additive model (GAM)
    lossClassification loss for generalized additive model (GAM)
    marginClassification margins for generalized additive model (GAM)
    edgeClassification edge for generalized additive model (GAM)
    resubPredictClassify training data using trained classifier
    resubLossResubstitution classification loss
    resubMarginResubstitution classification margin
    resubEdgeResubstitution classification edge
    compareHoldoutCompare accuracies of two classification models using new data
    testckfoldCompare accuracies of two classification models by repeated cross-validation

    Examples

    collapse all

    Train a univariate generalized additive model, which contains linear terms for predictors. Then, interpret the prediction for a specified data instance by using the plotLocalEffects function.

    Load the ionosphere data set. This data set has 34 predictors and 351 binary responses for radar returns, either bad ('b') or good ('g').

    load ionosphere

    Train a univariate GAM that identifies whether the radar return is bad ('b') or good ('g').

    Mdl = fitcgam(X,Y)
    Mdl = 
      ClassificationGAM
                 ResponseName: 'Y'
        CategoricalPredictors: []
                   ClassNames: {'b'  'g'}
               ScoreTransform: 'logit'
                    Intercept: 2.2715
              NumObservations: 351
    
    
    

    Mdl is a ClassificationGAM model object. The model display shows a partial list of the model properties. To view the full list of properties, double-click the variable name Mdl in the Workspace. The Variables editor opens for Mdl. Alternatively, you can display the properties in the Command Window by using dot notation. For example, display the class order of Mdl.

    classOrder = Mdl.ClassNames
    classOrder = 2x1 cell
        {'b'}
        {'g'}
    
    

    Classify the first observation of the training data, and plot the local effects of the terms in Mdl on the prediction.

    label = predict(Mdl,X(1,:))
    label = 1x1 cell array
        {'g'}
    
    
    plotLocalEffects(Mdl,X(1,:))

    Figure contains an axes object. The axes object with title Local Effects Plot, xlabel Local Effect, ylabel Term contains an object of type bar.

    The predict function classifies the first observation X(1,:) as 'g'. The plotLocalEffects function creates a horizontal bar graph that shows the local effects of the 10 most important terms on the prediction. Each local effect value shows the contribution of each term to the classification score for 'g', which is the logit of the posterior probability that the classification is 'g' for the observation.

    Train a generalized additive model that contains linear and interaction terms for predictors in three different ways:

    • Specify the interaction terms using the formula input argument.

    • Specify the 'Interactions' name-value argument.

    • Build a model with linear terms first and add interaction terms to the model by using the addInteractions function.

    Load Fisher's iris data set. Create a table that contains observations for versicolor and virginica.

    load fisheriris
    inds = strcmp(species,'versicolor') | strcmp(species,'virginica');
    tbl = array2table(meas(inds,:),'VariableNames',["x1","x2","x3","x4"]);
    tbl.Y = species(inds,:);

    Specify formula

    Train a GAM that contains the four linear terms (x1, x2, x3, and x4) and two interaction terms (x1*x2 and x2*x3). Specify the terms using a formula in the form 'Y ~ terms'.

    Mdl1 = fitcgam(tbl,'Y ~ x1 + x2 + x3 + x4 + x1:x2 + x2:x3');

    The function adds interaction terms to the model in the order of importance. You can use the Interactions property to check the interaction terms in the model and the order in which fitcgam adds them to the model. Display the Interactions property.

    Mdl1.Interactions
    ans = 2×2
    
         2     3
         1     2
    
    

    Each row of Interactions represents one interaction term and contains the column indexes of the predictor variables for the interaction term.

    Specify 'Interactions'

    Pass the training data (tbl) and the name of the response variable in tbl to fitcgam, so that the function includes the linear terms for all the other variables as predictors. Specify the 'Interactions' name-value argument using a logical matrix to include the two interaction terms, x1*x2 and x2*x3.

    Mdl2 = fitcgam(tbl,'Y','Interactions',logical([1 1 0 0; 0 1 1 0]));
    Mdl2.Interactions
    ans = 2×2
    
         2     3
         1     2
    
    

    You can also specify 'Interactions' as the number of interaction terms or as 'all' to include all available interaction terms. Among the specified interaction terms, fitcgam identifies those whose p-values are not greater than the 'MaxPValue' value and adds them to the model. The default 'MaxPValue' is 1 so that the function adds all specified interaction terms to the model.

    Specify 'Interactions','all' and set the 'MaxPValue' name-value argument to 0.01.

    Mdl3 = fitcgam(tbl,'Y','Interactions','all','MaxPValue',0.01);
    Mdl3.Interactions
    ans = 5×2
    
         3     4
         2     4
         1     4
         2     3
         1     3
    
    

    Mdl3 includes five of the six available pairs of interaction terms.

    Use addInteractions Function

    Train a univariate GAM that contains linear terms for predictors, and then add interaction terms to the trained model by using the addInteractions function. Specify the second input argument of addInteractions in the same way you specify the 'Interactions' name-value argument of fitcgam. You can specify the list of interaction terms using a logical matrix, the number of interaction terms, or 'all'.

    Specify the number of interaction terms as 5 to add the five most important interaction terms to the trained model.

    Mdl4 = fitcgam(tbl,'Y');
    UpdatedMdl4 = addInteractions(Mdl4,5);
    UpdatedMdl4.Interactions
    ans = 5×2
    
         3     4
         2     4
         1     4
         2     3
         1     3
    
    

    Mdl4 is a univariate GAM, and UpdatedMdl4 is an updated GAM that contains all the terms in Mdl4 and five additional interaction terms.

    Train a univariate classification GAM (which contains only linear terms) for a small number of iterations. After training the model for more iterations, compare the resubstitution loss.

    Load the ionosphere data set. This data set has 34 predictors and 351 binary responses for radar returns, either bad ('b') or good ('g').

    load ionosphere

    Train a univariate GAM that identifies whether the radar return is bad ('b') or good ('g'). Specify the number of trees per linear term as 2. fitcgam iterates the boosting algorithm for the specified number of iterations. For each boosting iteration, the function adds one tree per linear term. Specify 'Verbose' as 2 to display diagnostic messages at every iteration.

    Mdl = fitcgam(X,Y,'NumTreesPerPredictor',2,'Verbose',2);
    |========================================================|
    | Type | NumTrees |  Deviance  |   RelTol   | LearnRate  |
    |========================================================|
    |    1D|         0|      486.59|      -     |      -     |
    |    1D|         1|      166.71|         Inf|           1|
    |    1D|         2|      78.336|     0.58205|           1|
    

    To check whether fitcgam trains the specified number of trees, display the ReasonForTermination property of the trained model and view the displayed message.

    Mdl.ReasonForTermination
    ans = struct with fields:
          PredictorTrees: 'Terminated after training the requested number of trees.'
        InteractionTrees: ''
    
    

    Compute the classification loss for the training data.

    resubLoss(Mdl)
    ans = 
    0.0142
    

    Resume training the model for another 100 iterations. Because Mdl contains only linear terms, the resume function resumes training for the linear terms and adds more trees for them (predictor trees). Specify 'Verbose' and 'NumPrint' to display diagnostic messages at every 10 iterations.

    UpdatedMdl = resume(Mdl,100,'Verbose',1,'NumPrint',10);
    |========================================================|
    | Type | NumTrees |  Deviance  |   RelTol   | LearnRate  |
    |========================================================|
    |    1D|         0|      78.336|      -     |      -     |
    |    1D|         1|      38.364|     0.17429|           1|
    |    1D|        10|     0.16311|    0.011894|           1|
    |    1D|        20|  0.00035693|   0.0025178|           1|
    |    1D|        30|  8.1191e-07|   0.0011006|           1|
    |    1D|        40|  1.7978e-09|  0.00074607|           1|
    |    1D|        50|  3.6113e-12|  0.00034404|           1|
    |    1D|        60|  1.7497e-13|  0.00016541|           1|
    
    UpdatedMdl.ReasonForTermination
    ans = struct with fields:
          PredictorTrees: 'Unable to improve the model fit.'
        InteractionTrees: ''
    
    

    resume terminates training when adding more trees does not improve the deviance of the model fit.

    Compute the classification loss using the updated model.

    resubLoss(UpdatedMdl)
    ans = 
    0
    

    The classification loss decreases after resume updates the model with more iterations.

    More About

    expand all

    References

    [1] Lou, Yin, Rich Caruana, and Johannes Gehrke. "Intelligible Models for Classification and Regression." Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD ’12). Beijing, China: ACM Press, 2012, pp. 150–158.

    [2] Lou, Yin, Rich Caruana, Johannes Gehrke, and Giles Hooker. "Accurate Intelligible Models with Pairwise Interactions." Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD ’13) Chicago, Illinois, USA: ACM Press, 2013, pp. 623–631.

    Version History

    Introduced in R2021a