Documentation

### This is machine translation

Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

# resubMargin

Margin of k-nearest neighbor classifier by resubstitution

## Syntax

``m = resubMargin(mdl)``

## Description

example

````m = resubMargin(mdl)` returns the classification margins (`m`) of the data used to train `mdl`.`m` is returned as a numeric vector of length `size(mdl.X,1)`, where `mdl.X` is the training data for `mdl`. Each entry in `m` represents the margin for the corresponding row of `mdl.X` and the corresponding true class label in `mdl.Y`.```

## Examples

collapse all

Create a k-nearest neighbor classifier for the Fisher iris data, where $k$ = 5.

Load the Fisher iris data set.

```load fisheriris X = meas; Y = species;```

Create a classifier for five nearest neighbors.

`mdl = fitcknn(X,Y,'NumNeighbors',5);`

Examine some statistics of the resubstitution margin of the classifier.

```m = resubMargin(mdl); [max(m) min(m) mean(m)]```
```ans = 1×3 1.0000 -0.6000 0.9253 ```

The mean margin is over 0.9, indicating fairly high classification accuracy for resubstitution. For a more reliable assessment of model accuracy, consider cross-validation, such as `kfoldLoss`.

## Input Arguments

collapse all

k-nearest neighbor classifier model, specified as a `ClassificationKNN` object.

## More About

collapse all

### Margin

The classification margin for each observation is the difference between the classification score for the true class and the maximal classification score for the false classes.

### Score

The score of a classification is the posterior probability of the classification. The posterior probability is the number of neighbors with that classification divided by the number of neighbors. For a more detailed definition that includes weights and prior probabilities, see Posterior Probability.

Download ebook