Note: This page has been translated by MathWorks. Click here to see

To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

Loss of *k*-nearest neighbor classifier

`L = loss(mdl,tbl,ResponseVarName)`

`L = loss(mdl,tbl,Y)`

`L = loss(mdl,X,Y)`

`L = loss(___,Name,Value)`

returns a scalar representing how well `L`

= loss(`mdl`

,`tbl`

,`ResponseVarName`

)`mdl`

classifies the data
in `tbl`

when `tbl.ResponseVarName`

contains the
true classifications. If `tbl`

contains the response variable
used to train `mdl`

, then you do not need to specify
`ResponseVarName`

.

When computing the loss, the `loss`

function normalizes the
class probabilities in `tbl.ResponseVarName`

to the class
probabilities used for training, which are stored in the `Prior`

property of `mdl`

.

The meaning of the classification loss (`L`

) depends on the
loss function and weighting scheme, but, in general, better classifiers yield
smaller classification loss values. For more details, see Classification Loss.

returns a scalar representing how well `L`

= loss(`mdl`

,`tbl`

,`Y`

)`mdl`

classifies the data
in `tbl`

when `Y`

contains the true
classifications.

When computing the loss, the `loss`

function normalizes the
class probabilities in `Y`

to the class probabilities used for
training, which are stored in the `Prior`

property of
`mdl`

.

returns a scalar representing how well `L`

= loss(`mdl`

,`X`

,`Y`

)`mdl`

classifies the data
in `X`

when `Y`

contains the true
classifications.

When computing the loss, the `loss`

function normalizes the
class probabilities in `Y`

to the class probabilities used for
training, which are stored in the `Prior`

property of
`mdl`

.

specifies options using one or more name-value pair arguments in addition to the
input arguments in previous syntaxes. For example, you can specify the loss function
and the classification weights.`L`

= loss(___,`Name,Value`

)

`ClassificationKNN`

| `edge`

| `fitcknn`

| `margin`