Main Content

Train Nearest Neighbor Classifiers Using Classification Learner App

This example shows how to construct nearest neighbors classifiers in the Classification Learner app.

  1. In MATLAB®, load the fisheriris data set and define some variables from the data set to use for a classification.

    fishertable = readtable('fisheriris.csv');
    
  2. On the Apps tab, in the Machine Learning and Deep Learning group, click Classification Learner.

  3. On the Classification Learner tab, in the File section, click New Session > From Workspace.

    Classification Learner tab

    In the New Session from Workspace dialog box, select the table fishertable from the Data Set Variable list (if necessary). Observe that the app has selected response and predictor variables based on their data type. Petal and sepal length and width are predictors, and species is the response that you want to classify. For this example, do not change the selections.

  4. Click Start Session.

    The app creates a scatter plot of the data.

  5. Use the scatter plot to investigate which variables are useful for predicting the response. To visualize the distribution of species and measurements, select different options on the Variable on X axis and Variable on Y axis menus. Observe which variables separate the species colors most clearly.

  6. To create a selection of nearest neighbors models, on the Classification Learner tab, on the far right of the Model Type section, click the arrow to expand the list of classifiers, and under Nearest Neighbor Classifiers, click All KNNs.

  7. In the Training section, click Train.

    Tip

    If you have Parallel Computing Toolbox™, you can train all the models (All KNNs) simultaneously by selecting the Use Parallel button in the Training section before clicking Train. After you click Train, the Opening Parallel Pool dialog box opens and remains open while the app opens a parallel pool of workers. During this time, you cannot interact with the software. After the pool opens, the app trains the models simultaneously.

    Classification Learner trains one of each nonoptimizable nearest neighbor classification option in the gallery, and highlights the best score. The app outlines in a box the Accuracy (Validation) score of the best model. Classification Learner also displays a validation confusion matrix for the first KNN model (Fine KNN).

    Validation confusion matrix of the iris data modeled by a KNN classifier. Blue values indicate correct classifications, and red values indicate incorrect classifications.

    Note

    Validation introduces some randomness into the results. Your model validation results can vary from the results shown in this example.

  8. To view the results for a model, select the model in the Models pane, and inspect the Current Model Summary pane. The Current Model Summary pane displays the Training Results metrics, calculated on the validation set.

  9. For the selected model, inspect the accuracy of the predictions in each class. On the Classification Learner tab, in the Plots section, click the arrow to open the gallery, and then click Confusion Matrix (Validation) in the Validation Results group. View the matrix of true class and predicted class results.

  10. Select the other models in the Models pane, open the validation confusion matrix for each of the models, and then compare the results.

  11. Choose the best model (the best score is highlighted in a box). To improve the model, try including different features in the model. See if you can improve the model by removing features with low predictive power.

    On the Classification Learner tab, in the Features section, click Feature Selection. In the Feature Selection dialog box, select predictors to remove from the model, and click OK. In the Training section, click Train to train a new model using the new options. Compare results among the classifiers in the Models pane.

  12. To investigate features to include or exclude, use the parallel coordinates plot. On the Classification Learner tab, in the Plots section, click the arrow to open the gallery, and click Parallel Coordinates in the Validation Results group.

  13. Choose the best model in the Models pane. To try to improve the model further, try changing settings. On the Classification Learner tab, in the Model Type section, click Advanced. In the Advanced KNN Options dialog box, try changing a setting and click OK. Train the new model by clicking Train in the Training section. For information on settings and the strengths of different nearest neighbor model types, see Nearest Neighbor Classifiers.

  14. You can export a full version of the trained model to the workspace. On the Classification Learner tab, in the Export section, click Export Model and select either Export Model or Export Compact Model. Note that either option exports a full version of the trained model because nearest neighbor models always store training data. See Export Classification Model to Predict New Data.

  15. To examine the code for training this classifier, click Generate Function.

Use the same workflow to evaluate and compare the other classifier types you can train in Classification Learner.

To try all the nonoptimizable classifier model presets available for your data set:

  1. Click the arrow on the far right of the Model Type section to expand the list of classifiers.

  2. Click All, then click Train.

    Option selected for training all available classifier types

To learn about other classifier types, see Train Classification Models in Classification Learner App.

Related Topics