Train Classification Models in Classification Learner App
You can use Classification Learner to train models of these classifiers: decision trees, discriminant analysis, support vector machines, logistic regression, nearest neighbors, naive Bayes, kernel approximation, ensembles, and neural networks. In addition to training models, you can explore your data, select features, specify validation schemes, and evaluate results. You can export a model to the workspace to use the model with new data or generate MATLAB® code to learn about programmatic classification.
Training a model in Classification Learner consists of two parts:
Validated Model: Train a model with a validation scheme. By default, the app protects against overfitting by applying cross-validation. Alternatively, you can choose holdout validation. The validated model is visible in the app.
Full Model: Train a model on full data without validation. The app trains this model simultaneously with the validated model. However, the model trained on full data is not visible in the app. When you choose a classifier to export to the workspace, Classification Learner exports the full model.
The app displays the results of the validated model. Diagnostic measures, such as model accuracy, and plots, such as a scatter plot or the confusion matrix chart, reflect the validated model results. You can automatically train one or more classifiers, compare validation results, and choose the best model that works for your classification problem. When you choose a model to export to the workspace, Classification Learner exports the full model. Because Classification Learner creates a model object of the full model during training, you experience no lag time when you export the model. You can use the exported model to make predictions on new data.
Automated Classifier Training
You can use Classification Learner to automatically train a selection of different classification models on your data.
Get started by automatically training multiple models at once. You can quickly try a selection of models, then explore promising models interactively.
If you already know what classifier type you want, train individual classifiers instead. See Manual Classifier Training.
On the Apps tab, in the Machine Learning and Deep Learning group, click Classification Learner.
Click New Session and select data from the workspace or from a file. Specify a response variable and variables to use as predictors. See Select Data and Validation for Classification Problem.
On the Classification Learner tab, in the Model Type section, click All Quick-To-Train. This option trains all the model presets available for your data set that are fast to fit.
If you have Parallel Computing Toolbox™, you can train the models in parallel. See Parallel Classifier Training.
A selection of model types appears in the Models pane. When the models finish training, the best percentage Accuracy (Validation) score is highlighted in a box.
Click models in the Models pane and open the corresponding plots to explore results.
To try all the nonoptimizable classifier model presets available for your data set, click All, then click Train.
Manual Classifier Training
If you want to explore individual model types, or if you already know what classifier type you want, you can train classifiers one at a time or train a group of the same type.
Choose a classifier. On the Classification Learner tab, in the Model Type section, click a classifier type. To see all available classifier options, click the arrow on the far right of the Model Type section to expand the list of classifiers. The nonoptimizable model options in the Model Type gallery are preset starting points with different settings, suitable for a range of different classification problems.
To read a description of each classifier, switch to the details view.
For more information on each option, see Choose Classifier Options.
After selecting a classifier, click Train.
Repeat to try different classifiers.
Try decision trees and discriminants first. If the models are not accurate enough predicting the response, try other classifiers with higher flexibility. To avoid overfitting, look for a model of lower flexibility that provides sufficient accuracy.
If you want to try all nonoptimizable models of the same or different types, then select one of the All options in the Model Type gallery.
Alternatively, if you want to automatically tune hyperparameters of a specific model type, select the corresponding Optimizable model and perform hyperparameter optimization. For more information, see Hyperparameter Optimization in Classification Learner App.
For next steps, see Compare and Improve Classification Models
Parallel Classifier Training
You can train models in parallel using Classification Learner if you have Parallel Computing Toolbox. Parallel training allows you to train multiple classifiers at once and continue working.
To control parallel training, toggle the Use Parallel button on the app toolstrip. The Use Parallel button is available only if you have Parallel Computing Toolbox.
The first time you click Train after clicking the Use Parallel button, a dialog box is displayed while the app opens a parallel pool of workers. After the pool opens, you can train multiple classifiers at once.
When classifiers are training in parallel, progress indicators appear on each training and queued model in the Models pane. You can cancel individual models, if you want. During training, you can examine results and plots from models, and initiate training of more classifiers.
If you have Parallel Computing Toolbox, then parallel training is available in Classification Learner, and
you do not need to set the
UseParallel option of the
You cannot perform hyperparameter optimization in parallel. The app disables the Use Parallel button when you select an optimizable model. If you then select a nonoptimizable model, the button is off by default.
Compare and Improve Classification Models
Examine the Accuracy (Validation) score reported in the Models pane for each model. Click models in the Models pane and open the corresponding plots to explore the results. Compare model performance by inspecting results in the plots. You can rearrange the layout of the plots to compare results across multiple models: use the options in the Layout button, drag and drop plots, or select the options provided by the Document Actions arrow located to the right of the model plot tabs.
Additionally, you can compare the models by using the Sort by options in the Models pane. Delete any unwanted model by selecting the model and clicking the Delete selected model button in the upper right of the pane, or right-clicking the model and selecting Delete model.
Select the best model in the Models pane and then try including and excluding different features in the model. Click Feature Selection.
Try the parallel coordinates plot to help you identify features to remove. See if you can improve the model by removing features with low predictive power. Specify predictors to include in the model, and train new models using the new options. Compare results among the models in the Models pane.
You can also try transforming features with PCA to reduce dimensionality.
To improve the model further, you can try changing classifier parameter settings in the Advanced dialog box, and then train using the new options. To learn how to control model flexibility, see Choose Classifier Options. For information on how to tune model parameter settings automatically, see Hyperparameter Optimization in Classification Learner App.
If feature selection, PCA, or new parameter settings improve your model, try training All model types with the new settings. See if another model type does better with the new settings.
To avoid overfitting, look for a model of lower flexibility that provides sufficient accuracy. For example, look for simple models such as decision trees and discriminants that are fast and easy to interpret. If the models are not accurate enough predicting the response, choose other classifiers with higher flexibility, such as ensembles. To learn about the model flexibility, see Choose Classifier Options.
This figure shows the app with a Models pane containing various classifier types.
For a step-by-step example comparing different classifiers, see Train Decision Trees Using Classification Learner App.
For next steps, generate code to train the model with different data, or export trained models to the workspace to make predictions using new data. See Export Classification Model to Predict New Data.
- Select Data and Validation for Classification Problem
- Choose Classifier Options
- Feature Selection and Feature Transformation Using Classification Learner App
- Assess Classifier Performance in Classification Learner
- Export Classification Model to Predict New Data
- Train Decision Trees Using Classification Learner App
- Machine Learning in MATLAB