Compute performance metrics for average receiver operating characteristic (ROC) curve in multiclass problem
computes the averages of performance metrics stored in the
AUC] = average(
rocObj for a multiclass classification problem using the averaging
method specified in
type. The function returns the average false
positive rate (
FPR) and the average true positive rate
TPR) for each threshold value in
The function also returns
AUC, the area under the ROC curve composed of
Find Average ROC Curve
Compute the performance metrics for a multiclass classification problem by creating a
rocmetrics object, and then compute the average values for the metrics by using the
average function. Plot the average ROC curve using the outputs of
Load a sample of true labels and the prediction scores for a classification problem. For this example, there are five classes: daisy, dandelion, roses, sunflowers, and tulips. The class names are stored in
classNames. The scores are the softmax prediction scores generated using the
scores is an N-by-K array where N is the number of observations and K is the number of classes. The column order of
scores follows the class order stored in c
load('flowersDataResponses.mat') scores = flowersData.scores; trueLabels = flowersData.trueLabels; classNames = flowersData.classNames;
rocmetrics object by using the true labels in
trueLabels and the classification scores in
scores. Specify the column order of s
rocObj = rocmetrics(trueLabels,scores,classNames);
rocmetrics computes the FPR and TPR at different thresholds and finds the AUC value for each class.
Compute the average performance metric values, including the FPR and TPR at different thresholds and the AUC value, using the macro-averaging method.
[FPR,TPR,Thresholds,AUC] = average(rocObj,"macro");
Plot the average ROC curve and display the average AUC value. Include (0,0) so that the curve starts from the origin
plot([0;FPR],[0;TPR]) xlabel("False Positive Rate") ylabel("True Positive Rate") title("Average ROC Curve") hold on plot([0,1],[0,1],"k--") legend(join(["Macro-average (AUC =",AUC,")"]), ... Location="southeast") axis padded hold off
Alternatively, you can create the average ROC curve by using the
plot function. Specify
AverageROCType="macro" to compute the metrics for the average ROC curve using the macro-averaging method.
type — Averaging method
Averaging method, specified as
averagefinds the average performance metrics by treating all one-versus-all binary classification problems as one binary classification problem. The function computes the confusion matrix components for the combined binary classification problem, and then computes the average FPR and TPR using the values of the confusion matrix.
averagecomputes the average values for FPR and TPR by averaging the values of all one-versus-all binary classification problems.
"weighted"(weighted macro-averaging) —
averagecomputes the weighted average values for FPR and TPR using the macro-averaging method and using the prior class probabilities (the
rocObj) as weights.
The algorithm type determines the length of the vectors for the output arguments
Thresholds). For more details, see Average of Performance Metrics.
FPR — Average false positive rates
Average false positive rates, returned as a numeric vector.
TPR — Average true positive rates
Average true positive rates, returned as a numeric vector.
AUC — Area under average ROC curve
Area under the average ROC curve composed of
TPR, returned as a numeric scalar.
Receiver Operating Characteristic (ROC) Curve
Area Under ROC Curve (AUC)
One-Versus-All (OVA) Coding Design
Adjusted Scores for Multiclass Classification Problem
You can use the
plotfunction to create the average ROC curve. The function returns a
ROCCurveobject containing the
AUCproperties, which correspond to the output arguments
averagefunction, respectively. For an example, see Plot ROC Curve.
Introduced in R2022b