Documentation

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

evaluateDetectionPrecision

Evaluate precision metric for object detection

Syntax

averagePrecision = evaluateDetectionPrecision(detectionResults,trainingData)
[averagePrecision,recall,precision] = evaluateDetectionPrecision(___)
[___] = evaluateDetectionPrecision(___,threshold)

Description

averagePrecision = evaluateDetectionPrecision(detectionResults,trainingData) returns the average precision, of the detectionResults compared to the trainingData. You can use the average precision to measure the performance of an object detector. For a multiclass detector, the function returns averagePrecision as a vector of scores for each object class in the order specified by trainingData.

example

[averagePrecision,recall,precision] = evaluateDetectionPrecision(___) returns data points for plotting the precision–recall curve, using input arguments from the previous syntax.

[___] = evaluateDetectionPrecision(___,threshold) specifies the overlap threshold for assigning a detection to a ground truth box.

Examples

collapse all

Train an ACF-based detector using pre-loaded ground truth information. Run the detector on the training images. Evaluate the detector and display the precision-recall curve.

Load the ground truth table.

load('stopSignsAndCars.mat')
stopSigns = stopSignsAndCars(:,1:2);
stopSigns.imageFilename = fullfile(toolboxdir('vision'),'visiondata', ...
    stopSigns.imageFilename);

Train an ACF-based detector.

detector = trainACFObjectDetector(stopSigns,'NumStages',3);
ACF Object Detector Training
The training will take 3 stages. The model size is 34x31.
Sample positive examples(~100% Completed)
Compute approximation coefficients...Completed.
Compute aggregated channel features...Completed.
--------------------------------------------
Stage 1:
Sample negative examples(~100% Completed)
Compute aggregated channel features...Completed.
Train classifier with 42 positive examples and 210 negative examples...Completed.
The trained classifier has 19 weak learners.
--------------------------------------------
Stage 2:
Sample negative examples(~100% Completed)
Found 210 new negative examples for training.
Compute aggregated channel features...Completed.
Train classifier with 42 positive examples and 210 negative examples...Completed.
The trained classifier has 38 weak learners.
--------------------------------------------
Stage 3:
Sample negative examples(~100% Completed)
Found 210 new negative examples for training.
Compute aggregated channel features...Completed.
Train classifier with 42 positive examples and 210 negative examples...Completed.
The trained classifier has 71 weak learners.
--------------------------------------------
ACF object detector training is completed. Elapsed time is 45.159 seconds.

Create a table to store the results.

numImages = height(stopSigns);
results(numImages) = struct('Boxes',[],'Scores',[]);

Run the detector on the training images. Store the results as a table.

for i = 1:numImages
    I = imread(stopSigns.imageFilename{i});
    [bboxes,scores] = detect(detector,I);
    results(i).Boxes = bboxes;
    results(i).Scores = scores;
end

results = struct2table(results);

Evaluate the results against the ground truth data. Get the precision statistics.

[ap,recall,precision] = evaluateDetectionPrecision(results,stopSigns(:,2));

Plot the precision-recall curve.

figure
plot(recall,precision)
grid on
title(sprintf('Average Precision = %.1f',ap))

Input Arguments

collapse all

Object locations and scores, specified as a two-column table containing the bounding boxes and scores for each detected object. For multiclass detection, a third column contains the predicted label for each detection.

When detecting objects, you can create the detection results table by using struct2table to combine the bboxes and scores outputs:

   for i = 1:numImages
        I = imread(imageFilename{i});
        [bboxes,scores] = detect(detector,I);
        results(i).Boxes = bboxes;
        results(i).Scores = scores;
    end
 results = struct2table(results);

Data Types: table

Training data, specified as a table with one or more columns. The table contains one column for single-class data and multiple columns for multiclass data. Each column contains M-by-4 matrices of [x,y,width,height] bounding boxes that specify object locations. The format specifies the upper-left corner location and the size of the object. The column name specifies the class label.

Overlap threshold for assigned a detection to a ground truth box, specified as a numeric scalar. The overlap ratio is computed as the intersection over union.

Output Arguments

collapse all

Average precision over all the detection results, returned as a numeric scalar or vector. Precision is a ratio of true positive instances to all positive instances of objects in the detector, based on the ground truth. For a multiclass detector, the average precision is a vector of average precision scores for each object class.

Recall values from each detection, returned as a vector of numeric scalars or as a cell array. Recall is a ratio of true positive instances to the sum of true positives and false negatives in the detector, based on the ground truth. For a multiclass detector, recall and precision are cell arrays, where each cell contains the data points for each object class.

Precision values from each detection, returned as a vector of numeric scalars or as a cell array. Precision is a ratio of true positive instances to all positive instances of objects in the detector, based on the ground truth. For a multi-class detector, recall and precision are cell arrays, where each cell contains the data points for each object class.

Introduced in R2017a