Precision Recall Plot given the ground truth, predicted label, and predicted score

How can i get the precision recall plot for this ? I know of the function at http://www.mathworks.com/help/stats/perfcurve.html and http://www.mathworks.com/matlabcentral/fileexchange/21528-precision-recall-and-roc-curves but the issue is that the inputs are the true class labels and the predicted scores.
For example. (I have edited my question. This is my actual real example. All detections positive classes.)
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 1 1 1 ]
predicted_scores = [ 10 9 8 7 6 5 ] (scores for corresponding label)
If I set threshold at 6, then I get 3 false positives and 2 true positives.
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 1 1 0 ]
If I set threshold at 8, then I get 2 false positives and 1 true positives.
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 0 0 0 ]

3 Comments

What do the predicted "scores" have to do with precision recall plot? Are they some measure of certainty, and just by chance they happen to be arranged in descending order?
Yes they are a measure of certainty and that they happen to be arranged in descending order.
And aren't the precision and recall plots based on the scores ? A higher threshold would lead to lower false positives but at the same time lower true positives. So the precision-recall plot indirectly shows the performance of the detector at varied thresholds.

Sign in to comment.

Answers (0)

Asked:

on 8 Jul 2016

Edited:

on 8 Jul 2016

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!