Precision Recall Plot given the ground truth, predicted label, and predicted score
Show older comments
How can i get the precision recall plot for this ? I know of the function at http://www.mathworks.com/help/stats/perfcurve.html and http://www.mathworks.com/matlabcentral/fileexchange/21528-precision-recall-and-roc-curves but the issue is that the inputs are the true class labels and the predicted scores.
For example. (I have edited my question. This is my actual real example. All detections positive classes.)
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 1 1 1 ]
predicted_scores = [ 10 9 8 7 6 5 ] (scores for corresponding label)
If I set threshold at 6, then I get 3 false positives and 2 true positives.
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 1 1 0 ]
If I set threshold at 8, then I get 2 false positives and 1 true positives.
true_labels = [ 0 1 0 0 1 1 ]
predicted_labels = [ 1 1 1 0 0 0 ]
3 Comments
Walter Roberson
on 8 Jul 2016
What do the predicted "scores" have to do with precision recall plot? Are they some measure of certainty, and just by chance they happen to be arranged in descending order?
Answers (0)
Categories
Find more on Ground Truth Labeling in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!