Computer Vision MT25, Precision and recall


Flashcards

Explain what is being plotted on a graph like this.


The precision versus recall for many different thresholding values.

What is the AP metric?


The area under the precision-recall curve for a given classifier.

One way to evaluate an object detection model is to compute the average precision by calculating the area under the precision-recall curve. Typically, this will depend on some choice made about what to count as a false positive in terms of the threshold used by the IoU metric.

Given this, can you define the $\text{AP}$ metric in full generality and explain how it is an average, average, average precision?


\[\text{AP} = \frac{1}{\# \text{classes}} \sum _ {\text{class} \in \text{classes}} \text{AP} (\text{class})\]

where

\[\text{AP}(\text{class}) = \frac{1}{\#\text{IoU thresholds}} \sum _ {\text{iou} \in \text{thresholds}} \text{AP}(\text{class}, \text{iou})\]

and $\text{AP}(\text{class}, \text{iou})$ is the area under the corresponding precision-recall curve.

It is an:

  • average: average over the classes
  • average: average over the IoU thresholds
  • average: average over the precision at different recall levels

precision.




Related posts