美文网首页
mAP (mean Average Precision) for

mAP (mean Average Precision) for

作者: kimifjc | 来源:发表于2018-09-04 17:12 被阅读0次

mAP is the metric to measure the accuracy of object detectors. It is the average of the maximum precisions at different recall values.

Precision & recall

Precision measures how accurate is your predictions. i.e. the percentage of your positive predictions are correct. If you have a precision score of close to 1.0 then there is a high likelihood that whatever the classifier predicts as a positive detection is in fact a correct prediction.
Recall measures how good you find all the positives. For example, we can find 80% of the possible positive cases in our top K predictions. If you have a recall score close to 1.0 then almost all objects that are in your dataset will be positively detected by the model.
Here are their mathematical definitions:

IoU (Intersection over union)

IoU measures how much overlap between 2 regions, This measures how good is our prediction in the object detector with the ground truth.


AP (Average Precision)

AP calculation with precision-recall curve

Precision-Recall curve for an example detector.
The calculation if the AP score is to take the average value of the precision across all recall values (see explanation in section 4.2 of the Pascal Challenge paper).
From section 4.2 in paper
However the actual calculation is a little different. precision-recall curve

We set the precision for recall r to the maximum precision obtained for any recall r' > r. Then we compute the AP as the area under this curve (shown in light blue) by numerical integration. No approximation is involved since the curve is piecewise constant.

Code for AP calculation

def compute_ap(recall, precision):
    """ Compute the average precision, given the recall and precision curves.
    Code originally from https://github.com/rbgirshick/py-faster-rcnn.

    # Arguments
        recall:    The recall curve (list).
        precision: The precision curve (list).
    # Returns
        The average precision as computed in py-faster-rcnn.
    """
    # correct AP calculation
    # first append sentinel values at the end
    mrec = np.concatenate(([0.], recall, [1.]))
    mpre = np.concatenate(([0.], precision, [0.]))

    # compute the precision envelope
    for i in range(mpre.size - 1, 0, -1):
        mpre[i - 1] = np.maximum(mpre[i - 1], mpre[i])

    # to calculate area under PR curve, look for points
    # where X axis (recall) changes value
    i = np.where(mrec[1:] != mrec[:-1])[0]

    # and sum (\Delta recall) * prec
    ap = np.sum((mrec[i + 1] - mrec[i]) * mpre[i + 1])
    return ap
import numpy as np

recall = [0.2,0.4,0.4,0.4,0.4,0.6,0.8,0.8,0.8,1.0]
precision = [1.0,1.0,0.67,0.5,0.4,0.5,0.57,0.5,0.44,0.5]
AP = compute_ap(recall, precision)
print('AP = {}'.format(AP))
# AP = 0.728
# If we calculate using 11 spaced recall values, 
# AP = (5 × 1.0 + 4 × 0.57 + 2 × 0.5)/11 = 0.753. 
# Check the this blog: mAP for Object Detection

# Intermediate results
def compute_ap(recall, precision):
    # correct AP calculation
    # first append sentinel values at the end
    mrec = np.concatenate(([0.], recall, [1.]))
    mpre = np.concatenate(([0.], precision, [0.]))
    # mrec = [0.  0.2 0.4 0.4 0.4 0.4 0.6 0.8 0.8 0.8 1.  1. ]
    # mpre = [0.   1.   1.   0.67 0.5  0.4  0.5  0.57 0.5  0.44 0.5  0.  ]
    
    # compute the precision envelope
    for i in range(mpre.size - 1, 0, -1):
        mpre[i - 1] = np.maximum(mpre[i - 1], mpre[i])
    # mpre = [1.   1.   1.   0.67 0.57 0.57 0.57 0.57 0.5  0.5  0.5  0.  ]

    # to calculate area under PR curve, look for points
    # where X axis (recall) changes value
    i = np.where(mrec[1:] != mrec[:-1])[0]
    # i = [0 1 5 6 9]
    # mrec[1:] = [0.2 0.4 0.4 0.4 0.4 0.6 0.8 0.8 0.8 1.  1. ]
    # mrec[:-1] = [0.  0.2 0.4 0.4 0.4 0.4 0.6 0.8 0.8 0.8 1. ]

    # and sum (\Delta recall) * prec
    ap = np.sum((mrec[i + 1] - mrec[i]) * mpre[i + 1])
    return ap     

Reference

mean Average Precision-github
mAP for Object Detection
Evaluation 12: mean average precision
Understanding the mAP Evaluation Metric for Object Detection
Discussion on fast.ai

相关文章

网友评论

      本文标题:mAP (mean Average Precision) for

      本文链接:https://www.haomeiwen.com/subject/uhggwftx.html