Skip to content

Explanation of indicators

Aryalfrat edited this page Nov 16, 2022 · 4 revisions

English | 简体中文

mAP (mean Averaged Precision):In the field of target detection in machine learning, mAP is very important metric to measure the performance of target detection algorithms. In general, mAP is obtained by combining the average precision (AP) of all class detection in a combined weighted average. There exist several ways to calculate mAP in general. YMIR provides a single-point calculation, meaning mAP under a specified IOU threshold (the IOU of the prediction and groundtruth is considered a positive sample if it is greater than this threshold). YMIR provides an interpolation calculation (a common evaluation criterion for target detection algorithms, i.e., a measure on the COCO dataset) as an average of multiple IOU thresholds mAP, meaning the average of the mAP under a total of 10 IOU thresholds at progressive increments of 0.05 from 0.5 to 0.95, which is a relatively more comprehensive evaluation criterion. image

PRcurve: In a machine learning binary classification problem, for example, logistic regression, the model first outputs the probability that the current sample belongs to a positive class, and then determines whether it is a positive class according to a specified threshold, which is usually 0.5 by default. However, we can still adjust this threshold to obtain better model prediction results. Thus, the precision and recall rates for different thresholds can be calculated and plotted as a curve called the Precision-Recall Curve (PR Curve). The PR Curve allows us to clearly observe the variation of the precision and recall rates and to choose a reasonable threshold value. In general, as the recall rate increases, then the overall precision rate may tend to decrease. Thus, PRcurve well demonstrates the balance between precision and recall for different threshold values. At the same time, the ideal situation is that as the recall rate increases, the precision rate also gradually keeps increasing or stays the same. image

P@R: The precision rate corresponding to the specified recall on PRcurve, expressed as a specific point on PRcurve, generally counting all values where the recall is incremented by 5% from the set range. image

R@P: The recall rate corresponding to the specified precision rate on the PRcurve, which is expressed as a specific point on the PRcurve, and generally counts all values of the precision rate incremented by 5% from the set range. image

Clone this wiki locally