Skip to content

Commit

Permalink
[Fix] fix eval_recalls error in voc_metric (#10770)
Browse files Browse the repository at this point in the history
  • Loading branch information
gotjd709 authored Aug 14, 2023
1 parent a066ef4 commit 635c4bf
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions mmdet/evaluation/metrics/voc_metric.py
Original file line number Diff line number Diff line change
Expand Up @@ -157,11 +157,11 @@ def compute_metrics(self, results: list) -> dict:
eval_results['mAP'] = sum(mean_aps) / len(mean_aps)
eval_results.move_to_end('mAP', last=False)
elif self.metric == 'recall':
# TODO: Currently not checked.
gt_bboxes = [ann['bboxes'] for ann in self.annotations]
gt_bboxes = [gt['bboxes'] for gt in gts]
pr_bboxes = [pred[0] for pred in preds]
recalls = eval_recalls(
gt_bboxes,
results,
pr_bboxes,
self.proposal_nums,
self.iou_thrs,
logger=logger,
Expand Down

0 comments on commit 635c4bf

Please sign in to comment.