Mean-Average-Precision (mAP)¶
Module Interface¶
- class torchmetrics.detection.mean_ap.MeanAveragePrecision(box_format='xyxy', iou_type='bbox', iou_thresholds=None, rec_thresholds=None, max_detection_thresholds=None, class_metrics=False, **kwargs)[source]
Computes the Mean-Average-Precision (mAP) and Mean-Average-Recall (mAR) for object detection predictions. Optionally, the mAP and mAR values can be calculated per class.
Predicted boxes and targets have to be in Pascal VOC format (xmin-top left, ymin-top left, xmax-bottom right, ymax-bottom right). See the
update()
method for more information about the input format to this metric.For an example on how to use this metric check the torchmetrics examples
Note
This metric is following the mAP implementation of pycocotools, a standard implementation for the mAP metric for object detection.
Note
This metric requires you to have torchvision version 0.8.0 or newer installed (with corresponding version 1.7.0 of torch or newer). This metric requires pycocotools installed when iou_type is segm. Please install with
pip install torchvision
orpip install torchmetrics[detection]
.- Parameters
box_format¶ (
str
) – Input format of given boxes. Supported formats are[`xyxy`, `xywh`, `cxcywh`]
.iou_type¶ (
str
) – Type of input (either masks or bounding-boxes) used for computing IOU. Supported IOU types are[`bboxes`, `segm`]
.iou_thresholds¶ (
Optional
[List
[float
]]) – IoU thresholds for evaluation. If set toNone
it corresponds to the stepped range[0.5,...,0.95]
with step0.05
. Else provide a list of floats.rec_thresholds¶ (
Optional
[List
[float
]]) – Recall thresholds for evaluation. If set toNone
it corresponds to the stepped range[0,...,1]
with step0.01
. Else provide a list of floats.max_detection_thresholds¶ (
Optional
[List
[int
]]) – Thresholds on max detections per image. If set to None will use thresholds[1, 10, 100]
. Else, please provide a list of ints.class_metrics¶ (
bool
) – Option to enable per-class metrics for mAP and mAR_100. Has a performance impact.kwargs¶ (
Dict
[str
,Any
]) – Additional keyword arguments, see Advanced metric settings for more info.
Example
>>> import torch >>> from torchmetrics.detection.mean_ap import MeanAveragePrecision >>> preds = [ ... dict( ... boxes=torch.tensor([[258.0, 41.0, 606.0, 285.0]]), ... scores=torch.tensor([0.536]), ... labels=torch.tensor([0]), ... ) ... ] >>> target = [ ... dict( ... boxes=torch.tensor([[214.0, 41.0, 562.0, 285.0]]), ... labels=torch.tensor([0]), ... ) ... ] >>> metric = MeanAveragePrecision() >>> metric.update(preds, target) >>> from pprint import pprint >>> pprint(metric.compute()) {'map': tensor(0.6000), 'map_50': tensor(1.), 'map_75': tensor(1.), 'map_large': tensor(0.6000), 'map_medium': tensor(-1.), 'map_per_class': tensor(-1.), 'map_small': tensor(-1.), 'mar_1': tensor(0.6000), 'mar_10': tensor(0.6000), 'mar_100': tensor(0.6000), 'mar_100_per_class': tensor(-1.), 'mar_large': tensor(0.6000), 'mar_medium': tensor(-1.), 'mar_small': tensor(-1.)}
- Raises
ModuleNotFoundError – If
torchvision
is not installed or version installed is lower than 0.8.0ModuleNotFoundError – If
iou_type
is equal toseqm
andpycocotools
is not installedValueError – If
class_metrics
is not a boolean
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- compute()[source]
Compute the Mean-Average-Precision (mAP) and Mean-Average-Recall (mAR) scores.
Note
map
score is calculated with @[ IoU=self.iou_thresholds | area=all | max_dets=max_detection_thresholds ]Caution: If the initialization parameters are changed, dictionary keys for mAR can change as well. The default properties are also accessible via fields and will raise an
AttributeError
if not available.- Return type
- Returns
dict containing
map:
torch.Tensor
map_small:
torch.Tensor
map_medium:
torch.Tensor
map_large:
torch.Tensor
mar_1:
torch.Tensor
mar_10:
torch.Tensor
mar_100:
torch.Tensor
mar_small:
torch.Tensor
mar_medium:
torch.Tensor
mar_large:
torch.Tensor
map_50:
torch.Tensor
(-1 if 0.5 not in the list of iou thresholds)map_75:
torch.Tensor
(-1 if 0.75 not in the list of iou thresholds)map_per_class:
torch.Tensor
(-1 if class metrics are disabled)mar_100_per_class:
torch.Tensor
(-1 if class metrics are disabled)
- update(preds, target)[source]
Add detections and ground truth to the metric.
- Parameters
preds¶ (
List
[Dict
[str
,Tensor
]]) –A list consisting of dictionaries each containing the key-values (each dictionary corresponds to a single image):
boxes
:torch.FloatTensor
of shape[num_boxes, 4]
containingnum_boxes
detection boxes of the format specified in the constructor. By default, this method expects[xmin, ymin, xmax, ymax]
in absolute image coordinates.scores
:torch.FloatTensor
of shape[num_boxes]
containing detection scores for the boxes.labels
:torch.IntTensor
of shape[num_boxes]
containing 0-indexed detection classes for the boxes.
target¶ (
List
[Dict
[str
,Tensor
]]) –A list consisting of dictionaries each containing the key-values (each dictionary corresponds to a single image):
boxes
:torch.FloatTensor
of shape[num_boxes, 4]
containingnum_boxes
ground truth boxes of the format specified in the constructor. By default, this method expects[xmin, ymin, xmax, ymax]
in absolute image coordinates.labels
:torch.IntTensor
of shape[num_boxes]
containing 1-indexed ground truthclasses for the boxes.
- Raises
ValueError – If
preds
is not of typeList[Dict[str, Tensor]]
ValueError – If
target
is not of typeList[Dict[str, Tensor]]
ValueError – If
preds
andtarget
are not of the same lengthValueError – If any of
preds.boxes
,preds.scores
andpreds.labels
are not of the same lengthValueError – If any of
target.boxes
andtarget.labels
are not of the same lengthValueError – If any box is not type float and of length 4
ValueError – If any class is not type int and of length 1
ValueError – If any score is not type float and of length 1
- Return type