Shortcuts

AveragePrecision

class mmeval.metrics.AveragePrecision(average: Optional[str] = 'macro', **kwargs)[source]

Calculate the average precision with respect of classes.

Parameters
  • average (str, optional) –

    The average method. It supports two modes:

    • ”macro”: Calculate metrics for each category, and calculate

      the mean value over all categories.

    • None: Return scores of all categories.

  • to "macro". (Defaults) –

References

1

Wikipedia entry for the Average precision

Examples

>>> from mmeval import AveragePrecision
>>> average_precision = AveragePrecision()

Use Builtin implementation with label-format labels:

>>> preds = [[0.9, 0.8, 0.3, 0.2],
             [0.1, 0.2, 0.2, 0.1],
             [0.7, 0.5, 0.9, 0.3],
             [0.8, 0.1, 0.1, 0.2]]
>>> labels = [[0, 1], [1], [2], [0]]
>>> average_precision(preds, labels)
{'mAP': 70.833..}

Use Builtin implementation with one-hot encoding labels:

>>> preds = [[0.9, 0.8, 0.3, 0.2],
              [0.1, 0.2, 0.2, 0.1],
              [0.7, 0.5, 0.9, 0.3],
              [0.8, 0.1, 0.1, 0.2]]
>>> labels = [[1, 1, 0, 0],
               [0, 1, 0, 0],
               [0, 0, 1, 0],
               [1, 0, 0, 0]]
>>> average_precision(preds, labels)
{'mAP': 70.833..}

Use NumPy implementation with label-format labels:

>>> import numpy as np
>>> preds = np.array([[0.9, 0.8, 0.3, 0.2],
                      [0.1, 0.2, 0.2, 0.1],
                      [0.7, 0.5, 0.9, 0.3],
                      [0.8, 0.1, 0.1, 0.2]])
>>> labels = [np.array([0, 1]), np.array([1]), np.array([2]), np.array([0])] # noqa
>>> average_precision(preds, labels)
{'mAP': 70.833..}

Use PyTorch implementation with one-hot encoding labels:

>>> import torch
>>> preds = torch.Tensor([[0.9, 0.8, 0.3, 0.2],
                          [0.1, 0.2, 0.2, 0.1],
                          [0.7, 0.5, 0.9, 0.3],
                          [0.8, 0.1, 0.1, 0.2]])
>>> labels = torch.Tensor([[1, 1, 0, 0],
                           [0, 1, 0, 0],
                           [0, 0, 1, 0],
                           [1, 0, 0, 0]])
>>> average_precision(preds, labels)
{'mAP': 70.833..}

Computing with None average mode:

>>> preds = np.array([[0.9, 0.8, 0.3, 0.2],
                      [0.1, 0.2, 0.2, 0.1],
                      [0.7, 0.5, 0.9, 0.3],
                      [0.8, 0.1, 0.1, 0.2]])
>>> labels = [np.array([0, 1]), np.array([1]), np.array([2]), np.array([0])] # noqa
>>> average_precision = AveragePrecision(average=None)
>>> average_precision(preds, labels)
{'AP_classwise': [100.0, 83.33, 100.00, 0.0]}  # rounded results

Accumulate batch:

>>> for i in range(10):
...     preds = torch.randint(0, 4, size=(100, 10))
...     labels = torch.randint(0, 4, size=(100, ))
...     average_precision.add(preds, labels)
>>> average_precision.compute()  
add(preds: Sequence, labels: Sequence)None[source]

Add the intermediate results to self._results.

Parameters
  • preds (Sequence) – Predictions from the model. It should be scores of every class (N, C).

  • labels (Sequence) – The ground truth labels. It should be (N, ) for label-format, or (N, C) for one-hot encoding.

compute_metric(results: List[Union[Tuple[Union[numpy.ndarray, numpy.number], Union[numpy.ndarray, numpy.number]], Tuple[torch.Tensor, torch.Tensor], Tuple[oneflow.Tensor, oneflow.Tensor], Tuple[Union[int, Sequence[Union[int, float]]], Union[int, Sequence[int]]]]])Dict[str, float][source]

Compute the metric.

Currently, there are 3 implementations of this method: NumPy and PyTorch and OneFlow. Which implementation to use is determined by the type of the calling parameters. e.g. numpy.ndarray or torch.Tensor, oneflow.Tensor.

This method would be invoked in BaseMetric.compute after distributed synchronization.

Parameters
  • (List[Union[NUMPY_IMPL_HINTS (results) –

  • TORCH_IMPL_HINTS

:param : :param ONEFLOW_IMPL_HINTS]]): A list of tuples that consisting the :param prediction and label. This list has already been synced across: :param all ranks.:

Returns

The computed metric.

Return type

Dict[str, float]

Read the Docs v: latest
Versions
latest
stable
Downloads
pdf
html
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.