Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve precision recall metric issue 2571 #2573

Merged

Conversation

sadra-barikbin
Copy link
Collaborator

Fixes #2571

Check list:

  • New tests are added (if a new feature is added)
  • New doc strings: description and/or example code are in RST format
  • Documentation is updated (if required)

@github-actions github-actions bot added the module: metrics Metrics module label May 14, 2022
@sadra-barikbin sadra-barikbin force-pushed the improve-precision-recall-metric-issue-2571 branch from 1eb3773 to aa3fc9b Compare May 14, 2022 19:29
@sadra-barikbin sadra-barikbin force-pushed the improve-precision-recall-metric-issue-2571 branch from 45447eb to db91bf2 Compare May 14, 2022 21:44
Copy link
Collaborator

@vfdev-5 vfdev-5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sadra-barikbin thanks a lot for the PR !
I left few comments to clear out the API. I haven't checked yet the details but the feature looks cool !
Please check also docs build failure.

ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/recall.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
@sadra-barikbin sadra-barikbin force-pushed the improve-precision-recall-metric-issue-2571 branch from 5edfca5 to bebacd4 Compare May 16, 2022 19:25

_test(average=True)
_test(average=False)
assert pr._updated is False


def _test_distrib_integration_multiclass(device):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any reason that this test has not been parametrized?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test and other similar ones are used in test_distrib_hvd, test_distrib_gloo_cpu_or_gpu, test_distrib_nccl_gpu tests. IMO, it is not obvious how to properly parametrize them. However, there was a task to rewrite distributed tests such that we do not need to write all: test_distrib_hvd, test_distrib_gloo_cpu_or_gpu, test_distrib_nccl_gpu but only _test_distrib_integration_multiclass and they could be executed with hvd, gloo, nccl confs...

@sadra-barikbin
Copy link
Collaborator Author

Scikit-Learn has another data type called multilabel-multioutput which is like multilabel but its labels are not binary. It has yet to be implemented. We can do it if you will.

undo calling item() on result when data is binary and average=True due to resulting Fbeta failure
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/fbeta.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/recall.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
@sadra-barikbin sadra-barikbin force-pushed the improve-precision-recall-metric-issue-2571 branch from 1edab48 to d450903 Compare June 6, 2022 21:13
@sadra-barikbin sadra-barikbin force-pushed the improve-precision-recall-metric-issue-2571 branch from 76e1508 to 485e4e4 Compare June 7, 2022 06:37
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
tests/ignite/metrics/test_metrics_lambda.py Outdated Show resolved Hide resolved
tests/ignite/metrics/test_metrics_lambda.py Outdated Show resolved Hide resolved
tests/ignite/metrics/test_metrics_lambda.py Outdated Show resolved Hide resolved
ignite/metrics/precision.py Outdated Show resolved Hide resolved
Copy link
Collaborator

@vfdev-5 vfdev-5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for this great PR @sadra-barikbin !
I'm checking it on GPU and if CI is green let's merge it.
In a follow-up PR, let's clean-up docstring about

- torch.Tensor(...)
+ torch.tensor(...)

@vfdev-5 vfdev-5 merged commit 9947e7d into pytorch:master Jun 10, 2022
@sadra-barikbin sadra-barikbin deleted the improve-precision-recall-metric-issue-2571 branch June 11, 2022 08:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: metrics Metrics module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add new options to Precision and Recall metrics
2 participants