Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix how auc scores are calculated in PrecisionRecallCurve.plot methods #2437

Merged
merged 5 commits into from
Mar 15, 2024

Conversation

SkafteNicki
Copy link
Member

@SkafteNicki SkafteNicki commented Mar 7, 2024

What does this PR do?

Fixes #2405
Because the scores for this metric is not in ascending order but descending the direction argument should be set to -1.0 when calculating the AUC. Additionally, adds a bit of description about how the value is calculated.

Before submitting
  • Was this discussed/agreed via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃


📚 Documentation preview 📚: https://torchmetrics--2437.org.readthedocs.build/en/2437/

@SkafteNicki SkafteNicki added the bug / fix Something isn't working label Mar 7, 2024
@SkafteNicki SkafteNicki added this to the v1.3.x milestone Mar 7, 2024
Copy link

codecov bot commented Mar 7, 2024

Codecov Report

Merging #2437 (5e3ed6a) into master (4ed43e6) will not change coverage.
The diff coverage is 75%.

Additional details and impacted files
@@          Coverage Diff           @@
##           master   #2437   +/-   ##
======================================
  Coverage      69%     69%           
======================================
  Files         307     307           
  Lines       17363   17363           
======================================
  Hits        11961   11961           
  Misses       5402    5402           

@mergify mergify bot added the ready label Mar 7, 2024
@justusschock justusschock merged commit 0a6ad01 into master Mar 15, 2024
61 checks passed
@justusschock justusschock deleted the bugfix/score_in_plotting branch March 15, 2024 14:03
Borda pushed a commit that referenced this pull request Mar 16, 2024
…ods (#2437)

Co-authored-by: Jirka Borovec <6035284+Borda@users.noreply.github.com>
(cherry picked from commit 0a6ad01)
Borda pushed a commit that referenced this pull request Mar 18, 2024
…ods (#2437)

Co-authored-by: Jirka Borovec <6035284+Borda@users.noreply.github.com>
(cherry picked from commit 0a6ad01)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Incorrect value of AUROC when plotting a PrecisionRecallCurve metric with score=True
3 participants