-
Notifications
You must be signed in to change notification settings - Fork 412
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds average argument to AveragePrecision metric #477
Conversation
@SkafteNicki seems |
Codecov Report
@@ Coverage Diff @@
## master #477 +/- ##
=====================================
- Coverage 95% 95% -0%
=====================================
Files 132 132
Lines 4652 4681 +29
=====================================
+ Hits 4435 4459 +24
- Misses 217 222 +5 |
de7bfa2
to
f9e3a17
Compare
c0192f0
to
fb7e88c
Compare
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice!
fine with the wording choice here, but isn't |
good point, mAP is mean Average Precision but it is not simple stat mean, but some aggregated mean if I am correct 🐰 |
This is the confusing part with the naming of these metrics:
|
Makes sense, thanks for the clarification guys :) |
Before submitting
What does this PR do?
Fixes #471
Adds
average
argument toAveragePrecision
metric as requested by users. This will change the default output for multiclass and multilabel input from a list with the score per class to instead output themacro
average.PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃