Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[skipci] [Doctest] added contrib metrics AveragePrecision, PrecisionRecallCurve, ROC_AUC and RocCurve #2341

Merged
merged 6 commits into from
Dec 6, 2021

Conversation

Ishan-Kumar2
Copy link
Contributor

Addresses #2265

Description:
added doctests for contrib metrics AveragePrecision, PrecisionRecallCurve, ROC_AUC and RocCurve
Not sure about the list of floats outputs, please let me know if there is a better way to handle those.
@ydcjeff @sdesrozis

Check list:

  • New tests are added (if a new feature is added)
  • New doc strings: description and/or example code are in RST format
  • Documentation is updated (if required)

@github-actions github-actions bot added docs module: contrib Contrib module labels Dec 5, 2021
docs/source/conf.py Outdated Show resolved Hide resolved
@sdesrozis
Copy link
Contributor

@Ishan-Kumar2 Thanks a lot for your help ! I left some minor comments.


avg_precision = AveragePrecision(activated_output_transform)
avg_precision = AveragePrecision()
#The ``output_transform`` arg of the metric can be used to perform a softmax on the ``y_pred``.
Copy link
Contributor

@sdesrozis sdesrozis Dec 5, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thinking about it, it is worth to keep the current example to show the way to respect the data format, maybe as a note or an example following the one you did. What do you think ?

Note:
     AveragePrecision expects y to be comprised of 0's and 1's. y_pred must either be probability estimates or confidence values. To apply an activation to y_pred, use output_transform as shown below:

        .. code-block:: python

            def activated_output_transform(output):
                y_pred, y = output
                y_pred = torch.softmax(y_pred, dim=1)
                return y_pred, y

            avg_precision = AveragePrecision(activated_output_transform)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup, makes sense to have it in the docs somewhere, I have added it now!

Copy link
Contributor

@sdesrozis sdesrozis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Ishan-Kumar2 Thank you ! LGTM

@sdesrozis sdesrozis merged commit 71bbe2f into pytorch:master Dec 6, 2021
@sdesrozis sdesrozis mentioned this pull request Dec 6, 2021
51 tasks
Ishan-Kumar2 added a commit to Ishan-Kumar2/ignite that referenced this pull request Dec 26, 2021
…isionRecallCurve``, ``ROC_AUC`` and ``RocCurve`` (pytorch#2341)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants