Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Iteration over a metric never ends #1319

Closed
dakinggg opened this issue Nov 7, 2022 · 0 comments · Fixed by #1320
Closed

Iteration over a metric never ends #1319

dakinggg opened this issue Nov 7, 2022 · 0 comments · Fixed by #1320
Labels
bug / fix Something isn't working help wanted Extra attention is needed

Comments

@dakinggg
Copy link

dakinggg commented Nov 7, 2022

🐛 Bug

If you try to iterate over a metric (by mistake in my case), it just spins forever, making it appear that the program is hung. I'm not sure internally to torchmetrics exactly why this behavior occurs and if it is desired, but it would be nice if it errored out or something, unless there is a use for iterating over a metric that I am not aware of?

from torchmetrics.classification import Accuracy
acc = Accuracy()
for i, item in enumerate(acc):
    print(i, item)

The above code will spin forever, printing something like this

8144 CompositionalMetric(
  <lambda>(
    Accuracy(),
    None
  )
)

Expected behavior

I would expect an error to be raised if I try to iterate over a metric.

Environment

  • TorchMetrics version (and how you installed TM, e.g. conda, pip, build from source): 0.10.2
@dakinggg dakinggg added bug / fix Something isn't working help wanted Extra attention is needed labels Nov 7, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant