Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow FID with torch.float64 #1628

Merged
merged 8 commits into from
Mar 28, 2023
Merged

Allow FID with torch.float64 #1628

merged 8 commits into from
Mar 28, 2023

Conversation

SkafteNicki
Copy link
Member

What does this PR do?

Fixes #1620

Before submitting
  • Was this discussed/agreed via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@SkafteNicki SkafteNicki added the bug / fix Something isn't working label Mar 16, 2023
@SkafteNicki SkafteNicki added this to the v0.12 milestone Mar 16, 2023
@SkafteNicki SkafteNicki marked this pull request as ready for review March 17, 2023 11:24
@codecov
Copy link

codecov bot commented Mar 17, 2023

Codecov Report

Merging #1628 (3e5f4ea) into master (685391d) will decrease coverage by 48%.
The diff coverage is 91%.

❗ Current head 3e5f4ea differs from pull request most recent head 839ccc6. Consider uploading reports for the commit 839ccc6 to get more accurate results

Additional details and impacted files
@@           Coverage Diff            @@
##           master   #1628     +/-   ##
========================================
- Coverage      89%     41%    -48%     
========================================
  Files         228     228             
  Lines       12363   12432     +69     
========================================
- Hits        10945    5067   -5878     
- Misses       1418    7365   +5947     

@mergify mergify bot added the ready label Mar 17, 2023
@mergify mergify bot requested a review from a team March 17, 2023 17:53
@Borda Borda requested a review from stancld March 21, 2023 07:36
@Borda Borda enabled auto-merge (squash) March 21, 2023 07:36
@Borda Borda disabled auto-merge March 28, 2023 10:09
@Borda Borda merged commit 7015b94 into master Mar 28, 2023
@Borda Borda deleted the feature/fid64 branch March 28, 2023 10:09
toshas added a commit to toshas/torch-fidelity that referenced this pull request Apr 30, 2023
…lp numerical issues with inception feature extractor and its output variation due to the batch size.

fix #43, related in torchmetrics:
- Lightning-AI/torchmetrics#1620
- Lightning-AI/torchmetrics#1628
add explicit eval in the inception fe to help a case if someone copies just that file for metrics evaluation
add explicit require_grad(False) to clip feature extractor
add test cases to troubleshoot batch size dependence of metrics values
rustoneee added a commit to rustoneee/Pytorch-Generative-models-GAN- that referenced this pull request Nov 6, 2023
…lp numerical issues with inception feature extractor and its output variation due to the batch size.

fix #43, related in torchmetrics:
- Lightning-AI/torchmetrics#1620
- Lightning-AI/torchmetrics#1628
add explicit eval in the inception fe to help a case if someone copies just that file for metrics evaluation
add explicit require_grad(False) to clip feature extractor
add test cases to troubleshoot batch size dependence of metrics values
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working ready topic: Image
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Batch size dependent FID
3 participants