Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bugfix] Probabilities do not sum to one with Torch #5462

Merged
merged 21 commits into from
Apr 8, 2024

Conversation

PietropaoloFrisoni
Copy link
Contributor

@PietropaoloFrisoni PietropaoloFrisoni commented Apr 2, 2024

Context: When computing the expval of an operator using a quantum device with the torch interface and default_dtype set to torch.float32, the probabilities do not sum to one. This error does not occur if default_dtype is set to torch.float64.

Description of the Change: A renormalization of probabilities is introduced to overcome the issue. Such renormalization of probabilities occurs whenever the following two conditions are satisfied: 1) There is at least one probability that does not sum precisely to one, and 2) Such difference for all probabilities is between 0 and 1e-07.

Benefits: The error is not raised anymore.

Possible Drawbacks: The main drawback is that renormalization can occur in cases where it should not happen, although it is unlikely since the cutoff 1e-07 is expected to be reasonably small to prevent such cases (but large enough not to raise the error anymore).

Related GitHub Issues: #5444

[sc-59957]

Copy link

codecov bot commented Apr 2, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.67%. Comparing base (3b0c1b6) to head (d11f2ba).
Report is 1 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #5462      +/-   ##
==========================================
- Coverage   99.68%   99.67%   -0.01%     
==========================================
  Files         402      402              
  Lines       37540    37287     -253     
==========================================
- Hits        37420    37166     -254     
- Misses        120      121       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@PietropaoloFrisoni PietropaoloFrisoni marked this pull request as ready for review April 3, 2024 12:54
@PietropaoloFrisoni PietropaoloFrisoni requested a review from a team April 3, 2024 15:55
Copy link
Contributor

@albi3ro albi3ro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test?

Copy link
Contributor

@albi3ro albi3ro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Thanks for taking this on 🚀

@PietropaoloFrisoni PietropaoloFrisoni enabled auto-merge (squash) April 8, 2024 19:27
@albi3ro albi3ro disabled auto-merge April 8, 2024 20:10
@albi3ro albi3ro merged commit c416ad6 into master Apr 8, 2024
37 of 38 checks passed
@albi3ro albi3ro deleted the bugfix/probs_do_not_sum_to_one_torch branch April 8, 2024 20:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants