Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BugFix: TorchLayer now supports shot batching #5492

Merged
merged 33 commits into from
Apr 22, 2024
Merged

Conversation

astralcai
Copy link
Contributor

@astralcai astralcai commented Apr 10, 2024

Context:
TorchLayer raises an error when used with a shot vector.

Description of the Change:

  • Adds support for shot batching to TorchLayer
  • Fixes bug in _to_qfunc_output_type in qnode.py that turns the entire shot-batched output, as opposed to each output in the batch, into the qfunc output type.

Related GitHub Issues:
Fixes #5319
[sc-58099]

Copy link

codecov bot commented Apr 11, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.67%. Comparing base (d03216e) to head (7409dbe).
Report is 4 commits behind head on master.

Additional details and impacted files
@@           Coverage Diff            @@
##           master    #5492    +/-   ##
========================================
  Coverage   99.67%   99.67%            
========================================
  Files         407      409     +2     
  Lines       38019    37867   -152     
========================================
- Hits        37895    37745   -150     
+ Misses        124      122     -2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@astralcai astralcai marked this pull request as ready for review April 18, 2024 02:31
@mudit2812 mudit2812 self-requested a review April 18, 2024 19:10
@dwierichs dwierichs self-requested a review April 19, 2024 07:14
Copy link
Contributor

@dwierichs dwierichs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @astralcai !
Mostly looks good to me.
Before approving, we should figure out what is going on in finite diff and SPSA with the permuted axes.
Also I had a question on a missing usefixtures that does not seem to affect the test passing. Am I confused about how this works? 😅

pennylane/workflow/qnode.py Show resolved Hide resolved
pennylane/qnn/torch.py Show resolved Hide resolved
tests/qnn/test_qnn_torch.py Show resolved Hide resolved
Copy link
Contributor

@mudit2812 mudit2812 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left a couple of initial comments. Seems to me that this is currently blocked by the expected shape of the returns.

@astralcai
Copy link
Contributor Author

astralcai commented Apr 19, 2024

what is going on in finite diff and SPSA with the permuted axes

@dwierichs Can you elaborate?

@trbromley trbromley added this to the v0.36 milestone Apr 19, 2024
@dwierichs
Copy link
Contributor

@astralcai I was just referring to the output shape question for these gradient methods.
I do agree that there seems to be a discrepancy between the output shapes produced by PennyLane and the technical specification doc. But I am a bit surprised that this only surfaces in two specific gradient method tests and it is a bug fix unrelated to the original PR. Might be nice to split this off and e.g. add forward pass tests that detect this difference.

@astralcai
Copy link
Contributor Author

astralcai commented Apr 19, 2024

@astralcai I was just referring to the output shape question for these gradient methods. I do agree that there seems to be a discrepancy between the output shapes produced by PennyLane and the technical specification doc. But I am a bit surprised that this only surfaces in two specific gradient method tests and it is a bug fix unrelated to the original PR. Might be nice to split this off and e.g. add forward pass tests that detect this difference.

This only happens when the measurements are in a list. If you change the list to a tuple, the error does not appear, which is the case for most tests out there. More specifically, this only occurs when you have a list of only one measurement returned from a qnode, which I assume doesn't happen much anywhere else in the tests.

@dwierichs
Copy link
Contributor

Ahh, I understand now, thank you! 🙏 Sorry, I overlooked that distinction.

I still think we should test a forward-pass output shape not only within differentiation tests :) Would you mind adding a tiny test for this? If the code is ever changed, it would be confusing not to have any forward pass tests fail, but two differentation tests.

tests/test_return_types_qnode.py Outdated Show resolved Hide resolved
Copy link
Contributor

@dwierichs dwierichs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @astralcai 🎉
Just small suggestions to wrap up, but looks good to me!

tests/qnn/test_qnn_torch.py Outdated Show resolved Hide resolved
tests/test_return_types_qnode.py Outdated Show resolved Hide resolved
doc/releases/changelog-dev.md Show resolved Hide resolved
@astralcai astralcai enabled auto-merge (squash) April 22, 2024 16:15
@astralcai astralcai merged commit be8a22e into master Apr 22, 2024
38 checks passed
@astralcai astralcai deleted the torch-shot-batching branch April 22, 2024 19:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Shot-Batching does not work with TorchLayers
4 participants