-
Notifications
You must be signed in to change notification settings - Fork 604
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hadamard gradient transform #3625
Conversation
Hello. You may have forgotten to update the changelog!
|
Codecov Report
@@ Coverage Diff @@
## master #3625 +/- ##
========================================
Coverage 99.74% 99.74%
========================================
Files 326 327 +1
Lines 28491 28646 +155
========================================
+ Hits 28417 28572 +155
Misses 74 74
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
[sc-31942] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking great @rmoyard 💯
I left a few more super tiny language suggestions and 2 small code suggestions I didn't notice before.
The only slightly bigger question I have is why we can't make use of the generator
properties of the operations that are being differentiated. Sorry this only crossed my mind now.
Co-authored-by: David Wierichs <david.wierichs@xanadu.ai>
Co-authored-by: Isaac De Vlugt <34751083+isaacdevlugt@users.noreply.github.com>
… into lcu_gradient
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🥳
Co-authored-by: Matthew Silverman <matthews@xanadu.ai>
@dwierichs we add operations to get generators for different reasons: the generators need to be decomposed because it not unitary (phaseshift, CRx, ...), we prefer to use CZ instead of different generators (Rot), the generators are a tensor and the ctrl function is failing and therefore we use prod (IsingXX, ...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just added one final suggestion to document the behaviour for Rot
a little bit.
Otherwise looks very nice, looking forward to having the Hadamard test gradient available!
🎉
Co-authored-by: David Wierichs <david.wierichs@xanadu.ai>
* Draft * Trainable params * Working prototype for RX RY RZ * Add op_id * Change tape p idx * Multiple measurements support * Probs * Rot added * Rename * Indep params * Add tests * probs working * Test expval and probs * CR Ising and black * Typos * Tests aux wires * Decomposition * Small changes * Small changes * More black * Pylint tests * Pylint tests * Fix test * Typo * Update * Apply suggestions from code review Co-authored-by: David Wierichs <david.wierichs@xanadu.ai> Co-authored-by: Matthew Silverman <matthews@xanadu.ai> * Apply black * From review * From review * Test PauliRot * pylint tests * Fix multi measurement and multi tape * Doc * Changelog * Change test name * Fix doc * Apply suggestions from code review Co-authored-by: David Wierichs <david.wierichs@xanadu.ai> * Review * Update doc/releases/changelog-dev.md Co-authored-by: Isaac De Vlugt <34751083+isaacdevlugt@users.noreply.github.com> * Pylint * Add to doc * Apply suggestions from code review Co-authored-by: Matthew Silverman <matthews@xanadu.ai> * Note * Add init expand * Update pennylane/gradients/hadamard_gradient.py Co-authored-by: David Wierichs <david.wierichs@xanadu.ai> * Black --------- Co-authored-by: David Wierichs <david.wierichs@xanadu.ai> Co-authored-by: Matthew Silverman <matthews@xanadu.ai> Co-authored-by: Isaac De Vlugt <34751083+isaacdevlugt@users.noreply.github.com>
Context:
We currently do not have the hadamard gradient implemented.
Description of the Change:
This PR adds the
hadamard_grad
transform and it supports any mix of expval and probs.Benefits:
A new gradient transform.
Example:
Upcoming work