Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hadamard gradient transform #3625

Merged
merged 57 commits into from
Feb 3, 2023
Merged

Hadamard gradient transform #3625

merged 57 commits into from
Feb 3, 2023

Conversation

rmoyard
Copy link
Contributor

@rmoyard rmoyard commented Jan 11, 2023

Context:
We currently do not have the hadamard gradient implemented.

Description of the Change:

This PR adds the hadamard_grad transform and it supports any mix of expval and probs.

Benefits:

A new gradient transform.

Example:

import pennylane as qml

qml.enable_return()
dev = qml.device("default.qubit", wires=3)
x = 0.543
y = -0.654

with qml.queuing.AnnotatedQueue() as q:
    qml.RX(x, wires=[0])
    qml.RY(y, wires=[1])
    qml.CNOT(wires=[0, 1])
    qml.expval(qml.PauliZ(0))
    qml.probs(wires=[0, 1])

tape = qml.tape.QuantumScript.from_queue(q)
tapes, fn = qml.gradients.hadamard_grad(tape)
res = fn(dev.batch_execute(tapes))
print(res)
((-0.5167068002272901, -0.0), (tensor([-0.23169865, -0.02665475,  0.02665475,  0.23169865], requires_grad=True), tensor([ 0.28230648, -0.28230648, -0.02187647,  0.02187647], requires_grad=True)))

Upcoming work

  1. Add variance.
  2. Add QNode integration.

@github-actions
Copy link
Contributor

Hello. You may have forgotten to update the changelog!
Please edit doc/releases/changelog-dev.md with:

  • A one-to-two sentence description of the change. You may include a small working example for new features.
  • A link back to this PR.
  • Your name (or GitHub username) in the contributors section.

@codecov
Copy link

codecov bot commented Jan 11, 2023

Codecov Report

Merging #3625 (4000424) into master (30b58bf) will increase coverage by 0.00%.
The diff coverage is 100.00%.

@@           Coverage Diff            @@
##           master    #3625    +/-   ##
========================================
  Coverage   99.74%   99.74%            
========================================
  Files         326      327     +1     
  Lines       28491    28646   +155     
========================================
+ Hits        28417    28572   +155     
  Misses         74       74            
Impacted Files Coverage Δ
pennylane/gradients/gradient_transform.py 100.00% <ø> (ø)
pennylane/gradients/hamiltonian_grad.py 100.00% <ø> (ø)
pennylane/transforms/__init__.py 100.00% <ø> (ø)
pennylane/gradients/__init__.py 100.00% <100.00%> (ø)
pennylane/gradients/hadamard_gradient.py 100.00% <100.00%> (ø)
pennylane/transforms/tape_expand.py 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@rmoyard
Copy link
Contributor Author

rmoyard commented Jan 13, 2023

[sc-31942]

@rmoyard rmoyard changed the title [WIP] Linear combination unitaries for gradient [WIP] Hadamard gradient transform Jan 23, 2023
Copy link
Contributor

@dwierichs dwierichs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking great @rmoyard 💯
I left a few more super tiny language suggestions and 2 small code suggestions I didn't notice before.
The only slightly bigger question I have is why we can't make use of the generator properties of the operations that are being differentiated. Sorry this only crossed my mind now.

doc/releases/changelog-dev.md Outdated Show resolved Hide resolved
doc/releases/changelog-dev.md Outdated Show resolved Hide resolved
doc/releases/changelog-dev.md Outdated Show resolved Hide resolved
pennylane/gradients/hadamard_gradient.py Outdated Show resolved Hide resolved
pennylane/gradients/hadamard_gradient.py Outdated Show resolved Hide resolved
pennylane/gradients/hadamard_gradient.py Outdated Show resolved Hide resolved
pennylane/gradients/hadamard_gradient.py Show resolved Hide resolved
pennylane/gradients/hadamard_gradient.py Outdated Show resolved Hide resolved
pennylane/gradients/hadamard_gradient.py Outdated Show resolved Hide resolved
rmoyard and others added 3 commits February 2, 2023 11:53
rmoyard and others added 3 commits February 2, 2023 13:46
Co-authored-by: Isaac De Vlugt <34751083+isaacdevlugt@users.noreply.github.com>
Copy link
Contributor

@timmysilv timmysilv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🥳

pennylane/gradients/hadamard_gradient.py Outdated Show resolved Hide resolved
pennylane/gradients/hadamard_gradient.py Outdated Show resolved Hide resolved
pennylane/gradients/hadamard_gradient.py Show resolved Hide resolved
rmoyard and others added 2 commits February 2, 2023 16:30
Co-authored-by: Matthew Silverman <matthews@xanadu.ai>
@rmoyard
Copy link
Contributor Author

rmoyard commented Feb 2, 2023

@dwierichs we add operations to get generators for different reasons: the generators need to be decomposed because it not unitary (phaseshift, CRx, ...), we prefer to use CZ instead of different generators (Rot), the generators are a tensor and the ctrl function is failing and therefore we use prod (IsingXX, ...)

Copy link
Contributor

@dwierichs dwierichs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just added one final suggestion to document the behaviour for Rot a little bit.
Otherwise looks very nice, looking forward to having the Hadamard test gradient available!
🎉

pennylane/gradients/hadamard_gradient.py Outdated Show resolved Hide resolved
pennylane/gradients/hadamard_gradient.py Show resolved Hide resolved
@rmoyard rmoyard enabled auto-merge (squash) February 3, 2023 14:36
@rmoyard rmoyard merged commit e412162 into master Feb 3, 2023
@rmoyard rmoyard deleted the lcu_gradient branch February 3, 2023 14:54
mudit2812 pushed a commit that referenced this pull request Apr 13, 2023
* Draft

* Trainable params

* Working prototype for RX RY RZ

* Add op_id

* Change tape p idx

* Multiple measurements support

* Probs

* Rot added

* Rename

* Indep params

* Add tests

* probs working

* Test expval and probs

* CR Ising and black

* Typos

* Tests aux wires

* Decomposition

* Small changes

* Small changes

* More black

* Pylint tests

* Pylint tests

* Fix test

* Typo

* Update

* Apply suggestions from code review

Co-authored-by: David Wierichs <david.wierichs@xanadu.ai>
Co-authored-by: Matthew Silverman <matthews@xanadu.ai>

* Apply black

* From review

* From review

* Test PauliRot

* pylint tests

* Fix multi measurement and multi tape

* Doc

* Changelog

* Change test name

* Fix doc

* Apply suggestions from code review

Co-authored-by: David Wierichs <david.wierichs@xanadu.ai>

* Review

* Update doc/releases/changelog-dev.md

Co-authored-by: Isaac De Vlugt <34751083+isaacdevlugt@users.noreply.github.com>

* Pylint

* Add to doc

* Apply suggestions from code review

Co-authored-by: Matthew Silverman <matthews@xanadu.ai>

* Note

* Add init expand

* Update pennylane/gradients/hadamard_gradient.py

Co-authored-by: David Wierichs <david.wierichs@xanadu.ai>

* Black

---------

Co-authored-by: David Wierichs <david.wierichs@xanadu.ai>
Co-authored-by: Matthew Silverman <matthews@xanadu.ai>
Co-authored-by: Isaac De Vlugt <34751083+isaacdevlugt@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants