Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert device gradients fix #2595

Merged
merged 23 commits into from
Jun 2, 2022
Merged
Show file tree
Hide file tree
Changes from 15 commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 4 additions & 2 deletions doc/releases/changelog-dev.md
Original file line number Diff line number Diff line change
Expand Up @@ -178,6 +178,9 @@
instead of the controlled version of the diagonal unitary.
[(#2525)](https://github.com/PennyLaneAI/pennylane/pull/2525)

* Reverted device defined gradients fix [(#2485)](https://github.com/PennyLaneAI/pennylane/pull/2485), since
antalszava marked this conversation as resolved.
Show resolved Hide resolved
code change was breaking some devices. [(#2595)](https://github.com/PennyLaneAI/pennylane/pull/2595)
antalszava marked this conversation as resolved.
Show resolved Hide resolved

<h3>Deprecations</h3>

<h3>Documentation</h3>
Expand All @@ -197,5 +200,4 @@
This release contains contributions from (in alphabetical order):

Amintor Dusko, Chae-Yeun Park, Christian Gogolin, Christina Lee, David Wierichs, Edward Jiang, Guillermo Alonso-Linaje,
Jay Soni, Juan Miguel Arrazola, Maria Schuld, Mikhail Andrenkov, Soran Jahangiri, Utkarsh Azad

Jay Soni, Juan Miguel Arrazola, Maria Schuld, Mikhail Andrenkov, Samuel Banning, Soran Jahangiri, Utkarsh Azad
7 changes: 3 additions & 4 deletions pennylane/interfaces/autograd.py
Original file line number Diff line number Diff line change
Expand Up @@ -234,10 +234,9 @@ def grad_fn(dy):
return_vjps = [
qml.math.to_numpy(v, max_depth=_n) if isinstance(v, ArrayBox) else v for v in vjps
]
if device.capabilities().get("provides_jacobian", False):
# in the case where the device provides the jacobian,
# the output of grad_fn must be wrapped in a tuple in
# order to match the input parameters to _execute.
if device.short_name == "strawberryfields.gbs":
# TODO: remove this exceptional case once the source of this issue
# https://github.com/PennyLaneAI/pennylane-sf/issues/89 is determined
return (return_vjps,)
return return_vjps

Expand Down
26 changes: 26 additions & 0 deletions tests/interfaces/test_autograd.py
Original file line number Diff line number Diff line change
Expand Up @@ -1141,3 +1141,29 @@ def circuit(v):

d_out = d_circuit(params)
assert np.allclose(d_out, np.array([1.0, 2.0, 3.0, 4.0]))

def test_custom_jacobians_2(self):
antalszava marked this conversation as resolved.
Show resolved Hide resolved

class MyQubit(DefaultQubit):
@classmethod
def capabilities(cls):
capabilities = super().capabilities().copy()
capabilities.update(
provides_jacobian=True, # with this commented out everything works
antalszava marked this conversation as resolved.
Show resolved Hide resolved
)
return capabilities

def jacobian(self, *args, **kwargs):
raise NotImplementedError()

dev = MyQubit(wires=2)

@qml.qnode(dev, diff_method='parameter-shift', mode="backward")
def qnode(params):
qml.RY(params[0], wires=[0])
return qml.expval(qml.PauliZ(0))

params = np.array([0.2])

assert np.isclose(qnode(params), 0.9800665778412417)
assert np.isclose(qml.jacobian(qnode)(params), -0.19866933)