Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix qml.grad so that the returned gradient always matches the cost function return type if only a single argument is differentiated #1067

Merged
merged 11 commits into from
Feb 9, 2021

Conversation

josh146
Copy link
Member

@josh146 josh146 commented Feb 5, 2021

Context:

I realised yesterday that autograd.grad behaves differently depending if the argnum argument is a sequence or an integer:

>>> import autograd
>>> from autograd import numpy as np
>>> autograd.grad(np.sin, argnum=0)(0.6)
0.8253356149096783
>>> autograd.grad(np.sin, argnum=[0])(0.6)
(array(0.82533561),)

We never take this into account in PennyLane, which can lead to gradients often being returned as tuple of size 1.

Description of the Change:

  • If the inferred argnum argument is a sequence of length 1, only the contained integer is passed to autograd.grad.

Benefits:

  • The return of qml.grad should now always match the return type of the function being differentiated.

  • A lot of superfluous [0] could be removed from the tests (e.g., qml.grad(circuit)(x)[0])

Possible Drawbacks:

  • The optimizers have to add in the redundant tuple where needed so that the gradient remains iterable. But this is a easy change, and leads to an intuitive UI, so maybe better?

  • Some of the demos might need to be updated to remove unneeded [0] after gradient computations. But I think very few demos actually show the gradients.

Related GitHub Issues: n/a

@josh146 josh146 added the bug 🐛 Something isn't working label Feb 5, 2021
@josh146 josh146 changed the title Qml grad argnum Fix qml.grad so that the returned gradient always matches the cost function return type if only a single argument is differentiated Feb 5, 2021
@codecov
Copy link

codecov bot commented Feb 5, 2021

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.74%. Comparing base (812e9d4) to head (de65756).
Report is 2658 commits behind head on master.

Additional details and impacted files
@@           Coverage Diff           @@
##           master    #1067   +/-   ##
=======================================
  Coverage   97.74%   97.74%           
=======================================
  Files         153      153           
  Lines       11584    11590    +6     
=======================================
+ Hits        11323    11329    +6     
  Misses        261      261           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

tests/tape/interfaces/test_qnode_autograd.py Outdated Show resolved Hide resolved
tests/tape/interfaces/test_qnode_autograd.py Outdated Show resolved Hide resolved
tests/tape/interfaces/test_qnode_autograd.py Outdated Show resolved Hide resolved
Comment on lines +130 to +132
if len(args) == 1:
grad = (grad,)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have to turn grad into a tuple because we're reversing what we did to argnum earlier, right?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, the optimizer logic currently assumes grad is always iterable with respect to argument number.

I figured, easier to leave this logic intact and simply re-tuple the grad

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed.

@josh146 josh146 merged commit cfdb6f8 into master Feb 9, 2021
@josh146 josh146 deleted the qml-grad-argnum branch February 9, 2021 06:08
josh146 added a commit that referenced this pull request Feb 11, 2021
… cost function return type if only a single argument is differentiated (#1067)"

This reverts commit cfdb6f8.
josh146 added a commit that referenced this pull request Feb 11, 2021
… cost function return type if only a single argument is differentiated (#1067)" (#1080)

This reverts commit cfdb6f8.
josh146 added a commit that referenced this pull request Mar 2, 2021
…nction return type if only a single argument is differentiated (#1081)

* Revert "Fix qml.grad so that the returned gradient always matches the cost function return type if only a single argument is differentiated (#1067)"

This reverts commit cfdb6f8.

* qml fix

* changelog

* more
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐛 Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants