-
Notifications
You must be signed in to change notification settings - Fork 586
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added custom return object #2808
Conversation
Hello. You may have forgotten to update the changelog!
|
Codecov Report
@@ Coverage Diff @@
## master #2808 +/- ##
=======================================
Coverage 99.63% 99.63%
=======================================
Files 252 252
Lines 20986 20989 +3
=======================================
+ Hits 20910 20913 +3
Misses 76 76
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is looking great overall @puzzleshark! 🎉 Some minor comments.
Also, don't forget to update the changelog! :)
Co-authored-by: antalszava <antalszava@gmail.com>
Co-authored-by: antalszava <antalszava@gmail.com>
Co-authored-by: antalszava <antalszava@gmail.com>
When running this test with the latest master merged in I get an additional error:
To fix this |
* fix problem from #2808 (comment) * also test device gradient with custom return type
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good overall @puzzleshark, thank you for looking into this! 💯
The main change to do from my side would be moving the test cases into another more relevant file.
Co-authored-by: antalszava <antalszava@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! 🎉 Great one 🙂
Restores behavior of pennylane returning custom items to as it was here #1109 (comment)
In this pr, the return object is still an autograd numpy array (which we require for differentiation) with a single entry (being the custom object) and so requires
.item()
to unpack (see the discussion linked above).