-
Notifications
You must be signed in to change notification settings - Fork 603
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added grad_fn kwarg to QNGOptimizer.step_and_cost and .step #1487
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1487 +/- ##
=======================================
Coverage 98.32% 98.32%
=======================================
Files 180 180
Lines 12705 12709 +4
=======================================
+ Hits 12492 12496 +4
Misses 213 213
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this @dwierichs! Code itself is looking 💯, my main comment is to split up the test into distinct tests (one for grouped input, separate input, custom fn, etc) 🙂
recompute_tensor (bool): Whether or not the metric tensor should | ||
be recomputed. If not, the metric tensor from the previous | ||
optimization step is used. | ||
metric_tensor_fn (function): Optional metric tensor function | ||
with respect to the variables ``x``. | ||
If ``None``, the metric tensor function is computed automatically. | ||
**kwargs : variable length of keyword arguments for the qnode |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch. Was this needed because you were using keyword arguments in your notebook?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, this is to make QNGOptimizer
more similar to GradientDescentOptimizer
:)
Co-authored-by: Josh Izaac <josh146@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💯 Changes look good @dwierichs, happy for this to be merged in!
The
step_and_cost
andstep
methods ofQNGOptimizer
currently do not accept thegrad_fn
argument like other optimizers.In particular, this prevents a straight-forward use of JAX with QNG.
Changes
The kwarg
grad_fn
is accepted as explicit kwarg in the two above methods, passing them tocompute_grad
.Benefits
Unifying the optimizer interfaces and enabling custom gradient function when using QNG.
Possible drawbacks
None (The order of existing kwargs was not changed but
grad_fn
was added as last kwarg, so that even when using the existing kwargs as positional args there is no breaking change.