-
Notifications
You must be signed in to change notification settings - Fork 604
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Operator matrices should be defined using holomorphic functions #1819
Comments
@josh146 what do you mean when you say backprop simulators? |
@puzzleshark I mean the simulators which natively support backpropagation 🙂 This includes:
Note that when you create a QNode using dev = qml.device("default.qubit", wires=2)
@qml.qnode(dev, diff_method="best")
def circuit(...): then a backprop device will be automatically chosen if For more info, check out this tutorial: https://pennylane.ai/qml/demos/tutorial_backprop.html |
@josh146 gotcha. seems like all simulators use the same code, but just swap out the tensor operations ie. tf.math.conj vs torch.conj. Looking though, I don't see any particular issue with it w.r.t complex differentiability, (all instances of conj are used w.r.t to the state vector) but not sure how I could test that it's all in working order? Maybe create some tests with simple circuits and compare with same calculation via one of the auto-diff frameworks? |
Yep! I think that is the way to go. Perhaps create a simple circuit, such as def circuit(x):
qml.Hadamard(wires=0)
qml.RZ(x, wires=0)
return qml.state() and at the same time, analytically compute what the state looks like, by applying the analytic RX matrix to the state You can then differentiate the output state in Torch/TF/Autograd/JAX, and double check it agrees with the analytic result :) |
In #1749, the
Operator.matrix
method was updated to be use differentiable logic when constructing the matrix of parametric operations, allowing them to be differentiable in all frameworks. This change worked well with existing workflows, that is, differentiating a real-valued variational quantum algorithm cost function.However, when differentiating the operation matrix itself, we are dealing with a complex valued cost-function, and unlike in the case above, now need to ensure that all tensor operations are holomorphic.
Unfortunately,
qml.math.conj()
is a non-holomorphic function. Operations which use this function will output incorrect complex matrix gradients:Implementing the RZ gate without using
conj()
we can compare to the expected gradient:Note that this bug will only affect the gradients of complex-valued cost functions in PennyLane, but will arise wherever non-holomorphic functions are used. This includes:
qml.state()
on one of the backprop devices,default.qubit.(torch|tf|autograd|jax)
.To solve this, the all operator matrices and backprop simulators should be re-written to avoid non-holomorphic functions.
The text was updated successfully, but these errors were encountered: