Skip to content

Commit

Permalink
[Templates] Rewrite embeddings as operations (#1156)
Browse files Browse the repository at this point in the history
* Remove old core

* more

* more tests

* more tests

* more tests

* fixed more tests

* more tests passing

* update jax test

* tests passing

* fix

* fix

* linting

* fix docs

* fix

* fix

* fix

* Update pennylane/tape/operation_recorder.py

Co-authored-by: Chase Roberts <chase@xanadu.ai>

* fix tests after changing observable underline

* Update tests/test_queuing.py

Co-authored-by: antalszava <antalszava@gmail.com>

* Update pennylane/circuit_graph.py

Co-authored-by: antalszava <antalszava@gmail.com>

* Update pennylane/tape/tape.py

* merge master

* Update pennylane/measure.py

Co-authored-by: Maria Schuld <mariaschuld@gmail.com>

* Update doc/code/qml_tape.rst

* Update doc/code/qml_tape.rst

* Update tests/interfaces/test_qnode_torch.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* Update tests/interfaces/test_qnode_torch.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* Update tests/interfaces/test_qnode_autograd.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* Update tests/interfaces/test_qnode_autograd.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* Update tests/interfaces/test_qnode_tf.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* Update tests/interfaces/test_tape_tf.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* Update tests/interfaces/test_qnode_tf.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* Update tests/interfaces/test_tape_autograd.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* Update tests/interfaces/test_tape_torch.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* Update pennylane/interfaces/torch.py

Co-authored-by: Christina Lee <christina@xanadu.ai>

* suggested changes

* Update pennylane/qnode.py

Co-authored-by: Theodor <theodor@xanadu.ai>

* suggested changes

* fix test

* rewrite amplitudeembedding

* add tests

* finish amplitude embedding

* finish angle embedding

* finish basis embedding

* finished basis embedding

* finished displacement

* finished squeezing

* finished all embeddings

* polish

* black

* removed merge conflict

* polish docstrings

* add interface tests beyond gradients

* typo

* improve angle emb tests

* improve tests further

* add list/tuple input test

* update docstring of basisembedding

* black

* fix bug

* Update .github/CHANGELOG.md

Co-authored-by: Tom Bromley <49409390+trbromley@users.noreply.github.com>

* typos

* Update tests/templates/test_embeddings/test_angle.py

Co-authored-by: Tom Bromley <49409390+trbromley@users.noreply.github.com>

* applied review suggestions

* black

Co-authored-by: Josh Izaac <josh146@gmail.com>
Co-authored-by: Chase Roberts <chase@xanadu.ai>
Co-authored-by: antalszava <antalszava@gmail.com>
Co-authored-by: Christina Lee <christina@xanadu.ai>
Co-authored-by: Theodor <theodor@xanadu.ai>
Co-authored-by: Tom Bromley <49409390+trbromley@users.noreply.github.com>
  • Loading branch information
7 people committed Apr 6, 2021
1 parent 2972913 commit fd99be7
Show file tree
Hide file tree
Showing 16 changed files with 2,062 additions and 1,214 deletions.
8 changes: 5 additions & 3 deletions .github/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -537,11 +537,13 @@
1: ──RY(1.35)──╰X──RY(0.422)──╰X──┤
```

- The `QAOAEmbedding` and `BasicEntanglerLayers` are now classes inheriting
- The embedding templates, as well as `BasicEntanglerLayers`, are now classes inheriting
from `Operation`, and define the ansatz in their `expand()` method. This
change does not affect the user interface.
change does not affect the user interface.
[(#1138)](https://github.com/PennyLaneAI/pennylane/pull/1138)
[(#1156)](https://github.com/PennyLaneAI/pennylane/pull/1156)

For convenience, the class has a method that returns the shape of the
For convenience, `BasicEntanglerLayers` has a method that returns the shape of the
trainable parameter tensor, i.e.,

```python
Expand Down
174 changes: 82 additions & 92 deletions pennylane/templates/embeddings/amplitude.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,82 +12,22 @@
# See the License for the specific language governing permissions and
# limitations under the License.
r"""
Contains the ``AmplitudeEmbedding`` template.
Contains the AmplitudeEmbedding template.
"""
# pylint: disable-msg=too-many-branches,too-many-arguments,protected-access
import warnings
import numpy as np

import pennylane as qml
from pennylane.templates.decorator import template
from pennylane.operation import Operation, AnyWires
from pennylane.ops import QubitStateVector
from pennylane.wires import Wires

# tolerance for normalization
TOLERANCE = 1e-10


def _preprocess(features, wires, pad_with, normalize):
"""Validate and pre-process inputs as follows:
* Check that the features tensor is one-dimensional.
* If pad_with is None, check that the first dimension of the features tensor
has length :math:`2^n` where :math:`n` is the number of qubits. Else check that the
first dimension of the features tensor is not larger than :math:`2^n` and pad features with value if necessary.
* If normalize is false, check that first dimension of features is normalised to one. Else, normalise the
features tensor.
Args:
features (tensor_like): input features to pre-process
wires (Wires): wires that template acts on
pad_with (float): constant used to pad the features tensor to required dimension
normalize (bool): whether or not to normalize the features vector
Returns:
tensor: pre-processed features
"""

shape = qml.math.shape(features)

# check shape
if len(shape) != 1:
raise ValueError(f"Features must be a one-dimensional tensor; got shape {shape}.")

n_features = shape[0]
if pad_with is None and n_features != 2 ** len(wires):
raise ValueError(
f"Features must be of length {2 ** len(wires)}; got length {n_features}. "
f"Use the 'pad' argument for automated padding."
)

if pad_with is not None and n_features > 2 ** len(wires):
raise ValueError(
f"Features must be of length {2 ** len(wires)} or "
f"smaller to be padded; got length {n_features}."
)

# pad
if pad_with is not None and n_features < 2 ** len(wires):
padding = [pad_with] * (2 ** len(wires) - n_features)
features = qml.math.concatenate([features, padding], axis=0)

# normalize
norm = qml.math.sum(qml.math.abs(features) ** 2)

if not qml.math.allclose(norm, 1.0, atol=TOLERANCE):
if normalize or pad_with:
features = features / np.sqrt(norm)
else:
raise ValueError(
f"Features must be a vector of length 1.0; got length {norm}."
"Use 'normalize=True' to automatically normalize."
)

return features


@template
def AmplitudeEmbedding(features, wires, pad_with=None, normalize=False, pad=None):
class AmplitudeEmbedding(Operation):
r"""Encodes :math:`2^n` features into the amplitude vector of :math:`n` qubits.
By setting ``pad_with`` to a real or complex number, ``features`` is automatically padded to dimension
Expand All @@ -108,9 +48,8 @@ def AmplitudeEmbedding(features, wires, pad_with=None, normalize=False, pad=None
gradients with respect to the features cannot be computed by PennyLane.
Args:
features (tensor-like): input vector of length ``2^n``, or less if `pad_with` is specified
wires (Iterable or :class:`.wires.Wires`): Wires that the template acts on.
Accepts an iterable of numbers or strings, or a Wires object.
features (tensor_like): input tensor of dimension ``(2^n,)``, or less if `pad_with` is specified
wires (Iterable): wires that the template acts on
pad_with (float or complex): if not None, the input is padded with this constant to size :math:`2^n`
normalize (bool): whether to automatically normalize the features
pad (float or complex): same as `pad`, to be deprecated
Expand Down Expand Up @@ -142,21 +81,7 @@ def circuit(f=None):
**Differentiating with respect to the features**
Due to non-trivial classical processing to construct the state preparation circuit,
the features argument is **not always differentiable**.
.. code-block:: python
from pennylane import numpy as np
@qml.qnode(dev)
def circuit(f):
AmplitudeEmbedding(features=f, wires=range(2))
return qml.expval(qml.PauliZ(0))
>>> g = qml.grad(circuit, argnum=0)
>>> f = np.array([1, 1, 1, 1], requires_grad=True)
>>> g(f)
ValueError: Cannot differentiate wrt parameter(s) {0, 1, 2, 3}.
the features argument is in general **not differentiable**.
**Normalization**
Expand Down Expand Up @@ -216,17 +141,82 @@ def circuit(f=None):
"""

wires = Wires(wires)
num_params = 1
num_wires = AnyWires
par_domain = "A"

def __init__(self, features, wires, pad_with=None, normalize=False, pad=None, do_queue=True):

# pad is replaced with the more verbose pad_with
if pad is not None:
warnings.warn(
"The pad argument will be replaced by the pad_with option in future versions of PennyLane.",
PendingDeprecationWarning,
)
if pad_with is None:
pad_with = pad

wires = Wires(wires)
self.pad_with = pad_with
self.normalize = normalize

features = self._preprocess(features, wires, pad_with, normalize)
super().__init__(features, wires=wires, do_queue=do_queue)

def expand(self):

with qml.tape.QuantumTape() as tape:
QubitStateVector(self.parameters[0], wires=self.wires)

return tape

# pad is replaced with the more verbose pad_with
if pad is not None:
warnings.warn(
"The pad argument will be replaced by the pad_with option in future versions of PennyLane.",
PendingDeprecationWarning,
)
if pad_with is None:
pad_with = pad
@staticmethod
def _preprocess(features, wires, pad_with, normalize):
"""Validate and pre-process inputs as follows:
features = _preprocess(features, wires, pad_with, normalize)
* Check that the features tensor is one-dimensional.
* If pad_with is None, check that the first dimension of the features tensor
has length :math:`2^n` where :math:`n` is the number of qubits. Else check that the
first dimension of the features tensor is not larger than :math:`2^n` and pad features with value if necessary.
* If normalize is false, check that first dimension of features is normalised to one. Else, normalise the
features tensor.
"""

shape = qml.math.shape(features)

# check shape
if len(shape) != 1:
raise ValueError(f"Features must be a one-dimensional tensor; got shape {shape}.")

n_features = shape[0]
if pad_with is None and n_features != 2 ** len(wires):
raise ValueError(
f"Features must be of length {2 ** len(wires)}; got length {n_features}. "
f"Use the 'pad' argument for automated padding."
)

if pad_with is not None and n_features > 2 ** len(wires):
raise ValueError(
f"Features must be of length {2 ** len(wires)} or "
f"smaller to be padded; got length {n_features}."
)

QubitStateVector(features, wires=wires)
# pad
if pad_with is not None and n_features < 2 ** len(wires):
padding = [pad_with] * (2 ** len(wires) - n_features)
features = qml.math.concatenate([features, padding], axis=0)

# normalize
norm = qml.math.sum(qml.math.abs(features) ** 2)

if not qml.math.allclose(norm, 1.0, atol=TOLERANCE):
if normalize or pad_with:
features = features / np.sqrt(norm)
else:
raise ValueError(
f"Features must be a vector of length 1.0; got length {norm}."
"Use 'normalize=True' to automatically normalize."
)

features = qml.math.cast(features, np.complex128)
return features
79 changes: 34 additions & 45 deletions pennylane/templates/embeddings/angle.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,40 +16,14 @@
"""
# pylint: disable-msg=too-many-branches,too-many-arguments,protected-access
import pennylane as qml
from pennylane.templates.decorator import template
from pennylane.templates import broadcast
from pennylane.wires import Wires
from pennylane.ops import RX, RY, RZ
from pennylane.operation import Operation, AnyWires


def _preprocess(features, wires):
"""Validate and pre-process inputs as follows:
ROT = {"X": RX, "Y": RY, "Z": RZ}

* Check that the features tensor is one-dimensional.
* Check that the first dimension of the features tensor
has length :math:`n` or less, where :math:`n` is the number of qubits.

Args:
features (tensor_like): input features to pre-process
wires (Wires): wires that template acts on
Returns:
int: number of features
"""
shape = qml.math.shape(features)

if len(shape) != 1:
raise ValueError(f"Features must be a one-dimensional tensor; got shape {shape}.")

n_features = shape[0]
if n_features > len(wires):
raise ValueError(
f"Features must be of length {len(wires)} or less; got length {n_features}."
)
return n_features


@template
def AngleEmbedding(features, wires, rotation="X"):
class AngleEmbedding(Operation):
r"""
Encodes :math:`N` features into the rotation angles of :math:`n` qubits, where :math:`N \leq n`.
Expand All @@ -66,26 +40,41 @@ def AngleEmbedding(features, wires, rotation="X"):
``features`` than rotations, the circuit does not apply the remaining rotation gates.
Args:
features (array): input array of shape ``(N,)``, where N is the number of input features to embed,
features (tensor_like): input tensor of shape ``(N,)``, where N is the number of input features to embed,
with :math:`N\leq n`
wires (Iterable or Wires): Wires that the template acts on. Accepts an iterable of numbers or strings, or
a Wires object.
wires (Iterable): wires that the template acts on
rotation (str): type of rotations used
"""

wires = Wires(wires)
n_features = _preprocess(features, wires)
wires = wires[:n_features]
num_params = 1
num_wires = AnyWires
par_domain = "A"

def __init__(self, features, wires, rotation="X", do_queue=True):

if rotation not in ROT:
raise ValueError(f"Rotation option {rotation} not recognized.")
self.rotation = ROT[rotation]

shape = qml.math.shape(features)
if len(shape) != 1:
raise ValueError(f"Features must be a one-dimensional tensor; got shape {shape}.")
n_features = shape[0]
if n_features > len(wires):
raise ValueError(
f"Features must be of length {len(wires)} or less; got length {n_features}."
)

wires = wires[:n_features]
super().__init__(features, wires=wires, do_queue=do_queue)

def expand(self):

if rotation == "X":
broadcast(unitary=qml.RX, pattern="single", wires=wires, parameters=features)
features = self.parameters[0]

elif rotation == "Y":
broadcast(unitary=qml.RY, pattern="single", wires=wires, parameters=features)
with qml.tape.QuantumTape() as tape:

elif rotation == "Z":
broadcast(unitary=qml.RZ, pattern="single", wires=wires, parameters=features)
for i in range(len(self.wires)):
self.rotation(features[i], wires=self.wires[i])

else:
raise ValueError(f"Rotation option {rotation} not recognized.")
return tape
Loading

0 comments on commit fd99be7

Please sign in to comment.