Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Templates] Rewrite layer templates as operations #1163

Merged
merged 26 commits into from
Apr 9, 2021
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
8a919b6
converted CVQNN
mariaschuld Mar 24, 2021
6809572
backup
mariaschuld Mar 24, 2021
f291126
rewrite particle_conserving
mariaschuld Mar 25, 2021
0acca6a
add tests for simplified and stronglyent
mariaschuld Mar 25, 2021
4d62e4f
add particle1 tests
mariaschuld Mar 25, 2021
21827db
finished all tests - 2 external bugs remaining
mariaschuld Mar 25, 2021
1431e35
update changelog
mariaschuld Mar 25, 2021
892256c
black
mariaschuld Mar 25, 2021
558d008
resolve conflict in changelog
mariaschuld Mar 25, 2021
2b1e639
delete old tests that are now failing
mariaschuld Mar 25, 2021
73252db
Merge branch 'master' into rewrite-layers
mariaschuld Mar 29, 2021
d983b80
Merge branch 'master' into rewrite-layers
mariaschuld Apr 6, 2021
28c3f7e
implement some review comments
mariaschuld Apr 6, 2021
9f47a4d
Merge branch 'rewrite-layers' of github.com:PennyLaneAI/pennylane int…
mariaschuld Apr 6, 2021
d3e748b
reverse changes to interferometer
mariaschuld Apr 6, 2021
292f8fe
improve codecov
mariaschuld Apr 6, 2021
2e5aaf0
Update tests/templates/test_layers/test_basic_entangler.py
mariaschuld Apr 8, 2021
7ae4d57
Update tests/templates/test_layers/test_basic_entangler.py
mariaschuld Apr 8, 2021
22358f1
implement review comments
mariaschuld Apr 8, 2021
4fcf7c0
Merge branch 'master' into rewrite-layers
mariaschuld Apr 8, 2021
481d0b8
solved merge conflict
mariaschuld Apr 8, 2021
71b6679
more merge conflicts
mariaschuld Apr 8, 2021
1df440c
fix changelog
mariaschuld Apr 8, 2021
23caff8
delete old integration tests
mariaschuld Apr 8, 2021
7d19cc1
Merge branch 'master' into rewrite-layers
mariaschuld Apr 8, 2021
ace1680
Merge branch 'master' into rewrite-layers
mariaschuld Apr 9, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 4 additions & 3 deletions .github/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -537,11 +537,12 @@
1: ──RY(1.35)──╰X──RY(0.422)──╰X──┤
```

- The `QAOAEmbedding` and `BasicEntanglerLayers` are now classes inheriting
- The embedding and layer templates are now classes inheriting
from `Operation`, and define the ansatz in their `expand()` method. This
change does not affect the user interface.

For convenience, the class has a method that returns the shape of the
[(#1163)](https://github.com/PennyLaneAI/pennylane/pull/1163)

For convenience, some templates now provide a method that returns the shape of the
trainable parameter tensor, i.e.,

```python
Expand Down
6 changes: 2 additions & 4 deletions pennylane/templates/layers/basic_entangler.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,11 @@
# See the License for the specific language governing permissions and
# limitations under the License.
r"""
Contains the ``BasicEntanglerLayers`` template.
Contains the BasicEntanglerLayers template.
"""
# pylint: disable=consider-using-enumerate
import pennylane as qml
from pennylane.operation import Operation, AnyWires
from pennylane.wires import Wires


class BasicEntanglerLayers(Operation):
Expand Down Expand Up @@ -50,8 +49,7 @@ class BasicEntanglerLayers(Operation):
Args:
weights (tensor_like): Weight tensor of shape ``(L, len(wires))``. Each weight is used as a parameter
for the rotation.
wires (Iterable or Wires): Wires that the template acts on. Accepts an iterable of numbers or strings, or
a Wires object.
wires (Iterable): wires that the template acts on
mariaschuld marked this conversation as resolved.
Show resolved Hide resolved
rotation (pennylane.ops.Operation): one-parameter single-qubit gate to use,
if ``None``, :class:`~pennylane.ops.RX` is used as default
Raises:
Expand Down
213 changes: 129 additions & 84 deletions pennylane/templates/layers/cv_neural_net.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,79 +12,14 @@
# See the License for the specific language governing permissions and
# limitations under the License.
r"""
Contains the ``CVNeuralNetLayers`` template.
Contains the CVNeuralNetLayers template.
"""
# pylint: disable-msg=too-many-branches,too-many-arguments,protected-access
import pennylane as qml
from pennylane.templates.decorator import template
from pennylane.ops import Squeezing, Displacement, Kerr
from pennylane.templates.subroutines import Interferometer
from pennylane.templates import broadcast
from pennylane.wires import Wires
from pennylane.operation import Operation, AnyWires


def _preprocess(theta_1, phi_1, varphi_1, r, phi_r, theta_2, phi_2, varphi_2, a, phi_a, k, wires):
"""Validate and pre-process inputs as follows:

* Check that the first dimensions of all weight tensors match
* Check that the other dimensions of all weight tensors are correct for the number of qubits.

Args:
heta_1 (tensor_like): shape :math:`(L, K)` tensor of transmittivity angles for first interferometer
phi_1 (tensor_like): shape :math:`(L, K)` tensor of phase angles for first interferometer
varphi_1 (tensor_like): shape :math:`(L, M)` tensor of rotation angles to apply after first interferometer
r (tensor_like): shape :math:`(L, M)` tensor of squeezing amounts for :class:`~pennylane.ops.Squeezing` operations
phi_r (tensor_like): shape :math:`(L, M)` tensor of squeezing angles for :class:`~pennylane.ops.Squeezing` operations
theta_2 (tensor_like): shape :math:`(L, K)` tensor of transmittivity angles for second interferometer
phi_2 (tensor_like): shape :math:`(L, K)` tensor of phase angles for second interferometer
varphi_2 (tensor_like): shape :math:`(L, M)` tensor of rotation angles to apply after second interferometer
a (tensor_like): shape :math:`(L, M)` tensor of displacement magnitudes for :class:`~pennylane.ops.Displacement` operations
phi_a (tensor_like): shape :math:`(L, M)` tensor of displacement angles for :class:`~pennylane.ops.Displacement` operations
k (tensor_like): shape :math:`(L, M)` tensor of kerr parameters for :class:`~pennylane.ops.Kerr` operations
wires (Wires): wires that template acts on

Returns:
int: number of times that the ansatz is repeated
"""

n_wires = len(wires)
n_if = n_wires * (n_wires - 1) // 2

# check that first dimension is the same
weights_list = [theta_1, phi_1, varphi_1, r, phi_r, theta_2, phi_2, varphi_2, a, phi_a, k]
shapes = [qml.math.shape(w) for w in weights_list]

first_dims = [s[0] for s in shapes]
if len(set(first_dims)) > 1:
raise ValueError(
f"The first dimension of all parameters needs to be the same, got {first_dims}"
)
repeat = shapes[0][0]

second_dims = [s[1] for s in shapes]
expected = [
n_if,
n_if,
n_wires,
n_wires,
n_wires,
n_if,
n_if,
n_wires,
n_wires,
n_wires,
n_wires,
]
if not all(e == d for e, d in zip(expected, second_dims)):
raise ValueError("Got unexpected shape for one or more parameters.")

return repeat


@template
def CVNeuralNetLayers(
theta_1, phi_1, varphi_1, r, phi_r, theta_2, phi_2, varphi_2, a, phi_a, k, wires
):
class CVNeuralNetLayers(Operation):
r"""A sequence of layers of a continuous-variable quantum neural network,
as specified in `arXiv:1806.06871 <https://arxiv.org/abs/1806.06871>`_.

Expand Down Expand Up @@ -124,27 +59,137 @@ def CVNeuralNetLayers(
a (tensor_like): shape :math:`(L, M)` tensor of displacement magnitudes for :class:`~pennylane.ops.Displacement` operations
phi_a (tensor_like): shape :math:`(L, M)` tensor of displacement angles for :class:`~pennylane.ops.Displacement` operations
k (tensor_like): shape :math:`(L, M)` tensor of kerr parameters for :class:`~pennylane.ops.Kerr` operations
wires (Iterable or Wires): Wires that the template acts on. Accepts an iterable of numbers or strings, or
a Wires object.
Raises:
ValueError: if inputs do not have the correct format
wires (Iterable): wires that the template acts on

.. UsageDetails:

**Parameter shapes**

A list of shapes for the 11 input parameter tensors can be computed by the static method
:meth:`~.CVNeuralNetLayers.shapes` and used when creating randomly
initialised weights:

.. code-block:: python

shapes = CVNeuralNetLayers.shapes(n_layers=2, n_wires=2)
weights = [np.random.random(shape) for shape in shapes]

def circuit():
CVNeuralNetLayers(*weights, wires=[0, 1])
return qml.expval(qml.X(0))

"""

wires = Wires(wires)
repeat = _preprocess(
theta_1, phi_1, varphi_1, r, phi_r, theta_2, phi_2, varphi_2, a, phi_a, k, wires
)
num_params = 11
num_wires = AnyWires
par_domain = "A"

def __init__(
self,
theta_1,
phi_1,
varphi_1,
r,
phi_r,
theta_2,
phi_2,
varphi_2,
a,
phi_a,
k,
wires,
do_queue=True,
):

n_wires = len(wires)
n_if = n_wires * (n_wires - 1) // 2

# check that first dimension is the same
weights_list = [theta_1, phi_1, varphi_1, r, phi_r, theta_2, phi_2, varphi_2, a, phi_a, k]
shapes = [qml.math.shape(w) for w in weights_list]
first_dims = [s[0] for s in shapes]
if len(set(first_dims)) > 1:
raise ValueError(
f"The first dimension of all parameters needs to be the same, got {first_dims}"
)

# check second dimensions
second_dims = [s[1] for s in shapes]
expected = [n_if] * 2 + [n_wires] * 3 + [n_if] * 2 + [n_wires] * 4
if not all(e == d for e, d in zip(expected, second_dims)):
raise ValueError("Got unexpected shape for one or more parameters.")

self.n_layers = shapes[0][0]

super().__init__(
theta_1,
phi_1,
varphi_1,
r,
phi_r,
theta_2,
phi_2,
varphi_2,
a,
phi_a,
k,
wires=wires,
do_queue=do_queue,
)

def expand(self):

with qml.tape.QuantumTape() as tape:

for l in range(self.n_layers):

qml.templates.Interferometer(
theta=self.parameters[0][l],
phi=self.parameters[1][l],
varphi=self.parameters[2][l],
wires=self.wires,
)

for i in range(len(self.wires)):
qml.Squeezing(
self.parameters[3][l, i], self.parameters[4][l, i], wires=self.wires[i]
)

for l in range(repeat):
qml.templates.Interferometer(
theta=self.parameters[5][l],
phi=self.parameters[6][l],
varphi=self.parameters[7][l],
wires=self.wires,
)

Interferometer(theta=theta_1[l], phi=phi_1[l], varphi=varphi_1[l], wires=wires)
for i in range(len(self.wires)):
qml.Displacement(
self.parameters[8][l, i], self.parameters[9][l, i], wires=self.wires[i]
)

r_and_phi_r = qml.math.stack([r[l], phi_r[l]], axis=1)
broadcast(unitary=Squeezing, pattern="single", wires=wires, parameters=r_and_phi_r)
for i in range(len(self.wires)):
qml.Kerr(self.parameters[10][l, i], wires=self.wires[i])

Interferometer(theta=theta_2[l], phi=phi_2[l], varphi=varphi_2[l], wires=wires)
return tape

a_and_phi_a = qml.math.stack([a[l], phi_a[l]], axis=1)
broadcast(unitary=Displacement, pattern="single", wires=wires, parameters=a_and_phi_a)
@staticmethod
def shape(n_layers, n_wires):
r"""Returns a list of shapes for the 11 parameter tensors.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😁 😁


Args:
n_layers (int): number of layers
n_wires (int): number of wires

Returns:
list[tuple[int]]: list of shapes
"""
n_if = n_wires * (n_wires - 1) // 2

shapes = (
[(n_layers, n_if)] * 2
+ [(n_layers, n_wires)] * 3
+ [(n_layers, n_if)] * 2
+ [(n_layers, n_wires)] * 4
)

broadcast(unitary=Kerr, pattern="single", wires=wires, parameters=k[l])
return shapes
Loading