Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Effective Dimension Algorithm #318

Closed
2 of 7 tasks
ElePT opened this issue Feb 16, 2022 · 0 comments
Closed
2 of 7 tasks

Implement Effective Dimension Algorithm #318

ElePT opened this issue Feb 16, 2022 · 0 comments
Assignees
Labels
type: feature request 💡 New feature or request

Comments

@ElePT
Copy link
Collaborator

ElePT commented Feb 16, 2022

What should we add?

The goal of this issue is to document/keep track of the project of creating a Runtime program for QNN Effective Dimension calculations.

What do we have?

The explanation of the effective dimension algorithm and its relevance can be found in the following research paper. And the code to reproduce the original experiments has been published in this repo. The original implementation is clear and well documented, but is outdated in several points, and is overall missing certain features to become a runtime program. This implementation is based on 2 main classes:

  1. QuantumNeuralNetwork class - Computes batched forward and backward passes, and returns the Fisher information matrix. It uses the python multiprocessing library to speed up the Montecarlo estimates necessary to calculate the Fisher information matrix. Main methods:
  • forward()
  • get_gradients()
  • get_fisher()
  1. EffectiveDimension class - This class receives a model, number of parameter sets and number of inputs, and computes the normalized Fisher information (f_hat), as well as the effective dimension for a specified number of data (n). The distributions the inputs and parameters are drawn from are fixed (i.e. a standard normal distribution for data), only the number of inputs/parameters can be changed. Main methods:
  • get_fhat()
  • eff_dim()

This implementation works only with statevector simulation, and not with real backends.

What do we want?

We want to ultimately be able to use a runtime program to compute effective dimension calculations for any given QNN, input set and parameter set (not necessarily drawn from a standard normal distribution). In other words, we want a runtime program that adapts pre-existing effective dimension code, and adds:

  • Customizable QNNs (from the qiskit-machine-learning API)
  • Customizable input data (not only # inputs)
  • Customizable parameters (not only # parameters)
  • Access to backends (not only statevector simulator)

Proposed implementation

With this proposed implementation, I try to avoid redundancy, and take advantage of pre-existing qiskit-machine-learning code.

  1. The QuantumNeuralNetwork class used in this project mostly overlaps with qiskit-machine-learning's CircuitQNN class with dense output, except for the get_fisher() method. My proposal is to move the Fisher information matrix calculation to the EffectiveDimension class, so that it encapsulates all necessary computations, and we don't need the custom QNN class anymore. This would immediately give us access to the backends (the custom QNN class only works with statevector simulation).
  2. Once this first step is completed, I propose to add the customizable input data and parameters feature to the EffectiveDimension class. This class can be introduced as a standalone contribution to qiskit-machine-learning.
  3. Finally, I propose to write a runtime program that uses this EffectiveDimension class.

Things to keep in mind

  1. I believe that the same code can be used for global/local effective dimension (if I am not mistaken, the only change is the number of parameters used), but it might be interesting to explicitly add a LocalEffectiveDimension class. In that case, it would be nice if it also included model training.
  2. By following this implementation, we will not be using multiprocessing for the Fisher information matrix estimation. In my opinion, multiprocessing is not a platform-agnostic library, and it will likely become troublesome (for example, with Python 3.10 & Mac OS it currently does not work well). Plus, it will most likely not be an advantage inside the runtime environment. However, it might still be interesting to look a bit further into this to confirm these ideas.
  3. The custom QNN and the CircuitQNN class stack outputs and implement post-processing differently. I believe that this is a minor inconvenience, but it must be taken into account for step 1 (see below).
  4. This implementation assumes the use of CircuitQNN to define the quantum neural networks. Would it be interesting to extend this implementation to ANY qiskit-machine-learning QNN class? (i.e. OpflowQNN and subclasses).

My proposal is to turn this issue into an epic, and assign issues to the 3 mentioned steps:

  • Create new EffectiveDimension class
  • Extend EffectiveDimension class functionality (input/param. customization)
  • Create EffectiveDimension runtime program
@ElePT ElePT added the type: feature request 💡 New feature or request label Feb 16, 2022
@adekusar-drl adekusar-drl added Epic type: epic 😎 A theme of work that contain sub-tasks and removed Epic type: epic 😎 A theme of work that contain sub-tasks labels Feb 16, 2022
@adekusar-drl adekusar-drl changed the title Implement Effective Dimension Runtime Implement Effective Dimension Algorithm Mar 9, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: feature request 💡 New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants