Skip to content

Learning function operators with neural networks.

License

Notifications You must be signed in to change notification settings

schroedk/continuiti

 
 

Repository files navigation

continuiti

continuiti

Learning function operators with neural networks.

PyTorch Documentation Test

continuiti is a Python package for deep learning on function operators with a focus on elegance and generality. It provides a unified interface for neural operators (such as DeepONet or FNO) to be used in a plug and play fashion. As operator learning is particularly useful in scientific machine learning, continuiti also includes physics-informed loss functions and a collection of relevant benchmarks.

Installation

Install the package using pip:

pip install continuiti

Or install the latest development version from the repository:

git clone https://github.com/aai-institute/continuiti.git
cd continuiti
pip install -e .[dev]

Usage

Our Documentation contains a verbose introduction to operator learning, a collection of examples using continuiti, and a class documentation.

In general, the operator syntax in continuiti is

v = operator(x, u(x), y)

mapping a function u (evaluated at x) to function v (evaluated in y).

For more details, see Learning Operators.

Examples

Contributing

Contributions are welcome from anyone in the form of pull requests, bug reports and feature requests. If you find a bug or have a feature request, please open an issue on GitHub. If you want to contribute code, please fork the repository and submit a pull request. See CONTRIBUTING.md for details on local development.

License

This project is licensed under the GNU LGPLv3 License - see the LICENSE file for details.

About

Learning function operators with neural networks.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.6%
  • Other 0.4%