Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Latest commit

 

History

History
31 lines (23 loc) · 1.05 KB

README.md

File metadata and controls

31 lines (23 loc) · 1.05 KB

Neural Conservation Laws: A Divergence-Free Perspective

This repo contains code for the NeurIPS 2022 paper https://arxiv.org/abs/2210.01741.

Building divergence-free neural networks

Experiments

Experiments on training Neural Conservation Laws (and baselines) for fluid simulation are available in the jax subdirectory.

Experiments on training Neural Conservation Laws (and baselines) for dynamical optimal transport are available in the pytorch subdirectory.

Citations

If you find this repository helpful for your publications, please consider citing our paper:

@inproceedings{
    richter-powell2022neural,
    title={Neural Conservation Laws: A Divergence-Free Perspective},
    author={Jack Richter-Powell and Yaron Lipman and Ricky T. Q. Chen},
    booktitle={Advances in Neural Information Processing Systems},
    year={2022},
}

License

This repository is licensed under the CC BY-NC 4.0 License.