Skip to content

yonetaniryo/neural-astar-jax

Repository files navigation

Faster Neural A* Implemented in JAX

This is a third-party implementation of Neural A* search [Yonetani+, ICML 2021]. I have reimplemented the Neural A* model and training scripts in JAX and Flax, and expect an overall speedup thanks to JAX's jit compile and vmapping.

This codebase is under active development; the quantitative performance is still limited due to potential bugs and implementation differences between PyTorch and JAX.

Getting started

The code has been tested on Ubuntu >=18.04 as well as WSL2 (Ubuntu 20.04) on Windows 11, with python3 (>=3.8). Planning can be performed only on the CPU, and the use of GPUs is supported for training/evaluating Neural A* models. We also provide Dockerfile and docker-compose.yaml to replicate our setup.

Installation (venv)

$ git clone --recursive https://github.com/yonetaniryo/neural-astar-jax.git
$ python -m venv .venv
$ source .venv/bin/activate
(.venv) $ pip install .[dev]

Citation

@InProceedings{pmlr-v139-yonetani21a,
  title =      {Path Planning using Neural A* Search},
  author    = {Ryo Yonetani and
               Tatsunori Taniai and
               Mohammadamin Barekatain and
               Mai Nishimura and
               Asako Kanezaki},
  booktitle =      {Proceedings of the 38th International Conference on Machine Learning},
  pages =      {12029--12039},
  year =      {2021},
  editor =      {Meila, Marina and Zhang, Tong},
  volume =      {139},
  series =      {Proceedings of Machine Learning Research},
  month =      {18--24 Jul},
  publisher =    {PMLR},
  pdf =      {http://proceedings.mlr.press/v139/yonetani21a/yonetani21a.pdf},
  url =      {http://proceedings.mlr.press/v139/yonetani21a.html},
}

About

Jax implementation of Neural A*

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published