TPAMI 2024
S-Lab, Nanyang Technological University1, The University of Hong Kong2, Texas A&M University3
* denotes equal contribution
Step 1: Clone this repository
git clone https://github.com/perf-project/PeRF.git
cd PeRF
pip install -r requirements.txt
Step 2: Install tiny-cuda-nn
pip install ninja
pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
Step 3: Download checkpoints as shown here.
Here is a command to train a PeRF of an example data:
python core_exp_runner.py --config-name nerf dataset.image_path=$(pwd)/example_data/kitchen/image.png device.base_exp_dir=$(pwd)/exp
After training is done, you can render a traverse video by running the following command:
python core_exp_runner.py --config-name nerf dataset.image_path=$(pwd)/example_data/kitchen/image.png device.base_exp_dir=$(pwd)/exp mode=render_dense is_continue=true
Cite as below if you find it helpful to your research.
@article{perf2023,
title={PERF: Panoramic Neural Radiance Field from a Single Panorama},
author={Guangcong Wang and Peng Wang and Zhaoxi Chen and Wenping Wang and Chen Change Loy and Ziwei Liu},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)},
year={2024}}