Here are some very brief instructions for the synthetic scene generation.
To work with a new synthetic scenes, it is necessary to write the the simulation script of the scene (Step 1), use blender to render the scene (Step 2), write code to properly output camera poses (Step 3), and modify code (like in repo_dir/load_pinf.py) to load the image sequences with camera poses.
Step 1. We use Mantaflow to simulate synthetic scenes.
An example script (used for the Game scene) can be found here: ./stairsWaveletTurbObs.py
After Mantaflow installation, this scene file can be run with something like:
./build/manta ./stairsWaveletTurbObs.py
It will save 3D density sequences as npz files (np.float16 is used to save space).
Step 2. Convert npz files to vdb files for rendering with Blender
After OpenVDB installation, vdb files can be obtained using the following command.
python ./manta2vdb.py
The vdb sequences can be loaded in Blender. Arbitrary rendering parameters and lighting conditions can be applied. We modify this code (a zip file provided here) to render images and output camera poses.