Code for Neural-GIF: Neural Generalized Implicit Functions for Animating People in Clothing(ICCV21)
We present Neural Generalized Implicit Functions (Neural-GIF), to animate people in clothing as a function of body pose. Neural-GIF learns directly from scans, models complex clothing and produces pose-dependent details for realistic animation. We show for four different characters the query input pose on the left (illustrated with a skeleton) and our output animation on the right.
https://nextcloud.mpi-klsb.mpg.de/index.php/s/FweAP5Js58Q9tsq
1. Install kaolin: https://github.com/NVIDIAGameWorks/kaolin
2. conda env create -f neuralgif.yml
3. conda activate neuralgif
1. Edit configs/*yaml with correct path
a. data/data_dir:
b. data/split_file: <path to train/test split file> (see example in dataset folder)
c. experiment/root_dir: training dir
d. experiment/exp_name: <exp_name>
2 . python trainer_shape.py --config=<path to config file>
1. python generator.py --config=<path to config file>
1. SMPL pose and shape parameters: https://github.com/bharat-b7/IPNet
2. Save the registartion data and original scan data as:
a. data_dir/scan_dir: contain original scans
b. data_dir/beta.npy: SMPL beta parameter of subject
c. data_dir/pose.npz: SMPL pose parameters for all frames of scan
3. Prepare training data:
python prepare_data/clothseq_data.py -data_dir=<path to data directory>
python visualisation/render_meshes.py -mesh_path=<folder containing meshes> -out_dir=<output dir>
@inproceedings{tiwari21neuralgif,
title = {Neural-GIF: Neural Generalized Implicit Functions for Animating People in Clothing},
author = {Tiwari, Garvita and Sarafianos, Nikolaos and Tung, Tony and Pons-Moll, Gerard},
booktitle = {International Conference on Computer Vision ({ICCV})},
month = {October},
year = {2021},
}