Daiheng Gao*
·
Yuliang Xiu*
·
Kailin Li*
·
Lixin Yang*
Feng Wang
·
Peng Zhang
·
Bang Zhang
·
Cewu Lu
·
Ping Tan
- [2024.04.01] DARTset could be easily cloned from Huggingface/Dataset at DARTset
- [2022.10.07] DART's raw textures+accessories are released at RAW
- [2022.09.29] DART Unity GUI's source code is publicly available at GUI
- numpy
- cv2
- imageio
- PyTorch
- PyTorch3D (>= 0.6)
- manotorch
Please download the data from HuggingFace or Baidu Pan (4w3r) and put them in the data/DARTset
folder.
git clone https://huggingface.co/datasets/Yuliang/DART data/DARTset
Then download MANO from the official website and put it in the assets
folder.
Your directory should look like this:
.
├── DARTset.py
├── DARTset_utils.py
├── assets
│ └── mano_v1_2
├── data
│ └── DARTset
│ ├── train
│ │ ├── 0
│ │ ├── 0_wbg
│ │ ├── part_0.pkl
│ │ |-- ...
│ └── test
python DARTset.py
You can modify this line in DARTset.py to change the train/test
data split.
Please check postprocess folder to learn how to generate intermediate output using DART's Unity GUI.
If you find our work useful in your research, please cite:
@inproceedings{gao2022dart,
title={{DART: Articulated Hand Model with Diverse Accessories and Rich Textures}},
author={Daiheng Gao and Yuliang Xiu and Kailin Li and Lixin Yang and Feng Wang and Peng Zhang and Bang Zhang and Cewu Lu and Ping Tan},
booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2022},
}