Pytorch version of the repo Deep3DFaceReconstruction.
This repo only contains the reconstruction part, so you can use Deep3DFaceReconstruction-pytorch repo to train the network. And the pretrained model is also from this repo.
I use mtcnn to crop raw images and detect 5 landmarks. The most code of MTCNN comes from FaceNet-pytorch.
In this repo, I use PyTorch3d 0.3.0 to render the reconstructed images.
In the origin repo (Deep3DFaceReconstruction-pytorch), the rendered images is not the same as the input image because of preprocess
. So, I add the estimate_intrinsic
to get intrinsic parameters.
Here are some examples:
Origin Images | Cropped Images | Rendered Images |
---|---|---|
## Installation
- Download the
BFM
folder from the original TensorFlow repository and place it inD3DFR/BFM
. - Download the Basel Face Model. Due to the license agreement of Basel Face Model, you have to download the BFM09 model after submitting an application on its home page. After getting the access to BFM data, download "01_MorphableModel.mat" and put it into
D3DFR/BFM/
subfolder. - Download the Expression Basis provided by Guo et al. You can find a link named
CoarseData
in the first row of Introduction part in their repository. Download and unzip theCoarse_Dataset.zip
. PutExp_Pca.bin
intoD3DFR/BFM/
subfolder. The expression basis are constructed using Facewarehouse data and transferred to BFM topology. - Create the
D3DFR/network
folder, then download the pre-trained reconstruction network, unzip it and putFaceReconModel.pb
intoD3DFR/network
.
├─BFM same as Deep3DFaceReconstruction
├─dataset storing the corpped images
│ └─Vladimir_Putin
├─examples show examples
├─facebank storing the raw/origin images
│ └─Vladimir_Putin
├─models storing the pretrained models
├─output storing the output images(.mat, .png)
│ └─Vladimir_Putin
└─preprocess cropping images and detecting landmarks
├─data storing the models of mtcnn
├─utils
Also, this repo can also generate the UV map, and you need download UV coordinates from the following link:
Download UV coordinates fom STN website: https://github.com/anilbas/3DMMasSTN/blob/master/util/BFM_UV.mat
Copy BFM_UV.mat to BFM
The pretrained models can be downloaded from Google Drive.