A script to generate a texture for a 3D mesh using Stable Diffusion and ControlNet. It's designed for the outputs of TripoSR, an image-to-3D model by Stability AI and Tripo AI, but might work on other meshes as well. TripoSR outputs more coarse-grained textures in the form of vertex colors, so this is helpful for (re)applying fine detail to the models.
Create a virtualenv and install requirements:
python3 -m virtualenv venv
venv/bin/pip install -r requirements.txt
Run the text2texture.py
script with the output mesh from
TripoSR along with a textual description of the
desired appearance.
venv/bin/python text2texture.py ~/TripoSR/output/0/mesh.obj 'a chair that looks like an avocado'
The first time this runs, it will download a Stable Diffusion model (by default, Lykon/dreamshaper-8) and a ControlNet model. The image model can be configured, say, to a model you've already fetched from Hugging Face.
Copyright © 2024 Evan Jones