PyTorch implementation of the experiments from the paper
"Compositional Sculpting of Iterative Generative Processes" (NeurIPS 2023)
by Timur Garipov, Sebastiaan De Peuter, Ge Yang, Vikas Garg, Samuel Kaski, Tommi Jaakkola
High training costs of generative models and the need to fine-tune them for specific tasks have created a strong interest in model reuse and composition.
A key challenge in composing iterative generative processes, such as GFlowNets and diffusion models, is that to realize the desired target distribution, all steps of the generative process need to be coordinated, and satisfy delicate balance conditions.
In this work, we propose Compositional Sculpting: a general approach for defining compositions of iterative generative processes.
We then introduce a method for sampling from these compositions built on classifier guidance.
We showcase ways to accomplish compositional sculpting in both GFlowNets and diffusion models.
We highlight two binary operations — the harmonic mean (
Please cite our paper if you find it helpful in your work:
@article{garipov2023compositional,
title={Compositional Sculpting of Iterative Generative Processes},
author={Garipov, Timur and De Peuter, Sebastiaan and Yang, Ge and Garg, Vikas and Kaski, Samuel and Jaakkola, Tommi},
journal={Advances in Neural Information Processing Systems},
volume={36},
year={2023}
}
See diffusion/README.md and diffusion_chaining/README.md
- Code for 2D grid GFlowNets (training and sampling) was adapted from https://gist.github.com/malkin1729/9a87ce4f19acdc2c24225782a8b81c15
- Code for fragment-based molecular graphs GFlowNets (training and sampling) was adapted from https://github.com/recursionpharma/gflownet (last synced on 2023-04-06, with commit 3d311c3)
- Code for MNIST diffusion models (training and sampling) was adapted from https://colab.research.google.com/drive/120kYYBOVa1i0TD85RjlEkFjaWDxSFUx3?usp=sharing