Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop on Diffusion Models

Importance-Guided Diffusion

Paris Flood · Pietro LiĆ³


Abstract:

Conditional generative modeling via diffusion processes has emerged as an indispensable tool for advancing the fidelity and diversity of sample generation, pushing the boundaries of applications such as image synthesis, style transfer, and adaptive content generation. We present a plug-and-play approach to conditional diffusion which integrates seamlessly with existing unconditioned diffusion architectures. Our method, derived from relative entropy coding, recasts diffusion as an auxiliary variable importance sampling procedure and is able to influence the generative process without the need for gradient information or tampering with the network in any manner. Furthermore, this approach offers a principled mechanism to both quantify and adjust the degree of conditioning, enabling precise navigation across a large spectrum of generative outputs. Experimental results indicate that this technique produces meaningful conditional outputs while maintaining a relatively minimal increase in computational burden.

Chat is not available.