Poster

Diffusion Hyperfeatures: Searching Through Time and Space for Semantic Correspondence

Grace Luo · Lisa Dunlap · Dong Huk Park · Aleksander Holynski · Trevor Darrell

Great Hall & Hall B1+B2 (level 1) #607
[ ] [ Project Page ]
Wed 13 Dec 3 p.m. PST — 5 p.m. PST

Abstract:

Diffusion models have been shown to be capable of generating high-quality images, suggesting that they could contain meaningful internal representations. Unfortunately, the feature maps that encode a diffusion model's internal information are spread not only over layers of the network, but also over diffusion timesteps, making it challenging to extract useful descriptors. We propose Diffusion Hyperfeatures, a framework for consolidating multi-scale and multi-timestep feature maps into per-pixel feature descriptors that can be used for downstream tasks. These descriptors can be extracted for both synthetic and real images using the generation and inversion processes. We evaluate the utility of our Diffusion Hyperfeatures on the task of semantic keypoint correspondence: our method achieves superior performance on the SPair-71k real image benchmark. We also demonstrate that our method is flexible and transferable: our feature aggregation network trained on the inversion features of real image pairs can be used on the generation features of synthetic image pairs with unseen objects and compositions. Our code is available at https://diffusion-hyperfeatures.github.io.

Chat is not available.