Skip to yearly menu bar Skip to main content


Poster

Unveiling the Power of Diffusion Features For Personalized Segmentation and Retrieval

Dvir Samuel · Rami Ben-Ari · Matan Levy · Nir Darshan · Gal Chechik


Abstract:

Personalized retrieval and segmentation aim to locate specific instances within a dataset based on an input image and a short description of the reference instance. While supervised methods are effective, they require extensive labeled data for training. Recently, self-supervised foundation models have been introduced to these tasks showing comparable results to supervised methods. However, a significant flaw in these models is evident: they struggle to locate a desired instance when other instances within the same class are presented. In this paper, we explore text-to-image diffusion models for these tasks. Specifically, we propose a novel approach called PDM for Personalized Features Diffusion Matching, that leverages intermediate features for personalization tasks without any additional training. PDM demonstrates superior performance on popular retrieval and segmentation benchmarks, outperforming even supervised methods. We also highlight notable shortcomings in current instance and segmentation benchmarks and propose new benchmarks for these tasks.

Live content is unavailable. Log in and register to view live content