Skip to yearly menu bar Skip to main content


Poster

Simplified and Generalized Masked Diffusion for Discrete Data

Jiaxin Shi · Kehang Han · Zhe Wang · Arnaud Doucet · Michalis Titsias

West Ballroom A-D #7307
[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Masked (or absorbing) diffusion is actively explored as an alternative to autoregressive models for generative modeling of discrete data. However, existing work in this area has been hindered by 1) unnecessarily complex model formulations and 2) unclear relationships between different perspectives, leading to suboptimal parameterization, training objectives, and ad hoc adjustments to counteract these issues. In this work, we aim to provide a simple and general framework that unlocks the full potential of masked diffusion models. We show the first theoretical result that the continuous-time variational objective of masked diffusion models is a simple weighted integral of cross-entropy losses. Our framework also enables training generalized masked diffusion models with state-dependent masking schedules. We apply our models to text and image tasks and demonstrate state-of-the-art likelihood and zero-shot transfer results for discrete diffusion models.

Live content is unavailable. Log in and register to view live content