Timezone: »
Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow
Xingchao Liu · Chengyue Gong · Qiang Liu
Event URL: https://openreview.net/forum?id=gWxpdtQpiYV »
We present rectified flow, a surprisingly simple approach to learning (neural) ordinary differential equation (ODE) models to transport between two empirically observed distributions $\pi_0$ and $\pi_1$, hence providing a unified solution to generative modeling and domain transfer, among various other tasks involving distribution transport. The idea of rectified flow is to learn the ODE to follow the straight paths connecting the points drawn from $\pi_0$ and $\pi_1$ as much as possible. This is achieved by solving a straightforward nonlinear least squares optimization problem, which can be easily scaled to large models without introducing extra parameters beyond standard supervised learning. The straight paths are special and preferred because they are the shortest paths between two points, and can be simulated exactly without time discretization and hence yield computationally efficient models. We show that the procedure of learning a rectified flow from data, called rectification, turns an arbitrary coupling of $\pi_0$ and $\pi_1$ to a new deterministic coupling with provably non-increasing convex transport costs. In addition, recursively applying rectification allows us to obtain a sequence of flows with increasingly straight paths, which can be simulated accurately with coarse time discretization in the inference phase. In empirical studies, we show that rectified flow performs superbly on image generation and image-to-image translation. In particular, on image generation and translation, our method yields nearly straight flows that give high quality results even with \emph{a single Euler discretization step}.
We present rectified flow, a surprisingly simple approach to learning (neural) ordinary differential equation (ODE) models to transport between two empirically observed distributions $\pi_0$ and $\pi_1$, hence providing a unified solution to generative modeling and domain transfer, among various other tasks involving distribution transport. The idea of rectified flow is to learn the ODE to follow the straight paths connecting the points drawn from $\pi_0$ and $\pi_1$ as much as possible. This is achieved by solving a straightforward nonlinear least squares optimization problem, which can be easily scaled to large models without introducing extra parameters beyond standard supervised learning. The straight paths are special and preferred because they are the shortest paths between two points, and can be simulated exactly without time discretization and hence yield computationally efficient models. We show that the procedure of learning a rectified flow from data, called rectification, turns an arbitrary coupling of $\pi_0$ and $\pi_1$ to a new deterministic coupling with provably non-increasing convex transport costs. In addition, recursively applying rectification allows us to obtain a sequence of flows with increasingly straight paths, which can be simulated accurately with coarse time discretization in the inference phase. In empirical studies, we show that rectified flow performs superbly on image generation and image-to-image translation. In particular, on image generation and translation, our method yields nearly straight flows that give high quality results even with \emph{a single Euler discretization step}.
Author Information
Xingchao Liu (University of Texas, Austin)
Chengyue Gong (University of Texas at Austin)
Qiang Liu (Dartmouth College)
More from the Same Authors
-
2021 Spotlight: Profiling Pareto Front With Multi-Objective Stein Variational Gradient Descent »
Xingchao Liu · Xin Tong · Qiang Liu -
2022 : BOME! Bilevel Optimization Made Easy: A Simple First-Order Approach »
Mao Ye · Bo Liu · Stephen Wright · Peter Stone · Qiang Liu -
2022 : Diffusion-based Molecule Generation with Informative Prior Bridges »
Chengyue Gong · Lemeng Wu · Xingchao Liu · Mao Ye · Qiang Liu -
2022 : HotProtein: A Novel Framework for Protein Thermostability Prediction and Editing »
Tianlong Chen · Chengyue Gong · Daniel Diaz · Xuxi Chen · Jordan Wells · Qiang Liu · Zhangyang Wang · Andrew Ellington · Alex Dimakis · Adam Klivans -
2022 : First hitting diffusion models »
Mao Ye · Lemeng Wu · Qiang Liu -
2022 : Neural Volumetric Mesh Generator »
Yan Zheng · Lemeng Wu · Xingchao Liu · Zhen Chen · Qiang Liu · Qixing Huang -
2022 : Let us Build Bridges: Understanding and Extending Diffusion Generative Models »
Xingchao Liu · Lemeng Wu · Mao Ye · Qiang Liu -
2022 Poster: First Hitting Diffusion Models for Generating Manifold, Graph and Categorical Data »
Mao Ye · Lemeng Wu · Qiang Liu -
2022 Poster: Sampling in Constrained Domains with Orthogonal-Space Variational Gradient Descent »
Ruqi Zhang · Qiang Liu · Xin Tong -
2022 Poster: BOME! Bilevel Optimization Made Easy: A Simple First-Order Approach »
Bo Liu · Mao Ye · Stephen Wright · Peter Stone · Qiang Liu -
2022 Poster: Diffusion-based Molecule Generation with Informative Prior Bridges »
Lemeng Wu · Chengyue Gong · Xingchao Liu · Mao Ye · Qiang Liu -
2021 Poster: Conflict-Averse Gradient Descent for Multi-task learning »
Bo Liu · Xingchao Liu · Xiaojie Jin · Peter Stone · Qiang Liu -
2021 Poster: Sampling with Trusthworthy Constraints: A Variational Gradient Framework »
Xingchao Liu · Xin Tong · Qiang Liu -
2021 Poster: Automatic and Harmless Regularization with Constrained and Lexicographic Optimization: A Dynamic Barrier Approach »
Chengyue Gong · Xingchao Liu · Qiang Liu -
2021 Poster: argmax centroid »
Chengyue Gong · Mao Ye · Qiang Liu -
2021 Poster: Profiling Pareto Front With Multi-Objective Stein Variational Gradient Descent »
Xingchao Liu · Xin Tong · Qiang Liu -
2020 Poster: Implicit Regularization and Convergence for Weight Normalization »
Xiaoxia Wu · Edgar Dobriban · Tongzheng Ren · Shanshan Wu · Zhiyuan Li · Suriya Gunasekar · Rachel Ward · Qiang Liu