Generalised Flow Maps on Riemannian Manifolds
Abstract
Recent advances in generative modelling on Euclidean spaces have shown how to train models that achieve state-of-the-art quality from scratch, while requiring only few function evaluations. In the meanwhile, data supported on Riemannian manifolds, such as protein backbones or geological data, has lagged behind: inference and training on such manifolds remain computationally challenging and numerically unstable due to the need for manifold-specific operations, and require numerous evaluations of a potentially expensive model to obtain samples of high quality. In this paper, we propose Riemannian Flow Maps, a new class of few-step generative models that generalise the flow map framework~\citep{boffi2024flow} to arbitrary Riemannian manifolds. We port and design three self-distillation-based training methods: Riemannian Lagrange Flow Maps, Riemannian Eulerian Flow Maps, and Riemannian Semi-Group Flow Maps, all of which recover their Euclidean counterparts. Empirically, we test Riemmannian Flow Maps on a host of standard datasets, and achieve, amongst Riemannian generative models, state-of-the-art sample quality for single- and few-step evaluations, and log-likelihoods.