Energy-Based Physics-Informed Diffusion Transformers Sampling for Time Series Forecasting
Abstract
Scientific time series data, particularly in climate science, present unique challenges at the intersection of probabilistic inference and physical constraints. While large pre-trained Time Series Diffusion Transformers excel at capturing complex data distributions, they lack mechanisms to enforce physical consistency. To address this gap, we present a novel framework for adapting pre-trained generative models for scientific tasks. Our contribution is a model-agnostic physics-injection module that employs Langevin dynamics at inference time to steer predictions toward physically consistent solutions without costly retraining. We provide theoretical guarantees for convergence under physical constraints and empirically validate our method across multiple synthetic partial differential equations and climate systems, offering insights into the synergy between machine learning and physics-based sampling for scientific applications.