Particle Monte Carlo methods for Lattice Field Theory
Abstract
High-dimensional, multimodal sampling problems from lattice field theory (LFT) are an increasingly important proving ground for machine learning sampling methods. Recent activity has focused largely on the development of sophisticated neural samplers at a significant computational cost; we demonstrate this complexity is often unnecessary. We show that classic particle-based methods—specifically Sequential Monte Carlo (SMC) and nested sampling—are remarkably effective when implemented in a batched, GPU-accelerated framework. Tuned with only a single, data-driven particle covariance, our methods match or outperform state-of-the-art neural samplers in both sample quality and wall-clock time on a standard LFT benchmark, while also reliably estimating the partition function. These results establish a new, strong performance baseline and suggest the high training cost of learned samplers must be more carefully justified. We advocate for a path forward through hybrid models and provide open-source implementations to facilitate future comparisons.