Generalizing PDE Emulation with Equation-Aware Neural Operators
Abstract
Solving partial differential equations (PDEs) can be prohibitively expensive using traditional numerical methods. Deep learning-based surrogate models (emulators) typically specialize in a single PDE with fixed parameters. We present a framework for equation-aware emulation that generalizes to unseen PDEs, conditioning a neural model on a vector encoding representing the PDE's terms and their coefficients. We learn a mapping that generalizes to unseen physical systems. We present a baseline of four distinct modeling technqiues, trained on a family of 1D PDEs from the APEBench suite. Our approach achieves strong performance on parameter sets held out from the training distribution, with strong stability for rollout beyond the training timesteps, and generalization to an entirely unseen PDE.