Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge
Abstract
Recent advances in scientific machine learning (SciML) have established neural operators (NOs) as powerful surrogates for partial differential equations (PDEs) governed dynamics. However, existing methods largely ignore the fundamental physical principles underlying these equations. We propose a multiphysics training framework that jointly learns from both full PDEs and their simplified basic forms. This approach improves data efficiency, reduces predictive error, and enhances out-of-distribution (OOD) generalization, including parameter shifts and synthetic-to-real transfer. Our method is architecture-agnostic and achieves over 11.5\% and up to 64\% improvement in nRMSE across diverse PDEs. Through extensive experiments, we show that explicitly incorporating fundamental physics knowledge substantially strengthens the generalization of neural operators. Models and data will be released upon acceptance.Our code is anonymously available at https://anonymous.4open.science/r/SciML-PDE-7BF7.