Skip to yearly menu bar Skip to main content


Poster

Boosting Generalization in Parametric PDE Neural Solvers through Adaptive Conditioning

Armand Kassaï Koupaï Sorbonne Université · Jorge Mifsut Benet · Yuan Yin · Jean-Noël Vittaut · Patrick Gallinari

[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Solving parametric partial differential equations (PDEs) presents significant challenges for data-driven methods, particularly due to the sensitivity of spatio-temporal dynamics to variations in PDE parameters. Machine learning approaches often struggle to capture this variability, especially with scarce data. Neural PDE solvers have sought to address this by sampling jointly PDE parameters and trajectories distributions, or by using meta-learning to adapt to new parameters. However, direct sampling typically fails to encompass the full diversity of solution behaviours, while meta-learning approaches often struggle with scalability with respect to the size and the complexity of datasets. We propose a first-order optimisation with a low-rank rapid adaptation to unseen environments by adjusting only a small set of context parameters. We demonstrate the versatility of our approach for both fully data-driven and for physics-aware neural solvers. Validation performed on a whole range of spatio-temporal forecasting problems demonstrates excellent performance for generalising to unseen conditions including initial conditions, PDE coefficients, forcing terms and solution domain.

Live content is unavailable. Log in and register to view live content