Scaling Laws and Pathologies of Single-Layer PINNs: Network Width and PDE Nonlinearity
Faris Chaudhry
Abstract
We establish empirical scaling laws for Single-Layer Physics-Informed Neural Networks (PINNs) on canonical nonlinear PDEs (KdV, Sine-Gordon, Allen-Cahn) and quantify how solution error scales with network width and a nonlinearity parameter. We find that the benefit of increased width systematically degrades with nonlinearity, often resulting in pathological scaling where wider networks perform worse. This work provides quantitative evidence that optimization, not approximation, is the primary bottleneck and introduces the methodology for measuring scaling performance in more complex architectures.
Chat is not available.
Successful Page Load