Physics-Informed Learning Near Critical Transitions: A Comparative Study of UDEs and Neural ODEs
Abstract
Neural systems exhibit rich computational behavior near critical transitions between ordered and chaotic dynamics. Learning these transitions poses unique challenges due to slow dynamics, sensitivity to parameters, and multi-scale temporal structure. Wesystematically compare Universal Differential Equations (UDEs) and Neural ODEs for learning a two-dimensional neural dynamical system across stability regimes. Through Lyapunov landscape analysis, we demonstrate that activation function choice fundamentally shapes bifurcation structure, with Swish enabling smooth order-to-chaos transitions unlike ReLU or sigmoid. Our comprehensive evaluation reveals that UDEs consistently outperform Neural ODEs, achieving 2–10× lower RMSE across all coupling strengths and superior robustness under external perturbations. Critically, both methods struggle near transition points (λ ∼ 0), but UDEs maintain better performance. Surprisingly, while UDEs excel at dynamics prediction, they fail to accurately reconstruct the underlying activation function, revealing fundamental trade-offs between system-level learning and component interpretability in physics-informed approaches.