A Discrete-Continuous Curriculum Learning (DCCL) Framework for Stable Long-Horizon PDE Surrogates
Lalit Ghule · Akanksh Shetty · Morgane Bourgeois
Abstract
Deep surrogates for transient dynamics often diverge over long horizons due to compounding error. Curriculum learning mitigates exposure bias by gradually replacing teacher forcing with self-rollouts, but discrete switching can induce optimization instabilities. We propose a Discrete-Continuous Curriculum Learning (DCCL) framework that blends model predictions with ground truth via a convex combination. On a 2D vorticity dataset, DCCL consistently reduces long-horizon rollout error compared to self-rollout and discrete curriculum baselines for both UNet and Fourier Neural Operator (FNO) backbones, with pronounced gains in extrapolation.
Chat is not available.
Successful Page Load