Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Mathematics of Modern Machine Learning (M3L)

Generalization Guarantees of Deep ResNets in the Mean-Field Regime

Yihang Chen · Fanghui Liu · Yiping Lu · Grigorios Chrysos · Volkan Cevher


Abstract:

Despite the widespread empirical success of ResNet, the generalization ability of deep ResNet is rarely explored beyond the lazy-training regime. In this work, we investigate ResNet in the limit of infinitely deep and wide neural networks, of which the gradient flow is described by a partial differential equation in the large-neural network limit, i.e., the \emph{mean-field} regime.To derive the generalization bounds under this setting, our analysis necessitates a shift from the conventional time-invariant Gram matrix employed in the lazy training regime to a time-variant, distribution-dependent version tailored to the mean-field regime.To this end, we provide a lower bound on the minimum eigenvalue of the Gram matrix under the mean-field regime.Besides, the traceability of the dynamic of Kullback-Leibler (KL) divergence is also required under the mean-field regime.We therefore establish the linear convergence of the empirical error and estimate the upper bound of the KL divergence over parameters distribution.The above two results are employed to build the uniform convergence for generalization bound via Rademacher complexity.Our results offer new insights into the generalization ability of deep ResNet beyond the lazy training regime and contribute to advancing the understanding of the fundamental properties of deep neural networks.

Chat is not available.