Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Bayesian Deep Learning

Power-law asymptotics of the generalization error for GP regression under power-law priors and targets

Hui Jin · Pradeep Kr. Banerjee · Guido Montufar


Abstract: We study the power-law asymptotics of learning curves for Gaussian process regression (GPR). When the eigenspectrum of the prior decays with rate α and the eigenexpansion coefficients of the target function decay with rate β, we show that the Bayesian generalization error behaves as O~(nmax{1α1,12βα}) with high probability over the draw of n input samples. Infinitely wide neural networks can be related to GPR with respect to the Neural Network Gaussian Process kernel, which in several cases is known to have a power-law spectrum. Hence our methods can be applied to study the generalization error of infinitely wide neural networks. We present toy experiments demonstrating the theory.

Chat is not available.