Poster
in
Workshop: Bayesian Deep Learning
Power-law asymptotics of the generalization error for GP regression under power-law priors and targets
Hui Jin · Pradeep Kr. Banerjee · Guido Montufar
Abstract:
We study the power-law asymptotics of learning curves for Gaussian process regression (GPR). When the eigenspectrum of the prior decays with rate and the eigenexpansion coefficients of the target function decay with rate , we show that the Bayesian generalization error behaves as with high probability over the draw of input samples. Infinitely wide neural networks can be related to GPR with respect to the Neural Network Gaussian Process kernel, which in several cases is known to have a power-law spectrum. Hence our methods can be applied to study the generalization error of infinitely wide neural networks. We present toy experiments demonstrating the theory.
Chat is not available.