Skip to yearly menu bar Skip to main content


Poster

Asymptotic Properties for Bayesian Neural Network in Besov Space

Kyeongwon Lee · Jaeyong Lee

Hall J (level 1) #714

Keywords: [ Sparsity ] [ Bayesian neural network ] [ optimal rate ] [ posterior consistency ] [ Deep Learning ]


Abstract:

Neural networks have shown great predictive power when applied to unstructured data such as images and natural languages. The Bayesian neural network captures the uncertainty of prediction by computing the posterior distribution of the model parameters. In this paper, we show that the Bayesian neural network with spikeand-slab prior has posterior consistency with a near minimax optimal convergence rate when the true regression function belongs to the Besov space. The spikeand-slab prior is adaptive to the smoothness of the regression function and the posterior convergence rate does not change even when the smoothness of the regression function is unknown. We also consider the shrinkage prior, which is computationally more feasible than the spike-and-slab prior, and show that it has the same posterior convergence rate as the spike-and-slab prior.

Chat is not available.