Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Meta-Learning

Meta-Learning Bayesian Neural Network Priors Based on PAC-Bayesian Theory

Jonas Rothfuss


Abstract:

Bayesian Neural Networks (BNNs) are a promising approach towards improved uncertainty quantification and sample efficiency. Due to their complex parameter space, choosing informative priors for BNNs is challenging. Thus, often a naive, zero-centered Gaussian is used, resulting both in bad generalization and poor uncertainty estimates when training data is scarce. In contrast, meta-learning aims to extract such prior knowledge from a set of related learning tasks. We propose a principled and scalable algorithm for meta-learning BNN priors based on PAC-Bayesian bounds. Whereas previous approaches require optimizing the prior and multiple variational posteriors in an interdependent manner, our method does not rely on difficult nested optimization problems. Our experiments show that the proposed method is not only computationally more efficient but also yields better predictions and uncertainty estimates when compared to previous meta-learning methods and BNNs with standard priors.

Chat is not available.