Timezone: »

Liberty or Depth: Deep Bayesian Neural Nets Do Not Need Complex Weight Posterior Approximations
Sebastian Farquhar · Lewis Smith · Yarin Gal

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1480

We challenge the longstanding assumption that the mean-field approximation for variational inference in Bayesian neural networks is severely restrictive, and show this is not the case in deep networks. We prove several results indicating that deep mean-field variational weight posteriors can induce similar distributions in function-space to those induced by shallower networks with complex weight posteriors. We validate our theoretical contributions empirically, both through examination of the weight posterior using Hamiltonian Monte Carlo in small models and by comparing diagonal- to structured-covariance in large settings. Since complex variational posteriors are often expensive and cumbersome to implement, our results suggest that using mean-field variational inference in a deeper model is both a practical and theoretically justified alternative to structured approximations.

Author Information

Sebastian Farquhar (University of Oxford)
Lewis Smith (University of Oxford)

Lewis Smith is a DPhil student supervised by Yarin Gal. His main interests are in the reliability and robustness of machine learning algorithms, Bayesian methods, and the utilisation of structure (such as invariances in the data). He is also a member of the [AIMS CDT](www.aims.robots.ox.ac.uk). Before joining OATML, he recieved his masters degree in physics from the University of Manchester.

Yarin Gal (University of Oxford)

More from the Same Authors