Timezone: »

 
Poster
Fast Bayesian Inference with Batch Bayesian Quadrature via Kernel Recombination
Masaki Adachi · Satoshi Hayakawa · Martin Jørgensen · Harald Oberhauser · Michael A Osborne

Tue Nov 29 02:00 PM -- 04:00 PM (PST) @ Hall J #719

Calculation of Bayesian posteriors and model evidences typically requires numerical integration. Bayesian quadrature (BQ), a surrogate-model-based approach to numerical integration, is capable of superb sample efficiency, but its lack of parallelisation has hindered its practical applications. In this work, we propose a parallelised (batch) BQ method, employing techniques from kernel quadrature, that possesses an empirically exponential convergence rate.Additionally, just as with Nested Sampling, our method permits simultaneous inference of both posteriors and model evidence.Samples from our BQ surrogate model are re-selected to give a sparse set of samples, via a kernel recombination algorithm, requiring negligible additional time to increase the batch size.Empirically, we find that our approach significantly outperforms the sampling efficiency of both state-of-the-art BQ techniques and Nested Sampling in various real-world datasets, including lithium-ion battery analytics.

Author Information

Masaki Adachi (University of Oxford)
Satoshi Hayakawa (University of Oxford)
Martin Jørgensen (University of Oxford)
Martin Jørgensen

I am a postdoctoral researcher at the University of Oxford and a Junior Research Fellow at Linacre College. Since April 2021 I have worked with Michael Osborne’s Bayesian Exploration Lab. Before that I was a PhD student under Søren Hauberg at the Technical University of Denmark. Research Interests: Gaussian Processes, Variational Bayesian Inference, Differential Geometry for Representation Learning, Bayesian Quadrature and Optimisation, Sample-efficienct methods

Harald Oberhauser (University of Oxford)
Michael A Osborne (U Oxford)

More from the Same Authors