Timezone: »
We introduce the Gaussian Process Convolution Model (GPCM), a two-stage nonparametric generative procedure to model stationary signals as the convolution between a continuous-time white-noise process and a continuous-time linear filter drawn from Gaussian process. The GPCM is a continuous-time nonparametric-window moving average process and, conditionally, is itself a Gaussian process with a nonparametric kernel defined in a probabilistic fashion. The generative model can be equivalently considered in the frequency domain, where the power spectral density of the signal is specified using a Gaussian process. One of the main contributions of the paper is to develop a novel variational free-energy approach based on inter-domain inducing variables that efficiently learns the continuous-time linear filter and infers the driving white-noise process. In turn, this scheme provides closed-form probabilistic estimates of the covariance kernel and the noise-free signal both in denoising and prediction scenarios. Additionally, the variational inference procedure provides closed-form expressions for the approximate posterior of the spectral density given the observed data, leading to new Bayesian nonparametric approaches to spectrum estimation. The proposed GPCM is validated using synthetic and real-world signals.
Author Information
Felipe Tobar (Universidad de Chile)
Thang Bui (University of Cambridge)
Richard Turner (University of Cambridge)
More from the Same Authors
-
2020 Poster: Efficient Low Rank Gaussian Variational Inference for Neural Networks »
Marcin Tomczak · Siddharth Swaroop · Richard Turner -
2020 Poster: Meta-Learning Stationary Stochastic Process Prediction with Convolutional Neural Processes »
Andrew Foong · Wessel Bruinsma · Jonathan Gordon · Yann Dubois · James Requeima · Richard Turner -
2020 Poster: On the Expressiveness of Approximate Inference in Bayesian Neural Networks »
Andrew Foong · David Burt · Yingzhen Li · Richard Turner -
2020 Poster: VAEM: a Deep Generative Model for Heterogeneous Mixed Type Data »
Chao Ma · Sebastian Tschiatschek · Richard Turner · José Miguel Hernández-Lobato · Cheng Zhang -
2020 Poster: Continual Deep Learning by Functional Regularisation of Memorable Past »
Pingbo Pan · Siddharth Swaroop · Alexander Immer · Runa Eschenhagen · Richard Turner · Mohammad Emtiyaz Khan -
2020 Oral: Continual Deep Learning by Functional Regularisation of Memorable Past »
Pingbo Pan · Siddharth Swaroop · Alexander Immer · Runa Eschenhagen · Richard Turner · Mohammad Emtiyaz Khan -
2019 Poster: Band-Limited Gaussian Processes: The Sinc Kernel »
Felipe Tobar -
2019 Poster: Icebreaker: Element-wise Efficient Information Acquisition with a Bayesian Deep Latent Gaussian Model »
Wenbo Gong · Sebastian Tschiatschek · Sebastian Nowozin · Richard Turner · José Miguel Hernández-Lobato · Cheng Zhang -
2019 Poster: Practical Deep Learning with Bayesian Principles »
Kazuki Osawa · Siddharth Swaroop · Mohammad Emtiyaz Khan · Anirudh Jain · Runa Eschenhagen · Richard Turner · Rio Yokota -
2018 Poster: Infinite-Horizon Gaussian Processes »
Arno Solin · James Hensman · Richard Turner -
2018 Poster: Geometrically Coupled Monte Carlo Sampling »
Mark Rowland · Krzysztof Choromanski · François Chalus · Aldo Pacchiano · Tamas Sarlos · Richard Turner · Adrian Weller -
2018 Spotlight: Geometrically Coupled Monte Carlo Sampling »
Mark Rowland · Krzysztof Choromanski · François Chalus · Aldo Pacchiano · Tamas Sarlos · Richard Turner · Adrian Weller -
2018 Poster: Bayesian Nonparametric Spectral Estimation »
Felipe Tobar -
2018 Spotlight: Bayesian Nonparametric Spectral Estimation »
Felipe Tobar -
2017 Poster: Streaming Sparse Gaussian Process Approximations »
Thang Bui · Cuong Nguyen · Richard Turner -
2017 Poster: Spectral Mixture Kernels for Multi-Output Gaussian Processes »
Gabriel Parra · Felipe Tobar -
2017 Poster: Interpolated Policy Gradient: Merging On-Policy and Off-Policy Gradient Estimation for Deep Reinforcement Learning »
Shixiang (Shane) Gu · Timothy Lillicrap · Richard Turner · Zoubin Ghahramani · Bernhard Schölkopf · Sergey Levine -
2016 Poster: Rényi Divergence Variational Inference »
Yingzhen Li · Richard Turner -
2015 Poster: Neural Adaptive Sequential Monte Carlo »
Shixiang (Shane) Gu · Zoubin Ghahramani · Richard Turner -
2015 Poster: Learning Stationary Time Series using Gaussian Processes with Nonparametric Kernels »
Felipe Tobar · Thang Bui · Richard Turner -
2015 Poster: Stochastic Expectation Propagation »
Yingzhen Li · José Miguel Hernández-Lobato · Richard Turner -
2015 Spotlight: Stochastic Expectation Propagation »
Yingzhen Li · José Miguel Hernández-Lobato · Richard Turner -
2014 Poster: Tree-structured Gaussian Process Approximations »
Thang Bui · Richard Turner -
2014 Spotlight: Tree-structured Gaussian Process Approximations »
Thang Bui · Richard Turner -
2011 Poster: Probabilistic amplitude and frequency demodulation »
Richard Turner · Maneesh Sahani -
2011 Spotlight: Probabilistic amplitude and frequency demodulation »
Richard Turner · Maneesh Sahani -
2009 Poster: Occlusive Components Analysis »
Jörg Lücke · Richard Turner · Maneesh Sahani · Marc Henniges -
2007 Workshop: Beyond Simple Cells: Probabilistic Models for Visual Cortical Processing »
Richard Turner · Pietro Berkes · Maneesh Sahani -
2007 Poster: Modeling Natural Sounds with Modulation Cascade Processes »
Richard Turner · Maneesh Sahani -
2007 Poster: On Sparsity and Overcompleteness in Image Models »
Pietro Berkes · Richard Turner · Maneesh Sahani