Timezone: »
In standard Gaussian Process regression input locations are assumed to be noise free. We present a simple yet effective GP model for training on input points corrupted by i.i.d. Gaussian noise. To make computations tractable we use a local linear expansion about each input point. This allows the input noise to be recast as output noise proportional to the squared gradient of the GP posterior mean. The input noise variances are inferred from the data as extra hyperparameters. They are trained alongside other hyperparameters by the usual method of maximisation of the marginal likelihood. Training uses an iterative scheme, which alternates between optimising the hyperparameters and calculating the posterior gradient. Analytic predictive moments can then be found for Gaussian distributed test points. We compare our model to others over a range of different regression problems and show that it improves over current methods.
Author Information
Andrew McHutchon (Cambridge University)
Carl Edward Rasmussen (University of Cambridge)
More from the Same Authors
-
2022 : Gaussian Process parameterized Covariance Kernels for Non-stationary Regression »
Vidhi Lalchand · Talay Cheema · Laurence Aitchison · Carl Edward Rasmussen -
2022 Poster: Sparse Gaussian Process Hyperparameters: Optimize or Integrate? »
Vidhi Lalchand · Wessel Bruinsma · David Burt · Carl Edward Rasmussen -
2021 Poster: Kernel Identification Through Transformers »
Fergus Simpson · Ian Davies · Vidhi Lalchand · Alessandro Vullo · Nicolas Durrande · Carl Edward Rasmussen -
2021 Poster: Marginalised Gaussian Processes with Nested Sampling »
Fergus Simpson · Vidhi Lalchand · Carl Edward Rasmussen -
2020 : Combining variational autoencoder representations with structural descriptors improves prediction of docking scores »
Miguel Garcia-Ortegon · Carl Edward Rasmussen · Hiroshi Kajino -
2020 Poster: Ensembling geophysical models with Bayesian Neural Networks »
Ushnish Sengupta · Matt Amos · Scott Hosking · Carl Edward Rasmussen · Matthew Juniper · Paul Young -
2017 Poster: Convolutional Gaussian Processes »
Mark van der Wilk · Carl Edward Rasmussen · James Hensman -
2017 Oral: Convolutional Gaussian Processes »
Mark van der Wilk · Carl Edward Rasmussen · James Hensman -
2017 Poster: Data-Efficient Reinforcement Learning in Continuous State-Action Gaussian-POMDPs »
Rowan McAllister · Carl Edward Rasmussen -
2016 Poster: Understanding Probabilistic Sparse Gaussian Process Approximations »
Matthias Bauer · Mark van der Wilk · Carl Edward Rasmussen -
2014 Poster: Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models »
Yarin Gal · Mark van der Wilk · Carl Edward Rasmussen -
2014 Poster: Variational Gaussian Process State-Space Models »
Roger Frigola · Yutian Chen · Carl Edward Rasmussen -
2013 Poster: Bayesian Inference and Learning in Gaussian Process State-Space Models with Particle MCMC »
Roger Frigola · Fredrik Lindsten · Thomas Schön · Carl Edward Rasmussen -
2012 Poster: Active Learning of Model Evidence Using Bayesian Quadrature »
Michael A Osborne · David Duvenaud · Roman Garnett · Carl Edward Rasmussen · Stephen J Roberts · Zoubin Ghahramani -
2011 Poster: Additive Gaussian Processes »
David Duvenaud · Hannes Nickisch · Carl Edward Rasmussen -
2009 Workshop: Probabilistic Approaches for Control and Robotics »
Marc Deisenroth · Hilbert J Kappen · Emo Todorov · Duy Nguyen-Tuong · Carl Edward Rasmussen · Jan Peters -
2006 Tutorial: Advances in Gaussian Processes »
Carl Edward Rasmussen