Timezone: »

 
Gaussian Process parameterized Covariance Kernels for Non-stationary Regression
Vidhi Lalchand · Talay Cheema · Laurence Aitchison · Carl Edward Rasmussen

A large cross-section of Gaussian process literature uses universal kernels like the squared exponential (SE) kernel along with automatic revelance determination (ARD) in high-dimensions. The ARD framework in covariance kernels operates by pruning away extraneous dimensions through contracting their inverse-lengthscales. This works considers probabilistic inference in the factorised Gibbs kernel (FGK) [Gibbs, 1998] and the multivariate Gibbs kernel (MGK) [Paciorek, 2003] with input-dependent lengthscales. These kernels allow for non-stationary modelling where samples from the posterior function space "adapt" to the varying smoothness structure inherent in the ground truth. We propose parameterizing the lengthscale function of the factorised and multivariate Gibbs covariance function with a latent Gaussian process defined on the same inputs. For large datasets, we show how these non-stationary constructions are compatible with sparse inducing variable formulations for regression. Experiments on synthetic and real-world spatial datasets for precipitation modelling and temperature trends demonstrate the feasibility and utility of the approach.

Author Information

Vidhi Lalchand (University of Cambridge)

Ph.D student in Machine learning at Cambridge, I work on Bayesian Non-parametrics, Gaussian Processes, Kernel Learning. Application Areas: High Energy Physics, Astronomy, Science!

Talay Cheema (University of Cambridge)
Laurence Aitchison (University of Bristol)
Carl Edward Rasmussen (University of Cambridge)

More from the Same Authors