Timezone: »
We propose a class of hierarchical Gaussian process priors in which each layer of the hierarchy controls the lengthscales of the next. While this has been explored, our proposal extends previous work on the Mahalanobis distance kernel bringing an alternative construction to non-stationary RBF-style kernels. This alternative take has more desirable theoretical properties restoring one of the interpretations for input-dependent lengthscales. More specifically, we interpret our model as a GP that performs locally linear non-linear dimensionality reduction. We directly compare it with compositional deep Gaussian process, a popular model that uses successive mappings to latent spaces to alleviate the burden of choosing a kernel function. Our experiments show promising results in synthetic and empirical datasets.
Author Information
Daniel Augusto de Souza (University College London)
Diego Mesquita (Aalto University)
César Lincoln Mattos (Federal University of Ceará)
César Lincoln Cavalcante Mattos is an associate professor at the Department of Computer Science, at Federal University of Ceará (UFC), Brazil. He is also an associate researcher at the Logics and Artificial Intelligence Group (LOGIA). He has research interests in the broad fields of machine learning and probabilistic modeling, such as Gaussian processes, deep (probabilistic) learning, approximate inference and system identification. He has been applying learning methods in several research and development collaborations in areas such as dynamical system modeling, health risk analysis, software repository mining and anomaly detection.
João Paulo Gomes (Federal University of Ceará)
More from the Same Authors
-
2023 Poster: Thin and deep Gaussian processes »
Daniel Augusto de Souza · Alexander Nikitin · ST John · Magnus Ross · Mauricio A Álvarez · Marc Deisenroth · João Paulo Gomes · Diego Mesquita · César Lincoln Mattos -
2021 : Concluding Remarks »
Felipe Tobar · César Lincoln Mattos -
2021 : From GPLVM to Deep GPs »
César Lincoln Mattos -
2021 : Current Trends on Kernel Design »
César Lincoln Mattos -
2021 : Sparse Approximations »
César Lincoln Mattos -
2021 : Beyond Gaussian Likelihood »
César Lincoln Mattos -
2021 Tutorial: The Art of Gaussian Processes: Classical and Contemporary »
César Lincoln Mattos · Felipe Tobar -
2021 : Live Intro »
Felipe Tobar · César Lincoln Mattos -
2020 Poster: Rethinking pooling in graph neural networks »
Diego Mesquita · Amauri Souza · Samuel Kaski -
2019 : Coffee/Poster session 1 »
Shiro Takagi · Khurram Javed · Johanna Sommer · Amr Sharaf · Pierluca D'Oro · Ying Wei · Sivan Doveh · Colin White · Santiago Gonzalez · Cuong Nguyen · Mao Li · Tianhe Yu · Tiago Ramalho · Masahiro Nomura · Ahsan Alvi · Jean-Francois Ton · W. Ronny Huang · Jessica Lee · Sebastian Flennerhag · Michael Zhang · Abram Friesen · Paul Blomstedt · Alina Dubatovka · Sergey Bartunov · Subin Yi · Iaroslav Shcherbatyi · Christian Simon · Zeyuan Shang · David MacLeod · Lu Liu · Liam Fowl · Diego Mesquita · Deirdre Quillen