Timezone: »

 
Deep Mahalanobis Gaussian Process
Daniel Augusto de Souza · Diego Mesquita · César Lincoln Mattos · João Paulo Gomes

We propose a class of hierarchical Gaussian process priors in which each layer of the hierarchy controls the lengthscales of the next. While this has been explored, our proposal extends previous work on the Mahalanobis distance kernel bringing an alternative construction to non-stationary RBF-style kernels. This alternative take has more desirable theoretical properties restoring one of the interpretations for input-dependent lengthscales. More specifically, we interpret our model as a GP that performs locally linear non-linear dimensionality reduction. We directly compare it with compositional deep Gaussian process, a popular model that uses successive mappings to latent spaces to alleviate the burden of choosing a kernel function. Our experiments show promising results in synthetic and empirical datasets.

Author Information

Daniel Augusto de Souza (University College London)
Diego Mesquita (Aalto University)
César Lincoln Mattos (Federal University of Ceará)

César Lincoln Cavalcante Mattos is an associate professor at the Department of Computer Science, at Federal University of Ceará (UFC), Brazil. He is also an associate researcher at the Logics and Artificial Intelligence Group (LOGIA). He has research interests in the broad fields of machine learning and probabilistic modeling, such as Gaussian processes, deep (probabilistic) learning, approximate inference and system identification. He has been applying learning methods in several research and development collaborations in areas such as dynamical system modeling, health risk analysis, software repository mining and anomaly detection.

João Paulo Gomes (Federal University of Ceará)

More from the Same Authors