Skip to yearly menu bar Skip to main content


Poster

Collapsed variational Bayes for Markov jump processes

Boqian Zhang · Jiangwei Pan · Vinayak Rao

Pacific Ballroom #187

Keywords: [ Stochastic Methods ] [ Variational Inference ] [ Unsupervised Learning ] [ Hierarchical Models ] [ Latent Variable Models ] [ Web Applications and Internet Data ]


Abstract:

Markov jump processes are continuous-time stochastic processes widely used in statistical applications in the natural sciences, and more recently in machine learning. Inference for these models typically proceeds via Markov chain Monte Carlo, and can suffer from various computational challenges. In this work, we propose a novel collapsed variational inference algorithm to address this issue. Our work leverages ideas from discrete-time Markov chains, and exploits a connection between these two through an idea called uniformization. Our algorithm proceeds by marginalizing out the parameters of the Markov jump process, and then approximating the distribution over the trajectory with a factored distribution over segments of a piecewise-constant function. Unlike MCMC schemes that marginalize out transition times of a piecewise-constant process, our scheme optimizes the discretization of time, resulting in significant computational savings. We apply our ideas to synthetic data as well as a dataset of check-in recordings, where we demonstrate superior performance over state-of-the-art MCMC methods.

Live content is unavailable. Log in and register to view live content