Timezone: »

 
Poster
Markov Chain Score Ascent: A Unifying Framework of Variational Inference with Markovian Gradients
Kyurae Kim · Jisu Oh · Jacob Gardner · Adji Bousso Dieng · Hongseok Kim

Thu Dec 01 02:00 PM -- 04:00 PM (PST) @ Hall J #434

Minimizing the inclusive Kullback-Leibler (KL) divergence with stochastic gradient descent (SGD) is challenging since its gradient is defined as an integral over the posterior. Recently, multiple methods have been proposed to run SGD with biased gradient estimates obtained from a Markov chain. This paper provides the first non-asymptotic convergence analysis of these methods by establishing their mixing rate and gradient variance. To do this, we demonstrate that these methods—which we collectively refer to as Markov chain score ascent (MCSA) methods—can be cast as special cases of the Markov chain gradient descent framework. Furthermore, by leveraging this new understanding, we develop a novel MCSA scheme, parallel MCSA (pMCSA), that achieves a tighter bound on the gradient variance. We demonstrate that this improved theoretical result translates to superior empirical performance.

Author Information

Kyurae Kim (University of Pennsylvania)
Kyurae Kim

I am a Ph.D. student advised by Professor Jacob R. Gardner at the University of Pennsylvania working on Bayesian machine learning, Bayesian inference, and Bayesian optimization. I acquired my Bachelor in Engineering degree at Sogang University, South Korea, and previously worked at Samsung Medical Center, South Korea, as an undergraduate researcher, at Kangbuk Samsung Hospital, South Korea, as a visiting researcher, and at the University of Liverpool as a research associate. I also worked part-time as an embedded software engineer at Hansono, South Korea. I hold memberships in both the ACM and the IEEE.

Jisu Oh (Sogang University)
Jacob Gardner (University of Pennsylvania)
Adji Bousso Dieng (Princeton University & Google AI)
Hongseok Kim (Sogang University)

More from the Same Authors