Skip to yearly menu bar Skip to main content


Invited Talk
in
Workshop: Machine Learning with New Compute Paradigms

Low-precision Sampling for Probabilistic Deep Learning

Ruqi Zhang

[ ] [ Project Page ]
Sat 16 Dec 9:05 a.m. PST — 9:25 a.m. PST

Abstract:

Sampling from a probability distribution is a ubiquitous challenge in machine learning, ranging from generative AI to approximate Bayesian inference. This talk will show how to leverage low-precision compute to accelerate Markov chain Monte Carlo (MCMC) sampling with theoretical guarantees on the convergence. First, I will introduce a general and theoretically grounded framework to enable low-precision sampling, with applications to Stochastic Gradient Langevin Dynamics and Stochastic Gradient Hamiltonian Monte Carlo. Then I will present an approach for binary sampling---operating at 1-bit precision. Finally, I will show the experimental results of low-precision sampling on various deep learning tasks.

Chat is not available.