Timezone: »

 
Efficient Variational Gaussian Processes Initialization via Kernel-based Least Squares Fitting
Xinran Zhu · David Bindel · Jacob Gardner

Stochastic variational Gaussian processes (SVGP) scale Gaussian process inference up to large datasets through inducing points and stochastic training. However, the training process involves hard multimodal optimization, and often suffers from slow and suboptimal convergence by initializing inducing points directly from training data. We provide a better initialization of inducing points from kernel-based least squares fitting. We show empirically that our approach consistently reaches better prediction performance with much fewer training epochs. Our initialization saves up to 38% of the total time cost as compared to standard SVGP training.

Author Information

Xinran Zhu (Cornell University)
David Bindel (Cornell University)
David Bindel

David Bindel received BS degrees in mathematics and computer science from the University of Maryland in 1999, and a PhD in computer science from UC Berkeley in 2006. After three years as a Courant Instructor of mathematics at NYU, he joined the department of Computer Science at Cornell University, where he is currently an associate professor of Computer Science, the director of the Center for Applied Mathematics (CAM), and associate dean of diversity and inclusion for the Cornell Ann S. Bowers College of Computing and Information Science. His research focus is in applied numerical linear algebra and scientific computing, with applications to a variety of science and engineering problems. He is the recipient of the Householder Prize in numerical linear algebra, a Sloan research fellowship, and best paper awards from the KDD and ASPLOS conferences and from the SIAM Activity Group on Linear Algebra.

Jacob Gardner (University of Pennsylvania)

More from the Same Authors