Timezone: »

 
Spotlight
Bayesian Exponential Family PCA
Shakir Mohamed · Katherine Heller · Zoubin Ghahramani

Wed Dec 10 05:27 PM -- 05:28 PM (PST) @

Principal Components Analysis (PCA) has become established as one of the key tools for dimensionality reduction when dealing with real valued data. Approaches such as exponential family PCA and non-negative matrix factorisation have successfully extended PCA to non-Gaussian data types, but these techniques fail to take advantage of Bayesian inference and can suffer from problems of overfitting and poor generalisation. This paper presents a fully probabilistic approach to PCA, which is generalised to the exponential family, based on Hybrid Monte Carlo sampling. We describe the model which is based on a factorisation of the observed data matrix, and show performance of the model on both synthetic and real data.

Author Information

Shakir Mohamed (DeepMind)
Shakir Mohamed

Shakir Mohamed is a senior staff scientist at DeepMind in London. Shakir's main interests lie at the intersection of approximate Bayesian inference, deep learning and reinforcement learning, and the role that machine learning systems at this intersection have in the development of more intelligent and general-purpose learning systems. Before moving to London, Shakir held a Junior Research Fellowship from the Canadian Institute for Advanced Research (CIFAR), based in Vancouver at the University of British Columbia with Nando de Freitas. Shakir completed his PhD with Zoubin Ghahramani at the University of Cambridge, where he was a Commonwealth Scholar to the United Kingdom. Shakir is from South Africa and completed his previous degrees in Electrical and Information Engineering at the University of the Witwatersrand, Johannesburg.

Katherine Heller (Google)
Zoubin Ghahramani (Uber and University of Cambridge)

Zoubin Ghahramani is Professor of Information Engineering at the University of Cambridge, where he leads the Machine Learning Group. He studied computer science and cognitive science at the University of Pennsylvania, obtained his PhD from MIT in 1995, and was a postdoctoral fellow at the University of Toronto. His academic career includes concurrent appointments as one of the founding members of the Gatsby Computational Neuroscience Unit in London, and as a faculty member of CMU's Machine Learning Department for over 10 years. His current research interests include statistical machine learning, Bayesian nonparametrics, scalable inference, probabilistic programming, and building an automatic statistician. He has held a number of leadership roles as programme and general chair of the leading international conferences in machine learning including: AISTATS (2005), ICML (2007, 2011), and NIPS (2013, 2014). In 2015 he was elected a Fellow of the Royal Society.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors