Timezone: »

Random function priors for exchangeable graphs and arrays
James R Lloyd · Daniel Roy · Peter Orbanz · Zoubin Ghahramani

Tue Dec 04 07:00 PM -- 12:00 AM (PST) @ Harrah’s Special Events Center 2nd Floor

A fundamental problem in the analysis of relational data---graphs, matrices or higher-dimensional arrays---is to extract a summary of the common structure underlying relations between individual entities. A successful approach is latent variable modeling, which summarizes this structure as an embedding into a suitable latent space. Results in probability theory, due to Aldous, Hoover and Kallenberg, show that relational data satisfying an exchangeability property can be represented in terms of a random measurable function. In a Bayesian model, this function constitutes the natural model parameter, and we discuss how available latent variable models can be classified according to how they implicitly approximate this parameter. We obtain a flexible yet simple model for relational data by representing the parameter function as a Gaussian process. Efficient inference draws on the large available arsenal of Gaussian process algorithms; sparse approximations prove particularly useful. We demonstrate applications of the model to network data and clarify its relation to models in the literature, several of which emerge as special cases.

Author Information

James R Lloyd (University of Cambridge)
Daniel Roy (U of Toronto; Vector)
Peter Orbanz (Columbia University)

Peter Orbanz is a research fellow at the University of Cambridge. He holds a PhD degree from ETH Zurich and will join the Statistics Faculty at Columbia University as an Assistant Professor in 2012. He is interested in the mathematical and algorithmic aspects of Bayesian nonparametric models and of related learning technologies.

Zoubin Ghahramani (Uber and University of Cambridge)

Zoubin Ghahramani is Professor of Information Engineering at the University of Cambridge, where he leads the Machine Learning Group. He studied computer science and cognitive science at the University of Pennsylvania, obtained his PhD from MIT in 1995, and was a postdoctoral fellow at the University of Toronto. His academic career includes concurrent appointments as one of the founding members of the Gatsby Computational Neuroscience Unit in London, and as a faculty member of CMU's Machine Learning Department for over 10 years. His current research interests include statistical machine learning, Bayesian nonparametrics, scalable inference, probabilistic programming, and building an automatic statistician. He has held a number of leadership roles as programme and general chair of the leading international conferences in machine learning including: AISTATS (2005), ICML (2007, 2011), and NIPS (2013, 2014). In 2015 he was elected a Fellow of the Royal Society.

More from the Same Authors