Timezone: »

 
Workshop
Representations and Inference on Probability Distributions
Kenji Fukumizu · Arthur Gretton · Alexander Smola

Sat Dec 08 07:30 AM -- 06:30 PM (PST) @ Hilton: Diamond Head
Event URL: http://nips2007.kyb.tuebingen.mpg.de »

When dealing with distributions it is in general infeasible to estimate them explicitly in high dimensional settings, since the associated learning rates are quite slow. On the other hand, a great variety of applications in machine learning and computer science require distribution estimation and/or comparison. Examples include testing for homogeneity (the "two-sample problem"), independence, and conditional independence, where the last two can be used to infer causality; data set squashing / data sketching / data anonymisation; domain adaptation (the transfer of knowledge learned on one domain to solving problems on another, related domain) and the related problem of covariate shift; message passing in graphical models (EP and related algorithms); compressed sensing; and links between divergence measures and loss functions. The purpose of this workshop is to bring together statisticians, machine learning researchers, and computer scientists working on representations of distributions for various inference and testing problems, to discuss the compromises necessary in obtaining useful results from finite data. In particular, what are the capabilities and weaknesses of different distribution estimates and comparison strategies, and what negative results apply?

Author Information

Kenji Fukumizu (Institute of Statistical Mathematics / Preferred Networks / RIKEN AIP)
Arthur Gretton (Google Deepmind / UCL)

Arthur Gretton is a Professor with the Gatsby Computational Neuroscience Unit at UCL. He received degrees in Physics and Systems Engineering from the Australian National University, and a PhD with Microsoft Research and the Signal Processing and Communications Laboratory at the University of Cambridge. He previously worked at the MPI for Biological Cybernetics, and at the Machine Learning Department, Carnegie Mellon University. Arthur's recent research interests in machine learning include the design and training of generative models, both implicit (e.g. GANs) and explicit (high/infinite dimensional exponential family models), nonparametric hypothesis testing, and kernel methods. He has been an associate editor at IEEE Transactions on Pattern Analysis and Machine Intelligence from 2009 to 2013, an Action Editor for JMLR since April 2013, an Area Chair for NeurIPS in 2008 and 2009, a Senior Area Chair for NeurIPS in 2018, an Area Chair for ICML in 2011 and 2012, and a member of the COLT Program Committee in 2013. Arthur was program chair for AISTATS in 2016 (with Christian Robert), tutorials chair for ICML 2018 (with Ruslan Salakhutdinov), workshops chair for ICML 2019 (with Honglak Lee), program chair for the Dali workshop in 2019 (with Krikamol Muandet and Shakir Mohammed), and co-organsier of the Machine Learning Summer School 2019 in London (with Marc Deisenroth).

Alexander Smola (Amazon)

**AWS Machine Learning**

More from the Same Authors