Timezone: »

Adaptive and Scalable Nonparametric Methods in Machine Learning
Aaditya Ramdas · Arthur Gretton · Bharath Sriperumbudur · Han Liu · John Lafferty · Samory Kpotufe · Zoltán Szabó

Fri Dec 11:00 PM -- 09:30 AM PST @ Room 120 + 121
Event URL: https://sites.google.com/site/nips2016adaptive/ »

Large amounts of high-dimensional data are routinely acquired in scientific fields ranging from biology, genomics and health sciences to astronomy and economics due to improvements in engineering and data acquisition techniques. Nonparametric methods allow for better modelling of complex systems underlying data generating processes compared to traditionally used linear and parametric models. From statistical point of view, scientists have enough data to reliably fit nonparametric models. However, from computational point of view, nonparametric methods often do not scale well to big data problems.

The aim of this workshop is to bring together practitioners, who are interested in developing and applying nonparametric methods in their domains, and theoreticians, who are interested in providing sound methodology. We hope to effectively communicate advances in development of computational tools for fitting nonparametric models and discuss challenging future directions that prevent applications of nonparametric methods to big data problems.

We encourage submissions on a variety of topics, including but not limited to:
- Randomized procedures for fitting nonparametric models. For example, sketching, random projections, core set selection, etc.
- Nonparametric probabilistic graphical models
- Scalable nonparametric methods
- Multiple kernel learning
- Random feature expansion
- Novel applications of nonparametric methods
- Bayesian nonparametric methods
- Nonparametric network models

This workshop is a fourth in a series of NIPS workshops on modern nonparametric methods in machine learning. Previous workshops focused on time/accuracy tradeoffs, high dimensionality and dimension reduction strategies, and automating the learning pipeline.

11:30 PM Richard Samworth. Adaptation in log-concave density estimation (Invited talk)|| Richard J Samworth
12:00 AM Ming Yuan. Functional nuclear norm and low rank function estimation. (Invited talk)|| Ming Yuan
12:30 AM Mladen Kolar. Post-Regularization Inference for Dynamic Nonparanormal Graphical Models. (Invited talk)|| Mladen Kolar
02:00 AM Debarghya Ghoshdastidar, Ulrike von Luxburg. Do Nonparametric Two-sample Tests work for Small Sample Size? A Study on Random Graphs. (Contributed talks)||
02:20 AM Diana Cai, Trevor Campbell, Tamara Broderick. Paintboxes and probability functions for edge-exchangeable graphs. (Contributed talks)||
02:40 AM Alessandro Rudi, Raffaello Camoriano, Lorenzo Rosasco. Generalization Properties of Learning with Random Features. (Contributed talks)||
03:00 AM Makoto Yamada, Yuta Umezu, Kenji Fukumizu, Ichiro Takeuchi. Post Selection Inference with Kernels. (Contributed talks)||
03:20 AM Yunpeng Pan, Xinyan Yan, Evangelos Theodorou, Byron Boots. Solving the Linear Bellman Equation via Kernel Embeddings and Stochastic Gradient Descent. (Contributed talks)||
03:40 AM Lunch break
05:30 AM Francis Bach. Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression. (Invited talk)|| Francis Bach
06:00 AM Richard (Fangjian) Guo. Boosting Variational Inference. (Invited talk)|| Fangjian Guo
06:30 AM Break
06:45 AM Olga Klopp. Network models and sparse graphon estimation. (Invited talk)|| Olga Klopp
07:15 AM Emily Fox. Sparse Graphs via Exchangeable Random Measures. (Invited talk)|| Emily Fox
07:45 AM Coffee break + posters

Author Information

Aaditya Ramdas (UC Berkeley)
Arthur Gretton (Gatsby Unit, UCL)

Arthur Gretton is a Professor with the Gatsby Computational Neuroscience Unit at UCL. He received degrees in Physics and Systems Engineering from the Australian National University, and a PhD with Microsoft Research and the Signal Processing and Communications Laboratory at the University of Cambridge. He previously worked at the MPI for Biological Cybernetics, and at the Machine Learning Department, Carnegie Mellon University. Arthur's recent research interests in machine learning include the design and training of generative models, both implicit (e.g. GANs) and explicit (high/infinite dimensional exponential family models), nonparametric hypothesis testing, and kernel methods. He has been an associate editor at IEEE Transactions on Pattern Analysis and Machine Intelligence from 2009 to 2013, an Action Editor for JMLR since April 2013, an Area Chair for NeurIPS in 2008 and 2009, a Senior Area Chair for NeurIPS in 2018, an Area Chair for ICML in 2011 and 2012, and a member of the COLT Program Committee in 2013. Arthur was program chair for AISTATS in 2016 (with Christian Robert), tutorials chair for ICML 2018 (with Ruslan Salakhutdinov), workshops chair for ICML 2019 (with Honglak Lee), program chair for the Dali workshop in 2019 (with Krikamol Muandet and Shakir Mohammed), and co-organsier of the Machine Learning Summer School 2019 in London (with Marc Deisenroth).

Bharath Sriperumbudur (Penn State University)
Han Liu (Tencent AI Lab)
John Lafferty (University of Chicago)
Samory Kpotufe (Princeton University)
Zoltán Szabó (École Polytechnique)


More from the Same Authors