Workshop
Adaptive and Scalable Nonparametric Methods in Machine Learning
Aaditya Ramdas · Arthur Gretton · Bharath Sriperumbudur · Han Liu · John Lafferty · Samory Kpotufe · Zoltán Szabó

Sat Dec 10th 08:00 AM -- 06:30 PM @ Room 120 + 121
Event URL: https://sites.google.com/site/nips2016adaptive/ »

Large amounts of high-dimensional data are routinely acquired in scientific fields ranging from biology, genomics and health sciences to astronomy and economics due to improvements in engineering and data acquisition techniques. Nonparametric methods allow for better modelling of complex systems underlying data generating processes compared to traditionally used linear and parametric models. From statistical point of view, scientists have enough data to reliably fit nonparametric models. However, from computational point of view, nonparametric methods often do not scale well to big data problems.

The aim of this workshop is to bring together practitioners, who are interested in developing and applying nonparametric methods in their domains, and theoreticians, who are interested in providing sound methodology. We hope to effectively communicate advances in development of computational tools for fitting nonparametric models and discuss challenging future directions that prevent applications of nonparametric methods to big data problems.

We encourage submissions on a variety of topics, including but not limited to:
- Randomized procedures for fitting nonparametric models. For example, sketching, random projections, core set selection, etc.
- Nonparametric probabilistic graphical models
- Scalable nonparametric methods
- Multiple kernel learning
- Random feature expansion
- Novel applications of nonparametric methods
- Bayesian nonparametric methods
- Nonparametric network models

This workshop is a fourth in a series of NIPS workshops on modern nonparametric methods in machine learning. Previous workshops focused on time/accuracy tradeoffs, high dimensionality and dimension reduction strategies, and automating the learning pipeline.

08:30 AM Richard Samworth. Adaptation in log-concave density estimation (Invited talk) Richard J Samworth
09:00 AM Ming Yuan. Functional nuclear norm and low rank function estimation. (Invited talk) Ming Yuan
09:30 AM Mladen Kolar. Post-Regularization Inference for Dynamic Nonparanormal Graphical Models. (Invited talk) Mladen Kolar
11:00 AM Debarghya Ghoshdastidar, Ulrike von Luxburg. Do Nonparametric Two-sample Tests work for Small Sample Size? A Study on Random Graphs. (Contributed talks)
11:20 AM Diana Cai, Trevor Campbell, Tamara Broderick. Paintboxes and probability functions for edge-exchangeable graphs. (Contributed talks)
11:40 AM Alessandro Rudi, Raffaello Camoriano, Lorenzo Rosasco. Generalization Properties of Learning with Random Features. (Contributed talks)
12:00 PM Makoto Yamada, Yuta Umezu, Kenji Fukumizu, Ichiro Takeuchi. Post Selection Inference with Kernels. (Contributed talks)
12:20 PM Yunpeng Pan, Xinyan Yan, Evangelos Theodorou, Byron Boots. Solving the Linear Bellman Equation via Kernel Embeddings and Stochastic Gradient Descent. (Contributed talks)
12:40 PM Lunch break <span> <a href="#"></a> </span>
02:30 PM Francis Bach. Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression. (Invited talk) Francis Bach
03:00 PM Richard (Fangjian) Guo. Boosting Variational Inference. (Invited talk) Fangjian Guo
03:30 PM Break <span> <a href="#"></a> </span>
03:45 PM Olga Klopp. Network models and sparse graphon estimation. (Invited talk) Olga Klopp
04:15 PM Emily Fox. Sparse Graphs via Exchangeable Random Measures. (Invited talk) Emily Fox
04:45 PM Coffee break + posters <span> <a href="#"></a> </span>

Author Information

Aaditya Ramdas (UC Berkeley)
Arthur Gretton (Gatsby Unit, UCL)

Arthur Gretton is a Professor with the Gatsby Computational Neuroscience Unit at UCL. He received degrees in Physics and Systems Engineering from the Australian National University, and a PhD with Microsoft Research and the Signal Processing and Communications Laboratory at the University of Cambridge. He previously worked at the MPI for Biological Cybernetics, and at the Machine Learning Department, Carnegie Mellon University. Arthur's recent research interests in machine learning include the design and training of generative models, both implicit (e.g. GANs) and explicit (high/infinite dimensional exponential family models), nonparametric hypothesis testing, and kernel methods. He has been an associate editor at IEEE Transactions on Pattern Analysis and Machine Intelligence from 2009 to 2013, an Action Editor for JMLR since April 2013, an Area Chair for NeurIPS in 2008 and 2009, a Senior Area Chair for NeurIPS in 2018, an Area Chair for ICML in 2011 and 2012, and a member of the COLT Program Committee in 2013. Arthur was program chair for AISTATS in 2016 (with Christian Robert), tutorials chair for ICML 2018 (with Ruslan Salakhutdinov), workshops chair for ICML 2019 (with Honglak Lee), program chair for the Dali workshop in 2019 (with Krikamol Muandet and Shakir Mohammed), and co-organsier of the Machine Learning Summer School 2019 in London (with Marc Deisenroth).

Bharath Sriperumbudur (Penn State University)
Han Liu (Tencent AI Lab)
John Lafferty (University of Chicago)
Samory Kpotufe (Princeton University)
Zoltán Szabó (École Polytechnique)

[Homepage](http://www.cmap.polytechnique.fr/~zoltan.szabo/)

More from the Same Authors