NIPS 2013
Skip to yearly menu bar Skip to main content


Workshop

Randomized Methods for Machine Learning

David Lopez-Paz · Quoc V Le · Alexander Smola

Harvey's Emerald Bay 5

As we enter the era of “big-data”, Machine Learning algorithms that resort in heavy optimization routines rapidly become prohibitive. Perhaps surprisingly, randomization (Raghavan and Motwani, 1995) arises as a computationally cheaper, simpler alternative to optimization that in many cases leads to smaller and faster models with little or no loss in performance. Although randomized algorithms date back to the probabilistic method (Erdős, 1947, Alon & Spencer, 2000), these techniques only recently started finding their way into Machine Learning. The most notable exceptions being stochastic methods for optimization, and Markov Chain Monte Carlo methods, both of which have become well-established in the past two decades. This workshop aims to accelerate this process by bringing together researchers in this area and exposing them to recent developments.

The targeted audience are researchers and practitioners looking for scalable, compact and fast solutions to learn in the large-scale setting.

Specific questions of interest include, but are not limited to:

- Randomized projections: locality sensitive hashing, hash kernels, counter braids, count sketches, optimization.
- Randomized function classes: Fourier features, Random Kitchen Sinks, Nystrom methods, Fastfood, Random Basis Neural networks.
- Sparse reconstructions: compressed sensing, error correcting output codes, reductions of inference problems to binary.
- Compressive approximations: min-hash, shingles, Bloom filters, coresets, random subsampling from streams.
- Randomized dependence measures, component analysis, dimensionality reduction.
- Extensions to less exploited tasks: density estimation, multitask and semi-supervised learning, deep and hierarchical models, feature learning, control, causality.
- Hybrid strategies that combine optimization and randomization.
- Sampling algorithms for Bayesian inference.
- Random matrices and graphs.

This one day workshop will feature invited tutorials and contributed short talks. Poster sessions, coffee breaks and a closing panel will encourage discussion between the attendants. We plan to collect a tightly edited collection of papers from the workshop in the form of a special issue or a book. This will allow faster dissemination of randomized methods in machine learning.

More information will be available at the official website www.randomizedmethods.org.

Live content is unavailable. Log in and register to view live content