NIPS 2015
Skip to yearly menu bar Skip to main content


Integration is the central numerical operation required for Bayesian machine learning (in the form of marginalization and conditioning). Sampling algorithms still abound in this area, although it has long been known that Monte Carlo methods are fundamentally sub-optimal. The challenges for the development of better performing integration methods are mostly algorithmic. Moreover, recent algorithms have begun to outperform MCMC and its siblings, in wall-clock time, on realistic problems from machine learning.

The workshop will review the existing, by now quite strong, theoretical case against the use of random numbers for integration, discuss recent algorithmic developments, relationships
between conceptual approaches, and highlight central research challenges going forward.

Among the questions to be addressed by the workshop are
* How fast can a practical integral estimate on a deterministic function converge (polynomially, super-polynomially, not just “better than sqrt(N)”)?
* How are these rates related, precisely, to prior assumptions about the integrand, and to the design rules of the integrator?
* To which degree can the source code of an integration problem be parsed to choose informative priors?
* Are random numbers necessary and helpful for efficient multivariate integration, or are they a conceptual crutch that cause inefficiencies?
* What are the practical challenges in the design of efficient multivariate integration methods that use such prior information?

The workshop builds upon the growing field of probabilistic numerics, for which Probabilistic Integration is a core component. A community website for probabilistic numerics can be found at http://probabilistic-numerics.org.

Live content is unavailable. Log in and register to view live content

Timezone: America/Los_Angeles

Schedule

Log in and register to view live content