Timezone: »
We demonstrate Monte, a massively parallel implementation of the probabilistic programming language Church. Using Monte, a probabilistic modeler or machine learning researcher can specify state-of-the-art nonparametric Bayesian models in just 10-20 lines of code, and automatically perform inference on them in the presence of arbitrary datasets. Monte also allows a probabilistic modeler to exploit Amazon's Elastic Compute Cloud (EC2) to perform massively parallel inference, significantly increasing the model complexity and dataset scale that Monte can handle over serial implementations of probabilistic programming languages. Church is a probabilistic programming language, based on Lisp, that enables compact specification of all computable probabilistic models and supports universal Bayesian inference via Markov chain Monte Carlo. Probabilistic programming languages have gained substantial attention in the NIPS community, serving as a representation language for probabilistic machine learning and the basis of a widely attended NIPS 2008 Workshop. However, very few implementations of probabilistic programming languages have been mature enough to support reliable interactive use. The Monte implementation of Church, developed at Navia Systems, Inc., is one of the first such systems. The best evidence of its robustness and interactivity lies in the compactness of Monte representations of state-of-the-art probabilistic models (see Figure 1) and in the simplicity of its use on a single processor workstation (see Figure 2). Users of our demonstration system will be able to implement state-of-the-art probabilistic models from the current NIPS conference and immediately perform inference on them in real time using Monte's engine, collecting and analyzing posterior samples of interest. Figure 2 shows one example of this use pattern in the context of a Dirichlet process Mixture Model with binary data.
Author Information
Vikash Mansinghka (Massachusetts Institute of Technology)
Vikash Mansinghka is a research scientist at MIT, where he leads the Probabilistic Computing Project. Vikash holds S.B. degrees in Mathematics and in Computer Science from MIT, as well as an M.Eng. in Computer Science and a PhD in Computation. He also held graduate fellowships from the National Science Foundation and MIT’s Lincoln Laboratory. His PhD dissertation on natively probabilistic computation won the MIT George M. Sprowls dissertation award in computer science, and his research on the Picture probabilistic programming language won an award at CVPR. He served on DARPA’s Information Science and Technology advisory board from 2010-2012, and currently serves on the editorial boards for the Journal of Machine Learning Research and the journal Statistics and Computation. He was an advisor to Google DeepMind and has co-founded two AI-related startups, one acquired and one currently operational.
More from the Same Authors
-
2021 : Towards Denotational Semantics of AD for Higher-Order, Recursive, Probabilistic Languages »
Alexander Lew · Mathieu Huot · Vikash Mansinghka -
2021 Poster: 3DP3: 3D Scene Perception via Probabilistic Programming »
Nishad Gothoskar · Marco Cusumano-Towner · Ben Zinberg · Matin Ghavamizadeh · Falk Pollok · Austin Garrett · Josh Tenenbaum · Dan Gutfreund · Vikash Mansinghka -
2020 Poster: Online Bayesian Goal Inference for Boundedly Rational Planning Agents »
Tan Zhi-Xuan · Jordyn Mann · Tom Silver · Josh Tenenbaum · Vikash Mansinghka -
2019 : Posters »
Colin Graber · Yuan-Ting Hu · Tiantian Fang · Jessica Hamrick · Giorgio Giannone · John Co-Reyes · Boyang Deng · Eric Crawford · Andrea Dittadi · Peter Karkus · Matthew Dirks · Rakshit Trivedi · Sunny Raj · Javier Felip Leon · Harris Chan · Jan Chorowski · Jeff Orchard · Aleksandar Stanić · Adam Kortylewski · Ben Zinberg · Chenghui Zhou · Wei Sun · Vikash Mansinghka · Chun-Liang Li · Marco Cusumano-Towner -
2017 Poster: AIDE: An algorithm for measuring the accuracy of probabilistic inference algorithms »
Marco Cusumano-Towner · Vikash Mansinghka -
2017 Tutorial: Engineering and Reverse-Engineering Intelligence Using Probabilistic Programs, Program Induction, and Deep Learning »
Josh Tenenbaum · Vikash Mansinghka -
2016 Poster: A Probabilistic Programming Approach To Probabilistic Data Analysis »
Feras Saad · Vikash Mansinghka -
2014 Workshop: 3rd NIPS Workshop on Probabilistic Programming »
Daniel Roy · Josh Tenenbaum · Thomas Dietterich · Stuart J Russell · YI WU · Ulrik R Beierholm · Alp Kucukelbir · Zenna Tavares · Yura Perov · Daniel Lee · Brian Ruttenberg · Sameer Singh · Michael Hughes · Marco Gaboardi · Alexey Radul · Vikash Mansinghka · Frank Wood · Sebastian Riedel · Prakash Panangaden -
2013 Poster: Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs »
Vikash Mansinghka · Tejas D Kulkarni · Yura N Perov · Josh Tenenbaum -
2013 Oral: Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs »
Vikash Mansinghka · Tejas D Kulkarni · Yura N Perov · Josh Tenenbaum -
2012 Workshop: Probabilistic Programming: Foundations and Applications (2 day) »
Vikash Mansinghka · Daniel Roy · Noah Goodman -
2012 Workshop: Probabilistic Programming: Foundations and Applications (2 day) »
Vikash Mansinghka · Daniel Roy · Noah Goodman -
2009 Demonstration: The IID: A Natively Probabilistic Reconfigurable Computer »
Vikash Mansinghka -
2008 Workshop: Probabilistic Programming: Universal Languages, Systems and Applications »
Daniel Roy · John Winn · David A McAllester · Vikash Mansinghka · Josh Tenenbaum -
2006 Poster: Learning annotated hierarchies from relational data »
Daniel Roy · Charles Kemp · Vikash Mansinghka · Josh Tenenbaum -
2006 Talk: Learning annotated hierarchies from relational data »
Daniel Roy · Charles Kemp · Vikash Mansinghka · Josh Tenenbaum -
2006 Demonstration: Blaise: A System for Interactive Development of High Performance Inference Algorithms »
Keith Bonawitz · Vikash Mansinghka