Timezone: »
An intensive, twoday workshop on PROBABILISTIC PROGRAMMING, with contributed and invited talks, poster sessions, demos, and discussions.
Probabilistic models and inference algorithms have become standard tools for interpreting ambiguous, noisy data and building systems that learn from their experience. However, even simple probabilistic models can require significant effort and specialized expertise to develop and use, frequently involving custom mathematics, algorithm design and software development. Stateoftheart models from Bayesian statistics, artificial intelligence and cognitive science  especially those involving distributions over infinite data structures, relational structures, worlds with unknown numbers of objects, rich causal simulations of physics and psychology, and the reasoning processes of other agents  can be difficult to even specify formally, let alone in a machineexecutable fashion.
PROBABILISTIC PROGRAMMING aims to close this gap, making variations on commonlyused probabilistic models far easier to develop and use, and pointing the way towards entirely new types of models and inference. The central idea is to represent probabilistic models using ideas from programming, including functional, imperative, and logicbased languages. Most probabilistic programming systems represent distributions algorithmically, in terms of a programming language plus primitives for stochastic choice; some even support inference over Turinguniversal languages. Compared with representations of models in terms of their graphicalmodel structure, these representation languages are often significantly more flexible, but still support the development of generalpurpose inference algorithms.
The workshop will cover, and welcomes submissions about, all aspects of probabilistic programming. Some questions of particular interest include:
1. What realworld problems can be solved with probabilistic programming systems today? How much problemspecific customization/optimization is needed? Where is generalpurpose inference effective?
2. What does the probabilistic programming perspective, and in particular the representation of probabilistic models and inference procedures as algorithmic processes, reveal about the computability and complexity of Bayesian inference? When can theory guide the design and use of probabilistic programming systems?
3. How can we teach people to write probabilistic programs that work well, without having to teach them how to build an inference engine first? What programming styles support tractability of inference?
4. How can central ideas from software engineering  including debuggers, validation tools, style checkers, program analyses, reusable libraries, and profilers  help probabilistic programmers and modelers? Which of these tools can be built for probabilistic programs, or help us build probabilistic programming systems?
5. What new directions in AI, statistics, and cognitive science would be enabled if we could handle models that took hundreds or thousands of lines of probabilistic code to write?
Confirmed Keynote Speakers:
 Josh Tenenbaum (MIT)
 Stuart Russell (UC Berkeley)
 Christopher Bishop (Microsoft Research; University of Edinburgh)
Author Information
Vikash Mansinghka (Massachusetts Institute of Technology)
Vikash Mansinghka is a research scientist at MIT, where he leads the Probabilistic Computing Project. Vikash holds S.B. degrees in Mathematics and in Computer Science from MIT, as well as an M.Eng. in Computer Science and a PhD in Computation. He also held graduate fellowships from the National Science Foundation and MIT’s Lincoln Laboratory. His PhD dissertation on natively probabilistic computation won the MIT George M. Sprowls dissertation award in computer science, and his research on the Picture probabilistic programming language won an award at CVPR. He served on DARPA’s Information Science and Technology advisory board from 20102012, and currently serves on the editorial boards for the Journal of Machine Learning Research and the journal Statistics and Computation. He was an advisor to Google DeepMind and has cofounded two AIrelated startups, one acquired and one currently operational.
Dan Roy (Univ of Toronto & Vector)
Noah Goodman (Stanford University)
More from the Same Authors

2019 Workshop: Machine Learning with Guarantees »
Ben London · Gintare Karolina Dziugaite · Daniel Roy · Thorsten Joachims · Aleksander Madry · John ShaweTaylor 
2019 Poster: InformationTheoretic Generalization Bounds for SGLD via DataDependent Estimates »
Jeffrey Negrea · Mahdi Haghifam · Gintare Karolina Dziugaite · Ashish Khisti · Daniel Roy 
2019 Poster: Fastrate PACBayes Generalization Bounds via Shifted Rademacher Processes »
Jun Yang · Shengyang Sun · Daniel Roy 
2019 Poster: Variational Bayesian Optimal Experimental Design »
Adam Foster · Martin Jankowiak · Elias Bingham · Paul Horsfall · Yee Whye Teh · Thomas Rainforth · Noah Goodman 
2019 Spotlight: Variational Bayesian Optimal Experimental Design »
Adam Foster · Martin Jankowiak · Elias Bingham · Paul Horsfall · Yee Whye Teh · Thomas Rainforth · Noah Goodman 
2018 Poster: Datadependent PACBayes priors via differential privacy »
Gintare Karolina Dziugaite · Daniel Roy 
2018 Poster: Bias and Generalization in Deep Generative Models: An Empirical Study »
Shengjia Zhao · Hongyu Ren · Arianna Yuan · Jiaming Song · Noah Goodman · Stefano Ermon 
2018 Spotlight: Bias and Generalization in Deep Generative Models: An Empirical Study »
Shengjia Zhao · Hongyu Ren · Arianna Yuan · Jiaming Song · Noah Goodman · Stefano Ermon 
2018 Poster: Multimodal Generative Models for Scalable WeaklySupervised Learning »
Mike Wu · Noah Goodman 
2017 Poster: AIDE: An algorithm for measuring the accuracy of probabilistic inference algorithms »
Marco CusumanoTowner · Vikash Mansinghka 
2017 Poster: Learning Disentangled Representations with SemiSupervised Deep Generative Models »
Siddharth Narayanaswamy · Brooks Paige · JanWillem van de Meent · Alban Desmaison · Noah Goodman · Pushmeet Kohli · Frank Wood · Philip Torr 
2017 Tutorial: Engineering and ReverseEngineering Intelligence Using Probabilistic Programs, Program Induction, and Deep Learning »
Josh Tenenbaum · Vikash Mansinghka 
2016 Poster: A Probabilistic Programming Approach To Probabilistic Data Analysis »
Feras Saad · Vikash Mansinghka 
2016 Poster: Measuring the reliability of MCMC inference with bidirectional Monte Carlo »
Roger Grosse · Siddharth Ancha · Daniel Roy 
2016 Poster: NeurallyGuided Procedural Models: Amortized Inference for Procedural Graphics Programs using Neural Networks »
Daniel Ritchie · Anna Thomas · Pat Hanrahan · Noah Goodman 
2015 Workshop: Bounded Optimality and Rational Metareasoning »
Samuel J Gershman · Falk Lieder · Tom Griffiths · Noah Goodman 
2014 Workshop: 3rd NIPS Workshop on Probabilistic Programming »
Daniel Roy · Josh Tenenbaum · Thomas Dietterich · Stuart J Russell · YI WU · Ulrik R Beierholm · Alp Kucukelbir · Zenna Tavares · Yura Perov · Daniel Lee · Brian Ruttenberg · Sameer Singh · Michael Hughes · Marco Gaboardi · Alexey Radul · Vikash Mansinghka · Frank Wood · Sebastian Riedel · Prakash Panangaden 
2014 Poster: Gibbstype Indian Buffet Processes »
Creighton Heaukulani · Daniel Roy 
2014 Poster: Mondrian Forests: Efficient Online Random Forests »
Balaji Lakshminarayanan · Daniel Roy · Yee Whye Teh 
2013 Poster: Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs »
Vikash Mansinghka · Tejas D Kulkarni · Yura N Perov · Josh Tenenbaum 
2013 Oral: Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs »
Vikash Mansinghka · Tejas D Kulkarni · Yura N Perov · Josh Tenenbaum 
2013 Poster: Learning and using language via recursive pragmatic reasoning about other agents »
Nathaniel J Smith · Noah Goodman · Michael C Frank 
2013 Poster: Learning Stochastic Inverses »
Andreas Stuhlmüller · Jacob Taylor · Noah Goodman 
2013 Session: Session Chair »
Daniel Roy 
2013 Session: Tutorial Session B »
Daniel Roy 
2012 Workshop: Probabilistic Programming: Foundations and Applications (2 day) »
Vikash Mansinghka · Daniel Roy · Noah Goodman 
2012 Poster: Random function priors for exchangeable graphs and arrays »
James R Lloyd · Daniel Roy · Peter Orbanz · Zoubin Ghahramani 
2012 Poster: Burnin, bias, and the rationality of anchoring »
Falk Lieder · Tom Griffiths · Noah Goodman 
2011 Poster: Complexity of Inference in Latent Dirichlet Allocation »
David Sontag · Daniel Roy 
2011 Spotlight: Complexity of Inference in Latent Dirichlet Allocation »
David Sontag · Daniel Roy 
2011 Poster: Nonstandard Interpretations of Probabilistic Programs for Efficient Inference »
David Wingate · Noah Goodman · Andreas Stuhlmueller · Jeffrey M Siskind 
2009 Demonstration: Monte: An Interactive Ssytem for Massively Parallel Probabilistic Programming »
Vikash Mansinghka 
2009 Demonstration: The IID: A Natively Probabilistic Reconfigurable Computer »
Vikash Mansinghka 
2008 Workshop: Probabilistic Programming: Universal Languages, Systems and Applications »
Daniel Roy · John Winn · David A McAllester · Vikash Mansinghka · Josh Tenenbaum 
2008 Oral: The Mondrian Process »
Daniel Roy · Yee Whye Teh 
2008 Poster: The Mondrian Process »
Daniel Roy · Yee Whye Teh 
2007 Poster: Bayesian Agglomerative Clustering with Coalescents »
Yee Whye Teh · Hal Daumé III · Daniel Roy 
2007 Oral: Bayesian Agglomerative Clustering with Coalescents »
Yee Whye Teh · Hal Daumé III · Daniel Roy 
2006 Poster: Learning annotated hierarchies from relational data »
Daniel Roy · Charles Kemp · Vikash Mansinghka · Josh Tenenbaum 
2006 Talk: Learning annotated hierarchies from relational data »
Daniel Roy · Charles Kemp · Vikash Mansinghka · Josh Tenenbaum 
2006 Demonstration: Blaise: A System for Interactive Development of High Performance Inference Algorithms »
Keith Bonawitz · Vikash Mansinghka