( events)   Timezone: »  
Program Highlights »
Workshop
Sat Dec 14 08:00 AM -- 06:00 PM (PST) @ West 114 + 115
Program Transformations for ML
Pascal Lamblin · Atilim Gunes Baydin · Alexander Wiltschko · Bart van Merriënboer · Emily Fertig · Barak Pearlmutter · David Duvenaud · Laurent Hascoet





Workshop Home Page

Machine learning researchers often express complex models as a program, relying on program transformations to add functionality. New languages and transformations (e.g., TorchScript and TensorFlow AutoGraph) are becoming core capabilities of ML libraries. However, existing transformations, such as automatic differentiation (AD), inference in probabilistic programming languages (PPL), and optimizing compilers are often built in isolation, and limited in scope. This workshop aims at viewing program transformations in ML in a unified light, making these capabilities more accessible, and building entirely new ones.
Program transformations are an area of active study. AD transforms a program performing numerical computation into one computing the gradient of those computations. In PPL, a program describing a sampling procedure can be modified to perform inference on model parameters given observations. Other examples are vectorizing a program expressed on one data point, and learned transformations where ML models use programs as inputs or outputs.
This workshop will bring together researchers in the fields of AD, programming languages, compilers, and ML, with the goal of understanding the commonalities between disparate approaches and views, and sharing ways to make these techniques broadly available. It would enable ML practitioners to iterate faster on novel models and architectures (e.g., those naturally expressed through high-level constructs like recursion).
Topics:
—Abstractions and syntax (beyond meta-programming and operator overloading) to naturally express a program (expression, or procedure) as an object to be manipulated.
—Techniques from AD and PPL the ML community could adopt to enable research on new models
—How to overcome challenges due to the ML’s specific hardware (GPUs, specialized chips) and software (Python) stacks, and the particular demands of practitioners for their tools
—Greater collaboration between ML and programming languages communities

Opening statements (Introduction)
Jan-Willem van de Meent - Compositional Methods for Learning and Inference in Deep Probabilistic Programs (Talk)
Jan-Willem van de Meent
Applications of a disintegration transformation (Talk)
Praveen Narayanan
Coffee break (Break)
Christine Tasson - Semantics of Functional Probabilistic Programs (Talk)
Christine Tasson
The Differentiable Curry (Talk)
Dimitrios Vytiniotis
Functional Tensors for Probabilistic Programming (Talk)
Fritz Obermeyer
Lunch break & Poster session (Poster Session)
Breandan Considine, Mike Innes, Du Phan, Dougal Maclaurin, Robin Manhaeve, Alexey Radul, Shashi Gowda, Ekansh Sharma, Eli Sennesh, Maxim K Kochurov, Gordon Plotkin, Thomas Wiecki, Navjot Kukreja, Chung-chieh Shan, Matthew Johnson, Dan Belov, Neeraj Pradhan, Wannes Meert, Angelika Kimmig, Luc De Raedt, Brian Patton, Matthew Hoffman, Rif A. Saurous, Dan Roy, Eli Bingham, Martin Jankowiak, Colin Carroll, Junpeng Lao, Liam Paull, Martin Abadi, Angel Rojas Jimenez, JP Chen
Optimized execution of PyTorch programs with TorchScript (Talk)
Zachary DeVito
Skye Wanderman-Milne - JAX: accelerated machine-learning research via composable function transformations in Python (Talk)
Skye Wanderman-Milne
Coffee break (Break)
Generalized Abs-Linear Learning (Talk)
Andreas Griewank
Towards Polyhedral Automatic Differentiation (Talk)
Jan Hueckelheim
Taylor-Mode Automatic Differentiation for Higher-Order Derivatives in JAX (Talk)
Jesse Bettencourt
Panel and general discussion (Panel Discussion)