Workshop
Program Transformations for ML
Pascal Lamblin · Atilim Gunes Baydin · Alexander Wiltschko · Bart van Merriënboer · Emily Fertig · Barak Pearlmutter · David Duvenaud · Laurent Hascoet

Sat Dec 14th 08:00 AM -- 06:00 PM @ West 114 + 115
Event URL: https://program-transformations.github.io/ »

Machine learning researchers often express complex models as a program, relying on program transformations to add functionality. New languages and transformations (e.g., TorchScript and TensorFlow AutoGraph) are becoming core capabilities of ML libraries. However, existing transformations, such as automatic differentiation (AD), inference in probabilistic programming languages (PPL), and optimizing compilers are often built in isolation, and limited in scope. This workshop aims at viewing program transformations in ML in a unified light, making these capabilities more accessible, and building entirely new ones.
Program transformations are an area of active study. AD transforms a program performing numerical computation into one computing the gradient of those computations. In PPL, a program describing a sampling procedure can be modified to perform inference on model parameters given observations. Other examples are vectorizing a program expressed on one data point, and learned transformations where ML models use programs as inputs or outputs.
This workshop will bring together researchers in the fields of AD, programming languages, compilers, and ML, with the goal of understanding the commonalities between disparate approaches and views, and sharing ways to make these techniques broadly available. It would enable ML practitioners to iterate faster on novel models and architectures (e.g., those naturally expressed through high-level constructs like recursion).
Topics:
—Abstractions and syntax (beyond meta-programming and operator overloading) to naturally express a program (expression, or procedure) as an object to be manipulated.
—Techniques from AD and PPL the ML community could adopt to enable research on new models
—How to overcome challenges due to the ML’s specific hardware (GPUs, specialized chips) and software (Python) stacks, and the particular demands of practitioners for their tools
—Greater collaboration between ML and programming languages communities

08:30 AM Opening statements (Introduction)
08:40 AM Jan-Willem van de Meent - TBA (Talk) Jan-Willem van de Meent
09:30 AM Applications of a disintegration transformation (Talk) Praveen Narayanan
09:50 AM Coffee break (Break)
10:30 AM Christine Tasson - TBA (Talk) Christine Tasson
11:20 AM The Differentiable Curry (Talk) Dimitrios Vytiniotis
11:40 AM Functional Tensors for Probabilistic Programming (Talk) Fritz Obermeyer
12:00 PM Lunch break & Poster session (Poster Session)
Breandan Considine, Mike Innes, Du Phan, Dougal Maclaurin, Robin Manhaeve, Alexey Radul, Shashi Gowda, Ekansh Sharma, Eli Sennesh, Maxim K Kochurov, Gordon Plotkin, Thomas Wiecki, Navjot Kukreja, Chung-chieh Shan, Matthew Johnson, Dan Belov, Neeraj Pradhan, Wannes Meert, Angelika Kimmig, Luc De Raedt, Brian Patton, Matthew Hoffman, Rif A. Saurous, Dan Roy, Eli Bingham, Martin Jankowiak, Colin Carroll, Junpeng Lao, Liam Paull, Martin Abadi, Angel Rojas Jimenez, JP Chen
02:00 PM Optimized execution of PyTorch programs with TorchScript (Talk) Zachary DeVito
02:50 PM Skye Wanderman-Milne - TBA (Talk) Skye Wanderman-Milne
03:40 PM Coffee break (Break)
04:20 PM Generalized Abs-Linear Learning (Talk) Andreas Griewank
04:40 PM Towards Polyhedral Automatic Differentiation (Talk) Jan Hueckelheim
05:00 PM Taylor-Mode Automatic Differentiation for Higher-Order Derivatives in JAX (Talk) Jesse Bettencourt
05:20 PM Panel and general discussion (Panel Discussion)

Author Information

Pascal Lamblin (Google)
Atilim Gunes Baydin (University of Oxford)
Alexander Wiltschko (Google Brain)
Bart van Merriënboer (Google)
Emily Fertig (Google Research)
Barak Pearlmutter (Maynooth University)
David Duvenaud (University of Toronto)
Laurent Hascoet (INRIA)

More from the Same Authors