Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Workshop
Sat Dec 14 08:00 AM -- 06:00 PM (PST) @ West 114 + 115
Program Transformations for ML
Pascal Lamblin · Atilim Gunes Baydin · Alexander Wiltschko · Bart van Merriënboer · Emily Fertig · Barak Pearlmutter · David Duvenaud · Laurent Hascoet





Workshop Home Page

Machine learning researchers often express complex models as a program, relying on program transformations to add functionality. New languages and transformations (e.g., TorchScript and TensorFlow AutoGraph) are becoming core capabilities of ML libraries. However, existing transformations, such as automatic differentiation (AD), inference in probabilistic programming languages (PPL), and optimizing compilers are often built in isolation, and limited in scope. This workshop aims at viewing program transformations in ML in a unified light, making these capabilities more accessible, and building entirely new ones.
Program transformations are an area of active study. AD transforms a program performing numerical computation into one computing the gradient of those computations. In PPL, a program describing a sampling procedure can be modified to perform inference on model parameters given observations. Other examples are vectorizing a program expressed on one data point, and learned transformations where ML models use programs as inputs or outputs.
This workshop will bring together researchers in the fields of AD, programming languages, compilers, and ML, with the goal of understanding the commonalities between disparate approaches and views, and sharing ways to make these techniques broadly available. It would enable ML practitioners to iterate faster on novel models and architectures (e.g., those naturally expressed through high-level constructs like recursion).
Topics:
—Abstractions and syntax (beyond meta-programming and operator overloading) to naturally express a program (expression, or procedure) as an object to be manipulated.
—Techniques from AD and PPL the ML community could adopt to enable research on new models
—How to overcome challenges due to the ML’s specific hardware (GPUs, specialized chips) and software (Python) stacks, and the particular demands of practitioners for their tools
—Greater collaboration between ML and programming languages communities

Opening statements (Introduction)
Jan-Willem van de Meent - Compositional Methods for Learning and Inference in Deep Probabilistic Programs (Talk)
Applications of a disintegration transformation (Talk)
Coffee break (Break)
Christine Tasson - Semantics of Functional Probabilistic Programs (Talk)
The Differentiable Curry (Talk)
Functional Tensors for Probabilistic Programming (Talk)
Lunch break & Poster session (Poster Session)
Optimized execution of PyTorch programs with TorchScript (Talk)
Skye Wanderman-Milne - JAX: accelerated machine-learning research via composable function transformations in Python (Talk)
Coffee break (Break)
Generalized Abs-Linear Learning (Talk)
Towards Polyhedral Automatic Differentiation (Talk)
Taylor-Mode Automatic Differentiation for Higher-Order Derivatives in JAX (Talk)
Panel and general discussion (Panel Discussion)