Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Workshop
Mon Dec 13 06:00 AM -- 03:00 PM (PST)
Differentiable Programming Workshop
Ludger Paehler · William Moses · Maria I Gorinova · Assefaw H. Gebremedhin · Jan Hueckelheim · Sri Hari Krishna Narayanan





Workshop Home Page

Differentiable programming allows for automatically computing derivatives of functions within a high-level language. It has become increasingly popular within the machine learning (ML) community: differentiable programming has been used within backpropagation of neural networks, probabilistic programming, and Bayesian inference. Fundamentally, differentiable programming frameworks empower machine learning and its applications: the availability of efficient and composable automatic differentiation (AD) tools has led to advances in optimization, differentiable simulators, engineering, and science.

While AD tools have greatly increased the productivity of ML scientists and practitioners, many problems remain unsolved. Crucially, there is little communication between the broad group of AD users, the programming languages researchers, and the differentiable programming developers, resulting in them working in isolation. We propose a Differentiable Programming workshop as a forum to narrow the gaps between differentiable and probabilistic languages design, efficient automatic differentiation engines and higher-level applications of differentiable programming. We hope this workshop will harness a closer collaboration between language designers and domain scientists by bringing together a diverse part of the differentiable programming community including people working on core automatic differentiation tools, higher level frameworks that rely upon AD (such as probabilistic programming and differentiable simulators), and applications that use differentiable programs to solve scientific problems.

The explicit goals of the workshop are to:
1. Foster closer collaboration and synergies between the individual communities;
2. Evaluate the merits of differentiable design constructs and the impact they have on the algorithm design space and usability of the language;
3. Highlight differentiable techniques of individual domains, and the potential they hold for other fields.

Welcome (Short Introduction & Welcome to the Workshop)
Parallel-Friendly Automatic Differentiation in Dex and JAX (Invited Talk)
SYMPAIS: SYMbolic Parallel Adaptive Importance Sampling for Probabilistic Program Analysis (Invited Talk)
Differentiable Scripting (Oral)
A research framework for writing differentiable PDE discretizations in JAX (Oral)
Break
Differentiable Programming in Molecular Physics (Invited Talk)
Diffractor.jl: High Level, High Performance AD for Julia (Invited Talk)
Equinox: neural networks in JAX via callable PyTrees and filtered transformations (Oral)
A fully-differentiable compressible high-order computational fluid dynamics solver (Oral)
Short Break (Break)
Poster Session
AbstractDifferentiation.jl: Backend-Agnostic Differentiable Programming in Julia (Poster)
GPU Accelerated Automatic Differentiation with Clad (Poster)
Gradients of the Big Bang: Solving the Einstein-Boltzmann Equations with Automatic Differentiation (Poster)
On automatic differentiation for the Matern covariance (Poster)
Neural Differentiable Predictive Control (Poster)
Aggregated type handling in AD tape implementations (Poster)
Backpropagation through Back substitution with a Backslash (Poster)
Extended Abstract – Enzyme.jl: Low levelauto-differentiation meets high-level language (Poster)
Unbiased Reparametrisation Gradient via Smoothing and Diagonalisation (Poster)
Differentiable Parametric Optimization Approach to Power System Load Modeling (Poster)
Short Break (Break)
Learning from Data through the Lens of Ocean Models, Surrogates, and their Derivatives (Invited Talk)
Learnable Physics Models (Invited Talk)
Escaping the abstraction: a foreign function interface for the Unified Form Language [UFL] (Oral)
Towards Denotational Semantics of AD for Higher-Order, Recursive, Probabilistic Languages (Oral)
Break
Differentiable Programming for Protein Sequences and Structure (Invited Talk)
Approximate High Performance Computing Guided by Automatic Differentiation (Invited Talk)
A Complete Axiomatization of Forward Differentiation (Oral)
Generalizability of density functionals learned from differentiable programming on weakly correlated spin-polarized systems (Oral)
Social