Timezone: »

 
Workshop
The future of gradient-based machine learning software & techniques
Alex Wiltschko · Bart van Merriënboer · Pascal Lamblin

Sat Dec 09 08:00 AM -- 06:30 PM (PST) @ 104 C
Event URL: https://autodiff-workshop.github.io/ »

Many algorithms in machine learning, computer vision, physical simulation, and other fields require the calculation of gradients and other derivatives. Manual derivation of gradients can be time consuming and error-prone. Automatic differentiation comprises a set of techniques to calculate the derivative of a numerical computation expressed as a computer program. These techniques are commonly used in atmospheric sciences and computational fluid dynamics, and have more recently also been adopted by machine learning researchers.

Practitioners across many fields have built a wide set of automatic differentiation tools, using different programming languages, computational primitives and intermediate compiler representations. Each of these choices comes with positive and negative trade-offs, in terms of their usability, flexibility and performance in specific domains.

This workshop will bring together researchers in the fields of automatic differentiation and machine learning to discuss ways in which advanced automatic differentiation frameworks and techniques can enable more advanced machine learning models, run large-scale machine learning on accelerators with better performance, and increase the usability of machine learning frameworks for practitioners. Topics for discussion will include:

* What abstractions (languages, kernels, interfaces, instruction sets) do we need to develop advanced automatic differentiation frameworks for the machine learning ecosystem?
* What different use cases exist in machine learning, from large-scale performance-critical models to small prototypes, and how should our toolsets reflect these needs?
* What advanced techniques from the automatic differentiation literature, such as checkpointing, differentiating through iterative processes or chaotic systems, cross-country elimination, etc., could be adopted by the ML community to enable research on new models?
* How can we foster greater collaboration between the fields of machine learning and automatic differentiation?

Sat 9:00 a.m. - 9:10 a.m. [iCal]
Introduction and opening remarks (Talk)
Alex Wiltschko
Sat 9:10 a.m. - 9:50 a.m. [iCal]
Beyond backprop: automatic differentiation in machine learning (Talk)
Atilim Gunes Baydin
Sat 9:50 a.m. - 10:30 a.m. [iCal]
Automatic differentiation in PyTorch (Talk)
Adam Paszke
Sat 10:30 a.m. - 11:00 a.m. [iCal]
Morning Coffee Break (Break)
Sat 11:00 a.m. - 11:40 a.m. [iCal]
Optimal Smoothing for Pathwise Adjoints (Talk)
Jonathan Hüser
Sat 11:40 a.m. - 1:40 p.m. [iCal]
Poster session (Poster Session)
Uwe Naumann, Lane Schwartz, Richard Wei, Eric Meissner, Jeff Druce, Zeming Lin, Alex Pothen, Edward Yang
Sat 1:40 p.m. - 2:20 p.m. [iCal]
Algorithmic differentiation techniques in the deep learning context (Talk)
Jean Utke
Sat 2:20 p.m. - 3:00 p.m. [iCal]
Some highlights on Source-to-Source Adjoint AD (Talk)
Laurent Hascoet
Sat 3:00 p.m. - 3:30 p.m. [iCal]
Afternoon Coffee Break (Break)
Sat 3:30 p.m. - 4:10 p.m. [iCal]
Divide-and-Conquer Checkpointing for Arbitrary Programs with No User Annotation (Talk)
Jeffrey M Siskind
Sat 4:10 p.m. - 4:50 p.m. [iCal]
Automatic Differentiation of Parallelised Convolutional Neural Networks - Lessons from Adjoint PDE Solvers (Talk)
Jan Hueckelheim
Sat 4:50 p.m. - 6:00 p.m. [iCal]
Panel discussion (Discussion Panel)
Atilim Gunes Baydin, Adam Paszke, Jonathan Hüser, Jean Utke, Laurent Hascoet, Jeffrey M Siskind, Jan Hueckelheim, Andreas Griewank

Author Information

Alex Wiltschko (Google)
Bart van Merriënboer (MILA / Google Brain)
Pascal Lamblin (Google)

More from the Same Authors