The future of gradient-based machine learning software & techniques
Alex Wiltschko · Bart van Merriënboer · Pascal Lamblin

Sat Dec 9th 08:00 AM -- 06:30 PM @ 104 C
Event URL: »

Many algorithms in machine learning, computer vision, physical simulation, and other fields require the calculation of gradients and other derivatives. Manual derivation of gradients can be time consuming and error-prone. Automatic differentiation comprises a set of techniques to calculate the derivative of a numerical computation expressed as a computer program. These techniques are commonly used in atmospheric sciences and computational fluid dynamics, and have more recently also been adopted by machine learning researchers.

Practitioners across many fields have built a wide set of automatic differentiation tools, using different programming languages, computational primitives and intermediate compiler representations. Each of these choices comes with positive and negative trade-offs, in terms of their usability, flexibility and performance in specific domains.

This workshop will bring together researchers in the fields of automatic differentiation and machine learning to discuss ways in which advanced automatic differentiation frameworks and techniques can enable more advanced machine learning models, run large-scale machine learning on accelerators with better performance, and increase the usability of machine learning frameworks for practitioners. Topics for discussion will include:

* What abstractions (languages, kernels, interfaces, instruction sets) do we need to develop advanced automatic differentiation frameworks for the machine learning ecosystem?
* What different use cases exist in machine learning, from large-scale performance-critical models to small prototypes, and how should our toolsets reflect these needs?
* What advanced techniques from the automatic differentiation literature, such as checkpointing, differentiating through iterative processes or chaotic systems, cross-country elimination, etc., could be adopted by the ML community to enable research on new models?
* How can we foster greater collaboration between the fields of machine learning and automatic differentiation?

09:00 AM Introduction and opening remarks (Talk) Alex Wiltschko
09:10 AM Beyond backprop: automatic differentiation in machine learning (Talk) Atilim Gunes Baydin
09:50 AM Automatic differentiation in PyTorch (Talk) Adam Paszke
10:30 AM Morning Coffee Break (Break)
11:00 AM Optimal Smoothing for Pathwise Adjoints (Talk) Jonathan Hüser
11:40 AM Poster session (Poster Session) Uwe Naumann, Lane Schwartz, Richard Wei, Eric Meissner, Jeff Druce, Zeming Lin, Alex Pothen, Edward Yang
01:40 PM Algorithmic differentiation techniques in the deep learning context (Talk) Jean Utke
02:20 PM Some highlights on Source-to-Source Adjoint AD (Talk) Laurent Hascoet
03:00 PM Afternoon Coffee Break (Break)
03:30 PM Divide-and-Conquer Checkpointing for Arbitrary Programs with No User Annotation (Talk) Jeffrey M Siskind
04:10 PM Automatic Differentiation of Parallelised Convolutional Neural Networks - Lessons from Adjoint PDE Solvers (Talk) Jan Hueckelheim
04:50 PM Panel discussion (Discussion Panel) Atilim Gunes Baydin, Adam Paszke, Jonathan Hüser, Jean Utke, Laurent Hascoet, Jeffrey M Siskind, Jan Hueckelheim, Andreas Griewank

Author Information

Alex Wiltschko (Google)
Bart van Merriënboer (MILA / Google Brain)
Pascal Lamblin (Google)

More from the Same Authors