Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Differentiable Programming Workshop

GPU Accelerated Automatic Differentiation with Clad

Vassil Vassilev · David Lange


Abstract:

Automatic Differentiation (AD) is a fundamental method that empowers computational algorithms across a range of fields, including Machine Learning, Robotics and High Energy Physics. We present methods enabling well-behaved C++ functions to be automatically differentiated on a GPU without need of code modification. This work brings forth the potential of a new layer of optimisation and a proportional speed up when gradients. The aim of this effort is to provide a tool for AD that can be easily integrated into existing frameworks as a compiler plugin extending the Clang compiler. It can be used interactively, as a Jupyter kernel extension, or as a plugin extending an interactive environment. It will provide researchers with the means to reuse pre-existing models and have their workloads scheduled on parallel processors without the need to optimise their computational kernels.