Skip to yearly menu bar Skip to main content


Poster

Differentiable Spline Approximations

Minsu Cho · Aditya Balu · Ameya Joshi · Anjana Deva Prasad · Biswajit Khara · Soumik Sarkar · Baskar Ganapathysubramanian · Adarsh Krishnamurthy · Chinmay Hegde

Keywords: [ Optimization ] [ Machine Learning ]


Abstract:

The paradigm of differentiable programming has significantly enhanced the scope of machine learning via the judicious use of gradient-based optimization. However, standard differentiable programming methods (such as autodiff) typically require that the machine learning models be differentiable, limiting their applicability. Our goal in this paper is to use a new, principled approach to extend gradient-based optimization to functions well modeled by splines, which encompass a large family of piecewise polynomial models. We derive the form of the (weak) Jacobian of such functions and show that it exhibits a block-sparse structure that can be computed implicitly and efficiently. Overall, we show that leveraging this redesigned Jacobian in the form of a differentiable "layer'' in predictive models leads to improved performance in diverse applications such as image segmentation, 3D point cloud reconstruction, and finite element analysis. We also open-source the code at \url{https://github.com/idealab-isu/DSA}.

Chat is not available.