Timezone: »

 
Poster
Learning Differential Equations that are Easy to Solve
Jacob Kelly · Jesse Bettencourt · Matthew Johnson · David Duvenaud

Tue Dec 08 09:00 AM -- 11:00 AM (PST) @ Poster Session 1 #176

Differential equations parameterized by neural networks become expensive to solve numerically as training progresses. We propose a remedy that encourages learned dynamics to be easier to solve. Specifically, we introduce a differentiable surrogate for the time cost of standard numerical solvers, using higher-order derivatives of solution trajectories. These derivatives are efficient to compute with Taylor-mode automatic differentiation. Optimizing this additional objective trades model performance against the time cost of solving the learned dynamics. We demonstrate our approach by training substantially faster, while nearly as accurate, models in supervised classification, density estimation, and time-series modelling tasks.

Author Information

Jacob Kelly (University of Toronto)

I'm an undergrad in CS, Math, and Stats at the University of Toronto. I'm currently a ML Research Intern at Deep Genomics. I'm also very fortunate to work with David Duvenaud at the Vector Institute. I'm interested in latent variable models, neural ODEs, variational inference, and genomics. My long term research goal is to combine machine learning with novel sources of data to develop new tools for improved diagnosis and treatment of patients.

Jesse Bettencourt (University of Toronto)
Matthew Johnson (Google Brain)
David Duvenaud (University of Toronto)

More from the Same Authors