Timezone: »
This talk will give a gentle introduction to Dex, an experimental programming language. Dex is designed to combine the clarity and safety of high-level functional languages with the efficiency of low-level numerical languages. For example, Dex allows one to move much of the informal type and shape information normally contained in comments into compile-time checked types, while also omitting unambiguous details, to keep things terse. It also allows in-place updates and stateful, loopy code that can automatically take advantage of parallelism in a fine-grained way. We'll demonstrate these features on standard deep architectures like attention and graph neural nets.
Author Information
David Duvenaud (University of Toronto)
David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning. His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge. David also co-founded Invenia, an energy forecasting and trading company.
AIPLANS 2021 (NeurIPS)
More from the Same Authors
-
2021 : Program Synthesis »
AIPLANS 2021 -
2021 : Automatic Differentiation »
AIPLANS 2021 -
2021 : Neurosymbolic Systems and Reasoning »
AIPLANS 2021 -
2021 : Theorem Proving and Formal Mathematics »
AIPLANS 2021 -
2021 : Programming Languages Theory »
AIPLANS 2021 -
2022 Workshop: The Symbiosis of Deep Learning and Differential Equations II »
Michael Poli · Winnie Xu · Estefany Kelly Buchanan · Maryam Hosseini · Luca Celotti · Martin Magill · Ermal Rrapaj · Qiyao Wei · Stefano Massaroli · Patrick Kidger · Archis Joglekar · Animesh Garg · David Duvenaud -
2021 : Poster Session »
AIPLANS 2021 -
2021 : Type Inference as Optimization - Eirene V. Pandi »
AIPLANS 2021 · Eirini V. Pandi -
2021 : Learning Adaptive Control Flow in Transformers for Improved Systematic Generalization - Róbert Csordás »
AIPLANS 2021 · Róbert Csordás -
2021 : Learning Rules with Stratified Negation in Differentiable ILP - Giri Krishnan »
AIPLANS 2021 · Giri Krishnan -
2021 : LazyPPL: laziness and types in non-parametric probabilistic programs - Hugo Paquet »
AIPLANS 2021 · Hugo Paquet -
2021 : Meta-Learning an Inference Algorithm for Probabilistic Programs - Gwonsoo Che »
AIPLANS 2021 · Gwonsoo Che -
2021 : When Gödel discovered Automatic Differentiation - Marie Kerjean - Centre national de la recherche scientifique »
AIPLANS 2021 -
2021 Poster: Meta-learning to Improve Pre-training »
Aniruddh Raghu · Jonathan Lorraine · Simon Kornblith · Matthew McDermott · David Duvenaud -
2020 : Panel discussion 2 »
Danielle S Bassett · Yoshua Bengio · Cristina Savin · David Duvenaud · Anna Choromanska · Yanping Huang -
2020 : Invited Talk David Duvenaud »
David Duvenaud -
2020 Tutorial: (Track3) Deep Implicit Layers: Neural ODEs, Equilibrium Models, and Differentiable Optimization Q&A »
David Duvenaud · J. Zico Kolter · Matthew Johnson -
2020 Poster: What went wrong and when? Instance-wise feature importance for time-series black-box models »
Sana Tonekaboni · Shalmali Joshi · Kieran Campbell · David Duvenaud · Anna Goldenberg -
2020 Poster: Learning Differential Equations that are Easy to Solve »
Jacob Kelly · Jesse Bettencourt · Matthew Johnson · David Duvenaud -
2020 Tutorial: (Track3) Deep Implicit Layers: Neural ODEs, Equilibrium Models, and Differentiable Optimization »
David Duvenaud · J. Zico Kolter · Matthew Johnson -
2019 Workshop: Program Transformations for ML »
Pascal Lamblin · Atilim Gunes Baydin · Alexander Wiltschko · Bart van Merriënboer · Emily Fertig · Barak Pearlmutter · David Duvenaud · Laurent Hascoet -
2019 : Molecules and Genomes »
David Haussler · Djork-Arné Clevert · Michael Keiser · Alan Aspuru-Guzik · David Duvenaud · David Jones · Jennifer Wei · Alexander D'Amour -
2019 Poster: Latent Ordinary Differential Equations for Irregularly-Sampled Time Series »
Yulia Rubanova · Tian Qi Chen · David Duvenaud -
2019 Poster: Residual Flows for Invertible Generative Modeling »
Tian Qi Chen · Jens Behrmann · David Duvenaud · Joern-Henrik Jacobsen -
2019 Spotlight: Residual Flows for Invertible Generative Modeling »
Tian Qi Chen · Jens Behrmann · David Duvenaud · Joern-Henrik Jacobsen -
2019 Poster: Efficient Graph Generation with Graph Recurrent Attention Networks »
Renjie Liao · Yujia Li · Yang Song · Shenlong Wang · Will Hamilton · David Duvenaud · Raquel Urtasun · Richard Zemel -
2019 Poster: Neural Networks with Cheap Differential Operators »
Tian Qi Chen · David Duvenaud -
2019 Spotlight: Neural Networks with Cheap Differential Operators »
Tian Qi Chen · David Duvenaud -
2018 : Software Panel »
Ben Letham · David Duvenaud · Dustin Tran · Aki Vehtari -
2018 Poster: Isolating Sources of Disentanglement in Variational Autoencoders »
Tian Qi Chen · Xuechen (Chen) Li · Roger Grosse · David Duvenaud -
2018 Oral: Isolating Sources of Disentanglement in Variational Autoencoders »
Tian Qi Chen · Xuechen (Chen) Li · Roger Grosse · David Duvenaud -
2018 Poster: Neural Ordinary Differential Equations »
Tian Qi Chen · Yulia Rubanova · Jesse Bettencourt · David Duvenaud -
2018 Oral: Neural Ordinary Differential Equations »
Tian Qi Chen · Yulia Rubanova · Jesse Bettencourt · David Duvenaud -
2017 Workshop: Aligned Artificial Intelligence »
Dylan Hadfield-Menell · Jacob Steinhardt · David Duvenaud · David Krueger · Anca Dragan -
2017 : Automatic Chemical Design Using a Data-driven Continuous Representation of Molecules »
David Duvenaud -
2017 Poster: Sticking the Landing: Simple, Lower-Variance Gradient Estimators for Variational Inference »
Geoffrey Roeder · Yuhuai Wu · David Duvenaud -
2016 : Generating Class-conditional Images with Gradient-based Inference »
David Duvenaud -
2016 : David Duvenaud – No more mini-languages: The power of autodiffing full-featured Python »
David Duvenaud -
2016 Workshop: Reliable Machine Learning in the Wild »
Dylan Hadfield-Menell · Adrian Weller · David Duvenaud · Jacob Steinhardt · Percy Liang -
2016 Poster: Composing graphical models with neural networks for structured representations and fast inference »
Matthew Johnson · David Duvenaud · Alex Wiltschko · Ryan Adams · Sandeep R Datta -
2016 Poster: Probing the Compositionality of Intuitive Functions »
Eric Schulz · Josh Tenenbaum · David Duvenaud · Maarten Speekenbrink · Samuel J Gershman -
2015 : *David Duvenaud* Automatic Differentiation: The most criminally underused tool in probabilistic numerics »
David Duvenaud -
2015 Poster: Convolutional Networks on Graphs for Learning Molecular Fingerprints »
David Duvenaud · Dougal Maclaurin · Jorge Iparraguirre · Rafael Bombarell · Timothy Hirzel · Alan Aspuru-Guzik · Ryan Adams -
2014 Poster: Probabilistic ODE Solvers with Runge-Kutta Means »
Michael Schober · David Duvenaud · Philipp Hennig -
2014 Oral: Probabilistic ODE Solvers with Runge-Kutta Means »
Michael Schober · David Duvenaud · Philipp Hennig -
2012 Poster: Active Learning of Model Evidence Using Bayesian Quadrature »
Michael A Osborne · David Duvenaud · Roman Garnett · Carl Edward Rasmussen · Stephen J Roberts · Zoubin Ghahramani -
2011 Poster: Additive Gaussian Processes »
David Duvenaud · Hannes Nickisch · Carl Edward Rasmussen