Timezone: »

Parametric Complexity Bounds for Approximating PDEs with Neural Networks
Tanya Marwah · Zachary Lipton · Andrej Risteski

@ None
Recent experiments have shown that deep networks can approximate solutions to high-dimensional PDEs, seemingly escaping the curse of dimensionality. However, questions regarding the theoretical basis for such approximations, including the required network size remain open. In this paper, we investigate the representational power of neural networks for approximating solutions to linear elliptic PDEs with Dirichlet boundary conditions. We prove that when a PDE's coefficients are representable by small neural networks, the parameters required to approximate its solution scale polynomially with the input dimension $d$ and proportionally to the parameter counts of the coefficient networks. To this end, we develop a proof technique that simulates gradient descent (in an appropriate Hilbert space) by growing a neural network architecture whose iterates each participate as sub-networks in their (slightly larger) successors, and converge to the solution of the PDE.

Author Information

Tanya Marwah (Carnegie Mellon University)
Zachary Lipton (Carnegie Mellon University)
Andrej Risteski (CMU)

Assistant Professor in the ML department at CMU. Prior to that I was a Wiener Fellow at MIT, and prior to that finished my PhD at Princeton University.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors