Neural network distillation of orbital dependent density functional theory
Abstract
Density functional theory (DFT) offers a desirable balance between quantitative accuracy and computational efficiency in practical many-electron calculations. Its central component, the exchange-correlation energy functional, has been approximated with increasing levels of complexity ranging from strictly local approximations to nonlocal and orbital-dependent expressions with many tuned parameters. In this paper, we formulate a general way of rewriting complex density functionals using deep neural networks in a way that allows for simplified computation of Kohn-Sham potentials as well as higher functional derivatives through automatic differentiation, enabling access to highly nonlinear response functions and forces. These goals are achieved by using a recently developed class of robust neural network models capable of modeling functionals, as opposed to functions, with explicitly enforced spatial symmetries. Functionals treated in this way are then called \textit{global density approximations} and can be seamlessly integrated with existing DFT workflows. Tests are performed for a dataset featuring a large variety of molecular structures and popular meta-generalized gradient approximation density functionals, where we successfully eliminate orbital dependencies coming from the kinetic energy density, and discover a high degree of transferability to a variety of physical systems. The presented framework is general and could be extended to more complex orbital and energy dependent functionals as well as refined with specialized datasets.