Timezone: »

 
Geometric Considerations for Normalization Layers in Equivariant Neural Networks
Max Aalto · Ekdeep S Lubana · Hidenori Tanaka
Event URL: https://openreview.net/forum?id=p9fKD1sFog8 »

In recent years, neural networks that incorporate physical symmetry in their architecture have become indispensable tools for overcoming the scarcity of molecular and material data. However, despite its critical importance in deep learning, the design and selection of the normalization layer has often been treated as a side issue. In this study, we first review the unique challenges that batch normalization (BatchNorm) faces in its application to materials science and provide an overview of alternative normalization layers that can address the unique geometric considerations required by physical systems and tasks. While the challenges are diverse, we find that \emph{geometric-match} of a normalization layer can be achieved by ensuring that the normalization preserves not only invariance and equivariance, but also covariance of the task and dataset. Overall, our survey provides a coherent overview of normalization layers for practitioners and presents open-challenges for further developments.

Author Information

Max Aalto (NTT Research)
Max Aalto

I am an industry machine learning engineer making a transition to academia; most recently, I worked at Dropbox building large-scale graph models for the detection and mitigation of criminal activity on the platform. I have prior background in physics and currently I work at NTT Research, where I study methods that allow for the most efficient training of neural networks for use in physical simulations.

Ekdeep S Lubana (University of Michigan; CBS, Harvard University)
Hidenori Tanaka (Harvard University, Harvard University)

More from the Same Authors