Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Symmetry and Geometry in Neural Representations (NeurReps)

Towards architectural optimization of equivariant neural networks over subgroups

Kaitlin Maile · Dennis Wilson · Patrick Forré

Keywords: [ Neural Architecture Search ] [ Geometric Deep Learning ] [ Equivariance ]


Abstract: Incorporating equivariance to symmetry groups in artificial neural networks (ANNs) can improve performance on tasks exhibiting those symmetries, but such symmetries are often only approximate and not explicitly known. This motivates algorithmically optimizing the architectural constraints imposed by equivariance. We propose the equivariance relaxation morphism, which preserves functionality while reparameterizing a group equivariant layer to operate with equivariance constraints on a subgroup, and the $[G]$-mixed equivariant layer, which mixes operations constrained to equivariance to different groups to enable within-layer equivariance optimization. These two architectural tools can be used within neural architecture search (NAS) algorithms for equivariance-aware architectural optimization.

Chat is not available.