Timezone: »

Residual Pathway Priors for Soft Equivariance Constraints
Marc Finzi · Gregory Benton · Andrew Wilson

Tue Dec 07 04:30 PM -- 06:00 PM (PST) @

Models such as convolutional neural networks restrict the hypothesis space to a set of functions satisfying equivariance constraints, and improve generalization in problems by capturing relevant symmetries. However, symmetries are often only partially respected, preventing models with restriction biases from fitting the data. We introduce Residual Pathway Priors (RPPs) as a method for converting hard architectural constraints into soft priors, guiding models towards structured solutions while retaining the ability to capture additional complexity. RPPs are resilient to approximate or misspecified symmetries, and are as effective as fully constrained models even when symmetries are exact. We show that RPPs provide compelling performance on both model-free and model-based reinforcement learning problems, where contact forces and directional rewards violate the assumptions of equivariant networks. Finally, we demonstrate that RPPs have broad applicability, including dynamical systems, regression, and classification.

Author Information

Marc Finzi (NYU)
Gregory Benton (New York University)
Andrew Wilson (New York University)

I am a professor of machine learning at New York University.

More from the Same Authors