Timezone: »

 
Poster
Symmetry Teleportation for Accelerated Optimization
Bo Zhao · Nima Dehmamy · Robin Walters · Rose Yu

Wed Nov 30 09:00 AM -- 11:00 AM (PST) @ Hall J #503

Existing gradient-based optimization methods update parameters locally, in a direction that minimizes the loss function. We study a different approach, symmetry teleportation, that allows parameters to travel a large distance on the loss level set, in order to improve the convergence speed in subsequent steps. Teleportation exploits symmetries in the loss landscape of optimization problems. We derive loss-invariant group actions for test functions in optimization and multi-layer neural networks, and prove a necessary condition for teleportation to improve convergence rate. We also show that our algorithm is closely related to second order methods. Experimentally, we show that teleportation improves the convergence speed of gradient descent and AdaGrad for several optimization problems including test functions, multi-layer regressions, and MNIST classification.

Author Information

Bo Zhao (University of California, San Diego)
Nima Dehmamy (IBM Research)

I obtained my PhD in physics on complex systems from Boston University in 2016. I did postdoc at Northeastern University working on 3D embedded graphs and graph neural networks. My current research is on physics-informed machine learning and computational social science.

Robin Walters (Northeastern University)
Rose Yu (UC San Diego)

More from the Same Authors