Timezone: »

 
Poster
Connecting Optimization and Regularization Paths
Arun Suggala · Adarsh Prasad · Pradeep Ravikumar

Thu Dec 06 07:45 AM -- 09:45 AM (PST) @ Room 517 AB #113

We study the implicit regularization properties of optimization techniques by explicitly connecting their optimization paths to the regularization paths of ``corresponding'' regularized problems. This surprising connection shows that iterates of optimization techniques such as gradient descent and mirror descent are \emph{pointwise} close to solutions of appropriately regularized objectives. While such a tight connection between optimization and regularization is of independent intellectual interest, it also has important implications for machine learning: we can port results from regularized estimators to optimization, and vice versa. We investigate one key consequence, that borrows from the well-studied analysis of regularized estimators, to then obtain tight excess risk bounds of the iterates generated by optimization techniques.

Author Information

Arun Suggala (Carnegie Mellon University)
Adarsh Prasad (Carnegie Mellon University)
Pradeep Ravikumar (Carnegie Mellon University)

More from the Same Authors