Skip to yearly menu bar Skip to main content


Poster

Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions

Alejandro Carderera · Mathieu Besançon · Sebastian Pokutta

Keywords: [ Optimization ] [ Machine Learning ]


Abstract: Generalized self-concordance is a key property present in the objective function of many important learning problems. We establish the convergence rate of a simple Frank-Wolfe variant that uses the open-loop step size strategy $\gamma_t = 2/(t+2)$, obtaining a $\mathcal{O}(1/t)$ convergence rate for this class of functions in terms of primal gap and Frank-Wolfe gap, where $t$ is the iteration count. This avoids the use of second-order information or the need to estimate local smoothness parameters of previous work. We also show improved convergence rates for various common cases, e.g., when the feasible region under consideration is uniformly convex or polyhedral.

Chat is not available.