Skip to yearly menu bar Skip to main content


Poster

Single Loop Gaussian Homotopy Method for Non-convex Optimization

Hidenori Iwakiri · Yuhang Wang · Shinji Ito · Akiko Takeda

Hall J (level 1) #617

Keywords: [ Non-Convex Optimization ] [ Gaussian smoothing ] [ Zeroth-order optimization ] [ Worst-case iteration complexity ] [ Gaussian homotopy ]


Abstract: The Gaussian homotopy (GH) method is a popular approach to finding better stationary points for non-convex optimization problems by gradually reducing a parameter value t, which changes the problem to be solved from an almost convex one to the original target one. Existing GH-based methods repeatedly call an iterative optimization solver to find a stationary point every time t is updated, which incurs high computational costs. We propose a novel single loop framework for GH methods (SLGH) that updates the parameter t and the optimization decision variables at the same. Computational complexity analysis is performed on the SLGH algorithm under various situations: either a gradient or gradient-free oracle of a GH function can be obtained for both deterministic and stochastic settings. The convergence rate of SLGH with a tuned hyperparameter becomes consistent with the convergence rate of gradient descent, even though the problem to be solved is gradually changed due to t. In numerical experiments, our SLGH algorithms show faster convergence than an existing double loop GH method while outperforming gradient descent-based methods in terms of finding a better solution.

Chat is not available.