Timezone: »

 
Poster
On the Second-order Convergence Properties of Random Search Methods
Aurelien Lucchi · Antonio Orvieto · Adamos Solomou

Wed Dec 08 04:30 PM -- 06:00 PM (PST) @

We study the theoretical convergence properties of random-search methods when optimizing non-convex objective functions without having access to derivatives. We prove that standard random-search methods that do not rely on second-order information converge to a second-order stationary point. However, they suffer from an exponential complexity in terms of the input dimension of the problem. In order to address this issue, we propose a novel variant of random search that exploits negative curvature by only relying on function evaluations. We prove that this approach converges to a second-order stationary point at a much faster rate than vanilla methods: namely, the complexity in terms of the number of function evaluations is only linear in the problem dimension. We test our algorithm empirically and find good agreements with our theoretical results.

Author Information

Aurelien Lucchi (EPFL)
Antonio Orvieto (ETH Zurich)

Phd Student at ETH Zurich. I’m interested in the design and the analysis of adaptive stochastic momentum optimization algorithms for non-convex machine learning problems. Publications: 2Neurips, 1ICML, 1AISTATS, 1UAI. 1 Patent on a learning algorithm for an impact screwdriver. Besides my PhD research, I am involved in several computational systems biology projects at ETH Zurich, such as SignalX. I also work as a biomedical data analyst (genetic rare diseases research) at the University of Padua.

Adamos Solomou (Swiss Federal Institute of Technology)

More from the Same Authors