Timezone: »
We develop convergence analysis of a modified line search method for objective functions whose value is computed with noise and whose gradient estimates are not directly available. The noise is assumed to be bounded in absolute value without any additional assumptions. In this case, gradient approximation can be constructed via interpolation or sample average approximation of smoothing gradients and thus they are always inexact and possibly random. We extend the framework based on stochastic methods which was developed to provide analysis of a standard line-search method with exact function values and random gradients to the case of noisy function. We introduce a condition on the gradient which when satisfied with some sufficiently large probability at each iteration, guarantees convergence properties of the line search method. We derive expected complexity bounds for convex, strongly convex and nonconvex functions. We motivate these results with several recent papers related to policy optimization.
Author Information
Katya Scheinberg (Lehigh University)
More from the Same Authors
-
2021 : High Probability Step Size Lower Bound for Adaptive Stochastic Optimization »
Katya Scheinberg · Miaolan Xie -
2022 : Stochastic Adaptive Regularization Method with Cubics: A High Probability Complexity Bound »
Katya Scheinberg · Miaolan Xie -
2022 : Katya Scheinberg, Stochastic Oracles and Where to Find Them »
Katya Scheinberg -
2021 Workshop: OPT 2021: Optimization for Machine Learning »
Courtney Paquette · Quanquan Gu · Oliver Hinder · Katya Scheinberg · Sebastian Stich · Martin Takac -
2021 Poster: High Probability Complexity Bounds for Line Search Based on Stochastic Oracles »
Billy Jin · Katya Scheinberg · Miaolan Xie -
2010 Poster: Sparse Inverse Covariance Selection via Alternating Linearization Methods »
Katya Scheinberg · Shiqian Ma · Donald Goldfarb