Timezone: »

 
Analysis of linear search methods for various gradient approximation schemes for noisy derivative free optimization.
Katya Scheinberg

Fri Dec 13 04:15 PM -- 05:00 PM (PST) @

We develop convergence analysis of a modified line search method for objective functions whose value is computed with noise and whose gradient estimates are not directly available. The noise is assumed to be bounded in absolute value without any additional assumptions. In this case, gradient approximation can be constructed via interpolation or sample average approximation of smoothing gradients and thus they are always inexact and possibly random. We extend the framework based on stochastic methods which was developed to provide analysis of a standard line-search method with exact function values and random gradients to the case of noisy function. We introduce a condition on the gradient which when satisfied with some sufficiently large probability at each iteration, guarantees convergence properties of the line search method. We derive expected complexity bounds for convex, strongly convex and nonconvex functions. We motivate these results with several recent papers related to policy optimization.

Author Information

Katya Scheinberg (Lehigh University)

More from the Same Authors