Skip to yearly menu bar Skip to main content


Poster

Computing the Bias of Constant-step Stochastic Approximation with Markovian Noise

Sebastian Allmeier · Nicolas Gast


Abstract: We study stochastic approximation algorithms with Markovian noise and constant step-size α. We develop a method based on infinitesimal generator comparisons to study the bias of the algorithm, which is the expected difference between θn ---the value at iteration n--- and θ ---the unique equilibrium of the corresponding ODE. We show that, under some smoothness conditions, this bias is of order O(α). Furthermore, we show that the time-averaged bias is equal to αV+O(α2), where V is a constant characterized by a Lyapunov equation, showing that E[θ¯n]θ+Vα+O(α2), where θ¯n is the Polyak-Ruppert average. We also show that θ¯n converges with high probability around θ+αV. We illustrate how to combine this with Richardson-Romberg extrapolation to derive an iterative scheme with a bias of order O(α2).

Chat is not available.