Skip to yearly menu bar Skip to main content


Poster

Better Mini-Batch Algorithms via Accelerated Gradient Methods

Andrew Cotter · Ohad Shamir · Nati Srebro · Karthik Sridharan


Abstract:

Mini-batch algorithms have recently received significant attention as a way to speed-up stochastic convex optimization problems. In this paper, we study how such algorithms can be improved using accelerated gradient methods. We provide a novel analysis, which shows how standard gradient methods may sometimes be insufficient to obtain a significant speed-up. We propose a novel accelerated gradient algorithm, which deals with this deficiency, and enjoys a uniformly superior guarantee. We conclude our paper with experiments on real-world datasets, which validates our algorithm and substantiates our theoretical insights.

Live content is unavailable. Log in and register to view live content