Timezone: »

Better Mini-Batch Algorithms via Accelerated Gradient Methods
Andrew Cotter · Ohad Shamir · Nati Srebro · Karthik Sridharan

Tue Dec 13 08:45 AM -- 02:59 PM (PST) @ None #None

Mini-batch algorithms have recently received significant attention as a way to speed-up stochastic convex optimization problems. In this paper, we study how such algorithms can be improved using accelerated gradient methods. We provide a novel analysis, which shows how standard gradient methods may sometimes be insufficient to obtain a significant speed-up. We propose a novel accelerated gradient algorithm, which deals with this deficiency, and enjoys a uniformly superior guarantee. We conclude our paper with experiments on real-world datasets, which validates our algorithm and substantiates our theoretical insights.

Author Information

Andy Cotter (TTI Chicago)
Ohad Shamir (Weizmann Institute of Science)
Nati Srebro (TTI-Chicago)
Karthik Sridharan (University of Pennsylvania)

More from the Same Authors