Poster
Memory-Constrained Algorithms for Convex Optimization
Moise Blanchard · Junhui Zhang · Patrick Jaillet
Great Hall & Hall B1+B2 (level 1) #1904
Abstract:
We propose a family of recursive cutting-plane algorithms to solve feasibility problems with constrained memory, which can also be used for first-order convex optimization. Precisely, in order to find a point within a ball of radius with a separation oracle in dimension ---or to minimize -Lipschitz convex functions to accuracy over the unit ball---our algorithms use bits of memory, and make oracle calls. The family is parametrized by and provides an oracle-complexity/memory trade-off in the sub-polynomial regime . While several works gave lower-bound trade-offs (impossibility results)---we explicit here their dependence with , showing that these also hold in any sub-polynomial regime---to the best of our knowledge this is the first class of algorithms that provides a positive trade-off between gradient descent and cutting-plane methods in any regime with . The algorithms divide the variables into blocks and optimize over blocks sequentially, with approximate separation vectors constructed using a variant of Vaidya's method. In the regime , our algorithm with achieves the information-theoretic optimal memory usage and improves the oracle-complexity of gradient descent.
Chat is not available.