Skip to yearly menu bar Skip to main content


Poster

Accelerated Zeroth-Order and First-Order Momentum Methods from Mini to Minimax Optimization

Feihu Huang · Shangqian Gao · Jian Pei · Heng Huang

Hall J (level 1) #1007

Keywords: [ JMLR ] [ Journal Track ]


Abstract: In the paper, we propose a class of accelerated zeroth-order and first-order momentum methods for both nonconvex mini-optimization and minimax-optimization. Specifically, we propose a new accelerated zeroth-order momentum (Acc-ZOM) method for black-box mini-optimization where only function values can be obtained. Moreover, we prove that our Acc-ZOM method achieves a lower query complexity of O~(d3/4ϵ3) for finding an ϵ-stationary point, which improves the best known result by a factor of O(d1/4) where d denotes the variable dimension. In particular, our Acc-ZOM does not need large batches required in the existing zeroth-order stochastic algorithms. Meanwhile, we propose an accelerated zeroth-order momentum descent ascent (Acc-ZOMDA) method for black-box minimax optimization, where only function values can be obtained. Our Acc-ZOMDA obtains a low query complexity of O~((d1+d2)3/4κy4.5ϵ3) without requiring large batches for finding an ϵ-stationary point, where d1 and d2 denote variable dimensions and κy is condition number. Moreover, we propose an accelerated first-order momentum descent ascent (Acc-MDA) method for minimax optimization, whose explicit gradients are accessible. Our Acc-MDA achieves a low gradient complexity of O~(κy4.5ϵ3) without requiring large batches for finding an ϵ-stationary point. In particular, our Acc-MDA can obtain a lower gradient complexity of O~(κy2.5ϵ3) with a batch size O(κy4), which improves the best known result by a factor of O(κy1/2). Extensive experimental results on black-box adversarial attack to deep neural networks and poisoning attack to logistic regression demonstrate efficiency of our algorithms.

Chat is not available.