Timezone: »

 
Oral
Optimal Algorithms for Continuous Non-monotone Submodular and DR-Submodular Maximization
Rad Niazadeh · Tim Roughgarden · Joshua Wang

Thu Dec 06 07:05 AM -- 07:20 AM (PST) @ Room 517 CD

In this paper we study the fundamental problems of maximizing a continuous non monotone submodular function over a hypercube, with and without coordinate-wise concavity. This family of optimization problems has several applications in machine learning, economics, and communication systems. Our main result is the first 1/2 approximation algorithm for continuous submodular function maximization; this approximation factor of is the best possible for algorithms that use only polynomially many queries. For the special case of DR-submodular maximization, we provide a faster 1/2-approximation algorithm that runs in (almost) linear time. Both of these results improve upon prior work [Bian et al., 2017, Soma and Yoshida, 2017, Buchbinder et al., 2012].

Our first algorithm is a single-pass algorithm that uses novel ideas such as reducing the guaranteed approximation problem to analyzing a zero-sum game for each coordinate, and incorporates the geometry of this zero-sum game to fix the value at this coordinate. Our second algorithm is a faster single-pass algorithm that exploits coordinate-wise concavity to identify a monotone equilibrium condition sufficient for getting the required approximation guarantee, and hunts for the equilibrium point using binary search. We further run experiments to verify the performance of our proposed algorithms in related machine learning applications.

Author Information

Rad Niazadeh (Stanford University)

Rad Niazadeh is a Motwani postdoctoral fellow at Stanford University, Department of Computer Science, where he is hosted by Tim Roughgarden, Amin Saberi and Moses Charikar. Prior to Stanford, he obtained his Ph.D. in Computer Science from Cornell University under Bobby Kleinberg. During his graduate studies, he was a research intern at Microsoft Research (Redmond), Microsoft Research (New England) and Yahoo! Research. He also has been awarded the Google PhD fellowship (in market algorithms), INFORMS Revenue Management and Pricing Dissertation Honorable Mention(runner up), Stanford Motwani fellowship and Cornell Irwin Jacobs fellowship. His research interests are broadly at the intersection of algorithms, game theory and optimization, with a focus on applications in market design, machine learning, and operations research.

Tim Roughgarden (Stanford University)
Joshua Wang (Google)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors