Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Oracle-Efficient Online Learning for Smoothed Adversaries

Nika Haghtalab · Yanjun Han · Abhishek Shetty · Kunhe Yang

Hall J (level 1) #824

Keywords: [ Online Learning ] [ Computational Efficiency ] [ Smoothed Analysis ]


Abstract: We study the design of computationally efficient online learning algorithms under smoothed analysis. In this setting, at every step, an adversary generates a sample from an adaptively chosen distribution whose density is upper bounded by 1/σ times the uniform density. Given access to an offline optimization (ERM) oracle, we give the first computationally efficient online algorithms whose sublinear regret depends only on the pseudo/VC dimension d of the class and the smoothness parameter σ. In particular, we achieve \emph{oracle-efficient} regret bounds of O(Tdσ1) for learning real-valued functions and O(Tdσ12) for learning binary-valued functions. Our results establish that online learning is computationally as easy as offline learning, under the smoothed analysis framework. This contrasts the computational separation between online learning with worst-case adversaries and offline learning established by [HK16].Our algorithms also achieve improved bounds for some settings with binary-valued functions and worst-case adversaries. These include an oracle-efficient algorithm with O(T(d|X|)1/2) regret that refines the earlier O(T|X|) bound of [DS16] for finite domains, and an oracle-efficient algorithm with O(T3/4d1/2) regret for the transductive setting.

Chat is not available.