Poster
Learning Bounds for Importance Weighting
Corinna Cortes · Yishay Mansour · Mehryar Mohri

Tue Dec 7th 12:00 -- 12:00 AM @ None #None

This paper presents an analysis of importance weighting for learning from finite samples and gives a series of theoretical and algorithmic results. We point out simple cases where importance weighting can fail, which suggests the need for an analysis of the properties of this technique. We then give both upper and lower bounds for generalization with bounded importance weights and, more significantly, give learning guarantees for the more common case of unbounded importance weights under the weak assumption that the second moment is bounded, a condition related to the Renyi divergence of the training and test distributions. These results are based on a series of novel and general bounds we derive for unbounded loss functions, which are of independent interest. We use these bounds to guide the definition of an alternative reweighting algorithm and report the results of experiments demonstrating its benefits. Finally, we analyze the properties of normalized importance weights which are also commonly used.

Author Information

Corinna Cortes (Google Research)
Yishay Mansour (Tel-Aviv University)
Mehryar Mohri (Courant Inst. of Math. Sciences & Google Research)

More from the Same Authors