Timezone: »

On Convergence of Nearest Neighbor Classifiers over Feature Transformations
Luka Rimanic · Cedric Renggli · Bo Li · Ce Zhang

Tue Dec 08 09:00 AM -- 11:00 AM (PST) @ Poster Session 1 #336

The k-Nearest Neighbors (kNN) classifier is a fundamental non-parametric machine learning algorithm. However, it is well known that it suffers from the curse of dimensionality, which is why in practice one often applies a kNN classifier on top of a (pre-trained) feature transformation. From a theoretical perspective, most, if not all theoretical results aimed at understanding the kNN classifier are derived for the raw feature space. This leads to an emerging gap between our theoretical understanding of kNN and its practical applications. In this paper, we take a first step towards bridging this gap. We provide a novel analysis on the convergence rates of a kNN classifier over transformed features. This analysis requires in-depth understanding of the properties that connect both the transformed space and the raw feature space. More precisely, we build our convergence bound upon two key properties of the transformed space: (1) safety -- how well can one recover the raw posterior from the transformed space, and (2) smoothness -- how complex this recovery function is. Based on our result, we are able to explain why some (pre-trained) feature transformations are better suited for a kNN classifier than other. We empirically validate that both properties have an impact on the kNN convergence on 30 feature transformations with 6 benchmark datasets spanning from the vision to the text domain.

Author Information

Luka Rimanic (ETH Zurich)
Cedric Renggli (ETH Zurich)
Bo Li (UIUC)
Ce Zhang (ETH Zurich)

More from the Same Authors