Timezone: »

Extrapolation Towards Imaginary 0-Nearest Neighbour and Its Improved Convergence Rate
Akifumi Okuno · Hidetoshi Shimodaira

Wed Dec 09 09:00 PM -- 11:00 PM (PST) @ Poster Session 4 #1188
$k$-nearest neighbour ($k$-NN) is one of the simplest and most widely-used methods for supervised classification, that predicts a query's label by taking weighted ratio of observed labels of $k$ objects nearest to the query. The weights and the parameter $k \in \mathbb{N}$ regulate its bias-variance trade-off, and the trade-off implicitly affects the convergence rate of the excess risk for the $k$-NN classifier; several existing studies considered selecting optimal $k$ and weights to obtain faster convergence rate. Whereas $k$-NN with non-negative weights has been developed widely, it was also proved that negative weights are essential for eradicating the bias terms and attaining optimal convergence rate. In this paper, we propose a novel multiscale $k$-NN (MS-$k$-NN), that extrapolates unweighted $k$-NN estimators from several $k \ge 1$ values to $k=0$, thus giving an imaginary 0-NN estimator. Our method implicitly computes optimal real-valued weights that are adaptive to the query and its neighbour points. We theoretically prove that the MS-$k$-NN attains the improved rate, which coincides with the existing optimal rate under some conditions.

Author Information

Akifumi Okuno (The Institute of Statistical Mathematics)
Hidetoshi Shimodaira (Kyoto University / RIKEN AIP)