Abstract:
-nearest neighbour (-NN) is one of the simplest and most widely-used methods for supervised classification, that predicts a query's label by taking weighted ratio of observed labels of objects nearest to the query. The weights and the parameter regulate its bias-variance trade-off, and the trade-off implicitly affects the convergence rate of the excess risk for the -NN classifier; several existing studies considered selecting optimal and weights to obtain faster convergence rate. Whereas -NN with non-negative weights has been developed widely, it was also proved that negative weights are essential for eradicating the bias terms and attaining optimal convergence rate. In this paper, we propose a novel multiscale -NN (MS--NN), that extrapolates unweighted -NN estimators from several values to , thus giving an imaginary 0-NN estimator. Our method implicitly computes optimal real-valued weights that are adaptive to the query and its neighbour points. We theoretically prove that the MS--NN attains the improved rate, which coincides with the existing optimal rate under some conditions.
Chat is not available.