Skip to yearly menu bar Skip to main content


Poster

Gradient Weights help Nonparametric Regressors

Samory Kpotufe · Abdeslam Boularias

Harrah’s Special Events Center 2nd Floor

Abstract: In regression problems over \reald, the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i with the estimated norm of the ith derivative of f is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and k-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.

Live content is unavailable. Log in and register to view live content