Poster
Gradient Weights help Nonparametric Regressors
Samory Kpotufe · Abdeslam Boularias
Harrah’s Special Events Center 2nd Floor
[
Abstract
]
Abstract:
In regression problems over , the unknown function often varies more in some coordinates than in others. We show that weighting each coordinate with the estimated norm of the th derivative of is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and -NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.
Live content is unavailable. Log in and register to view live content