Skip to yearly menu bar Skip to main content


Poster

Error Analysis of Generalized Nyström Kernel Regression

Hong Chen · Haifeng Xia · Heng Huang · Weidong Cai

Area 5+6+7+8 #63

Keywords: [ Learning Theory ] [ Kernel Methods ]


Abstract: Nystr\"{o}m method has been used successfully to improve the computational efficiency of kernel ridge regression (KRR). Recently, theoretical analysis of Nystr\"{o}m KRR, including generalization bound and convergence rate, has been established based on reproducing kernel Hilbert space (RKHS) associated with the symmetric positive semi-definite kernel. However, in real world applications, RKHS is not always optimal and kernel function is not necessary to be symmetric or positive semi-definite. In this paper, we consider the generalized Nystr\"{o}m kernel regression (GNKR) with $\ell_2$ coefficient regularization, where the kernel just requires the continuity and boundedness. Error analysis is provided to characterize its generalization performance and the column norm sampling is introduced to construct the refined hypothesis space. In particular, the fast learning rate with polynomial decay is reached for the GNKR. Experimental analysis demonstrates the satisfactory performance of GNKR with the column norm sampling.

Live content is unavailable. Log in and register to view live content