Skip to yearly menu bar Skip to main content


Poster

A Comprehensive Analysis on the Learning Curve in Kernel Ridge Regression

Tin Sum Cheng · Aurelien Lucchi · Anastasis Kratsios · David Belius

East Exhibit Hall A-C #2207
[ ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

This paper conducts a comprehensive study of the learning curves of kernel ridge regression (KRR) under minimal assumptions.Our contributions are three-fold: 1) we analyze the role of key properties of the kernel, such as its spectral eigen-decay and the characteristics of the eigenfunctions; 2) we demonstrate the validity of the Gaussian Equivalent Property (GEP), which states that the generalization performance of KRR remains the same when the whitened features are replaced by standard Gaussian vectors, thereby shedding light on the analysis success under the Gaussian Design Assumption in previous literature; 3) we derive novel bounds that improves over existing bounds across various settings.

Live content is unavailable. Log in and register to view live content