Poster
Regularization Path of Cross-Validation Error Lower Bounds
Atsushi Shibagaki · Yoshiki Suzuki · Masayuki Karasuyama · Ichiro Takeuchi

Tue Dec 8th 07:00 -- 11:59 PM @ 210 C #69 #None

Careful tuning of a regularization parameter is indispensable in many machine learning tasks because it has a significant impact on generalization performances.Nevertheless, current practice of regularization parameter tuning is more of an art than a science, e.g., it is hard to tell how many grid-points would be needed in cross-validation (CV) for obtaining a solution with sufficiently small CV error.In this paper we propose a novel framework for computing a lower bound of the CV errors as a function of the regularization parameter, which we call regularization path of CV error lower bounds.The proposed framework can be used for providing a theoretical approximation guarantee on a set of solutions in the sense that how far the CV error of the current best solution could be away from best possible CV error in the entire range of the regularization parameters.We demonstrate through numerical experiments that a theoretically guaranteed a choice of regularization parameter in the above sense is possible with reasonable computational costs.

Author Information

Atsushi Shibagaki (Nagoya Institute of Technology)
Yoshiki Suzuki (Nagoya Institute of Technology)
Masayuki Karasuyama (Nagoya Institute of Technology)
Ichiro Takeuchi (Nagoya Institute of Technology)

More from the Same Authors