Skip to yearly menu bar Skip to main content


Poster

PAC-Bayes bounds for stable algorithms with instance-dependent priors

Omar Rivasplata · Emilio Parrado-Hernandez · John Shawe-Taylor · Shiliang Sun · Csaba Szepesvari

Room 210 #56

Keywords: [ Kernel Methods ] [ Learning Theory ] [ Model Selection and Structure Learning ] [ Theory ] [ Classification ]


Abstract:

PAC-Bayes bounds have been proposed to get risk estimates based on a training sample. In this paper the PAC-Bayes approach is combined with stability of the hypothesis learned by a Hilbert space valued algorithm. The PAC-Bayes setting is used with a Gaussian prior centered at the expected output. Thus a novelty of our paper is using priors defined in terms of the data-generating distribution. Our main result estimates the risk of the randomized algorithm in terms of the hypothesis stability coefficients. We also provide a new bound for the SVM classifier, which is compared to other known bounds experimentally. Ours appears to be the first uniform hypothesis stability-based bound that evaluates to non-trivial values.

Live content is unavailable. Log in and register to view live content