Timezone: »
Kernel methods represent one of the most powerful tools in machine learning to tackle problems expressed in terms of function values and derivatives due to their capability to represent and model complex relations. While these methods show good versatility, they are computationally intensive and have poor scalability to large data as they require operations on Gram matrices. In order to mitigate this serious computational limitation, recently randomized constructions have been proposed in the literature, which allow the application of fast linear algorithms. Random Fourier features (RFF) are among the most popular and widely applied constructions: they provide an easily computable, low-dimensional feature representation for shift-invariant kernels. Despite the popularity of RFFs, very little is understood theoretically about their approximation quality. In this paper, we provide a detailed finite-sample theoretical analysis about the approximation quality of RFFs by (i) establishing optimal (in terms of the RFF dimension, and growing set size) performance guarantees in uniform norm, and (ii) presenting guarantees in L^r (1 ≤ r < ∞) norms. We also propose an RFF approximation to derivatives of a kernel with a theoretical study on its approximation quality.
Author Information
Bharath Sriperumbudur (The Pennsylvania State University)
Zoltan Szabo (Gatsby Unit, UCL)
More from the Same Authors
-
2015 Poster: Bayesian Manifold Learning: The Locally Linear Latent Variable Model (LL-LVM) »
Mijung Park · Wittawat Jitkrittum · Ahmad Qamar · Zoltan Szabo · Lars Buesing · Maneesh Sahani -
2015 Poster: Gradient-free Hamiltonian Monte Carlo with Efficient Kernel Exponential Families »
Heiko Strathmann · Dino Sejdinovic · Samuel Livingstone · Zoltan Szabo · Arthur Gretton -
2015 Poster: Optimal Rates for Random Fourier Features »
Bharath Sriperumbudur · Zoltan Szabo -
2014 Workshop: Modern Nonparametrics 3: Automating the Learning Pipeline »
Eric Xing · Mladen Kolar · Arthur Gretton · Samory Kpotufe · Han Liu · Zoltán Szabó · Alan Yuille · Andrew G Wilson · Ryan Tibshirani · Sasha Rakhlin · Damian Kozbur · Bharath Sriperumbudur · David Lopez-Paz · Kirthevasan Kandasamy · Francesco Orabona · Andreas Damianou · Wacha Bounliphone · Yanshuai Cao · Arijit Das · Yingzhen Yang · Giulia DeSalvo · Dmitry Storcheus · Roberto Valerio -
2014 Poster: Kernel Mean Estimation via Spectral Filtering »
Krikamol Muandet · Bharath Sriperumbudur · Bernhard Schölkopf -
2012 Poster: Optimal kernel choice for large-scale two-sample tests »
Arthur Gretton · Bharath Sriperumbudur · Dino Sejdinovic · Heiko Strathmann · Sivaraman Balakrishnan · Massimiliano Pontil · Kenji Fukumizu -
2011 Poster: Learning in Hilbert vs. Banach Spaces: A Measure Embedding Viewpoint »
Bharath Sriperumbudur · Kenji Fukumizu · Gert Lanckriet -
2009 Poster: Kernel Choice and Classifiability for RKHS Embeddings of Probability Distributions »
Bharath Sriperumbudur · Kenji Fukumizu · Arthur Gretton · Gert Lanckriet · Bernhard Schölkopf -
2009 Oral: Kernel Choice and Classifiability for RKHS Embeddings of Probability Distributions »
Bharath Sriperumbudur · Kenji Fukumizu · Arthur Gretton · Gert Lanckriet · Bernhard Schölkopf -
2009 Poster: On the Convergence of the Concave-Convex Procedure »
Bharath Sriperumbudur · Gert Lanckriet -
2009 Poster: A Fast, Consistent Kernel Two-Sample Test »
Arthur Gretton · Kenji Fukumizu · Zaid Harchaoui · Bharath Sriperumbudur -
2009 Spotlight: A Fast, Consistent Kernel Two-Sample Test »
Arthur Gretton · Kenji Fukumizu · Zaid Harchaoui · Bharath Sriperumbudur -
2008 Poster: Characteristic Kernels on Groups and Semigroups »
Kenji Fukumizu · Bharath Sriperumbudur · Arthur Gretton · Bernhard Schölkopf -
2008 Oral: Characteristic Kernels on Groups and Semigroups »
Kenji Fukumizu · Bharath Sriperumbudur · Arthur Gretton · Bernhard Schölkopf