Timezone: »
Several problems in machine learning, statistics, and other fields rely on computing eigenvectors. For large scale problems, the computation of these eigenvectors is typically performed via iterative schemes such as subspace iteration or Krylov methods. While there is classical and comprehensive analysis for subspace convergence guarantees with respect to the spectral norm, in many modern applications other notions of subspace distance are more appropriate. Recent theoretical work has focused on perturbations of subspaces measured in the ℓ2→∞ norm, but does not consider the actual computation of eigenvectors. Here we address the convergence of subspace iteration when distances are measured in the ℓ2→∞ norm and provide deterministic bounds. We complement our analysis with a practical stopping criterion and demonstrate its applicability via numerical experiments. Our results show that one can get comparable performance on downstream tasks while requiring fewer iterations, thereby saving substantial computational time.
Author Information
Vasileios Charisopoulos (Cornell University)
Austin Benson (Cornell University)
Anil Damle (Cornell University)
More from the Same Authors
-
2022 : A Stochastic Prox-Linear Method for CVaR Minimization »
Si Yi Meng · Vasileios Charisopoulos · Robert Gower -
2022 Poster: Model Preserving Compression for Neural Networks »
Jerry Chee · Megan Flynn (née Renz) · Anil Damle · Christopher De Sa -
2022 Poster: Communication-efficient distributed eigenspace estimation with arbitrary node failures »
Vasileios Charisopoulos · Anil Damle -
2022 Poster: Understanding Non-linearity in Graph Neural Networks from the Bayesian-Inference Perspective »
Rongzhe Wei · Haoteng YIN · Junteng Jia · Austin Benson · Pan Li -
2021 Poster: Approximate Decomposable Submodular Function Minimization for Cardinality-Based Components »
Nate Veldt · Austin Benson · Jon Kleinberg -
2020 Poster: Better Set Representations For Relational Reasoning »
Qian Huang · Horace He · Abhay Singh · Yan Zhang · Ser Nam Lim · Austin Benson -
2020 Poster: Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization »
Geoff Pleiss · Martin Jankowiak · David Eriksson · Anil Damle · Jacob Gardner -
2019 Poster: Neural Jump Stochastic Differential Equations »
Junteng Jia · Austin Benson -
2018 Poster: Found Graph Data and Planted Vertex Covers »
Austin Benson · Jon Kleinberg -
2016 Poster: General Tensor Spectral Co-clustering for Higher-Order Data »
Tao Wu · Austin Benson · David Gleich -
2015 : Beyond Nodes and Edges: Multiresolution Models of Complex Networks »
Austin Benson -
2014 Poster: Scalable Methods for Nonnegative Matrix Factorizations of Near-separable Tall-and-skinny Matrices »
Austin Benson · Jason D Lee · Bartek Rajwa · David F Gleich -
2014 Spotlight: Scalable Methods for Nonnegative Matrix Factorizations of Near-separable Tall-and-skinny Matrices »
Austin Benson · Jason D Lee · Bartek Rajwa · David F Gleich