Poster
The Scaling Limit of High-Dimensional Online Independent Component Analysis
Chuang Wang · Yue Lu
Pacific Ballroom #217
Keywords: [ Learning Theory ] [ Online Learning ] [ Stochastic Methods ] [ Large Deviations and Asymptotic Analysis ] [ Unsupervised Learning ] [ Non-Convex Optimization ] [ Signal Processing ] [ Source Separation ] [ Statistical Physics of Learning ]
We analyze the dynamics of an online algorithm for independent component analysis in the high-dimensional scaling limit. As the ambient dimension tends to infinity, and with proper time scaling, we show that the time-varying joint empirical measure of the target feature vector and the estimates provided by the algorithm will converge weakly to a deterministic measured-valued process that can be characterized as the unique solution of a nonlinear PDE. Numerical solutions of this PDE, which involves two spatial variables and one time variable, can be efficiently obtained. These solutions provide detailed information about the performance of the ICA algorithm, as many practical performance metrics are functionals of the joint empirical measures. Numerical simulations show that our asymptotic analysis is accurate even for moderate dimensions. In addition to providing a tool for understanding the performance of the algorithm, our PDE analysis also provides useful insight. In particular, in the high-dimensional limit, the original coupled dynamics associated with the algorithm will be asymptotically “decoupled”, with each coordinate independently solving a 1-D effective minimization problem via stochastic gradient descent. Exploiting this insight to design new algorithms for achieving optimal trade-offs between computational and statistical efficiency may prove an interesting line of future research.
Live content is unavailable. Log in and register to view live content