Timezone: »
A multilinear subspace regression model based on so called latent variable decomposition is introduced. Unlike standard regression methods which typically employ matrix (2D) data representations followed by vector subspace transformations, the proposed approach uses tensor subspace transformations to model common latent variables across both the independent and dependent data. The proposed approach aims to maximize the correlation between the so derived latent variables and is shown to be suitable for the prediction of multidimensional dependent data from multidimensional independent data, where for the estimation of the latent variables we introduce an algorithm based on Multilinear Singular Value Decomposition (MSVD) on a specially defined cross-covariance tensor. It is next shown that in this way we are also able to unify the existing Partial Least Squares (PLS) and N-way PLS regression algorithms within the same framework. Simulations on benchmark synthetic data confirm the advantages of the proposed approach, in terms of its predictive ability and robustness, especially for small sample sizes. The potential of the proposed technique is further illustrated on a real world task of the decoding of human intracranial electrocorticogram (ECoG) from a simultaneously recorded scalp electroencephalograph (EEG).
Author Information
Qibin Zhao (RIKEN AIP)
Cesar F Caiafa (CONICET/UBA)
Danilo Mandic (Imperial College London)
Liqing Zhang (Shanghai Jiao Tong University)
Tonio Ball (Albert-Ludwigs-University)
Andreas Schulze-bonhage (Albert-Ludwigs-University)
Andrzej S CICHOCKI (RIKEB Brain Science Institute)
Related Events (a corresponding poster, oral, or spotlight)
-
2011 Poster: A Multilinear Subspace Regression Method Using Orthogonal Tensors Decompositions »
Wed Dec 14th 04:45 -- 10:59 PM Room None
More from the Same Authors
-
2020 Workshop: First Workshop on Quantum Tensor Networks in Machine Learning »
Xiao-Yang Liu · Qibin Zhao · Jacob Biamonte · Cesar F Caiafa · Paul Pu Liang · Nadav Cohen · Stefan Leichenauer -
2020 Poster: Reciprocal Adversarial Learning via Characteristic Functions »
Shengxi Li · Zeyang Yu · Min Xiang · Danilo Mandic -
2020 Spotlight: Reciprocal Adversarial Learning via Characteristic Functions »
Shengxi Li · Zeyang Yu · Min Xiang · Danilo Mandic -
2020 Poster: Understanding Anomaly Detection with Deep Invertible Networks through Hierarchies of Distributions and Features »
Robin Schirrmeister · Yuxuan Zhou · Tonio Ball · Dan Zhang -
2019 Poster: Deep Multimodal Multilinear Fusion with High-order Polynomial Pooling »
Ming Hou · Jiajia Tang · Jianhai Zhang · Wanzeng Kong · Qibin Zhao -
2019 Poster: Learning Macroscopic Brain Connectomes via Group-Sparse Factorization »
Farzane Aminmansour · Andrew Patterson · Lei Le · Yisu Peng · Daniel Mitchell · Franco Pestilli · Cesar F Caiafa · Russell Greiner · Martha White -
2017 Spotlight: Tensor encoding and decomposition of brain connectomes with application to tractography evaluation »
Cesar F Caiafa · Olaf Sporns · Andrew Saykin · Franco Pestilli -
2017 Poster: Unified representation of tractography and diffusion-weighted MRI data using sparse multidimensional arrays »
Cesar F Caiafa · Olaf Sporns · Andrew Saykin · Franco Pestilli -
2011 Demonstration: Contour-Based Large Scale Image Retrieval Platform »
Rong Zhou · Liqing Zhang -
2008 Poster: Dynamic Visual Attention: Searching for coding length increments »
Xiaodi Hou · Liqing Zhang -
2008 Spotlight: Dynamic Visual Attention: Searching for coding length increments »
Xiaodi Hou · Liqing Zhang -
2007 Spotlight: Measuring Neural Synchrony by Message Passing »
Justin Dauwels · François Vialatte · Tomasz M Rutkowski · Andrzej S CICHOCKI -
2007 Poster: Measuring Neural Synchrony by Message Passing »
Justin Dauwels · François Vialatte · Tomasz M Rutkowski · Andrzej S CICHOCKI