Timezone: »

Tensor Wheel Decomposition and Its Tensor Completion Application
Zhong-Cheng Wu · Ting-Zhu Huang · Liang-Jian Deng · Hong-Xia Dou · Deyu Meng


Recently, tensor network (TN) decompositions have gained prominence in computer vision and contributed promising results to high-order data recovery tasks. However, current TN models are rather being developed towards more intricate structures to pursue incremental improvements, which instead leads to a dramatic increase in rank numbers, thus encountering laborious hyper-parameter selection, especially for higher-order cases. In this paper, we propose a novel TN decomposition, dubbed tensor wheel (TW) decomposition, in which a high-order tensor is represented by a set of latent factors mapped into a specific wheel topology. Such decomposition is constructed starting from analyzing the graph structure, aiming to more accurately characterize the complex interactions inside objectives while maintaining a lower hyper-parameter scale, theoretically alleviating the above deficiencies. Furthermore, to investigate the potentiality of TW decomposition, we provide its one numerical application, i.e., tensor completion (TC), yet develop an efficient proximal alternating minimization-based solving algorithm with guaranteed convergence. Experimental results elaborate that the proposed method is significantly superior to other tensor decomposition-based state-of-the-art methods on synthetic and real-world data, implying the merits of TW decomposition. The code is available at: https://github.com/zhongchengwu/code_TWDec.

Author Information

Zhong-Cheng Wu (University of Electronic Science and Technology of China)
Ting-Zhu Huang (School of Mathematical Sciences, University of Electronic Science and Technology of China)
Liang-Jian Deng (University of Electronic Science and Technology of China)
Hong-Xia Dou (Xihua University)
Deyu Meng (Xi'an Jiaotong University)

More from the Same Authors