`

Timezone: »

 
Is Rank Minimization of the Essence to Learn Tensor Network Structure?
Chao Li · Qibin Zhao

Structure learning for tensor network (TN) representation is to select the optimal network for TN contraction to fit a tensor. In the existing literature and view of many tensor researchers, this task is widely considered as the same to learning tensor network (model) ranks. In the manuscript, we briefly analyze the relation of these two critical tasks in a rigorous fashion, stating that rank minimization is actually a subtopic of the structure learning for TN, where the graph essence of TN structures is ignored in rank minimization. On one hand, we agree that the two tasks are identical to each other if the TN structure is not constrained on graph. We propose, on the other hand, an open problem called permutation learning in structure learning, i.e., learning the optimal matching between tensor modes and vertices in TN, to point out rank minimization would be failure in structure learning in this case, due to its limitation of exploring graph spaces. We last focus on permutation learning and give several preliminary results to help understand this open problem.

Author Information

Chao Li (RIKEN Center for Advanced Intelligence Project)
Qibin Zhao (RIKEN AIP)

More from the Same Authors