`

Timezone: »

 
Poster
Fast Tucker Rank Reduction for Non-Negative Tensors Using Mean-Field Approximation
Kazu Ghalamkari · Mahito Sugiyama

Thu Dec 09 12:30 AM -- 02:00 AM (PST) @ Virtual #None

We present an efficient low-rank approximation algorithm for non-negative tensors. The algorithm is derived from our two findings: First, we show that rank-1 approximation for tensors can be viewed as a mean-field approximation by treating each tensor as a probability distribution. Second, we theoretically provide a sufficient condition for distribution parameters to reduce Tucker ranks of tensors; interestingly, this sufficient condition can be achieved by iterative application of the mean-field approximation. Since the mean-field approximation is always given as a closed formula, our findings lead to a fast low-rank approximation algorithm without using a gradient method. We empirically demonstrate that our algorithm is faster than the existing non-negative Tucker rank reduction methods and achieves competitive or better approximation of given tensors.

Author Information

Kazu Ghalamkari (National Institute of Informatics / SOKENDAI)

I am engaged in research to develop machine learning theory based on information geometry. My background in master's course was physics. So I am intensely interested in the interdisciplinary academic field of intelligent informatics and physics. When I can describe things in machine learning with physics terms, I am thrilled. In NeurIPS2021, I will introduce a novel tensor decomposition method, called Legendre Tucker Rank reduction (LTR). In this paper, I pointed out that the tensor rank-1 approximation is a mean-field approximation, a widely known physics methodology for reducing many-body problems to one-body problems. More details are on my personal page. gkazu.info

Mahito Sugiyama (National Institute of Informatics)

More from the Same Authors