Timezone: »

Minimax Optimal Alternating Minimization for Kernel Nonparametric Tensor Learning
Taiji Suzuki · Heishiro Kanagawa · Hayato Kobayashi · Nobuyuki Shimizu · Yukihiro Tagami

Wed Dec 07 09:00 AM -- 12:30 PM (PST) @ Area 5+6+7+8 #103 #None

We investigate the statistical performance and computational efficiency of the alternating minimization procedure for nonparametric tensor learning. Tensor modeling has been widely used for capturing the higher order relations between multimodal data sources. In addition to a linear model, a nonlinear tensor model has been received much attention recently because of its high flexibility. We consider an alternating minimization procedure for a general nonlinear model where the true function consists of components in a reproducing kernel Hilbert space (RKHS). In this paper, we show that the alternating minimization method achieves linear convergence as an optimization algorithm and that the generalization error of the resultant estimator yields the minimax optimality. We apply our algorithm to some multitask learning problems and show that the method actually shows favorable performances.

Author Information

Taiji Suzuki (The University of Tokyo/JST-PRESTO/RIKEN)
Heishiro Kanagawa (Gatsby Unit, University College London)
Hayato Kobayashi (Yahoo Japan Corporation)
Nobuyuki Shimizu (Yahoo Japan Corporation)
Yukihiro Tagami (Yahoo Japan Corporation)

More from the Same Authors