Timezone: »
In this paper, we propose a matrix-variate normal penalty with sparse inverse covariances to couple multiple tasks. Learning multiple (parametric) models can be viewed as estimating a matrix of parameters, where rows and columns of the matrix correspond to tasks and features, respectively. Following the matrix-variate normal density, we design a penalty that decomposes the full covariance of matrix elements into the Kronecker product of row covariance and column covariance, which characterizes both task relatedness and feature representation. Several recently proposed methods are variants of the special cases of this formulation. To address the overfitting issue and select meaningful task and feature structures, we include sparse covariance selection into our matrix-normal regularization via L-1 penalties on task and feature inverse covariances. We empirically study the proposed method and compare with related models in two real-world problems: detecting landmines in multiple fields and recognizing faces between different subjects. Experimental results show that the proposed framework provides an effective and flexible way to model various different structures of multiple tasks.
Author Information
Yi Zhang (Carnegie Mellon University)
Jeff Schneider (CMU)
More from the Same Authors
-
2019 Poster: Offline Contextual Bayesian Optimization »
Ian Char · Youngseog Chung · Willie Neiswanger · Kirthevasan Kandasamy · Oak Nelson · Mark Boyer · Egemen Kolemen · Jeff Schneider -
2018 Poster: Neural Architecture Search with Bayesian Optimisation and Optimal Transport »
Kirthevasan Kandasamy · Willie Neiswanger · Jeff Schneider · Barnabas Poczos · Eric Xing -
2018 Spotlight: Neural Architecture Search with Bayesian Optimisation and Optimal Transport »
Kirthevasan Kandasamy · Willie Neiswanger · Jeff Schneider · Barnabas Poczos · Eric Xing -
2016 Poster: The Multi-fidelity Multi-armed Bandit »
Kirthevasan Kandasamy · Gautam Dasarathy · Barnabas Poczos · Jeff Schneider -
2016 Poster: Gaussian Process Bandit Optimisation with Multi-fidelity Evaluations »
Kirthevasan Kandasamy · Gautam Dasarathy · Junier B Oliva · Jeff Schneider · Barnabas Poczos -
2014 Poster: Flexible Transfer Learning under Support and Model Shift »
Xuezhi Wang · Jeff Schneider -
2013 Poster: Learning Hidden Markov Models from Non-sequence Data via Tensor Decomposition »
Tzu-Kuo Huang · Jeff Schneider -
2013 Poster: Σ-Optimality for Active Learning on Gaussian Random Fields »
Yifei Ma · Roman Garnett · Jeff Schneider -
2011 Poster: Group Anomaly Detection using Flexible Genre Models »
Liang Xiong · Barnabas Poczos · Jeff Schneider -
2011 Poster: Learning Auto-regressive Models from Sequence and Non-sequence Data »
Tzu-Kuo Huang · Jeff Schneider -
2008 Poster: Learning the Semantic Correlation: An Alternative Way to Gain from Unlabeled Text »
Yi Zhang · Jeff Schneider · Artur Dubrawski