Timezone: »

Variational Multi-Task Learning with Gumbel-Softmax Priors
Jiayi Shen · Xiantong Zhen · Marcel Worring · Ling Shao

Thu Dec 09 12:30 AM -- 02:00 AM (PST) @

Multi-task learning aims to explore task relatedness to improve individual tasks, which is of particular significance in the challenging scenario that only limited data is available for each task. To tackle this challenge, we propose variational multi-task learning (VMTL), a general probabilistic inference framework for learning multiple related tasks. We cast multi-task learning as a variational Bayesian inference problem, in which task relatedness is explored in a unified manner by specifying priors. To incorporate shared knowledge into each task, we design the prior of a task to be a learnable mixture of the variational posteriors of other related tasks, which is learned by the Gumbel-Softmax technique. In contrast to previous methods, our VMTL can exploit task relatedness for both representations and classifiers in a principled way by jointly inferring their posteriors. This enables individual tasks to fully leverage inductive biases provided by related tasks, therefore improving the overall performance of all tasks. Experimental results demonstrate that the proposed VMTL is able to effectively tackle a variety of challenging multi-task learning settings with limited training data for both classification and regression. Our method consistently surpasses previous methods, including strong Bayesian approaches, and achieves state-of-the-art performance on five benchmark datasets.

Author Information

Jiayi Shen (University of Amsterdam)
Xiantong Zhen (University of Amsterdam)
Marcel Worring (University of Amsterdam)
Ling Shao (Inception Institute of Artificial Intelligence)

More from the Same Authors