Timezone: »
Multi-task learning aims to explore task relatedness to improve individual tasks, which is of particular significance in the challenging scenario that only limited data is available for each task. To tackle this challenge, we propose variational multi-task learning (VMTL), a general probabilistic inference framework for learning multiple related tasks. We cast multi-task learning as a variational Bayesian inference problem, in which task relatedness is explored in a unified manner by specifying priors. To incorporate shared knowledge into each task, we design the prior of a task to be a learnable mixture of the variational posteriors of other related tasks, which is learned by the Gumbel-Softmax technique. In contrast to previous methods, our VMTL can exploit task relatedness for both representations and classifiers in a principled way by jointly inferring their posteriors. This enables individual tasks to fully leverage inductive biases provided by related tasks, therefore improving the overall performance of all tasks. Experimental results demonstrate that the proposed VMTL is able to effectively tackle a variety of challenging multi-task learning settings with limited training data for both classification and regression. Our method consistently surpasses previous methods, including strong Bayesian approaches, and achieves state-of-the-art performance on five benchmark datasets.
Author Information
Jiayi Shen (University of Amsterdam)
Xiantong Zhen (University of Amsterdam)
Marcel Worring (University of Amsterdam)
Ling Shao (Inception Institute of Artificial Intelligence)
More from the Same Authors
-
2022 : Meta-Learning Makes a Better Multimodal Few-shot Learner »
Ivona Najdenkoska · Xiantong Zhen · Marcel Worring -
2023 Poster: Episodic Multi-Task Learning with Heterogeneous Neural Processes »
Jiayi Shen · Xiantong Zhen · Qi Wang · Marcel Worring -
2023 Poster: Bridging Semantic Gaps for Language-Supervised Semantic Segmentation »
Yun Xing · Jian Kang · Aoran Xiao · Jiahao Nie · Ling Shao · Shijian Lu -
2022 Poster: PolarMix: A General Data Augmentation Technique for LiDAR Point Clouds »
Aoran Xiao · Jiaxing Huang · Dayan Guan · Kaiwen Cui · Shijian Lu · Ling Shao -
2022 Poster: Association Graph Learning for Multi-Task Classification with Category Shifts »
Jiayi Shen · Zehao Xiao · Xiantong Zhen · Cees Snoek · Marcel Worring -
2022 Poster: Variational Model Perturbation for Source-Free Domain Adaptation »
Mengmeng Jing · Xiantong Zhen · Jingjing Li · Cees Snoek -
2021 Poster: You Never Cluster Alone »
Yuming Shen · Ziyi Shen · Menghan Wang · Jie Qin · Philip Torr · Ling Shao -
2021 Poster: Learning to Learn Dense Gaussian Processes for Few-Shot Learning »
Ze Wang · Zichen Miao · Xiantong Zhen · Qiang Qiu -
2021 Poster: TransMatcher: Deep Image Matching Through Transformers for Generalizable Person Re-identification »
Shengcai Liao · Ling Shao -
2021 Poster: HSVA: Hierarchical Semantic-Visual Adaptation for Zero-Shot Learning »
Shiming Chen · Guosen Xie · Yang Liu · Qinmu Peng · Baigui Sun · Hao Li · Xinge You · Ling Shao -
2020 Poster: Learning to Learn Variational Semantic Memory »
Xiantong Zhen · Yingjun Du · Huan Xiong · Qiang Qiu · Cees Snoek · Ling Shao -
2020 Poster: Human Parsing Based Texture Transfer from Single Image to 3D Human via Cross-View Consistency »
Fang Zhao · Shengcai Liao · Kaihao Zhang · Ling Shao -
2019 Poster: Two Generator Game: Learning to Sample via Linear Goodness-of-Fit Test »
Lizhong Ding · Mengyang Yu · Li Liu · Fan Zhu · Yong Liu · Yu Li · Ling Shao