Skip to yearly menu bar Skip to main content


Poster

Consistent Multitask Learning with Nonlinear Output Relations

Carlo Ciliberto · Alessandro Rudi · Lorenzo Rosasco · Massimiliano Pontil

Pacific Ballroom #5

Keywords: [ Multitask and Transfer Learning ] [ Structured Prediction ] [ Kernel Methods ]


Abstract:

Key to multitask learning is exploiting the relationships between different tasks to improve prediction performance. Most previous methods have focused on the case where tasks relations can be modeled as linear operators and regularization approaches can be used successfully. However, in practice assuming the tasks to be linearly related is often restrictive, and allowing for nonlinear structures is a challenge. In this paper, we tackle this issue by casting the problem within the framework of structured prediction. Our main contribution is a novel algorithm for learning multiple tasks which are related by a system of nonlinear equations that their joint outputs need to satisfy. We show that our algorithm can be efficiently implemented and study its generalization properties, proving universal consistency and learning rates. Our theoretical analysis highlights the benefits of non-linear multitask learning over learning the tasks independently. Encouraging experimental results show the benefits of the proposed method in practice.

Live content is unavailable. Log in and register to view live content