Workshop
Transfer and Multi-Task Learning: Trends and New Perspectives
Anastasia Pentina · Christoph Lampert · Sinno Jialin Pan · Mingsheng Long · Judy Hoffman · Baochen Sun · Kate Saenko

Sat Dec 12th 08:30 AM -- 06:30 PM @ 514 bc
Event URL: https://sites.google.com/site/tlworkshop2015 »

This workshop aims to bring together researchers and practitioners from machine learning, computer vision, natural language processing and related fields to discuss and document recent advances in transfer and multi-task learning. This includes the main topics of transfer and multi-task learning, together with several related variants as domain adaptation and dataset bias, and new discoveries and directions in deep learning based approaches.

Transfer and multi-task learning methods aim to better exploit the available data during training and adapt previously learned knowledge to new domains or tasks. This mitigates the burden of human labeling for emerging applications and enables learning from very few labeled examples.

In the past years there have been increasing activities in these areas, mainly driven by practical applications (e.g. object recognition, sentiment analysis) as well as state-of-the-art deep learning frameworks (e.g. CNN). Of the recently proposed solutions, most lack joint theoretical justifications, especially those deep learning based approaches. On the other hand, most of the existing theoretically justified approaches are rarely used in practice.

This NIPS 2015 workshop will focus on closing the gap between theory and practice by providing an opportunity for researchers and practitioners to get together, to share ideas and debate current theories and empirical results. The goal is to promote a fruitful exchange of ideas across different communities, leading to global advancement of the field.

Tentative topics:
New perspectives or theories on transfer and multi-task learning
Dataset bias and concept drift
Domain adaptation
Multi-task learning
Zero-shot or one-shot learning
Feature based approaches
Instance based approaches
Deep architectures for transfer and multi-task learning
Transferability of deep representations
Transfer across different architectures, e.g. CNN to RNN
Transfer across different modalities, e.g. image to text
Transfer across different tasks, e.g. recognition and detection
Transfer from weakly labeled or noisy data, e.g. Web data
Transfer in practical settings, e.g. online, active, and large-scale learning
Innovative applications, e.g. machine translation, computational biology
Datasets, benchmarks, and open-source packages

11:40 AM Learning Representations for Unsupervised and Transfer Learning (Talk) Yoshua Bengio
08:50 AM Intro and Adapting Deep Networks Across Domains, Modalities, and Tasks (Talk) Trevor Darrell
09:00 AM Learning Shared Representations in MDPs (Talk) Diana Borsa
09:05 AM On Weight Ratio Estimation for Covariate Shift (Talk) Ruth Urner
09:30 AM The Benefit of Multitask Representation Learning (Talk) Massimiliano Pontil
10:30 AM A Theory of Multiple Source Adaptation (Talk) Mehryar Mohri
02:30 PM Shai Ben-David (Talk)
02:30 PM Domain Adaptation for Binary Classification (Talk) Shai Ben-David
03:00 PM Multitask Generalized Eigenvalue Program (Talk) Boyu Wang
03:30 PM Actor-Mimic (Talk) Emilio Parisotto
05:00 PM Sharing the "How" (and not the "What") (Talk) Percy Liang
05:30 PM Transitive Transfer Learning (Talk) Qiang Yang

Author Information

Anastasia Pentina (IST Austria)
Christoph Lampert (IST Austria)
Sinno Jialin Pan (Nanyang Technological University)
Mingsheng Long (Tsinghua University)
Judy Hoffman (UC Berkeley)
Baochen Sun (University of Massachusetts Lowell)
Kate Saenko (UMass Lowell)

More from the Same Authors