Skip to yearly menu bar Skip to main content


Poster

Transfer Learning via Minimizing the Performance Gap Between Domains

Boyu Wang · Jorge Mendez · Mingbo Cai · Eric Eaton

East Exhibition Hall B, C #61

Keywords: [ Algorithms ] [ Multitask and Transfer Learning ] [ Boosting and Ensemble Methods ]


Abstract:

We propose a new principle for transfer learning, based on a straightforward intuition: if two domains are similar to each other, the model trained on one domain should also perform well on the other domain, and vice versa. To formalize this intuition, we define the performance gap as a measure of the discrepancy between the source and target domains. We derive generalization bounds for the instance weighting approach to transfer learning, showing that the performance gap can be viewed as an algorithm-dependent regularizer, which controls the model complexity. Our theoretical analysis provides new insight into transfer learning and motivates a set of general, principled rules for designing new instance weighting schemes for transfer learning. These rules lead to gapBoost, a novel and principled boosting approach for transfer learning. Our experimental evaluation on benchmark data sets shows that gapBoost significantly outperforms previous boosting-based transfer learning algorithms.

Live content is unavailable. Log in and register to view live content