Timezone: »

Transfer Learning via Minimizing the Performance Gap Between Domains
Boyu Wang · Jorge Mendez · Mingbo Cai · Eric Eaton

Thu Dec 12 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #61

We propose a new principle for transfer learning, based on a straightforward intuition: if two domains are similar to each other, the model trained on one domain should also perform well on the other domain, and vice versa. To formalize this intuition, we define the performance gap as a measure of the discrepancy between the source and target domains. We derive generalization bounds for the instance weighting approach to transfer learning, showing that the performance gap can be viewed as an algorithm-dependent regularizer, which controls the model complexity. Our theoretical analysis provides new insight into transfer learning and motivates a set of general, principled rules for designing new instance weighting schemes for transfer learning. These rules lead to gapBoost, a novel and principled boosting approach for transfer learning. Our experimental evaluation on benchmark data sets shows that gapBoost significantly outperforms previous boosting-based transfer learning algorithms.

Author Information

Boyu Wang (University of Western Ontario)
Jorge Mendez (University of Pennsylvania)
Mingbo Cai (University of Tokyo)
Eric Eaton (University of Pennsylvania)

More from the Same Authors