On the Role of Pretraining in Domain Adaptation in an Infant-Inspired Distribution Shift Task
Deepayan Sanyal · Joel Michelson · Maithilee Kunda
Abstract
We study a novel distribution shift inspired by infant visual experience, which involves the tradeoff between viewpoint and instance diversity. To analyze this shift, we apply domain adaptation using Joint Adaptation Networks (JAN) under varying pretraining conditions. Our results show that JAN’s performance is highly sensitive to the pretraining scheme, with notable drops when semantic information about the target dataset is absent during pretraining. To investigate this dependence, we introduce a metric that measures target category separability in the pretrained feature space. Using this metric, we demonstrate a strong correlation between target separability before domain adaptation and JAN’s eventual performance on the target dataset.
Chat is not available.
Successful Page Load