Ebb: Progressive Optimization For Partial Domain Adaptation
Cheng Feng, Chaoliang Zhong, Jie Wang, Jun Sun, Yasuto Yokota
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:12:33
Unsupervised domain adaptation (UDA) methods are generally proposed based on the assumption that the source domain and the target domain share an identical group of classes. However, in transfer learning tasks in reality, the target domain often has fewer data with missing classes. Partial domain adaptation (PDA) allows the source domain to have unshared categories. Anchor points are used to describe the easily identified target samples. It is observed that the shared classes tend to have more anchor points compared with the unshared classes and we introduce a novel progressive optimization method named Ebb. Ebb could resist the negative transfer caused by the category gap and can be applied to any domain adaptation model. Ebb uses the class-wise distribution of anchor points to estimate the category gap. Then Ebb minimizes the errors of shared classes and corrects the error samples caused by blind alignment. To verify the effectiveness of the method, we apply Ebb to three widely used image classification tasks, i.e., Office-Home, Office-31 and ImageCLEF-DA. The results show that Ebb brings a significant improvement in all tasks and the models optimized by Ebb have stable performance under a wide range of category gaps.