Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:12:03
21 Sep 2021

In traditional unsupervised domain adaptation problems, the target domain is assumed to share the same set of classes as the source domain. In practice, there exist situations where target-domain data are from only a subset of source-domain classes and it is not known which classes the target-domain data belong to since they are unlabeled. This problem has been formulated as Partial Domain Adaptation (PDA) in the literature and is a challenging task due to the negative transfer issue (i.e. source-domain data belonging to the irrelevant classes harm the domain adaptation). We address the PDA problem by detecting the outlier classes in the source domain progressively. As a result, the PDA is boiled down to an easier unsupervised domain adaptation problem which can be solved without the issue of negative transfer. Specifically, we employ the locality preserving projection to learn a latent common subspace in which a label propagation algorithm is used to label the target-domain data. The outlier classes can be detected if no target-domain data are labeled as these classes. We remove the detected outlier classes from the source domain and repeat the process for multiple iterations until convergence. Experimental results on commonly used datasets Office31 and Office-Home demonstrate our proposed method achieves state-of-the-art performance with an average accuracy of 98.1% and 75.4% respectively.

Value-Added Bundle(s) Including this Product

More Like This