Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Oct 2022

Positive and Unlabeled learning (PU learning) trains a binary classifier based on only positive (P) and unlabeled (U) data, where the unlabeled data contains positive or negative samples. Previous importance reweighting approaches treat all unlabeled samples as weighted negative samples, achieving state-of-the-art performance. However, in this paper, we surprisingly find that the classifier could misclassify negative samples in U data as positive ones at the late training stage by weight adjustment. Motivated by this discovery, we leverage Semi-Supervised Learning (SSL) to address this performance degradation problem. To this end, we propose a novel SSL-based framework to tackle PU learning. Firstly, we introduce the dynamic increasing sampling strategy to progressively select both negative and positive samples from U data. Secondly, we adopt MixMatch to take full advantage of the unchosen samples in U data. Finally, we propose the Co-learning strategy that iteratively trains two independent networks with the selected samples to avoid the confirmation bias. Experimental results on four benchmark datasets demonstrate the effectiveness and superiority of our approach when compared with other state-of-the-art methods.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00