Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Poster 11 Oct 2023

Despite the impressive performance of deep neural networks, they are prone to over-fitting at labeled points rooting from the scarcity of annotated data. Applying mixup regularization in training provides an effective mechanism to improve generalization performance. On the other hand, semi-supervised learning(SSL) leverages an abundant amount of unlabeled data along with a small amount of labeled data in the training process. In this paper, we have introduced mixup regularization to SSL, along with an exploration-utilization training scheme to enhance the performance. Besides, due to the large volume imbalance between labeled/unlabeled data and the unwanted noise resulting from unlabeled samples, we also implement a balancing ratio between the labeled/unlabeled loss terms. Specifically, we devise a novel Sharp Entropy loss for model optimization with large-scale unlabeled samples and employ an uncertainty estimation technique to weigh unlabeled loss function. Extensive experiments show the state-of-the-art performance of SEMixup and uncertainty balancing ratio superior to baselines on image classification.

More Like This

  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00