Skip to main content

SIMPLE SELF-DISTILLATION LEARNING FOR NOISY IMAGE CLASSIFICATION

Tenta Sasaya, Takashi Watanabe, Takashi Ida, Toshiyuki Ono

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Poster 11 Oct 2023

In computer vision, considerable success has been achieved in image classification. Yet, real-world applications can be negatively affected by noise corruption, so a method for dealing with noise is crucial. Knowledge distillation that utilizes the knowledge of a teacher model trained on clean images to train a student model on noisy images is a promising technique because it can be applied without special modification of the classifier. However, clean images are typically not available for most practical uses. To address this issue, we proposed a novel knowledge distillation method without clean images. By leveraging a property of the feature extractor in the classifier that naturally removes unrelated features for classification, we perform a simple training of the teacher model on noisy images, under the assumption that such a teacher can provide pseudo-clean features. Our experiments demonstrate that the proposed method can achieve classification performance comparable to conventional methods, even without clean images.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00