Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Lecture 10 Oct 2023

Knowledge Distillation based Unsupervised Anomaly Detection (KD-UAD), which aims to detect anomalies according to the differences between features of student network and teacher network on the sample, has been widely concerned by researchers. However, the strong learning ability of student networks may lead to small differences on abnormal samples, making it impossible to distinguish between anomalies and non-anomalies. To alleviate the above problem, we propose an improved KD-UAD approach to enhance the network's ability to perceive anomalies. Firstly, we propose abnormal-aware loss (AAL), which allows student network to gain anomaly repair capabilities. AAL enhances the differences between the features extracted by the student and teacher network on abnormal samples. Secondly, we design a full distillation loss (FDL) to enhance distillation strength. FDL allows the student network to learn the feature distribution more comprehensively. Experimental results show that our method outperforms the current state-of-the-art methods on the MVTec AD and BTAD datasets.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00