Skip to main content

NOISE SUPPRESSION FOR IMPROVED FEW-SHOT LEARNING

Zhikui Chen, Tiandong Ji, Suhua Zhang, Fangming Zhong

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:08:58
09 May 2022

Few-shot learning (FSL) aims to generalize from few labeled samples. Recently, metric-based methods have achieved surprising classification performance on many FSL benchmarks. However, those methods ignore the impact of noise, making the few-shot learning still tricky. In this work, we identify that noise suppression is important to improve the performance of FSL algorithms. Hence, we proposed a novel attention-based contrastive learning model with discrete cosine transform input (ACL-DCT), which can suppress the noise in input images, image labels, and learned features, respectively. ACL-DCT takes the transformed frequency domain representations by DCT as input and removes the high-frequency part to suppress the input noise. Besides, an attention-based alignment of the feature maps and a supervised contrastive loss are used to mitigate the feature and label noise. We evaluate our ACL-DCT by comparing previous methods on two widely used datasets for few-shot classification (i.e., miniImageNet and CUB). The results indicate that our proposed method outperforms the state-of-the-art methods.

More Like This

  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00