Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 13:07
04 May 2020

Convolutional neural networks (CNNs) can be easily over-fitted when they are over-parametered. The popular dropout that drops feature units randomly can't always work well for CNNs, due to the problem of under-dropping. To eliminate this problem, some structural dropout methods such as SpatialDropout, Cutout and DropBlock have been proposed. However, these methods that drop feature units in continuous regions randomly, may have the risk of over-dropping, thus leading to degradation of performance. To address these issues, we propose a novel structural dropout method, Correlation based Dropout (CorrDrop), to regularize CNNs by dropping feature units based on feature correlation, which reflects the discriminative information in feature maps. Specifically, the proposed method first obtains correlation map based on the activation in the feature maps, and then adaptively masks out those regions with small average correlation. Thus, the proposed method can regularize CNNs well by discarding part of contextual regions. Extensive experiments on image classification demonstrate the superiority of our method compared with other counterparts.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00