Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:09:07
10 May 2022

The attention mechanism has been widely explored to construct a long-range connection which is beyond the realm of convolutions. The two groups of attention, unary and pairwise attention, seem like being incompatible as fire and water due to the completely different operations. In this paper, we propose a Group Attention (GA) block to bridge the gap between these two attentions and merely leverage unary attention to lightweightly reach the effect of pairwise attention, based on the implicit group clustering of light-weight CNNs. Compared with the conventional pairwise attention, i.e, Non-Local networks, our method artfully bypasses the burdensome pixel-pair calculation to save a huge computational cost, that is a big advantage of our work. Experiments on the task of image classification demontrate the effectiveness and efficiency of our GA block to enhance the light-weight models. Code will be released in the feature.

More Like This

  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
01 Feb 2024

P4.15-Attention Mechanism

1.00 pdh 0.10 ceu
  • SPS
    Members: Free
    IEEE Members: Free
    Non-members: Free