Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:02:07
21 Apr 2023

In medical image segmentation, fusion of multi-scale feature maps using skip connection is proven to be an effective technique for performance boosting. However, one feature map is sufficient to make a prediction about the segmentation result, and there exists divergence between predictions made by feature maps of different scales. To better utilize feature maps of different scales, in this paper, we propose a plug-and-play module named div-attention. This module can be applied to any deep neural networks with skip connections. It is placed at the junction between adjacent paths of skip connections to re-segment the area where divergence exists and to fuse results for segmentation performance improvement, which means that this module is divergence-aware and is proposed for better utilizing this disagreement between predictions made by feature maps of diverse scales within one model. Experiment results show that this plug-and-play module can significantly boost segmentation performance in different deep neural networks and in diverse CT segmentation tasks.