Multi-Scale Gridded Gabor Attention For Cirrus Segmentation
Felix Richards, Xianghua Xie, Adeline Paiement, Elisabeth Sola, Pierre-Alain Duc
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:10:02
The effective receptive field of a fully convolutional neural network is an important consideration when designing an architecture, as it defines the portion of the input visible to each convolutional kernel. We propose a neural network module, extending traditional skip connections, called the translated skip connection. Translated skip connections geometrically increase the receptive field of an architecture with negligible impact on both the size of the parameter space and computational complexity. By embedding translated skip connections into a benchmark architecture, we demonstrate that our module matches or outperforms four other approaches to expanding the effective receptive fields of fully convolutional neural networks. We confirm this result across five contemporary image segmentation datasets from disparate domains, including the detection of COVID-19 infection, segmentation of aerial imagery, common object segmentation, and segmentation for self-driving cars.