IMPROVING TRANSLATION INVARIANCE IN CONVOLUTIONAL NEURAL NETWORKS WITH PERIPHERAL PREDICTION PADDING
Kensuke Mukai, Takao Yamanaka
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Zero padding is often used in convolutional neural networks to prevent the feature map size from decreasing with each layer. However, recent studies have shown that zero padding promotes encoding of absolute positional information, which may adversely affect the performance of some tasks. In this work, a novel padding method called Peripheral Prediction Padding (PP-Pad) method is proposed, which enables end-to-end training of padding values suitable for each task instead of zero padding. Moreover, novel metrics to quantitatively evaluate the translation invariance of the model are presented. By evaluating with these metrics, it was confirmed that the proposed method achieved higher accuracy and translation invariance than the previous methods in a semantic segmentation task.