-
SPS
IEEE Members: $11.00
Non-members: $15.00
Arbitrary style transfer has drawn widespread attention from academic communities for its extensive application in reality. However, existing methods fail to make a proper trade-off between flexibility and capability. All of these methods directly synthesize style textures on the whole content image indiscriminately, leading to local distortions when there are obvious visual distinctions in different regions. In this paper, we propose a novel Saliency-Aware Instance Normalization (SAIN) module considering the visual distinctions between salient area and non-salient area for arbitrary style transfer. To extract region-specific stylized features, SAIN separately performs the local feature alignment in each region. Besides, for the full integration of global and local features, we design a global branch in our model, where content features are adjusted with the globally calculated statistics of style features. Comprehensive evaluations demonstrate our method’s superiority over six state-of-the-art arbitrary style transfer methods. Source code is available at https://github.com/yitongli123/SAIN.