UT-GAN: A NOVEL UNPAIRED TEXTUAL-ATTENTION GENERATIVE ADVERSARIAL NETWORK FOR LOW-LIGHT TEXT IMAGE ENHANCEMENT
Minglong Xue, ZhengYang He, Yanyi He, Peiqi Xie, Xin Feng
-
SPS
IEEE Members: $11.00
Non-members: $15.00
How to balance lighting and texture details to achieve the desired visual effect remains the bottleneck of existing low-light image enhancement methods. In this paper, we propose a novel Unpaired Textual-attention Generative Adversarial network (UT-GAN) for low-light text image enhancement task. UT-GAN first uses the Zero-DCE net for initial illumination recovery and our TAM module is proposed to translate text information into a textual attention mechanism for the overall network, emphasizing attention to the details of text regions. Moreover, the method constructs an AGM-Net module to mitigate noise effects and fine-tune the illumination. Experiments show that UT-GAN outperforms existing methods in qualitative and quantitative evaluation on the widely used the low-light datasets LOL and SID.