Modular and Lightweight Networks For Bi-Scale Style Transfer
Thibault Durand, Julien Rabin, David Tschumperlé
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:11:34
Neural-based image and video codecs are significantly more power-efficient when weights and activations are quantized to low-precision integers. While there are general-purpose techniques for reducing quantization effects, large losses can occur when specific entropy coding properties are not considered. This work analyzes how entropy coding is affected by parameter quantizations, and provides a method to minimize losses. It is shown that, by using a certain type of coding parameters to be learned, uniform quantization becomes practically optimal, also simplifying the minimization of code memory requirements. The mathematical properties of the new representation are presented, and its effectiveness is demonstrated by coding experiments, showing that good results can be obtained with precision as low as 4 bits per network output, and practically no loss with 8 bits.