Skip to main content

Network Pruning Using Linear Dependency Analysis On Feature Maps

Hao Pan, Zhongdi Chao, Jiang Qian, Bojin Zhuang, Shaojun Wang, Jing Xiao

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:06:49
09 Jun 2021

Network pruning can be achieved by removing redundant channels. In this paper, we regard a channel 'redundant' if its output is linearly dependent with respect to those of other channels. Inspired by this, we propose an efficient pruning method, named as LDFM, by linear dependency analysis on all the feature maps of each individual layer. Specifically, for each layer, by applying the QR decomposition with column pivoting (PQR) on the matrix consisting of all feature maps, those channels corresponding to small absolute diagonal elements of the R matrix from the PQR decomposition are identified as redundant, and are pruned naturally. Although pruning these channels causes loss of information and hence degrades accuracy, the accuracy of the pruned network can be easily recovered by fine-tuning, as the lost information in the pruned channels can be recovered from that in the retained channels. Extensive experiments demonstrate that LDFM makes great improvement on accuracy with similar parameters and FLOPs as other methods, and achieves the state-of-the-art results on several different benchmarks and networks.

Chairs:
Simone Milani

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00