Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:11:47
06 Oct 2022

Sparse coding methods are iterative and typically rely on proximal gradient methods. While the commonly used sparsity promoting penalty is the L1 norm, alternatives such as the Minimax Concave Penalty (MCP) and Smoothly Clipped Absolute Deviation (SCAD) penalty have also been employed to obtain superior results. Combining various penalties to achieve robust sparse recovery is possible, but the challenge lies in parameter tuning. Given the connection between deep networks and unrolling of iterative algorithms, it is possible to unify the unfolded networks arising from different formulations. We propose an ensemble of proximal networks for sparse recovery, where the ensemble weights are learnt in a data-driven fashion. We found that the proposed network performs superior to or on par with the individual networks in the ensemble for synthetic data under various noise levels and sparsity conditions. We demonstrate an application to image denoising based on the convolutional sparse coding formulation.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00