Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:17:23
11 Jun 2021

Large-scale convex optimization problems can be efficiently addressed using the Fast Gradient Method (FGM) and the Fast Iterative Shrinkage Thresholding Algorithm (FISTA). FGM requires that the objective be finite and differentiable with a known gradient Lipschitz constant whereas FISTA can be applied to composite objectives and features line-search. Nonetheless, FISTA cannot increase the step size and is unable to take advantage of strong convexity. FGM and FISTA are very similar in form but appear to have vastly differing convergence analyses. In this work we unite the two analyses. We generalize the previously introduced augmented estimate sequence framework as well as the related notion of the gap sequence. We use our tools to construct a Generalized Accelerated Composite Gradient Method that encompasses FGM and FISTA, along with their most popular variants. The Lyapunov property of the generalized gap sequence used in deriving our method implies that both FGM and FISTA are amenable to a Lyapunov analysis. Moreover, our method features monotonically decreasing objective function values alongside a versatile line-search procedure and is thus able to surpass both FGM and FISTA in robustness and usability. We support our findings with simulation results on an extensive benchmark of composite problems.

Chairs:
Wenwu Wang

Value-Added Bundle(s) Including this Product

More Like This