Skip to main content

Parallel Gradient Blend for Class Incremental Learning

Yunlong Zhao, XiaoHeng Deng, Xinjun Pei, Xuechen Chen, Deng Li

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Lecture 11 Oct 2023

Neural Networks' performance on a sequence of incremental class tasks drops over time for Class Incremental Learning (IL). The gradient-based IL methods can simultaneously adapt to both new and previous tasks by promoting the update of the model in the correct direction. However, existing methods simply consider the previous/new task gradients separately. In this paper, we propose Parallel Gradient Blend (PGB) paradigm. On the one hand, PGB uses the gradients generated by mixing previous and new samples in equal proportions with Batch-Normal layers to adjust a reasonable model update direction. By comparing gradient similarities, the model selects either the previous task gradient or mixed gradient to update. On the other hand, PGB uses the sample feature gradient distribution difference to construct a regularized gradient. Finally, we experimentally demonstrate that PGB outperforms state-of-the-art methods on class-IL benchmarks.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00