Skip to main content

TriCL: Triplet Continual Learning

Xianchao Zhang (Dalian University of Technology); Guanglu Wang (Dalian University of Technology); Xiaotong Zhang (School of Software, Dalian University of Technology); Han Liu (Dalian University of Technology); Zhengxi Yin (Huawei Technologies Co. Ltd); Wentao Yang (Dalian University of Technology)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

A class-incremental learning agent learns online with a never-ending stream of data in one training epoch. In this setting, the agent suffers from severe catastrophic forgetting due to the absence of data from seen classes after learning data from new classes. Besides, the prototypes rapidly become outdated as the agent adapts to new data sequentially, and the previous example embeddings spread out in an unforeseen way, which exacerbates forgetting (i.e., concept drift). Based on this observation, we propose a replay-based method, called TriCL, which gathers the embeddings near the prototype from the same class and separates the embeddings from the different class prototypes. TriCL leverages an improved triplet loss without extra arranged input data triplets. To facilitate rapid convergence between the same class samples, we design a memory update algorithm for decreasing the variance of the buffered samples.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00