Skip to main content

HIERARCHICAL MULTI-TASK LEARNING VIA TASK AFFINITY GROUPINGS

Siddharth Srivastava, Swati Bhugra, Vinay Kaushik, Brejesh Lall

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Poster 11 Oct 2023

Multi-task learning (MTL) permits joint task learning based on a shared deep learning architecture and multiple loss functions. Despite the recent advances in MTL, one loss often dominates the learning optimization in multiple unrelated tasks. This often results in poor performance compared to the corresponding single task learning. To overcome the aforementioned “negative transfer”, we propose a novel hierarchical framework that leverages task relations via inter-task affinity to supervise multi-task learning. Specifically, the inter-task affinity generated task sets, with low-level task set and complex task set at the bottom and top layers respectively, enables iterative multi-task information sharing. In addition, it also alleviates simultaneous image annotations for multiple tasks. The proposed framework achieves state-of-the-art results on classification, detection, semantic segmentation and depth estimation across three standard benchmarks. Furthermore, with state of the results on two benchmarks for image retrieval task, we also demonstrate that the embeddings learned using such a framework provide good generalization and robust representation learning.

More Like This

  • APS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • IES
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00