FUNCTIONAL KNOWLEDGE TRANSFER WITH SELF-SUPERVISED REPRESENTATION LEARNING
Prakash Chandra Chhipa, Muskaan Chopra, Gopal Mengi, Varun Gupta, Richa Upadhyay, Meenakshi Subhash Chippa, Kanjar De, Rajkumar Saini, Seiichi Uchida, Marcus Liwicki
-
SPS
IEEE Members: $11.00
Non-members: $15.00
This work investigates the unexplored usability of self-supervised representation learning in the direction of functional knowledge transfer. Specifically, applying such meth- ods on small-scale datasets, functional knowledge transfer is achieved by joint optimization of self-supervised learning pseudo task and supervised learning downstream task, which achieves improved downstream task performance. Recent progress in self-supervised learning uses a large volume of data, which becomes a constraint for its applications on small-scale datasets (rephrase this sentence). This work shares a simple yet effective joint training framework that reinforces human-supervised task learning by learning self-supervised representations just-in-time and vice versa. Experiments on three public datasets from different visual domains, Intel Image, CIFAR, and APTOS, reveal a consistent track of performance improvements on classification tasks during joint optimization. Qualitative analysis also supports the robustness of learnt representations. Source code and trained models are available on GitHub.