Sorry, you need to enable JavaScript to visit this website.

Functional Knowledge Transfer with Self-supervised Representation Learning

DOI:
10.60864/kdcz-6w85
Citation Author(s):
Prakash Chandra Chhipa, Muskaan Chopra, Gopal Mengi, Varun Gupta, Richa Upadhyay , Meenakshi Subhash Chippa, Kanjar De, Rajkumar Saini, Seiichi Uchida, Marcus Liwicki
Submitted by:
Prakash Chandra...
Last updated:
17 November 2023 - 12:05pm
Document Type:
Presentation Slides
Document Year:
2023
Event:
Presenters:
Prakash Chandra Chhipa
Paper Code:
3230
Categories:
Keywords:
 

This work investigates the unexplored usability of self-supervised representation learning in the direction of functional knowledge transfer. In this work, functional knowledge transfer is achieved by joint optimization of self-supervised learning pseudo task and supervised learning task, improving supervised learning task performance. Recent progress in self-supervised learning uses a large volume of data, which becomes a constraint for its applications on small-scale datasets. This work shares a simple yet effective joint training framework that reinforces human-supervised task learning by learning self-supervised representations just-in-time and vice versa. Experiments on three public datasets from different visual domains, Intel Image, CIFAR, and APTOS, reveal a consistent track of performance improvements on classification tasks during joint optimization. Qualitative analysis also supports the robustness of learnt representations. Source code and trained models are available on GitHub.

up
0 users have voted: