Functionally Similar Multi-Label Knowledge Distillation

Existing multi-label knowledge distillation methods simply use regression or single-label classification methods without fully exploiting the essence of multi-label classification, resulting in student models' inadequate performance and poor functional similarity to teacher models. In this pape...

Full description

Saved in:
Bibliographic Details
Published in:ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 7210 - 7214
Main Authors: Chen, Binghan, Hu, Jianlong, Zheng, Xiawu, Lin, Wei, Chao, Fei, Ji, Rongrong
Format: Conference Proceeding
Language:English
Published: IEEE 14-04-2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Existing multi-label knowledge distillation methods simply use regression or single-label classification methods without fully exploiting the essence of multi-label classification, resulting in student models' inadequate performance and poor functional similarity to teacher models. In this paper, we reinterpret multi-label classification as multiple intra-class ranking tasks, with each class corresponding to a ranking task. Furthermore, we define the knowledge of multi-label classification models as the ranking of intra-class samples. On the one hand, we propose to evaluate the functional similarity between multi-label classification models with Kendall's tau and rank-biased overlap, which are common metrics for evaluating ranking similarity. On the other hand, we propose a new functionally similar multi-label knowledge distillation method called FSD, which enables student models to learn the ranking of intra-class samples from teacher models. Finally, experimental results validate that FSD outperforms existing methods, especially for functional similarity. Specifically, we achieve a mAP of 73.38% and a mKDT of 0.686 on COCO, which are 2.22% and 0.19 better than existing methods, respectively.
ISSN:2379-190X
DOI:10.1109/ICASSP48485.2024.10447660