Learning with Structural Labels for Learning with Noisy Labels
Deep Neural Networks (DNNs) have demonstrated remarkable performance across diverse domains and tasks with large-scale datasets. To reduce labeling costs for large-scale datasets, semi-automated and crowdsourcing labeling methods are developed, but their labels are in-evitably noisy. Learning with N...
Saved in:
Published in: | 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) pp. 27600 - 27610 |
---|---|
Main Authors: | , , |
Format: | Conference Proceeding |
Language: | English |
Published: |
IEEE
16-06-2024
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Deep Neural Networks (DNNs) have demonstrated remarkable performance across diverse domains and tasks with large-scale datasets. To reduce labeling costs for large-scale datasets, semi-automated and crowdsourcing labeling methods are developed, but their labels are in-evitably noisy. Learning with Noisy Labels (LNL) approaches aim to train DNNs despite the presence of noisy labels. These approaches utilize the memorization effect to select correct labels and refine noisy ones, which are then used for subsequent training. However, these methods en-counter a significant decrease in the model's generalization performance due to the inevitably existing noise labels. To overcome this limitation, we propose a new approach to enhance learning with noisy labels by incorporating additional distribution informationstructural labels. In order to leverage additional distribution information for generalization, we employ a reverse k-NN, which helps the model in achieving a better feature manifold and mitigating over-fitting to noisy labels. The proposed method shows outperformed performance in multiple benchmark datasets with IDN and real-world noisy datasets. |
---|---|
ISSN: | 2575-7075 |
DOI: | 10.1109/CVPR52733.2024.02607 |