A Large-Class Few-Shot Learning Method Based on High-Dimensional Features

Large-class few-shot learning has a wide range of applications in many fields, such as the medical, power, security, and remote sensing fields. At present, many few-shot learning methods for fewer-class scenarios have been proposed, but little research has been performed for large-class scenarios. I...

Full description

Saved in:
Bibliographic Details
Published in:Applied sciences Vol. 13; no. 23; p. 12843
Main Authors: Dang, Jiawei, Zhou, Yu, Zheng, Ruirui, He, Jianjun
Format: Journal Article
Language:English
Published: Basel MDPI AG 01-12-2023
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Large-class few-shot learning has a wide range of applications in many fields, such as the medical, power, security, and remote sensing fields. At present, many few-shot learning methods for fewer-class scenarios have been proposed, but little research has been performed for large-class scenarios. In this paper, we propose a large-class few-shot learning method called HF-FSL, which is based on high-dimensional features. Recent theoretical research shows that if the distribution of samples in a high-dimensional feature space meets the conditions of compactness within the class and the dispersion between classes, the large-class few-shot learning method has a better generalization ability. Inspired by this theory, the basic idea is use a deep neural network to extract high-dimensional features and unitize them to project the samples onto a hypersphere. The global orthogonal regularization strategy can then be used to make samples of different classes on the hypersphere that are as orthogonal as possible, so as to achieve the goal of sample compactness within the class and the dispersion between classes in high-dimensional feature space. Experiments on Omniglot, Fungi, and ImageNet demonstrate that the proposed method can effectively improve the recognition accuracy in a large-class FSL problem.
ISSN:2076-3417
2076-3417
DOI:10.3390/app132312843