An Improved ResNet-50 for Garbage Image Classification

In order to solve the classification model's shortcomings, this study suggests a new trash classification model that is generated by altering the structure of the ResNet-50 network. The improvement is divided into two sections. The first section is to change the residual block. To filter the in...

Full description

Saved in:
Bibliographic Details
Published in:Tehnički vjesnik Vol. 29; no. 5; pp. 1552 - 1559
Main Authors: Ma, Xiaoxuan, Li, Zhiwen, Zhang, Lei
Format: Journal Article Paper
Language:English
Published: Slavonski Baod University of Osijek 01-10-2022
Josipa Jurja Strossmayer University of Osijek
Strojarski fakultet u Slavonskom Brodu; Fakultet elektrotehnike, računarstva i informacijskih tehnologija Osijek; Građevinski i arhitektonski fakultet Osijek
Faculty of Mechanical Engineering in Slavonski Brod, Faculty of Electrical Engineering in Osijek, Faculty of Civil Engineering in Osijek
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In order to solve the classification model's shortcomings, this study suggests a new trash classification model that is generated by altering the structure of the ResNet-50 network. The improvement is divided into two sections. The first section is to change the residual block. To filter the input features, the attention module is inserted into the residual block. Simultaneously, the downsampling process in the residual block is changed to decrease information loss. The second section is multi-scale feature fusion. To optimize feature usage, horizontal and vertical multi-scale feature fusion is integrated to the primary network structure. Because of the filtering and reuse of image features, the enhanced model can achieve higher classification performance than existing models for small data sets with few samples. The experimental results show that the modified model outperforms the original ResNet-50 model on the TrashNet dataset by 7.62% and is more robust. In the meanwhile, our model is more accurate than other advanced methods.
Bibliography:281668
ISSN:1330-3651
1848-6339
DOI:10.17559/TV-20220420124810