Identification of tea leaf diseases by using an improved deep convolutional neural network

•A method for identifying tea diseases with low cost and high identification accuracy is proposed.•Multiscale feature extraction is introduced to distinguish the features of different tea leaf diseases.•Depthwise separable convolution is used instead of standard convolution to reduce the number of m...

Full description

Saved in:
Bibliographic Details
Published in:Sustainable computing informatics and systems Vol. 24; p. 100353
Main Authors: Hu, Gensheng, Yang, Xiaowei, Zhang, Yan, Wan, Mingzhu
Format: Journal Article
Language:English
Published: Elsevier Inc 01-12-2019
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•A method for identifying tea diseases with low cost and high identification accuracy is proposed.•Multiscale feature extraction is introduced to distinguish the features of different tea leaf diseases.•Depthwise separable convolution is used instead of standard convolution to reduce the number of model parameters. Accurate and rapid identification of tea leaf diseases is beneficial to their prevention and control. This study proposes a method based on an improved deep convolutional neural network (CNN) for tea leaf disease identification. A multiscale feature extraction module is added into the improved deep CNN of a CIFAR10-quick model to improve the ability to automatically extract image features of different tea leaf diseases. The depthwise separable convolution is used to reduce the number of model parameters and accelerate the calculation of the model. Experimental results show that the average identification accuracy of the proposed method is 92.5%, which is higher than that of traditional machine learning methods and classical deep learning methods. The number of parameters and the convergence iteration times of the improved model are significantly lower than those of VGG16 and AlexNet deep learning network models.
ISSN:2210-5379
DOI:10.1016/j.suscom.2019.100353