EfficientMaize: A Lightweight Dataset for Maize Classification on Resource-Constrained Devices

Hyperspectral imaging, combined with deep learning techniques, has been employed to classify maize. However, the implementation of these automated methods often requires substantial processing and computing resources, presenting a significant challenge for deployment on embedded devices due to high...

Full description

Saved in:
Bibliographic Details
Published in:Data in brief Vol. 54; p. 110261
Main Authors: Asante, Emmanuel, Appiah, Obed, Appiahene, Peter, Adu, Kwabena
Format: Journal Article
Language:English
Published: Netherlands Elsevier Inc 01-06-2024
Elsevier
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Hyperspectral imaging, combined with deep learning techniques, has been employed to classify maize. However, the implementation of these automated methods often requires substantial processing and computing resources, presenting a significant challenge for deployment on embedded devices due to high GPU power consumption. Access to Ghanaian local maize data for such classification tasks is also extremely difficult in Ghana. To address these challenges, this research aims to create a simple dataset comprising three distinct types of local maize seeds in Ghana. The goal is to facilitate the development of an efficient maize classification tool that minimizes computational costs and reduces human involvement in the process of grading seeds for marketing and production. The dataset is presented in two parts: raw images, consisting of 4,846 images, are categorized into bad and good. Specifically, 2,211 images belong to the bad class, while 2,635 belong to the good class. Augmented images consist of 28,910 images, with 13,250 representing bad data and 15,660 representing good data. All images have been validated by experts from Heritage Seeds Ghana and are freely available for use within the research community.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2352-3409
2352-3409
DOI:10.1016/j.dib.2024.110261