Performance Analysis of YOLO and Detectron2 Models for Detecting Corn and Soybean Pests Employing Customized Dataset

One of the most challenging aspects of agricultural pest control is accurate detection of insects in crops. Inadequate control measures for insect pests can seriously impact the production of corn and soybean plantations. In recent years, artificial intelligence (AI) algorithms have been extensively...

Full description

Saved in:
Bibliographic Details
Published in:Agronomy (Basel) Vol. 14; no. 10; p. 2194
Main Authors: de Almeida, Guilherme Pires Silva, dos Santos, Leonardo Nazário Silva, da Silva Souza, Leandro Rodrigues, da Costa Gontijo, Pablo, de Oliveira, Ruy, Teixeira, Matheus Cândido, De Oliveira, Mario, Teixeira, Marconi Batista, do Carmo França, Heyde Francielle
Format: Journal Article
Language:English
Published: Basel MDPI AG 01-10-2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:One of the most challenging aspects of agricultural pest control is accurate detection of insects in crops. Inadequate control measures for insect pests can seriously impact the production of corn and soybean plantations. In recent years, artificial intelligence (AI) algorithms have been extensively used for detecting insect pests in the field. In this line of research, this paper introduces a method to detect four key insect species that are predominant in Brazilian agriculture. Our model relies on computer vision techniques, including You Only Look Once (YOLO) and Detectron2, and adapts them to lightweight formats—TensorFlow Lite (TFLite) and Open Neural Network Exchange (ONNX)—for resource-constrained devices. Our method leverages two datasets: a comprehensive one and a smaller sample for comparison purposes. With this setup, the authors aimed at using these two datasets to evaluate the performance of the computer vision models and subsequently convert the best-performing models into TFLite and ONNX formats, facilitating their deployment on edge devices. The results are promising. Even in the worst-case scenario, where the ONNX model with the reduced dataset was compared to the YOLOv9-gelan model with the full dataset, the precision reached 87.3%, and the accuracy achieved was 95.0%.
ISSN:2073-4395
2073-4395
DOI:10.3390/agronomy14102194