Enhancement of YOLOv5 for automatic weed detection through backbone optimization
Abstract
In the context of our research project, which involves developing a robotic system capable of eliminating weeds using deep learning technics, the selection of powerful object detection model is essential. Object detectors typically consist of three components: backbone, neck, and prediction head. In this study, we propose an enhancement to the you only look once version 5 (YOLOv5) network by using the most popular convolutional neural networks (CNN) networks (such as DarkNet and MobileNet) as backbones. The objective of this study is to identify the best backbone that can improve YOLOv5 's performance while preserving its other layers (neck and head). In terms of detecting and ultra-localizing pea crops. Additionally, we compared their results with those of the most commonly used object detectors. Our findings indicate that the fastest models among the networks studied were MobileNet, YOLO-tiny, and YOLOv5, with speeds ranging from 5 to 14 milliseconds per image. Among these models, MobileNetv1 demonstrated the highest accuracy, achieving average precision (AP) score of 89.3% for intersection over union (IoU) threshold of 0.5. However, the accuracy of this model decreased when we increased the threshold, suggesting that it does not provide perfect crop delineation. On the other hand, while YOLOv5 had a lower AP score than MobileNetv1 at an IoU threshold of 0.5, it exhibited greater stability when faced with variations in this threshold.
Keywords
Computer vision; Convolutional neural networks backbone; Deep learning; Object detectors; Smart farming; Weed detection
Full Text:
PDFDOI: http://doi.org/10.11591/ijai.v14.i1.pp658-666
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).