Aarhus University Seal / Aarhus Universitets segl

Novel Assessment of Region-Based CNNs for Detecting Monocot/Dicot Weeds in Dense Field Environments

Publikation: Bidrag til tidsskrift/Konferencebidrag i tidsskrift /Bidrag til avisTidsskriftartikelForskningpeer review

DOI

  • Nima Teimouri, Agro Intelligence Aps
  • ,
  • Rasmus Nyholm Jørgensen, Agro Intelligence Aps
  • ,
  • Ole Green, Agro Intelligence Aps

Weeding operations represent an effective approach to increase crop yields. Reliable and precise weed detection is a prerequisite for achieving high-precision weed monitoring and control in precision agriculture. To develop an effective approach for detecting weeds within the red, green, and blue (RGB) images, two state-of-the-art object detection models, EfficientDet (coefficient 3) and YOLOv5m, were trained on more than 26,000 in situ labeled images with monocot/dicot classes recorded from more than 200 different fields in Denmark. The dataset was collected using a high velocity camera (HVCAM) equipped with a xenon ring flash that overrules the sunlight and minimize shadows, which enables the camera to record images with a horizontal velocity of over 50 km h-1. Software-wise, a novel image processing algorithm was developed and utilized to generate synthetic images for testing the model performance on some difficult occluded images with weeds that were properly generated using the proposed algorithm. Both deep-learning networks were trained on in-situ images and then evaluated on both synthetic and new unseen in-situ images to assess their performances. The obtained average precision (AP) of both EfficientDet and YOLOv5 models on 6625 synthetic images were 64.27% and 63.23%, respectively, for the monocot class and 45.96% and 37.11% for the dicot class. These results confirmed that both deep-learning networks could detect weeds with high performance. However, it is essential to verify both the model’s robustness on in-situ images in which there is heavy occlusion with a complicated background. Therefore, 1149 in-field images were recorded in 5 different fields in Denmark and then utilized to evaluate both proposed model’s robustness. In the next step, by running both models on 1149 in-situ images, the AP of monocot/dicot for EfficientDet and YOLOv5 models obtained 27.43%/42.91% and 30.70%/51.50%, respectively. Furthermore, this paper provides information regarding challenges of monocot/dicot weed detection by releasing 1149 in situ test images with their corresponding labels (RoboWeedMap) publicly to facilitate the research in the weed detection domain within the precision agriculture field.

OriginalsprogEngelsk
Artikelnummer1167
TidsskriftAgronomy
Vol/bind12
Nummer5
Antal sider24
ISSN2073-4395
DOI
StatusUdgivet - maj 2022

Se relationer på Aarhus Universitet Citationsformater

ID: 271244132