pith. machine review for the scientific record. sign in

arxiv: 2604.23442 · v1 · submitted 2026-04-25 · 💻 cs.CV

Recognition: unknown

Resource-Constrained UAV-Based Weed Detection for Site-Specific Management on Edge Devices

Authors on Pith no claims yet

Pith reviewed 2026-05-08 08:22 UTC · model grok-4.3

classification 💻 cs.CV
keywords weed detectionUAVedge computingobject detectionYOLORT-DETRsite-specific managementreal-time inference
0
0 comments X

The pith

YOLOv11s and RT-DETRv2-R50-M offer the best balance between accuracy and speed for real-time UAV weed detection on edge devices.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The authors build a complete pipeline for acquiring UAV images of crop fields, training object detection models, and running inference directly on edge hardware to identify weeds in real time. They systematically compare convolution-based YOLO models from versions 8 to 12 against transformer-based RT-DETR models from versions 1 to 2, measuring both detection accuracy and processing speed on three different Jetson computers. High-performing models reach 86.9 percent mean average precision but run too slowly for flight use, while faster models drop to 66-71 percent accuracy. The evaluation singles out YOLOv11s and RT-DETRv2-R50-M as the models that best combine usable accuracy with the speed needed for practical deployment.

Core claim

The paper presents a deployment-oriented framework for UAV-based weed detection that combines data collection, model development, and on-device inference. Experiments across Jetson Orin Nano, AGX Xavier, and AGX Orin devices reveal that while some models attain up to 86.9% mAP50, their high latency prevents real-time operation. In comparison, RT-DETRv2-R50-M achieves 79% mAP50 with improved efficiency, YOLOv10n offers the quickest inference, and both YOLOv11s and RT-DETRv2-R50-M deliver the most favorable trade-off between accuracy and speed for real-time UAV deployment.

What carries the argument

The benchmarking process that tests multiple YOLO and RT-DETR object detection models for their mean average precision at 50% IoU (mAP50) and inference latency when executed on resource-limited Jetson edge processors.

If this is right

  • Real-time on-device processing becomes feasible for UAV weed detection, eliminating the need for data transmission to remote servers.
  • Site-specific weed management can be implemented by using model outputs to guide precise herbicide application during flights.
  • Model selection can be tailored to available hardware, with lighter models prioritizing speed and balanced ones prioritizing accuracy.
  • The identified models provide candidates ready for integration into commercial UAV systems for agricultural monitoring.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same evaluation methodology could be applied to other precision agriculture tasks such as disease detection or crop counting on similar edge platforms.
  • Performance in varied real-world fields may require additional validation, as the current results are tied to the authors' chosen datasets and test conditions.
  • Combining these detections with drone-based spraying mechanisms could lead to fully autonomous weed control systems.

Load-bearing premise

Performance figures measured on the study's weed datasets and the three specific Jetson devices will translate reliably to other crop types, lighting conditions, weed densities, and flight scenarios.

What would settle it

Deploying YOLOv11s or RT-DETRv2-R50-M on a UAV in a new agricultural setting with different crops or environmental variables and measuring whether the observed accuracy and latency support real-time weed detection as claimed.

Figures

Figures reproduced from arXiv: 2604.23442 by Dong Chen, Haibo Yao, Hanbo Huang, Kelvin Betitame, Linyuan Wang, Te-Ming Tseng, Xin Sun.

Figure 1
Figure 1. Figure 1: Class distribution of weed species in the dataset. 2. Materials and Methods This section first describes the dataset used in this study, followed by an overview of the two model architectures evaluated, namely YOLO-based convolutional models and Transformer-based detection models. Finally, the experi￾mental setup, including training procedures, evaluation met￾rics, and edge deployment configurations, is pr… view at source ↗
Figure 2
Figure 2. Figure 2: Overview of the experimental pipeline for evaluating object detection models on edge computing platforms. The annotated dataset was randomly split into training (80%), validation (10%), and testing (10%) subsets. To im￾prove model robustness and generalization, four data aug￾mentation techniques, e.g., flipping, shearing, cropping, and mosaic augmentation, were applied exclusively to the train￾ing set prio… view at source ↗
Figure 3
Figure 3. Figure 3: Training accuracy curves of YOLO and RT-DETR models with data augmentation. 3.3. Edge devices result analysis view at source ↗
Figure 4
Figure 4. Figure 4: Visual comparison of multi-class weed detection results using YOLOv12n and RT-DETRv1-R18. 2025) and RT-DETRv3 (Wang et al., 2025), were not in￾cluded. Second, different precision formats and inference en￾gines (e.g., FP32, NCNN) were not systematically evaluated. Future work will extend this benchmark to include these newer models and deployment configurations, providing a more comprehensive evaluation acr… view at source ↗
Figure 5
Figure 5. Figure 5: Weed infestation map and weed species identification generated from detected weed coordinates. actionable guidance for model selection in real-time, site￾specific weed management systems. Authorship Contribution Linyuan Wang: Conceptualization, Investigation, Soft￾ware, Writing – Original Draft; Haibo Yao: Conceptual￾ization, Data preprocessing, Writing - Review ; Te-Ming Tseng: Conceptualization, Data Cur… view at source ↗
Figure 6
Figure 6. Figure 6: Inference Speed and Detection Accuracy Comparison of Object Detection Models on Edge Devices. farming applications from uav images. Remote sensing, 7(4):4026– 4047, 2015. N. Carion, F. Massa, G. Synnaeve, N. Usunier, A. Kirillov, and S. Zagoruyko. End-to-end object detection with transformers. In European conference on computer vision, pages 213–229. Springer, 2020. M. Everingham, L. Van Gool, C. K. Willia… view at source ↗
read the original abstract

Weeds compete with crops for light, water, and nutrients, reducing yield and crop quality. Efficient weed detection is essential for site-specific weed management (SSWM). Although deep learning models have been deployed on UAV-based edge systems, a systematic understanding of how different model architectures perform under real-world resource constraints is still lacking. To address this gap, this study proposes a deployment-oriented framework for real-time UAV-based weed detection on resource-constrained edge platforms. The framework integrates UAV data acquisition, model development, and on-device inference, with a focus on balancing detection accuracy and computational efficiency. A diverse set of state-of-the-art object detection models is evaluated, including convolution-based YOLO models (v8-v12) and transformer-based RT-DETR models (v1-v2). Experiments on three edge devices (Jetson Orin Nano, Jetson AGX Xavier, and Jetson AGX Orin) demonstrate clear trade-offs between accuracy and inference latency across models and hardware configurations. Results show that high-capacity models achieve up to 86.9% mAP50 but suffer from high latency, limiting real-time deployment. In contrast, lightweight models achieve 66%-71% mAP50 with significantly lower latency, enabling real-time performance. Among all models, RT-DETRv2-R50-M achieves competitive accuracy (79% mAP50) with improved efficiency, while YOLOv10n provides the fastest inference speed. YOLOv11s and RT-DETRv2-R50-M offer the best balance between accuracy and speed, making them strong candidates for real-time UAV deployment.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 2 minor

Summary. The paper proposes a deployment-oriented framework for real-time UAV-based weed detection on resource-constrained edge platforms. It evaluates convolution-based YOLO models (v8-v12) and transformer-based RT-DETR models (v1-v2) on three Jetson devices (Orin Nano, AGX Xavier, AGX Orin), reporting concrete mAP50 accuracy and inference latency figures. The central claim is that high-capacity models reach up to 86.9% mAP50 but with high latency, while YOLOv11s and RT-DETRv2-R50-M (at 79% mAP50) offer the best accuracy-speed balance and are strong candidates for real-time UAV deployment in site-specific weed management.

Significance. If the reported accuracy-latency trade-offs prove robust, the work supplies practical empirical benchmarks that could guide model selection for precision agriculture on edge hardware. The concrete mAP50 and latency numbers across models and devices are a clear strength, as is the systematic comparison of recent YOLO and RT-DETR variants. However, the absence of real-flight validation and statistical robustness checks limits the strength of the deployment recommendations.

major comments (3)
  1. [Abstract] Abstract: The abstract supplies no dataset details, training protocol, statistical tests, or error bars, preventing verification of whether the central performance claims hold.
  2. [Experimental evaluation] Experimental evaluation: The claim that YOLOv11s and RT-DETRv2-R50-M offer the best balance between accuracy and speed relies on mAP50 vs. latency trade-offs measured only on static Jetson benchmarks, without quantifying real UAV flight overheads such as camera streaming, altitude-induced scale variation, vibration, power draw, or motion blur.
  3. [Results] Results: No multiple random seeds, cross-validation folds, or significance tests are mentioned to establish that one model is statistically superior in the accuracy-latency plane; this makes the ranking sensitive to dataset splits or training variability and undermines the recommendation of specific models as strongest candidates.
minor comments (2)
  1. [Abstract] Abstract: It is unclear which specific high-capacity model achieves the reported 86.9% mAP50 maximum; specifying this would better contextualize the trade-offs with the lightweight models (66%-71% mAP50).
  2. [Abstract] Abstract: The statement that YOLOv10n provides the fastest inference speed is presented alongside the recommendation of YOLOv11s for best balance; a summary table clarifying exact metrics for all evaluated models would improve readability.

Simulated Author's Rebuttal

3 responses · 1 unresolved

We thank the referee for the constructive feedback, which highlights important aspects of clarity, experimental design, and statistical robustness. We address each major comment point by point below, indicating revisions where feasible while maintaining the manuscript's focus on deployment-oriented benchmarks.

read point-by-point responses
  1. Referee: [Abstract] Abstract: The abstract supplies no dataset details, training protocol, statistical tests, or error bars, preventing verification of whether the central performance claims hold.

    Authors: We agree that the abstract would benefit from greater specificity. We have revised the abstract to include a concise description of the dataset (a publicly available UAV weed imagery collection) and the training protocol (standard supervised fine-tuning with the same hyperparameters across models). Due to abstract length constraints, full details on statistical aspects remain in the methods and results sections, where we clarify the single-run reporting approach. revision: yes

  2. Referee: [Experimental evaluation] Experimental evaluation: The claim that YOLOv11s and RT-DETRv2-R50-M offer the best balance between accuracy and speed relies on mAP50 vs. latency trade-offs measured only on static Jetson benchmarks, without quantifying real UAV flight overheads such as camera streaming, altitude-induced scale variation, vibration, power draw, or motion blur.

    Authors: We acknowledge that our benchmarks isolate model inference on the target Jetson platforms to enable reproducible hardware-specific comparisons. Real-flight factors such as motion blur and vibration are indeed additional variables that depend on specific UAV configurations and conditions. We have added a dedicated limitations paragraph in the discussion section that explicitly addresses these overheads and their potential impact on the reported trade-offs, while noting that the core latency figures remain directly relevant for edge deployment feasibility. revision: partial

  3. Referee: [Results] Results: No multiple random seeds, cross-validation folds, or significance tests are mentioned to establish that one model is statistically superior in the accuracy-latency plane; this makes the ranking sensitive to dataset splits or training variability and undermines the recommendation of specific models as strongest candidates.

    Authors: We recognize the merit of statistical validation for model rankings. Our experiments used fixed random seeds and standard training protocols to ensure fair, reproducible comparisons across architectures, with latency being a deterministic hardware metric. The accuracy gaps (e.g., ~79% mAP50 for the recommended models versus lower for lighter variants) are large enough to support practical recommendations. We have inserted a clarifying statement in the results section explaining the single-run design and its implications, while suggesting multi-seed analysis as future work. revision: partial

standing simulated objections not resolved
  • Quantification of real UAV flight overheads (camera streaming, vibration, motion blur, power draw) via actual flight experiments, as this would require new data collection and hardware setups outside the scope of the current revision.

Circularity Check

0 steps flagged

No circularity in empirical model benchmarking study

full rationale

The paper conducts an empirical benchmark of object detection models (YOLO variants and RT-DETR) for UAV weed detection, reporting measured mAP50 accuracy and inference latency on the authors' datasets across three Jetson edge devices. The central claim that YOLOv11s and RT-DETRv2-R50-M offer the best accuracy-speed balance follows directly from these experimental results without any intervening derivations, equations, fitted parameters, or predictions. No self-citations, ansatzes, uniqueness theorems, or renamings of known results are invoked to support the performance rankings or deployment recommendations. The study contains no derivation chain that could reduce to its own inputs by construction, rendering it self-contained as a comparative experiment.

Axiom & Free-Parameter Ledger

1 free parameters · 1 axioms · 0 invented entities

The central claims rest on standard deep-learning training and evaluation assumptions plus the implicit premise that mAP50 and inference latency are the decisive metrics for deployment success.

free parameters (1)
  • model hyperparameters and training settings
    Standard DL training involves many fitted or chosen hyperparameters whose specific values are not reported in the abstract.
axioms (1)
  • domain assumption Standard assumptions of deep learning model training, evaluation metrics, and hardware benchmarking hold for the reported comparisons.
    Implicit throughout the abstract when stating mAP50 and latency results.

pith-pipeline@v0.9.0 · 5612 in / 1185 out tokens · 52198 ms · 2026-05-08T08:22:37.333415+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

52 extracted references · 4 canonical work pages · 2 internal anchors

  1. [1]

    write newline

    " write newline "" before.all 'output.state := FUNCTION n.dashify 't := "" t empty not t #1 #1 substring "-" = t #1 #2 substring "--" = not "--" * t #2 global.max substring 't := t #1 #1 substring "-" = "-" * t #2 global.max substring 't := while if t #1 #1 substring * t #2 global.max substring 't := if while FUNCTION word.in bbl.in ":" * " " * FUNCTION f...

  2. [2]

    Adhikari, J

    B. Adhikari, J. Li, E. S. Michel, J. Dykes, T.-M. Tseng, M. L. Tagert, and D. Chen. A comprehensive evaluation of yolo-based deer detection performance on edge devices. Electronics, 15 0 (5): 0 1026, 2026

  3. [3]

    Ahmad, D

    A. Ahmad, D. Saraswat, V. Aggarwal, A. Etienne, and B. Hancock. Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems. Computers and Electronics in Agriculture, 184: 0 106081, 2021

  4. [4]

    Allmendinger, A

    A. Allmendinger, A. O. Salt k, G. G. Peteinatos, A. Stein, and R. Gerhards. Assessing the capability of yolo-and transformer-based object detectors for real-time weed detection. Precision Agriculture, 26 0 (3): 0 52, 2025

  5. [5]

    V. N. Arsenoaia, D. C. Topa, R. N. Ratu, and I. Tenu. From sensing to intervention: A critical review of agricultural drones for precision agriculture, data-driven decision making, and sustainable intensification. Agronomy, 16 0 (5): 0 564, 2026

  6. [6]

    Balasingham, S

    D. Balasingham, S. Samarathunga, G. G. Arachchige, A. Bandara, S. Wellalage, D. Pandithage, M. M. Hansika, and R. De Silva. Sparrow: Smart precision agriculture robot for ridding of weeds. In 2024 5th International Conference for Emerging Technology (INCET), pages 1--6. IEEE, 2024

  7. [7]

    Betitame, C

    K. Betitame, C. Igathinathane, K. Howatt, J. Mettler, C. Koparan, and X. Sun. A practical guide to uav-based weed identification in soybean: Comparing rgb and multispectral sensor performance. Journal of Agriculture and Food Research, 20: 0 101784, 2025

  8. [8]

    YOLOv4: Optimal Speed and Accuracy of Object Detection

    A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao. Yolov4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934, 2020

  9. [9]

    Candiago, F

    S. Candiago, F. Remondino, M. De Giglio, M. Dubbini, and M. Gattelli. Evaluating multispectral images and vegetation indices for precision farming applications from uav images. Remote sensing, 7 0 (4): 0 4026--4047, 2015

  10. [10]

    Carion, F

    N. Carion, F. Massa, G. Synnaeve, N. Usunier, A. Kirillov, and S. Zagoruyko. End-to-end object detection with transformers. In European conference on computer vision, pages 213--229. Springer, 2020

  11. [11]

    Everingham, L

    M. Everingham, L. Van Gool, C. K. Williams, J. Winn, and A. Zisserman. The pascal visual object classes (voc) challenge. International journal of computer vision, 88 0 (2): 0 303--338, 2010

  12. [12]

    O. L. Garc \' a-Navarrete, J. H. Camacho-Tamayo, A. B. Bregon, J. Mart \' n-Garc \' a, and L. M. Navas-Gracia. Performance analysis of real-time detection transformer and you only look once models for weed detection in maize cultivation. Agronomy, 15 0 (4): 0 796, 2025

  13. [13]

    Gautam, H

    A. Gautam, H. Kaur, and P. Kashyap. Real-time weed detection using yolov8: A lightweight vision system for smart farming. In International Conference on Data Science and Applications, pages 307--317. Springer, 2025

  14. [14]

    Gauttam, V

    H. Gauttam, V. Chauhan, K. Pattanaik, A. Trivedi, H. Ghosh, et al. A comprehensive review of edge computing empowered smart agriculture: Trends, opportunities and future directions. Computers and Electronics in Agriculture, 241: 0 111252, 2026

  15. [15]

    Gerhards, D

    R. Gerhards, D. And \'u jar Sanchez, P. Hamouz, G. G. Peteinatos, S. Christensen, and C. Fernandez-Quintanilla. Advances in site-specific weed management in agriculture—a review. Weed Research, 62 0 (2): 0 123--133, 2022

  16. [16]

    G \'o mez, H

    A. G \'o mez, H. Moreno, and D. And \'u jar. Intelligent inter-and intra-row early weed detection in commercial maize crops. Plants, 14 0 (6): 0 881, 2025

  17. [17]

    M. D. Islam, W. Liu, P. Izere, P. Singh, C. Yu, B. Riggan, K. Zhang, A. J. Jhala, S. Knezevic, Y. Ge, et al. Towards real-time weed detection and segmentation with lightweight cnn models on edge devices. Computers and Electronics in Agriculture, 237: 0 110600, 2025

  18. [18]

    Jocher, J

    G. Jocher, J. Qiu, and A. Chaurasia. Ultralytics YOLO , Jan. 2023. URL https://github.com/ultralytics/ultralytics

  19. [19]

    Jocher, J

    G. Jocher, J. Qiu, and A. Chaurasia. Ultralytics YOLO , 2024. URL https://github.com/ultralytics/ultralytics

  20. [20]

    A. T. Khan, S. M. Jensen, and A. R. Khan. Advancing precision agriculture: A comparative analysis of yolov8 for multi-class weed detection in cotton cultivation. Artificial Intelligence in Agriculture, 15 0 (2): 0 182--191, 2025

  21. [21]

    M. Lei, S. Li, Y. Wu, H. Hu, Y. Zhou, X. Zheng, G. Ding, S. Du, Z. Wu, and Y. Gao. Yolov13: Real-time object detection with hypergraph-enhanced adaptive visual perception. arXiv preprint arXiv:2506.17733, 2025

  22. [22]

    J. Li, D. Chen, X. Yin, and Z. Li. Performance evaluation of semi-supervised learning frameworks for multi-class weed detection. Frontiers in Plant Science, 15: 0 1396568, 2024

  23. [23]

    T.-Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, P. Doll \'a r, and C. L. Zitnick. Microsoft coco: Common objects in context. In European conference on computer vision, pages 740--755. Springer, 2014

  24. [24]

    Lottes, J

    P. Lottes, J. Behley, A. Milioto, and C. Stachniss. Fully convolutional networks with sequential information for robust crop and weed detection in precision farming. IEEE Robotics and Automation Letters, 3 0 (4): 0 2870--2877, 2018

  25. [25]

    W. Lv, Y. Zhao, Q. Chang, K. Huang, G. Wang, and Y. Liu. Rt-detrv2: Improved baseline with bag-of-freebies for real-time detection transformer. arXiv preprint arXiv:2407.17140, 2024

  26. [26]

    N. Ma, X. Zhang, H.-T. Zheng, and J. Sun. Shufflenet v2: Practical guidelines for efficient cnn architecture design. In Proceedings of the European conference on computer vision (ECCV), pages 116--131, 2018

  27. [27]

    Mavridou, E

    E. Mavridou, E. Vrochidou, G. A. Papakostas, T. Pachidis, and V. G. Kaburlasos. Machine vision systems in precision agriculture for crop farming. Journal of Imaging, 5 0 (12): 0 89, 2019

  28. [28]

    B. Ni, L. Xiao, D. Lin, T.-L. Zhang, Q. Zhang, Y. Liu, Q. Chen, D. Zhu, H. Qian, M. C. Rillig, et al. Increasing pesticide diversity impairs soil microbial functions. Proceedings of the National Academy of Sciences, 122 0 (2): 0 e2419917122, 2025

  29. [29]

    N. Rai, Y. Zhang, M. Villamil, K. Howatt, M. Ostlie, and X. Sun. Agricultural weed identification in images and videos by integrating optimized deep learning architecture on an edge computing technology. Computers and Electronics in Agriculture, 216: 0 108442, 2024

  30. [30]

    B. Ram, J. Joy, N. Marcotte, S. Urlacher, D. McDonald, J. K. Amundson, G. Silewski, A. Jhala, and X. Sun. An edge-ai enabled uav system for site-specific application targeting palmer amaranth in corn and soybean fields. Journal of Agriculture and Food Research, page 102577, 2025

  31. [31]

    A. O. Salt k, A. Allmendinger, and A. Stein. Comparative analysis of yolov9, yolov10 and rt-detr for real-time weed detection. In European Conference on Computer Vision, pages 177--193. Springer, 2024

  32. [32]

    Shuai, J

    Y. Shuai, J. Shi, Y. Li, S. Zhou, L. Zhang, and J. Mu. Yolo-sw: A real-time weed detection model for soybean fields using swin transformer and rt-detr. Agronomy, 15 0 (7): 0 1712, 2025

  33. [33]

    Soltani, J

    N. Soltani, J. A. Dille, I. C. Burke, W. J. Everman, M. J. VanGessel, V. M. Davis, and P. H. Sikkema. Perspectives on potential soybean yield losses from weeds in north america. Weed Technology, 31 0 (1): 0 148--154, 2017

  34. [34]

    Srinivas, T.-Y

    A. Srinivas, T.-Y. Lin, N. Parmar, J. Shlens, P. Abbeel, and A. Vaswani. Bottleneck transformers for visual recognition. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 16519--16529, 2021

  35. [35]

    Sulzbach, I

    E. Sulzbach, I. Scheeren, M. S. T. Veras, M. C. Tosin, W. A. E. Kroth, A. Merotto Jr, and C. Markus. Deep learning model optimization methods and performance evaluation of yolov8 for enhanced weed detection in soybeans. Computers and Electronics in Agriculture, 232: 0 110117, 2025

  36. [36]

    H. Sun, T. Liu, J. Wang, D. Zhai, and J. Yu. Evaluation of two deep learning-based approaches for detecting weeds growing in cabbage. Pest Management Science, 80 0 (6): 0 2817--2826, 2024

  37. [37]

    Terven, D.-M

    J. Terven, D.-M. C \'o rdova-Esparza, and J.-A. Romero-Gonz \'a lez. A comprehensive review of yolo architectures in computer vision: From yolov1 to yolov8 and yolo-nas. Machine learning and knowledge extraction, 5 0 (4): 0 1680--1716, 2023

  38. [38]

    Thorp and L

    K. Thorp and L. Tian. A review on remote sensing of weeds in agriculture. Precision Agriculture, 5 0 (5): 0 477--508, 2004

  39. [39]

    Y. Tian, Q. Ye, and D. Doermann. Yolov12: Attention-centric real-time object detectors. arXiv preprint arXiv:2502.12524, 2025

  40. [40]

    Ugljic, A

    Z. Ugljic, A. Mobli, M. C. Oliveira, C. A. Proctor, J. A. Dille, and R. Werle. Stakeholder assessment of weed management practices and perceptions of targeted spraying technologies in corn-soybean systems. Frontiers in Agronomy, 7: 0 1601328, 2025

  41. [41]

    Upadhyay, G

    A. Upadhyay, G. Sunil, Y. Zhang, C. Koparan, and X. Sun. Development and evaluation of a machine vision and deep learning-based smart sprayer system for site-specific weed management in row crops: An edge computing approach. Journal of Agriculture and Food Research, 18: 0 101331, 2024

  42. [42]

    Vijayakumar and S

    A. Vijayakumar and S. Vairavasundaram. Yolo-based object detection models: A review and its applications. Multimedia Tools and Applications, 83 0 (35): 0 83535--83574, 2024

  43. [43]

    A. Wang, W. Zhang, and X. Wei. A review on weed detection using ground-based machine vision and image processing techniques. Computers and electronics in agriculture, 158: 0 226--240, 2019

  44. [44]

    A. Wang, H. Chen, L. Liu, K. Chen, Z. Lin, J. Han, and G. Ding. Yolov10: Real-time end-to-end object detection. Advances in neural information processing systems, 37: 0 107984--108011, 2024 a

  45. [45]

    Wang, I.-H

    C.-Y. Wang, I.-H. Yeh, and H.-Y. Mark Liao. Yolov9: Learning what you want to learn using programmable gradient information. In European conference on computer vision, pages 1--21. Springer, 2024 b

  46. [46]

    S. Wang, C. Xia, F. Lv, and Y. Shi. Rt-detrv3: Real-time end-to-end object detection with hierarchical dense positive supervision. In WACV, pages 1628--1636, 2025

  47. [47]

    Win, H.-H

    P.-P. Win, H.-H. Park, and Y.-I. Kuk. Control efficacy of natural products on broadleaf and grass weeds using various application methods. Agronomy, 13 0 (9): 0 2262, 2023

  48. [48]

    X. Xue, V. Thakur, H. Dhumras, R. H. Jhaveri, and T. R. Gadekallu. Real-time pest detection using resnet-50 and vision transformer: An iot-enabled mobile application for smart agriculture. IEEE Transactions on Consumer Electronics, 2025

  49. [49]

    Yang, W.-Z

    Z. Yang, W.-Z. Liang, N. Lawrence, X. Qiao, B. Riggan, R. Harveson, C.-E. Chiang, J. Oboamah, and D. P. Andjawo. Weedcam: An edge-computing camera system for multi-species weed detection in sugar beet production fields. Computers and Electronics in Agriculture, 244: 0 111498, 2026

  50. [50]

    Zhang, H

    W. Zhang, H. Huang, Y. Sun, and X. Wu. Agripest-yolo: A rapid light-trap agricultural pest detection method based on deep learning. Frontiers in Plant Science, 13: 0 1079384, 2022

  51. [51]

    Y. Zhao, W. Lv, S. Xu, J. Wei, G. Wang, Q. Dang, Y. Liu, and J. Chen. Detrs beat yolos on real-time object detection. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 16965--16974, 2024

  52. [52]

    Q. Zhou, Z. Wang, Y. Zhong, F. Zhong, and L. Wang. Efficient optimized yolov8 model with extended vision. Sensors, 24 0 (20): 0 6506, 2024