Recognition: unknown
Terrain Perception for Agricultural UAVs in Complex Farmland via Rotating mmWave Radar
Pith reviewed 2026-05-09 14:52 UTC · model grok-4.3
The pith
Rotating mmWave radar with pose-consistent reconstruction enables accurate terrain perception for agricultural UAVs.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
The authors introduce a low-cost rotating mmWave radar system for agricultural UAVs that enlarges spatial coverage beyond fixed field-of-view limits and pairs it with a pose-consistent terrain reconstruction pipeline suited to sparse, noisy radar data, allowing reliable ground extraction and continuous surface estimation in real farmland conditions.
What carries the argument
The mechanically rotating sensing design combined with the pose-consistent terrain reconstruction pipeline, which processes radar returns under dynamic flight to extract ground and estimate terrain surfaces.
If this is right
- Improved terrain coverage and observability during low-altitude dynamic flights in farmland.
- Higher ground segmentation accuracy with an F1 score of 94.42 compared to 90.48 for rivals.
- More robust terrain-following flight capabilities for agricultural UAVs.
- Low-cost alternative to denser sensors like LiDAR for complex environments.
Where Pith is reading between the lines
- The approach may generalize to other UAV tasks requiring surface estimation in unstructured settings.
- Integration with existing flight controllers could reduce reliance on GPS or visual odometry in adverse weather.
- Testing on different crop types or slopes would reveal limits of the sparse-data handling.
Load-bearing premise
The combination of mechanical rotation and the reconstruction pipeline reliably produces accurate continuous terrain surfaces from the sparse and noisy mmWave data collected during real dynamic flights over farmland.
What would settle it
Deployment in a new farmland site with different terrain features where the ground segmentation F1 score drops below 90 or the UAV fails to maintain safe terrain-following distance.
Figures
read the original abstract
Accurate terrain perception is essential for terrain-following flight of agricultural unmanned aerial vehicles (UAVs), yet remains challenging in real-world farmland due to occlusions, complex terrain geometry, and environmental disturbances. Millimeter-wave (mmWave) radar is a promising sensing modality for this task due to its robustness to adverse conditions; however, existing UAV-mounted radar systems rely on fixed field of view (FoV) and terrain extraction methods designed for dense LiDAR data, leading to incomplete and unreliable terrain estimation. To address these limitations, we present a low-cost rotating mmWave radar-enabled terrain perception framework for agricultural UAVs operating in complex farmland environments. Specifically, a mechanically rotating sensing design is introduced to enlarge spatial coverage and improve terrain observability beyond the limitations of fixed-view radar under dynamic low-altitude flight. Building upon this sensing capability, we further design a pose-consistent terrain reconstruction pipeline tailored for sparse, noisy, and partially observable radar data, enabling reliable ground extraction and continuous terrain surface estimation in challenging agricultural scenarios. The complete system is deployed on a real agricultural UAV platform and comprehensively evaluated through extensive field experiments. Experimental results demonstrate improved terrain coverage and estimation accuracy, achieving an F1 score of 94.42 for ground segmentation, while the closest rival only achieves 90.48. Thus, leading to more robust terrain following flight.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper presents a terrain perception framework for agricultural UAVs that uses a mechanically rotating mmWave radar to increase spatial coverage beyond fixed-FoV limitations during low-altitude flight in complex farmland. It introduces a pose-consistent reconstruction pipeline to extract ground points and estimate continuous surfaces from sparse, noisy radar returns, and reports field experiments on a real UAV platform showing an F1 score of 94.42 for ground segmentation versus 90.48 for the closest baseline.
Significance. If the results hold, the work offers a practical, low-cost sensing solution for robust terrain-following in agricultural UAVs where LiDAR fails due to occlusions or weather. Credit is due for the real-platform deployment, quantitative field comparison, and explicit linkage between the rotating design and measured coverage/accuracy gains.
minor comments (3)
- [Abstract] Abstract: the sentence 'Thus, leading to more robust terrain following flight' is grammatically incomplete and should be rephrased for clarity.
- [Abstract] The abstract refers to 'the closest rival' without naming the method or providing a citation; specifying the baseline (e.g., fixed-view mmWave with LiDAR-style extraction) would help readers assess fairness of the 4-point F1 gain.
- [Experimental results] Experimental section: while field trials and qualitative maps are described, adding a table with per-flight F1 values, number of runs, and environmental conditions would improve reproducibility and allow assessment of variance.
Simulated Author's Rebuttal
We thank the referee for the positive assessment of our terrain perception framework using rotating mmWave radar for agricultural UAVs. The recognition of the practical advantages in real farmland settings, the real-platform experiments, and the quantitative gains in coverage and F1 score (94.42 vs. 90.48) is appreciated. The recommendation for minor revision is noted.
Circularity Check
No significant circularity; experimental results independent of inputs
full rationale
The paper's core contribution is a mechanically rotating mmWave radar design plus a pose-consistent reconstruction pipeline for terrain perception on agricultural UAVs. All load-bearing claims (F1 score of 94.42 for ground segmentation, improved coverage over fixed-view baselines) are obtained from physical hardware deployment and quantitative field trials on real farmland, not from any self-referential definitions, fitted parameters presented as predictions, or self-citation chains. No equations or uniqueness theorems reduce to the paper's own inputs by construction; the evaluation protocol directly measures the claimed coverage and accuracy gains against external rivals. The derivation chain is therefore self-contained and externally falsifiable via the reported experiments.
Axiom & Free-Parameter Ledger
axioms (1)
- domain assumption mmWave radar returns can be processed to extract ground points and estimate terrain surfaces despite sparsity, noise, and partial observability in dynamic low-altitude flight
Reference graph
Works this paper leans on
-
[1]
A semi-autonomous robotic system for in situ soil sampling, analysis, and mapping in precision agriculture,
T. H. Nguyen, E. Muller, M. R. Rubin, X. Wang, F. Sibona, A. McBrat- ney, and S. Sukkarieh, “A semi-autonomous robotic system for in situ soil sampling, analysis, and mapping in precision agriculture,”IEEE Transactions on Field Robotics, vol. 3, pp. 22–39, 2025
2025
-
[2]
A predictive control strategy to offset-point tracking for agricultural mobile robots,
S. N. Wembe, V . Rousseau, J. Laconte, and R. Lenain, “A predictive control strategy to offset-point tracking for agricultural mobile robots,” IEEE Transactions on Field Robotics, 2026
2026
-
[3]
Precision agriculture management based on a surrogate model assisted multiob- jective algorithmic framework,
D. Cheng, Y . Yao, R. Liu, X. Li, B. Guan, and F. Yu, “Precision agriculture management based on a surrogate model assisted multiob- jective algorithmic framework,”Scientific Reports, vol. 13, no. 1, p. 1142, 2023
2023
-
[4]
A compilation of uav applications for precision agriculture,
P. Radoglou-Grammatikis, P. Sarigiannidis, T. Lagkas, and I. Moscho- lios, “A compilation of uav applications for precision agriculture,” Computer Networks, vol. 172, p. 107148, 2020
2020
-
[5]
A multi-sensor fusion approach for rapid orthoimage generation in large-scale uav mapping,
J. He, Z. Zhan, Z. Tu, X. Zhu, and J. Yuan, “A multi-sensor fusion approach for rapid orthoimage generation in large-scale uav mapping,” in2025 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2025, pp. 6808–6815
2025
-
[6]
Uav multispectral remote sensing for agriculture: A comparative study of radiometric correction methods under varying illumination conditions,
Y . Wang, G. Kootstra, Z. Yang, and H. A. Khan, “Uav multispectral remote sensing for agriculture: A comparative study of radiometric correction methods under varying illumination conditions,”Biosystems Engineering, vol. 248, pp. 240–254, 2024
2024
-
[7]
A comprehen- sive review of lidar applications in crop management for precision agriculture,
S. M. Farhan, J. Yin, Z. Chen, and M. S. Memon, “A comprehen- sive review of lidar applications in crop management for precision agriculture,”Sensors, vol. 24, no. 16, p. 5409, 2024
2024
-
[8]
Radar perception for autonomous un- manned aerial vehicles: A survey,
F. Corradi and F. Fioranelli, “Radar perception for autonomous un- manned aerial vehicles: A survey,”System Engineering for constrained embedded systems, pp. 14–20, 2022
2022
-
[9]
Gnss aided radar inertial odometry for uas flights in challenging conditions,
C. Doer, J. Atman, and G. F. Trnmmer, “Gnss aided radar inertial odometry for uas flights in challenging conditions,” in2022 IEEE Aerospace Conference (AERO). IEEE, 2022, pp. 1–10
2022
-
[10]
Agrilira4d: A multi-sensor uav dataset for robust slam in challenging agricultural fields,
Z. Zhan, Y . Ming, S. Li, and J. Yuan, “Agrilira4d: A multi-sensor uav dataset for robust slam in challenging agricultural fields,”arXiv preprint arXiv:2512.01753, 2025
-
[11]
Agriculture in hilly and mountainous landscapes: threats, monitoring and sustainable management,
P. Tarolli and E. Straffelini, “Agriculture in hilly and mountainous landscapes: threats, monitoring and sustainable management,”Geog- raphy and sustainability, vol. 1, no. 1, pp. 70–76, 2020
2020
-
[12]
Fast segmentation of 3d point clouds: A paradigm on lidar data for autonomous vehicle applications,
D. Zermas, I. Izzat, and N. Papanikolopoulos, “Fast segmentation of 3d point clouds: A paradigm on lidar data for autonomous vehicle applications,” in2017 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2017, pp. 5067–5073
2017
-
[13]
Patchwork: Concentric zone-based region-wise ground segmentation with ground likelihood estimation using a 3d lidar sensor,
H. Lim, M. Oh, and H. Myung, “Patchwork: Concentric zone-based region-wise ground segmentation with ground likelihood estimation using a 3d lidar sensor,”IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 6458–6465, 2021
2021
-
[14]
Patchwork++: Fast and robust ground segmentation solving partial under-segmentation using 3d point cloud,
S. Lee, H. Lim, and H. Myung, “Patchwork++: Fast and robust ground segmentation solving partial under-segmentation using 3d point cloud,” in2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2022, pp. 13 276–13 283
2022
-
[15]
Evora: Deep eviden- tial traversability learning for risk-aware off-road autonomy,
X. Cai, S. Ancha, L. Sharma, P. R. Osteen, B. Bucher, S. Phillips, J. Wang, M. Everett, N. Roy, and J. P. How, “Evora: Deep eviden- tial traversability learning for risk-aware off-road autonomy,”IEEE Transactions on Robotics, vol. 40, pp. 3756–3777, 2024
2024
-
[16]
Road surface 3d reconstruction based on dense subpixel disparity map estimation,
R. Fan, X. Ai, and N. Dahnoun, “Road surface 3d reconstruction based on dense subpixel disparity map estimation,”IEEE Transactions on Image Processing, vol. 27, no. 6, pp. 3025–3035, 2018. 16 VOLUME , <Society logo(s) and publication title will appear here.>
2018
-
[17]
Self-supervised visual terrain classification from unsupervised acoustic feature learning,
J. Z ¨urn, W. Burgard, and A. Valada, “Self-supervised visual terrain classification from unsupervised acoustic feature learning,”IEEE Transactions on Robotics, vol. 37, no. 2, pp. 466–481, 2021
2021
-
[18]
Pixel to elevation: Learning to predict elevation maps at long range using images for autonomous offroad navigation,
C. Chung, G. Georgakis, P. Spieler, C. Padgett, A. Agha, and S. Khattak, “Pixel to elevation: Learning to predict elevation maps at long range using images for autonomous offroad navigation,”IEEE Robotics and Automation Letters, vol. 9, no. 7, pp. 6170–6177, 2024
2024
-
[19]
Iden- tifying terrain physical parameters from vision - towards physical- parameter-aware locomotion and navigation,
J. Chen, J. Frey, R. Zhou, T. Miki, G. Martius, and M. Hutter, “Iden- tifying terrain physical parameters from vision - towards physical- parameter-aware locomotion and navigation,”IEEE Robotics and Automation Letters, vol. 9, no. 11, pp. 9279–9286, 2024
2024
-
[20]
Con- trastive label disambiguation for self-supervised terrain traversability learning in off-road environments,
H. Xue, L. Xiao, X. Hu, R. Xie, H. Fu, Y . Nie, and B. Dai, “Con- trastive label disambiguation for self-supervised terrain traversability learning in off-road environments,”IEEE Transactions on Intelligent Transportation Systems, vol. 26, no. 12, pp. 22 830–22 842, 2025
2025
-
[21]
Pothole de- tection based on disparity transformation and road surface modeling,
R. Fan, U. Ozgunalp, B. Hosking, M. Liu, and I. Pitas, “Pothole de- tection based on disparity transformation and road surface modeling,” IEEE Transactions on Image Processing, vol. 29, pp. 897–908, 2019
2019
-
[22]
Visual terrain classification by flying robots,
Y . N. Khan, A. Masselli, and A. Zell, “Visual terrain classification by flying robots,” in2012 IEEE International Conference on Robotics and Automation, 2012, pp. 498–503
2012
-
[23]
Continuous on-board monocular-vision-based elevation mapping ap- plied to autonomous landing of micro aerial vehicles,
C. Forster, M. Faessler, F. Fontana, M. Werlberger, and D. Scaramuzza, “Continuous on-board monocular-vision-based elevation mapping ap- plied to autonomous landing of micro aerial vehicles,” in2015 IEEE International Conference on Robotics and Automation (ICRA), 2015, pp. 111–118
2015
-
[24]
A height estimation approach for terrain following flights from monocular vision,
I. S. G. Campos, E. R. Nascimento, G. M. Freitas, and L. Chaimowicz, “A height estimation approach for terrain following flights from monocular vision,”Sensors, vol. 16, no. 12, 2016. [Online]. Available: https://www.mdpi.com/1424-8220/16/12/2071
2016
-
[25]
Vision-based terrain following for an unmanned rotorcraft,
M. A. Garratt and J. S. Chahl, “Vision-based terrain following for an unmanned rotorcraft,”Journal of Field Robotics, vol. 25, no. 4-5, pp. 284–301, 2008. [Online]. Available: https://onlinelibrary.wiley.com/ doi/abs/10.1002/rob.20239
-
[26]
Terrain classi- fication from uav flights using monocular vision,
I. S. Campos, E. R. Nascimento, and L. Chaimowicz, “Terrain classi- fication from uav flights using monocular vision,” in2015 12th Latin American Robotics Symposium and 2015 3rd Brazilian Symposium on Robotics (LARS-SBR), 2015, pp. 271–276
2015
-
[27]
Design and experimental study on an innovative uav-lidar topographic mapping system for precision land levelling,
M. Du, H. Li, and A. Roshanianfard, “Design and experimental study on an innovative uav-lidar topographic mapping system for precision land levelling,”Drones, vol. 6, no. 12, 2022. [Online]. Available: https://www.mdpi.com/2504-446X/6/12/403
2022
-
[28]
Consumer-grade uav solid-state lidar accurately quantifies topography in a vegetated fluvial environment,
C. J. MacDonell, R. D. Williams, G. Maniatis, K. Roberts, and M. Naylor, “Consumer-grade uav solid-state lidar accurately quantifies topography in a vegetated fluvial environment,”Earth Surface Processes and Landforms, vol. 48, no. 11, pp. 2211–2229,
-
[29]
Available: https://onlinelibrary.wiley.com/doi/abs/10
[Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10. 1002/esp.5608
-
[30]
K. Trepekli, T. Balstrøm, T. Friborg, B. Fog, A. N. Allotey, R. Y . Kofie, and L. Møller-Jensen, “Uav-borne, lidar-based elevation modelling: a method for improving local-scale urban flood risk assessment,” Natural Hazards, vol. 113, no. 1, pp. 423–451, Aug 2022. [Online]. Available: https://doi.org/10.1007/s11069-022-05308-9
-
[31]
Evaluation of uav lidar for mapping coastal environments,
Y .-C. Lin, Y .-T. Cheng, T. Zhou, R. Ravi, S. M. Hasheminasab, J. E. Flatt, C. Troy, and A. Habib, “Evaluation of uav lidar for mapping coastal environments,”Remote Sensing, vol. 11, no. 24, 2019. [Online]. Available: https://www.mdpi.com/2072-4292/11/24/2893
2019
-
[32]
The effectiveness of a uav-based lidar survey to develop digital terrain models and topographic texture analyses,
P. Bartmi ´nski, M. Siłuch, and W. Kociuba, “The effectiveness of a uav-based lidar survey to develop digital terrain models and topographic texture analyses,”Sensors, vol. 23, no. 14, 2023. [Online]. Available: https://www.mdpi.com/1424-8220/23/14/6415
2023
-
[33]
Enhancing lidar-uas derived digital terrain models with hierarchic robust and volume-based filtering approaches for precision topographic mapping,
V .-E. Oniga, A.-M. Loghin, M. Macovei, A.-A. Lazar, B. Boroianu, and P. Sestras, “Enhancing lidar-uas derived digital terrain models with hierarchic robust and volume-based filtering approaches for precision topographic mapping,”Remote Sensing, vol. 16, no. 1,
-
[34]
Available: https://www.mdpi.com/2072-4292/16/1/78
[Online]. Available: https://www.mdpi.com/2072-4292/16/1/78
2072
-
[35]
Acquisition of high-resolution topographic information in forest environments using integrated uav-lidar system: System development and field demonstration,
S.-K. Choi, R. A. Ramirez, and T.-H. Kwon, “Acquisition of high-resolution topographic information in forest environments using integrated uav-lidar system: System development and field demonstration,”Heliyon, vol. 9, no. 9, p. e20225, 2023. [Online]. Available: https://www.sciencedirect.com/science/article/pii/ S2405844023074339
2023
-
[36]
Groundgrid: Lidar point cloud ground segmentation and terrain estimation,
N. Steinke, D. Goehring, and R. Rojas, “Groundgrid: Lidar point cloud ground segmentation and terrain estimation,”IEEE Robotics and Automation Letters, vol. 9, no. 1, pp. 420–426, 2024
2024
-
[37]
H. Xue, H. Fu, L. Xiao, Y . Fan, D. Zhao, and B. Dai, “Traversability analysis for autonomous driving in complex environment: A lidar-based terrain modeling approach,”Journal of Field Robotics, vol. 40, no. 7, pp. 1779–1803, 2023. [Online]. Available: https: //onlinelibrary.wiley.com/doi/abs/10.1002/rob.22209
-
[38]
Probabilistic graph-based real-time ground segmentation for urban robotics,
I. d. Pino, A. Santamaria-Navarro, A. Garrell Zulueta, F. Torres, and J. Andrade-Cetto, “Probabilistic graph-based real-time ground segmentation for urban robotics,”IEEE Transactions on Intelligent Vehicles, vol. 9, no. 5, pp. 4989–5002, 2024
2024
-
[39]
A new wave in robotics: Survey on recent mmwave radar applications in robotics,
K. Harlow, H. Jang, T. D. Barfoot, A. Kim, and C. Heckman, “A new wave in robotics: Survey on recent mmwave radar applications in robotics,”IEEE Transactions on Robotics, vol. 40, pp. 4544–4560, 2024
2024
-
[40]
Radar-based perception for autonomous outdoor vehicles,
G. Reina, J. Underwood, G. Brooker, and H. Durrant-Whyte, “Radar-based perception for autonomous outdoor vehicles,”Journal of Field Robotics, vol. 28, no. 6, pp. 894–913, 2011. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.20393
-
[41]
Ground-aware automotive radar odometry,
D. C. Herraez, F. Kaschner, M. Zeller, D. Muhle, J. Behley, M. Hei- dingsfeld, D. Cremers, and C. Stachniss, “Ground-aware automotive radar odometry,” in2025 IEEE International Conference on Robotics and Automation (ICRA), 2025, pp. 13 007–13 013
2025
-
[42]
Ground-optimized 4d radar-inertial odometry via continuous velocity integration using gaussian process,
W. Yang, H. Jang, and A. Kim, “Ground-optimized 4d radar-inertial odometry via continuous velocity integration using gaussian process,” in2025 IEEE International Conference on Robotics and Automation (ICRA), 2025, pp. 3815–3821
2025
-
[43]
Micro-drone ego-velocity and height estimation in gps-denied environments using an fmcw mimo radar,
J. Barra, T. Creuzet, S. Lesecq, G. Scorletti, E. Blanco, and M. Zarud- niev, “Micro-drone ego-velocity and height estimation in gps-denied environments using an fmcw mimo radar,”IEEE Sensors Journal, vol. 23, no. 3, pp. 2684–2692, 2023
2023
-
[44]
Advancing mmwave altimetry for unmanned aerial systems: A signal processing framework for optimized waveform design,
M. A. Awan, Y . Dalveren, A. Kara, and M. Derawi, “Advancing mmwave altimetry for unmanned aerial systems: A signal processing framework for optimized waveform design,”Drones, vol. 8, no. 9,
-
[45]
Available: https://www.mdpi.com/2504-446X/8/9/440
[Online]. Available: https://www.mdpi.com/2504-446X/8/9/440
-
[46]
Ultra-high-frequency harmony: mmwave radar and event camera orchestrate accurate drone landing,
H. Wang, J. Xu, X. Luo, X. Chen, T. Zhang, R. Duan, Y . Liu, and X. Chen, “Ultra-high-frequency harmony: mmwave radar and event camera orchestrate accurate drone landing,” inProceedings of the 23rd ACM Conference on Embedded Networked Sensor Systems, ser. SenSys ’25. New York, NY , USA: Association for Computing Machinery, 2025, p. 15–29. [Online]. Availa...
-
[47]
Towards dense and accurate radar perception via efficient cross-modal diffusion model,
R. Zhang, D. Xue, Y . Wang, R. Geng, and F. Gao, “Towards dense and accurate radar perception via efficient cross-modal diffusion model,” IEEE Robotics and Automation Letters, vol. 9, no. 9, pp. 7429–7436, 2024
2024
-
[48]
Cross-modal contrastive learning of representations for navigation using lightweight, low-cost millimeter wave radar for adverse environmental conditions,
J.-T. Huang, C.-L. Lu, P.-K. Chang, C.-I. Huang, C.-C. Hsu, Z. L. Ewe, P.-J. Huang, and H.-C. Wang, “Cross-modal contrastive learning of representations for navigation using lightweight, low-cost millimeter wave radar for adverse environmental conditions,”IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3333–3340, 2021
2021
-
[49]
Diagnostics based principal component analysis for robust plane fitting in laser data,
A. Nurunnabi, D. Belton, and G. West, “Diagnostics based principal component analysis for robust plane fitting in laser data,” in16th Int’l Conf. Computer and Information Technology. IEEE, 2014, pp. 484– 489
2014
-
[50]
Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,
M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,”Communications of the ACM, vol. 24, no. 6, pp. 381–395, 1981. VOLUME , 17
1981
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.