pith. machine review for the scientific record. sign in

arxiv: 2604.03747 · v1 · submitted 2026-04-04 · 💻 cs.RO

Recognition: no theorem link

CT-VoxelMap: Efficient Continuous-Time LiDAR-Inertial Odometry with Probabilistic Adaptive Voxel Mapping

Authors on Pith no claims yet

Pith reviewed 2026-05-13 17:07 UTC · model grok-4.3

classification 💻 cs.RO
keywords continuous-time odometryLiDAR-inertial fusionB-spline representationmatrix Lie groupsvoxel mappingrobot localizationIMU propagation
0
0 comments X

The pith

Representing B-spline control-point increments on matrix Lie groups yields simpler analytical Jacobians for continuous-time LiDAR-inertial odometry.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper seeks to deliver stable localization for robots during fast motion or on rough terrain by fusing LiDAR and IMU data in a continuous-time framework. It replaces earlier quaternion-based or variable-control-point spline methods with increments expressed directly on matrix Lie groups, exploiting the cumulative B-spline form to obtain compact Jacobians that avoid boundary-condition handling. IMU forward propagation supplies an online estimate of spline-to-trajectory fitting error, while a hybrid feature-driven voxel map and a re-estimation policy together raise accuracy, robustness, and speed. Experiments on public datasets report better results than prior continuous-time approaches on most sequences.

Core claim

By treating increments of B-spline control points as variables on matrix Lie groups and using the cumulative spline formulation, the method obtains simpler analytical Jacobians without extra boundary considerations. IMU measurements are propagated forward to estimate fitting errors online; these estimates feed a hybrid feature-based voxel map whose management is adapted probabilistically. A re-estimation policy further reduces compute while preserving robustness. The resulting system shows superior trajectory accuracy on most evaluated public sequences.

What carries the argument

Cumulative B-spline representation of pose increments on matrix Lie groups, paired with IMU-driven online fitting-error estimation and probabilistic adaptive voxel mapping.

If this is right

  • Simpler Jacobians reduce the cost of each Gauss-Newton iteration, enabling higher update rates on embedded hardware.
  • Online fitting-error correction keeps the estimated trajectory closer to the true continuous motion even when control points are sparsely placed.
  • Hybrid voxel management improves map consistency when the robot transitions between flat and uneven surfaces.
  • The re-estimation policy allows the system to drop low-information features without restarting the entire optimization.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same Lie-group increment trick could be applied to other spline-based estimators that currently operate in quaternion space.
  • Because fitting errors are estimated rather than ignored, the method may tolerate lower spline knot density, freeing memory on long trajectories.
  • The probabilistic voxel rule might be replaced by learned feature importance without changing the rest of the pipeline.

Load-bearing premise

The IMU forward-propagation estimate of B-spline fitting error remains unbiased and the hybrid voxel-map selection does not favor particular terrains or motion profiles.

What would settle it

Run the estimator on a high-speed rough-terrain sequence, measure the difference between the spline trajectory and the IMU-integrated ground-truth trajectory at many interior knots, and check whether the reported fitting-error correction actually reduces that residual below the level achieved by earlier spline methods.

Figures

Figures reproduced from arXiv: 2604.03747 by Chuan Cao, Han Zhang, Lei Zhao, Tianchen Deng, Weidong Chen, Xingyi Li.

Figure 1
Figure 1. Figure 1: Dataset platforms and mapping performance. Despite [PITH_FULL_IMAGE:figures/full_fig_p001_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Fig.2. The LiDAR coordinate frame is denoted as [PITH_FULL_IMAGE:figures/full_fig_p003_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: The black line denotes a continuous trajectory, repre [PITH_FULL_IMAGE:figures/full_fig_p005_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: Schematic diagram of representation error of continu [PITH_FULL_IMAGE:figures/full_fig_p006_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Voxel representation and decomposition diagram. [PITH_FULL_IMAGE:figures/full_fig_p007_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: The schematic diagram illustrates three prediction and [PITH_FULL_IMAGE:figures/full_fig_p008_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: The experiments are conducted on four public datasets with diverse platforms and environmental conditions. M2UD is [PITH_FULL_IMAGE:figures/full_fig_p009_7.png] view at source ↗
Figure 8
Figure 8. Figure 8: Performing results comparison of different methods on the ‘aggressive04‘ and ‘AMtown03‘ sequence. [PITH_FULL_IMAGE:figures/full_fig_p012_8.png] view at source ↗
Figure 10
Figure 10. Figure 10: Performing results comparison of CT-VLO with [PITH_FULL_IMAGE:figures/full_fig_p013_10.png] view at source ↗
Figure 11
Figure 11. Figure 11: Performing results comparison of CT-VLO with different re-estimation times on ‘AMtown03‘ sequence. [PITH_FULL_IMAGE:figures/full_fig_p014_11.png] view at source ↗
Figure 12
Figure 12. Figure 12: Description of the Experimental Wheel-Legged Robot [PITH_FULL_IMAGE:figures/full_fig_p014_12.png] view at source ↗
Figure 13
Figure 13. Figure 13: Comparison of Trajectories Across Different Scenes in the Dataset. [PITH_FULL_IMAGE:figures/full_fig_p015_13.png] view at source ↗
read the original abstract

Maintaining stable and accurate localization during fast motion or on rough terrain remains highly challenging for mobile robots with onboard resources. Currently, multi-sensor fusion methods based on continuous-time representation offer a potential and effective solution to this challenge. Among these, spline-based methods provide an efficient and intuitive approach for continuous-time representation. Previous continuous-time odometry works based on B-splines either treat control points as variables to be estimated or perform estimation in quaternion space, which introduces complexity in deriving analytical Jacobians and often overlooks the fitting error between the spline and the true trajectory over time. To address these issues, we first propose representing the increments of control points on matrix Lie groups as variables to be estimated. Leveraging the feature of the cumulative form of B-splines, we derive a more compact formulation that yields simpler analytical Jacobians without requiring additional boundary condition considerations. Second, we utilize forward propagation information from IMU measurements to estimate fitting errors online and further introduce a hybrid feature-based voxel map management strategy, enhancing system accuracy and robustness. Finally, we propose a re-estimation policy that significantly improves system computational efficiency and robustness. The proposed method is evaluated on multiple challenging public datasets, demonstrating superior performance on most sequences. Detailed ablation studies are conducted to analyze the impact of each module on the overall pose estimation system.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The paper proposes CT-VoxelMap, a continuous-time LiDAR-inertial odometry framework. It represents B-spline control-point increments on matrix Lie groups to obtain compact analytical Jacobians, estimates B-spline fitting errors online via IMU forward propagation to weight a hybrid feature-based probabilistic voxel map, and applies a re-estimation policy to improve efficiency. The method is claimed to achieve superior accuracy and robustness on challenging public datasets relative to prior spline-based approaches, with supporting ablation studies.

Significance. If the central performance claims are substantiated, the work would provide a practical, resource-efficient advance for real-time localization on mobile robots in fast-motion or rough-terrain scenarios by combining Lie-group spline representations with adaptive voxel management.

major comments (2)
  1. [§3.2] §3.2 (IMU forward-propagation error estimation): the claim that IMU-propagated residuals provide an unbiased online estimate of B-spline fitting error is load-bearing for the hybrid voxel-map weighting and re-estimation policy. IMU bias, scale-factor errors, and limited spline bandwidth can produce systematic residuals on high-frequency motion; without explicit bias analysis or sensitivity experiments on rough-terrain sequences, the reported accuracy gains remain unverified.
  2. [Experiments] Experiments section, Table/Figure of per-sequence results: the superiority claim is stated for 'most sequences' but lacks reported standard deviations, full ablation tables with numerical deltas, or controls for post-hoc dataset selection. This weakens the statistical support for the central performance assertion.
minor comments (2)
  1. [Abstract] Abstract: the phrase 'detailed ablation studies' should briefly enumerate the modules tested to improve reader orientation.
  2. [Method] Notation: ensure the Lie-group cumulative B-spline formulation is cross-referenced to the exact equation numbers used in the Jacobian derivation for clarity.

Simulated Author's Rebuttal

2 responses · 0 unresolved

Thank you for the constructive feedback on our manuscript. We address each major comment point by point below and will revise the paper accordingly to strengthen the validation of the IMU-based error estimation and the statistical presentation of results.

read point-by-point responses
  1. Referee: [§3.2] §3.2 (IMU forward-propagation error estimation): the claim that IMU-propagated residuals provide an unbiased online estimate of B-spline fitting error is load-bearing for the hybrid voxel-map weighting and re-estimation policy. IMU bias, scale-factor errors, and limited spline bandwidth can produce systematic residuals on high-frequency motion; without explicit bias analysis or sensitivity experiments on rough-terrain sequences, the reported accuracy gains remain unverified.

    Authors: We appreciate the referee's observation on the assumptions underlying the IMU forward-propagation residuals. In our framework, IMU biases are jointly estimated within the optimization state and corrected online, which reduces systematic drift in the propagated residuals. The spline bandwidth is chosen to balance computational cost and motion fidelity based on typical LiDAR rates. Nevertheless, we agree that an explicit sensitivity study would better substantiate the unbiased claim under high-frequency or rough-terrain conditions. In the revised manuscript we will add a dedicated subsection with bias-perturbation experiments and results on additional rough-terrain sequences, reporting accuracy with and without bias correction to quantify robustness. revision: yes

  2. Referee: [Experiments] Experiments section, Table/Figure of per-sequence results: the superiority claim is stated for 'most sequences' but lacks reported standard deviations, full ablation tables with numerical deltas, or controls for post-hoc dataset selection. This weakens the statistical support for the central performance assertion.

    Authors: We agree that stronger statistical reporting would improve the presentation. The current results use standard public benchmarks and report all sequences without omission, yet we acknowledge the value of standard deviations and explicit numerical deltas. In the revision we will expand the Experiments section to include standard deviations (computed over repeated runs where feasible), full ablation tables showing per-module accuracy and runtime deltas, and a brief clarification on dataset usage confirming that no post-hoc selection occurred. These additions will directly support the superiority claims with transparent quantitative evidence. revision: yes

Circularity Check

0 steps flagged

No significant circularity; derivation remains self-contained

full rationale

The paper introduces a Lie-group representation for B-spline control-point increments, derives compact Jacobians from the cumulative B-spline form, and treats IMU forward propagation for online fitting-error estimation as an independent module feeding a hybrid voxel-map policy and re-estimation rule. These steps are presented as design choices whose performance is assessed on external public datasets. No equation reduces a claimed improvement to a quantity defined by the same fitted parameters, no load-bearing premise rests solely on self-citation, and no uniqueness theorem is imported from prior author work to force the formulation. The central accuracy claims therefore rest on empirical comparison rather than tautological re-labeling of inputs.

Axiom & Free-Parameter Ledger

1 free parameters · 2 axioms · 0 invented entities

The method rests on standard Lie-group properties for B-spline cumulative form and conventional IMU integration assumptions; no new physical entities are introduced. Free parameters such as voxel resolution thresholds and feature-probability cutoffs are implied but not quantified in the abstract.

free parameters (1)
  • voxel resolution and probability thresholds
    Adaptive voxel map management requires choosing or tuning resolution and probability cutoffs that affect feature retention and map density.
axioms (2)
  • domain assumption B-spline cumulative form allows compact Jacobian derivation without boundary conditions
    Invoked when representing control-point increments on Lie groups.
  • domain assumption IMU forward propagation provides unbiased estimates of spline fitting error
    Central to the hybrid error-correction step.

pith-pipeline@v0.9.0 · 5544 in / 1515 out tokens · 39096 ms · 2026-05-13T17:07:38.703730+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

51 extracted references · 51 canonical work pages

  1. [1]

    Onboard sensors-based self-localization for autonomous vehicle with hierarchical map,

    C. Xia, Y . Shen, Y . Yang, X. Deng, S. Chen, J. Xin, and N. Zheng, “Onboard sensors-based self-localization for autonomous vehicle with hierarchical map,”IEEE Transactions on Cybernetics, vol. 53, no. 7, pp. 4218–4231, 2023

  2. [2]

    Slope inspection under dense vegetation using lidar-based quadrotors,

    W. Liu, Y . Ren, R. Guo, V . W. Kong, A. S. Hung, F. Zhu, Y . Cai, H. Wu, Y . Zou, and F. Zhang, “Slope inspection under dense vegetation using lidar-based quadrotors,”Nature Communications, vol. 16, no. 1, p. 7411, 2025

  3. [3]

    A self-rotating, single-actuated uav with extended sensor field of view for autonomous navigation,

    N. Chen, F. Kong, W. Xu, Y . Cai, H. Li, D. He, Y . Qin, and F. Zhang, “A self-rotating, single-actuated uav with extended sensor field of view for autonomous navigation,”Science Robotics, vol. 8, no. 76, p. eade4538, 2023

  4. [4]

    Rolo-slam: rotation-optimized lidar-only slam in uneven terrain with ground vehicle,

    Y . Wang, B. Ren, X. Zhang, P. Wang, C. Wang, R. Song, Y . Li, and M. Q.-H. Meng, “Rolo-slam: rotation-optimized lidar-only slam in uneven terrain with ground vehicle,”Journal of Field Robotics, vol. 42, no. 3, pp. 880–902, 2025

  5. [5]

    Mne-slam: Multi-agent neural slam for mobile robots,

    T. Deng, G. Shen, C. Xun, S. Yuan, T. Jin, H. Shen, Y . Wang, J. Wang, H. Wang, D. Wang,et al., “Mne-slam: Multi-agent neural slam for mobile robots,” inProceedings of the Computer Vision and Pattern Recognition Conference, pp. 1485–1494, 2025

  6. [6]

    Targetless calibration of lidar-imu system based on continuous-time batch estimation,

    J. Lv, J. Xu, K. Hu, Y . Liu, and X. Zuo, “Targetless calibration of lidar-imu system based on continuous-time batch estimation,” in2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 9968–9975, 2020

  7. [7]

    Liwo: Lidar-inertial-wheel odometry,

    Z. Yuan, F. Lang, T. Xu, and X. Yang, “Liwo: Lidar-inertial-wheel odometry,” in2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1481–1488, 2023

  8. [8]

    Lic-fusion: Lidar- inertial-camera odometry,

    X. Zuo, P. Geneva, W. Lee, Y . Liu, and G. Huang, “Lic-fusion: Lidar- inertial-camera odometry,” in2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5848–5854, 2019

  9. [9]

    Lic-fusion 2.0: Lidar-inertial-camera odometry with sliding-window plane-feature tracking,

    X. Zuo, Y . Yang, P. Geneva, J. Lv, Y . Liu, G. Huang, and M. Pollefeys, “Lic-fusion 2.0: Lidar-inertial-camera odometry with sliding-window plane-feature tracking,” in2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5112–5119, 2020

  10. [10]

    Fast-livo: Fast and tightly-coupled sparse-direct lidar-inertial-visual odometry,

    C. Zheng, Q. Zhu, W. Xu, X. Liu, Q. Guo, and F. Zhang, “Fast-livo: Fast and tightly-coupled sparse-direct lidar-inertial-visual odometry,” in2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4003–4009, 2022

  11. [11]

    Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter,

    W. Xu and F. Zhang, “Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter,”IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3317–3324, 2021

  12. [12]

    Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain,

    T. Shan and B. Englot, “Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain,” in2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4758–4765, 2018

  13. [13]

    Fast-livo2: Fast, direct lidar–inertial–visual odometry,

    C. Zheng, W. Xu, Z. Zou, T. Hua, C. Yuan, D. He, B. Zhou, Z. Liu, J. Lin, F. Zhu, Y . Ren, R. Wang, F. Meng, and F. Zhang, “Fast-livo2: Fast, direct lidar–inertial–visual odometry,”IEEE Transactions on Robotics, vol. 41, pp. 326–346, 2025

  14. [14]

    Continuous- time fixed-lag smoothing for lidar-inertial-camera slam,

    J. Lv, X. Lang, J. Xu, M. Wang, Y . Liu, and X. Zuo, “Continuous- time fixed-lag smoothing for lidar-inertial-camera slam,”IEEE/ASME Transactions on Mechatronics, vol. 28, no. 4, pp. 2259–2270, 2023

  15. [15]

    Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,

    T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5135–5142, 2020

  16. [16]

    Traj-lo: In defense of lidar-only odometry using an effective continuous-time trajectory,

    X. Zheng and J. Zhu, “Traj-lo: In defense of lidar-only odometry using an effective continuous-time trajectory,”IEEE Robotics and Automation Letters, vol. 9, no. 2, pp. 1961–1968, 2024

  17. [17]

    Ct-icp: Real-time elastic lidar odometry with loop closure,

    P. Dellenbach, J.-E. Deschaud, B. Jacquet, and F. Goulette, “Ct-icp: Real-time elastic lidar odometry with loop closure,” in2022 Interna- tional Conference on Robotics and Automation (ICRA), pp. 5580–5586, 2022

  18. [18]

    Continuous-time state estimation methods in robotics: A survey,

    W. Talbot, J. Nubert, T. Tuna, C. Cadena, F. D ¨umbgen, J. Tordesillas, T. D. Barfoot, and M. Hutter, “Continuous-time state estimation methods in robotics: A survey,”IEEE Transactions on Robotics, pp. 1–20, 2025

  19. [19]

    Gaussian process gauss– newton for non-parametric simultaneous localization and mapping,

    C. H. Tong, P. Furgale, and T. D. Barfoot, “Gaussian process gauss– newton for non-parametric simultaneous localization and mapping,”The 16 International Journal of Robotics Research, vol. 32, no. 5, pp. 507–525, 2013

  20. [20]

    Batch continuous-time trajectory estimation as exactly sparse gaussian process regression.,

    T. D. Barfoot, C. H. Tong, and S. S ¨arkk¨a, “Batch continuous-time trajectory estimation as exactly sparse gaussian process regression.,” in Robotics: Science and Systems, vol. 10, pp. 1–10, Citeseer, 2014

  21. [21]

    Sparse gaussian processes for continuous-time trajectory estimation on matrix lie groups,

    J. Dong, B. Boots, and F. Dellaert, “Sparse gaussian processes for continuous-time trajectory estimation on matrix lie groups,”arXiv preprint arXiv:1705.06020, 2017

  22. [22]

    The numerical evaluation of b-splines,

    M. G. Cox, “The numerical evaluation of b-splines,”IMA Journal of Applied mathematics, vol. 10, no. 2, pp. 134–149, 1972

  23. [23]

    On calculating with b-splines,

    C. De Boor, “On calculating with b-splines,”Journal of Approximation theory, vol. 6, no. 1, pp. 50–62, 1972

  24. [24]

    Efficient derivative computation for cumulative b-splines on lie groups,

    C. Sommer, V . Usenko, D. Schubert, N. Demmel, and D. Cremers, “Efficient derivative computation for cumulative b-splines on lie groups,” inProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 2020

  25. [25]

    Clins: Continuous- time trajectory estimation for lidar-inertial system,

    J. Lv, K. Hu, J. Xu, Y . Liu, X. Ma, and X. Zuo, “Clins: Continuous- time trajectory estimation for lidar-inertial system,” in2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 6657–6663, 2021

  26. [26]

    Eigen is all you need: Efficient lidar-inertial continuous-time odometry with internal association,

    T.-M. Nguyen, X. Xu, T. Jin, Y . Yang, J. Li, S. Yuan, and L. Xie, “Eigen is all you need: Efficient lidar-inertial continuous-time odometry with internal association,”IEEE Robotics and Automation Letters, 2024

  27. [27]

    Resple: Recursive spline estimation for lidar-based odometry,

    Z. Cao, W. Talbot, and K. Li, “Resple: Recursive spline estimation for lidar-based odometry,”IEEE Robotics and Automation Letters, vol. 10, no. 10, pp. 10666–10673, 2025

  28. [28]

    Mars-lvig dataset: A multi-sensor aerial robots slam dataset for lidar-visual-inertial-gnss fusion,

    H. Li, Y . Zou, N. Chen, J. Lin, X. Liu, W. Xu, C. Zheng, R. Li, D. He, F. Kong,et al., “Mars-lvig dataset: A multi-sensor aerial robots slam dataset for lidar-visual-inertial-gnss fusion,”The International Journal of Robotics Research, vol. 43, no. 8, pp. 1114–1127, 2024

  29. [29]

    M2ud: A multi-model, multi-scenario, uneven-terrain dataset for ground robot with localization and mapping evaluation,

    Y . Jia, S. Wang, S. Shao, Y . Wang, F. Zhang, and T. Wang, “M2ud: A multi-model, multi-scenario, uneven-terrain dataset for ground robot with localization and mapping evaluation,”The International Journal of Robotics Research, p. 02783649251401224, 2025

  30. [30]

    Mcd: Diverse large-scale multi-campus dataset for robot perception,

    T.-M. Nguyen, S. Yuan, T. H. Nguyen, P. Yin, H. Cao, L. Xie, M. Wozniak, P. Jensfelt, M. Thiel, J. Ziegenbein,et al., “Mcd: Diverse large-scale multi-campus dataset for robot perception,” inProceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp. 22304–22313, 2024

  31. [31]

    Diter++: Diverse terrain and multi-modal dataset for multi-robot slam in multi-session environments,

    J. Kim, H. Kim, S. Jeong, Y . Shin, and Y . Cho, “Diter++: Diverse terrain and multi-modal dataset for multi-robot slam in multi-session environments,” in2025 IEEE International Conference on Robotics and Automation (ICRA), pp. 12187–12193, 2025

  32. [32]

    A flexible and scalable slam system with full 3d motion estimation,

    S. Kohlbrecher, O. V on Stryk, J. Meyer, and U. Klingauf, “A flexible and scalable slam system with full 3d motion estimation,” in2011 IEEE international symposium on safety, security, and rescue robotics, pp. 155–160, IEEE, 2011

  33. [33]

    Loam: Lidar odometry and mapping in real- time,

    J. Zhang, S. Singh,et al., “Loam: Lidar odometry and mapping in real- time,” inRobotics: Science and systems, vol. 2, pp. 1–9, Berkeley, CA, 2014

  34. [34]

    Ekf-loam: An adaptive fusion of lidar slam with wheel odometry and inertial data for confined spaces with few geometric features,

    G. P. C. J ´unior, A. M. C. Rezende, V . R. F. Miranda, R. Fernandes, H. Azp ´urua, A. A. Neto, G. Pessin, and G. M. Freitas, “Ekf-loam: An adaptive fusion of lidar slam with wheel odometry and inertial data for confined spaces with few geometric features,”IEEE Transactions on Automation Science and Engineering, vol. 19, no. 3, pp. 1458–1471, 2022

  35. [35]

    A multi-state constraint kalman filter for vision-aided inertial navigation,

    A. I. Mourikis and S. I. Roumeliotis, “A multi-state constraint kalman filter for vision-aided inertial navigation,” inProceedings 2007 IEEE International Conference on Robotics and Automation, pp. 3565–3572, 2007

  36. [36]

    High-precision, consistent ekf-based visual- inertial odometry,

    M. Li and A. I. Mourikis, “High-precision, consistent ekf-based visual- inertial odometry,”The International Journal of Robotics Research, vol. 32, no. 6, pp. 690–711, 2013

  37. [37]

    Fast-lio2: Fast direct lidar-inertial odometry,

    W. Xu, Y . Cai, D. He, J. Lin, and F. Zhang, “Fast-lio2: Fast direct lidar-inertial odometry,”IEEE Transactions on Robotics, vol. 38, no. 4, pp. 2053–2073, 2022

  38. [38]

    Efficient and probabilistic adaptive voxel mapping for accurate online lidar odometry,

    C. Yuan, W. Xu, X. Liu, X. Hong, and F. Zhang, “Efficient and probabilistic adaptive voxel mapping for accurate online lidar odometry,” IEEE Robotics and Automation Letters, vol. 7, no. 3, pp. 8518–8525, 2022

  39. [39]

    Cte-mlo: Continuous-time and efficient multi-lidar odometry with localizability-aware point cloud sampling,

    H. Shen, Z. Wu, Y . Hui, W. Wang, Q. Lyu, T. Deng, Y . Zhu, B. Tian, and D. Wang, “Cte-mlo: Continuous-time and efficient multi-lidar odometry with localizability-aware point cloud sampling,”IEEE Transactions on Field Robotics, 2025

  40. [40]

    Cta-lo: Accurate and robust lidar odometry using continuous-time adaptive estimation,

    Y . Lv, Y . Zhang, X. Zhao, W. Li, J. Ning, and Y . Jin, “Cta-lo: Accurate and robust lidar odometry using continuous-time adaptive estimation,” in2024 IEEE International Conference on Robotics and Automation (ICRA), pp. 12034–12040, 2024

  41. [41]

    A lidar-inertial odometry with principled un- certainty modeling,

    B. Jiang and S. Shen, “A lidar-inertial odometry with principled un- certainty modeling,” in2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 13292–13299, 2022

  42. [42]

    Log- lio2: A lidar-inertial odometry with efficient uncertainty analysis,

    K. Huang, J. Zhao, J. Lin, Z. Zhu, S. Song, C. Ye, and T. Feng, “Log- lio2: A lidar-inertial odometry with efficient uncertainty analysis,”IEEE Robotics and Automation Letters, vol. 9, no. 10, pp. 8226–8233, 2024

  43. [43]

    Pixel-level extrinsic self cal- ibration of high resolution lidar and camera in targetless environments,

    C. Yuan, X. Liu, X. Hong, and F. Zhang, “Pixel-level extrinsic self cal- ibration of high resolution lidar and camera in targetless environments,” IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 7517–7524, 2021

  44. [44]

    X-icp: Localizability-aware lidar registration for robust localization in extreme environments,

    T. Tuna, J. Nubert, Y . Nava, S. Khattak, and M. Hutter, “X-icp: Localizability-aware lidar registration for robust localization in extreme environments,”IEEE Transactions on Robotics, vol. 40, pp. 452–471, 2024

  45. [45]

    Balm: Bundle adjustment for lidar mapping,

    Z. Liu and F. Zhang, “Balm: Bundle adjustment for lidar mapping,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3184–3191, 2021

  46. [46]

    Hierarchical distribution- based tightly-coupled lidar inertial odometry,

    C. Wang, Z. Cao, J. Li, J. Yu, and S. Wang, “Hierarchical distribution- based tightly-coupled lidar inertial odometry,”IEEE Transactions on Intelligent Vehicles, vol. 9, no. 1, pp. 1423–1435, 2024

  47. [47]

    3dmndt: 3d multi-view registration method based on the normal distributions transform,

    J. Zhu, J. Mu, C.-B. Yan, D. Wang, and Z. Li, “3dmndt: 3d multi-view registration method based on the normal distributions transform,”IEEE Transactions on Automation Science and Engineering, vol. 21, no. 1, pp. 488–501, 2024

  48. [48]

    The normal distributions transform: a new approach to laser scan matching,

    P. Biber and W. Strasser, “The normal distributions transform: a new approach to laser scan matching,” inProceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453), vol. 3, pp. 2743–2748 vol.3, 2003

  49. [49]

    Point-lio: robust high-bandwidth light detection and ranging inertial odometry,

    D. He, W. Xu, N. Chen, F. Kong, C. Yuan, and F. Zhang, “Point-lio: robust high-bandwidth light detection and ranging inertial odometry,” Advanced Intelligent Systems, vol. 5, no. 7, p. 2200459, 2023

  50. [50]

    Maximum correntropy kalman filter,

    B. Chen, X. Liu, H. Zhao, and J. C. Principe, “Maximum correntropy kalman filter,”Automatica, vol. 76, pp. 70–77, 2017

  51. [51]

    evo: Python package for the evaluation of odometry and slam

    M. Grupp, “evo: Python package for the evaluation of odometry and slam..” https://github.com/MichaelGrupp/evo, 2017