RLI-SLAM: Fast Robust Ranging-LiDAR-Inertial Tightly-Coupled Localization and Mapping
Abstract
:1. Introduction
- We tightly fuse the high-accuracy UWB ranging measurements with the inertial sensor, which can effectively eliminate the initial bias and long-term drift of the inertial sensor. This allows the point cloud distortions of the fast-moving LiDAR to be effectively compensated in real-time, whether in the initial phase or the long-term processing, even with a single anchor’s ranging measurement.
- We introduce an efficient loop closure detection module at a low computational complexity, utilizing an incremental smoothing factor graph approach. This module seamlessly integrates into our RLI-SLAM system, enabling high-precision mapping in challenging environments.
- We conduct extensive benchmark comparisons and validate that, compared with other state-of-the-art systems, our approach is highly accurate, robust, flexible, and fast for state estimation and mapping in long-term fast-motion scenarios. Specifically, there is no limitation on the number of tightly-coupled ranging measurements, and we add an efficient loop closure detection module that can be seamlessly integrated into our RLI-SLAM system to improve accuracy. As for flexibility, even without ranging measurement, we can still use tightly-coupled LiDAR and inertial sensors to maintain the high-accuracy state estimation. Additionally, our approach has the same low computational complexity as the fast LiDAR-Inertial odometry (FAST-LIO2) [4] system, which is the fastest LiDAR-based odometry available.
2. Related Works
2.1. UWB-LiDAR-Inertial Odometry
2.2. Loop-Closure Detection
3. System Architecture
4. Methodology
4.1. Preliminaries
4.1.1. State Estimates
4.1.2. Synchronization
4.2. UWB-LiDAR-Inertial Odometry
4.2.1. Motion Compensation
4.2.2. UWB Constraint and Drift Correction
4.2.3. Observation Model
4.3. Loop Closure Detection
4.3.1. Odometry Factor
4.3.2. Loop Closure Factor
Algorithm 1: Loop Closure Detection Based on Point Cloud Descriptors |
Input: : The index of the current keyframe L: The set of loop closure frame indices d: Loop closure search radius T: Loop closure search time difference threshold |
Algorithm 2: Loop Closure Validity Check |
Input: Index of the current keyframe Index of the loop closure keyframe Loop detection similarity threshold S |
Algorithm 3: Loop Closure Detection Based on SCD |
Input: Index of the current keyframe The set of loop closure frame indices L Loop closure search radius d Loop closure search time difference threshold T |
5. Experiments and Result
5.1. Benchmark Dataset
5.1.1. UWB Anchors Configuration
5.1.2. Accuracy Evaluation
5.1.3. Processing Time Evaluation
5.2. Real World Test
5.2.1. Experimental Environment
5.2.2. Experimental Analysis
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef]
- Shan, T.; Englot, B. LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4758–4765. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. Visual-lidar odometry and mapping: Low-drift, robust, and fast. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 2174–2181. [Google Scholar]
- Xu, W.; Cai, Y.; He, D.; Lin, J.; Zhang, F. FAST-LIO2: Fast Direct LiDAR-Inertial Odometry. IEEE Trans. Robot. 2022, 38, 2053–2073. [Google Scholar] [CrossRef]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24–30 October 2020; pp. 5135–5142. [Google Scholar] [CrossRef]
- Liu, S.; Watterson, M.; Mohta, K.; Sun, K.; Bhattacharya, S.; Taylor, C.J.; Kumar, V. Planning Dynamically Feasible Trajectories for Quadrotors Using Safe Flight Corridors in 3-D Complex Environments. IEEE Robot. Autom. Lett. 2017, 2, 1688–1695. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. LOAM: Lidar odometry and mapping in real-time. In Proceedings of the Robotics: Science and Systems, Berkeley, CA, USA, 12–16 July 2014; Volume 2, pp. 1–9. [Google Scholar]
- Martínez, C.; Campoy, P.; Mondragón, I.; Olivares-Méndez, M.A. Trinocular ground system to control UAVs. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 3361–3367. [Google Scholar] [CrossRef]
- Mueller, M.W.; Hamer, M.; D’Andrea, R. Fusing ultra-wideband range measurements with accelerometers and rate gyroscopes for quadrocopter state estimation. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 1730–1736. [Google Scholar] [CrossRef]
- Shi, Q.; Zhao, S.; Cui, X.; Lu, M.; Jia, M. Anchor self-localization algorithm based on UWB ranging and inertial measurements. Tsinghua Sci. Technol. 2019, 24, 728–737. [Google Scholar] [CrossRef]
- Zhao, S.; Zhang, X.P.; Cui, X.; Lu, M. A new TOA localization and synchronization system with virtually synchronized periodic asymmetric ranging network. IEEE Internet Things J. 2021, 8, 9030–9044. [Google Scholar] [CrossRef]
- Zhen, W.; Scherer, S. Estimating the localizability in tunnel-like en-vironments using LIDAR and UWB. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 4903–4908. [Google Scholar]
- Li, K.; Wang, C.; Huang, S.; Liang, G.; Wu, X.; Liao, Y. Self-positioning for UAV indoor navigation based on 3D laser scanner, UWB and INS. In Proceedings of the 2016 IEEE International Conference on Information and Automation (ICIA), Ningbo, China, 1–3 August 2016; pp. 498–503. [Google Scholar] [CrossRef]
- Zhou, H.; Yao, Z.; Zhang, Z.; Liu, P.; Lu, M. An Online Multi-Robot SLAM System Based on Lidar/UWB Fusion. IEEE Sens. J. 2022, 22, 2530–2542. [Google Scholar] [CrossRef]
- Ye, H.; Chen, Y.; Liu, M. Tightly coupled 3D LiDAR inertial odometry and mapping. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 3144–3150. [Google Scholar]
- Nguyen, T.-M.; Cao, M.; Yuan, S.; Lyu, Y.; Nguyen, T.H.; Xie, L. LIRO: Tightly Coupled Lidar-Inertia-Ranging Odometry. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 14484–14490. [Google Scholar] [CrossRef]
- Steder, B.; Grisetti, G.; Burgard, W. Robust place recognition for 3D range data based on point features. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–8 May 2010; pp. 1400–1405. [Google Scholar] [CrossRef]
- Rusu, R.B.; Bradski, G.; Thibaux, R.; Hsu, J. Fast 3D recognition and pose using the Viewpoint Feature Histogram. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 2155–2162. [Google Scholar] [CrossRef]
- Bosse, M.; Zlot, R. Place recognition using keypoint voting in large 3D lidar datasets. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 2677–2684. [Google Scholar] [CrossRef]
- He, L.; Wang, X.; Zhang, H. M2DP: A novel 3D point cloud descriptor and its application in loop closure detection. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republica of Korea, 9–14 October 2016; pp. 231–237. [Google Scholar] [CrossRef]
- Kim, G.; Kim, A. Scan Context: Egocentric Spatial Descriptor for Place Recognition Within 3D Point Cloud Map. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4802–4809. [Google Scholar] [CrossRef]
- Yin, J.; Li, A.; Li, T.; Yu, W.; Zou, D. M2DGR: A Multi-Sensor and Multi-Scenario SLAM Dataset for Ground Robots. IEEE Robot. Autom. Lett. 2022, 7, 2266–2273. [Google Scholar] [CrossRef]
- Nguyen, T.-M.; Yuan, S.; Cao, M.; Lyu, Y.; Xie, L. NTU VIRAL: A visual-inertial-ranging-lidar dataset, from an aerial vehicle viewpoint. Int. J. Robot. Res. 2022, 41, 270–280. [Google Scholar] [CrossRef]
- Malajner, M.; Planinšič, P.; Gleich, D. UWB ranging accuracy. In Proceedings of the 2015 International Conference on Systems, Signals and Image Processing (IWSSIP), London, UK, 10–12 September 2015; pp. 61–64. [Google Scholar] [CrossRef]
- Henry, J. Ranging and Positioning with UWB; UWB Technology—New Insights and Developments; IntechOpen: London, UK, 2023. [Google Scholar] [CrossRef]
- Nguyen, T.-M.; Cao, M.; Yuan, S.; Lyu, Y.; Nguyen, T.H.; Xie, L. VIRAL-Fusion: A Visual-Inertial-Ranging-Lidar Sensor Fusion Approach. IEEE Trans. Robot. 2022, 38, 958–977. [Google Scholar] [CrossRef]
A-LOAM | LIO-SAM | FAST-LIO2 | RLI-SLAM (3 anc) | RLI-SLAM (2 anc) | RLI-SLAM (1 anc) | RLI-SLAM (rand) | RLI-SLAM (0 anc) | RLI-SLAM (3 anc w/o LCD) | RLI-SLAM (2 anc w/o LCD) | RLI-SLAM (1 anc w/o LCD) | RLI-SLAM (rand w/o LCD) | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
hall_01 | 0.204 | 0.205 | 0.219 | 0.107 | 0.162 | 0.172 | 0.154 | 0.172 | 0.092 | 0.161 | 0.170 | 0.153 |
hall_02 | 0.271 | 0.369 | 0.505 | 0.132 | 0.354 | 0.378 | 0.398 | 0.421 | 0.124 | 0.347 | 0.352 | 0.312 |
door_01 | 0.266 | 0.232 | 0.399 | 0.189 | 0.273 | 0.294 | 0.258 | 0.312 | 0.257 | 0.297 | 0.303 | 0.293 |
door_02 | 0.220 | 0.177 | 0.311 | 0.138 | 0.273 | 0.282 | 0.192 | 0.291 | 0.175 | 0.320 | 0.391 | 0.298 |
gata_01 | 0.566 | 0.184 | 0.164 | 0.116 | 0.135 | 0.145 | 0.258 | 0.312 | 0.257 | 0.297 | 0.303 | 0.293 |
gata_02 | 0.420 | 0.494 | 0.276 | 0.274 | 0.279 | 0.281 | 0.292 | 0.291 | 0.285 | 0.298 | 0.276 | 0.278 |
gata_03 | 0.170 | 0.101 | 0.201 | 0.098 | 0.113 | 0.128 | 0.139 | 0.121 | 0.115 | 0.120 | 0.191 | 0.198 |
street_01 | 6.355 | 35.790 | 281.430 | 2.805 | 3.403 | 3.223 | 11.721 | 12.391 | 4.750 | 7.379 | 10.127 | 8.912 |
street_02 | 2.625 | 3.045 | 2.240 | 1.549 | 2.855 | 3.291 | 4.468 | 4.821 | 2.000 | 3.198 | 3.215 | 2.986 |
street_04 | 3.153 | 0.822 | 6.087 | 0.185 | 0.303 | 0.270 | 0.354 | 0.791 | 0.261 | 0.311 | 0.391 | 0.397 |
street_08 | 3.185 | 0.596 | 0.501 | 0.178 | 0.349 | 0.388 | 0.302 | 0.323 | 0.287 | 0.441 | 0.539 | 0.571 |
A-LOAM | LIO-SAM | FAST-LIO2 | VIRAL-Fusion | RLI-SLAM | RLI-SLAM (w/o LCD) | |
---|---|---|---|---|---|---|
eee_01 | 0.212 | 0.075 | 0.131 | 0.060 | 0.054 | 0.063 |
eee_02 | 0.199 | 0.069 | 0.124 | 0.058 | 0.047 | 0.054 |
eee_03 | 0.148 | 0.101 | 0.163 | 0.037 | 0.069 | 0.082 |
nya_01 | 0.077 | 0.076 | 0.122 | 0.051 | 0.046 | 0.055 |
nya_02 | 0.091 | 0.090 | 0.142 | 0.043 | 0.058 | 0.082 |
nya_03 | 0.080 | 0.137 | 0.144 | 0.052 | 0.046 | 0.052 |
sbs_01 | 0.203 | 0.089 | 0.142 | 0.048 | 0.047 | 0.058 |
sbs_02 | 0.091 | 0.083 | 0.140 | 0.062 | 0.049 | 0.056 |
sbs_03 | 0.363 | 0.054 | 0.133 | 0.054 | 0.048 | 0.056 |
Module | Time (ms) |
---|---|
The Inertial Sensor Preprocessing | 2.62 |
UWB Optimization | 0.21 |
Point Cloud Feature Processing | 6.93 |
State Optimization Estimation | 28.72 |
Building Point Cloud Maps | 1.38 |
Loop Detection | 1.71 |
Total Time | 41.57 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xin, R.; Guo, N.; Ma, X.; Liu, G.; Feng, Z. RLI-SLAM: Fast Robust Ranging-LiDAR-Inertial Tightly-Coupled Localization and Mapping. Sensors 2024, 24, 5672. https://doi.org/10.3390/s24175672
Xin R, Guo N, Ma X, Liu G, Feng Z. RLI-SLAM: Fast Robust Ranging-LiDAR-Inertial Tightly-Coupled Localization and Mapping. Sensors. 2024; 24(17):5672. https://doi.org/10.3390/s24175672
Chicago/Turabian StyleXin, Rui, Ningyan Guo, Xingyu Ma, Gang Liu, and Zhiyong Feng. 2024. "RLI-SLAM: Fast Robust Ranging-LiDAR-Inertial Tightly-Coupled Localization and Mapping" Sensors 24, no. 17: 5672. https://doi.org/10.3390/s24175672
APA StyleXin, R., Guo, N., Ma, X., Liu, G., & Feng, Z. (2024). RLI-SLAM: Fast Robust Ranging-LiDAR-Inertial Tightly-Coupled Localization and Mapping. Sensors, 24(17), 5672. https://doi.org/10.3390/s24175672