An Enhanced Multi-Sensor Simultaneous Localization and Mapping (SLAM) Framework with Coarse-to-Fine Loop Closure Detection Based on a Tightly Coupled Error State Iterative Kalman Filter
Abstract
:1. Introduction
- (1)
- We present a real-time simultaneous localization, mapping, and coloring framework. Three parts of this framework are integrated through an ESIKF: the LIO module, the VIO module, and the loop closure detection module. The whole system fuses data from LIDAR, cameras, and IMUs together to achieve state estimation, while being able to reconstruct a dense, 3D, RGB-colored point cloud of the environment in real-time.
- (2)
- We propose a loop closure detection module utilizing Quatro and Nano-GICP, which accomplishes state estimation between LIDAR frames via a radius-based search and an extended point cloud alignment approach. Subsequently, we attain a more precise evaluation of the current state via factor graph optimization through the iSAM2 optimizer. The optimization framework exclusively contains the LIDAR odometry factor with the loop closure factor, resulting in a reduction in the optimization time for other states.
- (3)
- The developed system is validated on the open data sequence NTU VIRAL dataset [35] and our customized device data. The results show that our system outperforms other similar systems and that this approach effectively corrects the degradation of the LIO scheme in structured scenarios.
2. System Overview
2.1. Filter-Based Odometry
2.1.1. System Definitions
2.1.2. The Boxplus “” and Boxminus “” Operator
2.1.3. Discrete State Transition Model
2.2. IMU Forward Propagation
2.3. Map Management
2.3.1. LIDAR Point Cloud Map
2.3.2. Point-Patch Cloud Map
2.3.3. Colored Point Cloud Map
2.4. Frame-to-Map Measurement Error Model
2.4.1. LIDAR Point-to-Plane Residual Error Model
2.4.2. Visual Photometric Error
2.4.3. Loop Closure Detection Error Model
2.5. ESIKF Update
3. QN-C2F-SAM Loop Method
3.1. Coarse Matching Method
3.2. Fine Matching Method
3.3. Factor Graph Optimization
4. Experiments and Results
4.1. Benchmark Dataset
4.1.1. Algorithmic Analysis Tool
4.1.2. Numerical Comparison
4.1.3. Visual Comparison
4.2. Private Dataset
4.2.1. Handheld Device Dataset
4.2.2. Mapping with Normal Point Cloud
4.2.3. Mapping with RGB Point Cloud
4.3. Time Analysis
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
SLAM | Simultaneous Localization and Mapping |
LIDAR | Light Detection and Ranging |
IMU | Inertial Measurement Unit |
ICP | Iterative Closest Point |
LOAM | LIDAR Odometry and Mapping |
VIO | Visual-Inertial Odometry |
LIO | LIDAR-Inertial Odometry |
LVIO | LIDAR-Visual-Inertial Odometry |
ESIKF | Error-State Iterative Kalman Filter |
UAV | Unmanned Aerial Vehicles |
APE | Absolute Pose Error |
RPE | Relative Pose Error |
Appendix A
References
- Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. LOAM: LIDAR Odometry and Mapping in Real-Time, Robotics: Science and Systems. In Proceedings of the Robotics: Science and Systems, Berkeley, CA, USA, 12–16 July 2014; Volume 2, pp. 1–9. [Google Scholar]
- Low, K.-L. Linear Least-Squares Optimization for Point-to-Plane ICP Surface Registration; University of North Carolina: Chapel Hill, NC, USA, 2004; pp. 2–4. [Google Scholar]
- Shan, T.; Englot, B. LeGO-LOAM: Lightweight and Ground-Optimized LIDAR Odometry and Mapping on Variable Terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4758–4765. [Google Scholar] [CrossRef]
- Chen, K.; Lopez, B.T.; Agha-Mohammadi, A.A.; Mehta, A. Direct LIDAR Odometry: Fast Localization With Dense Point Clouds. IEEE Robot. Autom. Lett. 2022, 7, 2000–2007. [Google Scholar] [CrossRef]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Tardos, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef]
- Campos, C.; Elvira, R.; Rodriguez, J.J.G.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. Visual-LIDAR Odometry and Mapping: Low-Drift, Robust, and Fast. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. Laser–Visual–Inertial Odometry and Mapping with High Robustness and Low Drift. J. Field Robot. 2018, 35, 1242–1264. [Google Scholar] [CrossRef]
- Wang, Z.; Zhang, J.; Chen, S.; Yuan, C.; Zhang, J.; Zhang, J. Robust High Accuracy Visual-Inertial-Laser SLAM System. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 6636–6641. [Google Scholar] [CrossRef]
- Lowe, T.; Kim, S.; Cox, M. Complementary Perception for Handheld SLAM. IEEE Robot. Autom. Lett. 2018, 3, 1104–1111. [Google Scholar] [CrossRef]
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. IMU Preintegration on Manifold for Efficient Visual-Inertial Maximum-a-Posteriori Estimation. In Proceedings of the Robotics: Science and Systems, Rome, Italy, 13–17 July 2015; Volume 11. [Google Scholar] [CrossRef]
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On-Manifold Preintegration for Real-Time Visual--Inertial Odometry. IEEE Trans. Robot. 2017, 33, 1–21. [Google Scholar] [CrossRef]
- Geneva, P.; Eckenhoff, K.; Yang, Y.; Huang, G. LIPS: LIDAR-Inertial 3D Plane SLAM. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 123–130. [Google Scholar] [CrossRef]
- Gentil, C.L.; Vidal-Calleja, T.; Huang, S. IN2LAMA: INertial LIDAR Localisation And MApping. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 6388–6394. [Google Scholar] [CrossRef]
- Ye, H.; Chen, Y.; Liu, M. Tightly Coupled 3D LIDAR Inertial Odometry and Mapping. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019. [Google Scholar] [CrossRef]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. LIO-SAM: Tightly-Coupled LIDAR Inertial Odometry via Smoothing and Mapping. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24–30 October 2020. [Google Scholar] [CrossRef]
- Kaess, M.; Johannsson, H.; Roberts, R.; Ila, V.; Leonard, J.J.; Dellaert, F. ISAM2: Incremental Smoothing and Mapping Using the Bayes Tree. Int. J. Robot. Res. 2012, 31, 216–235. [Google Scholar] [CrossRef]
- Qin, C.; Ye, H.; Pranata, C.E.; Han, J.; Zhang, S.; Liu, M. LINS: A LIDAR-Inertial State Estimator for Robust and Efficient Navigation. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020. [Google Scholar] [CrossRef]
- Xu, W.; Zhang, F. FAST-LIO: A Fast, Robust LIDAR-Inertial Odometry Package by Tightly-Coupled Iterated Kalman Filter. IEEE Robot. Autom. Lett. 2021, 6, 3317–3324. [Google Scholar] [CrossRef]
- Xu, W.; Cai, Y.; He, D.; Lin, J.; Zhang, F. FAST-LIO2: Fast Direct LIDAR-Inertial Odometry. IEEE Trans. Robot. 2022, 38, 2053–2073. [Google Scholar] [CrossRef]
- Wang, T.; Su, Y.; Shao, S.; Yao, C.; Wang, Z. GR-Fusion: Multi-Sensor Fusion SLAM for Ground Robots with High Robustness and Low Drift. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 5440–5447. [Google Scholar] [CrossRef]
- Jia, Y.; Luo, H.; Zhao, F.; Jiang, G.; Li, Y.; Yan, J.; Jiang, Z.; Wang, Z. Lvio-Fusion: A Self-Adaptive Multi-Sensor Fusion SLAM Framework Using Actor-Critic Method. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 286–293. [Google Scholar] [CrossRef]
- Zheng, T.X.; Huang, S.; Li, Y.F.; Feng, M.C. Key Techniques for Vision Based 3D Reconstruction: A Review. Zidonghua Xuebao/Acta Autom. Sin. 2020, 46, 631–652. [Google Scholar] [CrossRef]
- Theodorou, C.; Velisavljevic, V.; Dyo, V. Visual SLAM for Dynamic Environments Based on Object Detection and Optical Flow for Dynamic Object Removal. Sensors 2022, 22, 7553. [Google Scholar] [CrossRef]
- Shan, T.; Englot, B.; Ratti, C.; Rus, D. LVI-SAM: Tightly-Coupled LIDAR-Visual-Inertial Odometry via Smoothing and Mapping. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021. [Google Scholar] [CrossRef]
- Yang, Y.; Geneva, P.; Zuo, X.; Eckenhoff, K.; Liu, Y.; Huang, G. Tightly-Coupled Aided Inertial Navigation with Point and Plane Features. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 6094–6100. [Google Scholar] [CrossRef]
- Zheng, C.; Zhu, Q.; Xu, W.; Liu, X.; Guo, Q.; Zhang, F. FAST-LIVO: Fast and Tightly-Coupled Sparse-Direct LIDAR-Inertial-Visual Odometry. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 4003–4009. [Google Scholar] [CrossRef]
- Bell, B.M.; Cathey, F.W. The Iterated Kalman Filter Update as a Gauss-Newton Method. IEEE Trans. Autom. Control 1993, 38, 294–297. [Google Scholar] [CrossRef]
- He, D.; Xu, W.; Zhang, F. Kalman Filters on Differentiable Manifolds. arXiv 2021, arXiv:2102.03804. [Google Scholar] [CrossRef]
- Zuo, X.; Geneva, P.; Lee, W.; Liu, Y.; Huang, G. LIC-Fusion: LIDAR-Inertial-Camera Odometry. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Macau, China, 3–8 November 2019. [Google Scholar] [CrossRef]
- Zuo, X.; Yang, Y.; Geneva, P.; Lv, J.; Liu, Y.; Huang, G.; Pollefeys, M. LIC-Fusion 2.0: LIDAR-Inertial-Camera Odometry with Sliding-Window Plane-Feature Tracking. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Las Vegas, NV, USA, 24 October 2020–24 January 2021. [Google Scholar] [CrossRef]
- Lin, J.; Zheng, C.; Xu, W.; Zhang, F. R2LIVE: A Robust, Real-Time, LIDAR-Inertial-Visual Tightly-Coupled State Estimator and Mapping. IEEE Robot. Autom. Lett. 2021, 6, 7469–7476. [Google Scholar] [CrossRef]
- Nguyen, T.-M.; Yuan, S.; Cao, M.; Lyu, Y.; Nguyen, T.H.; Xie, L. NTU VIRAL: A Visual-Inertial-Ranging-LIDAR Dataset, from an Aerial Vehicle Viewpoint. Int. J. Robot. Res. 2021, 41, 270–280. [Google Scholar] [CrossRef]
- Lim, H.; Yeon, S.; Ryu, S.; Lee, Y.; Kim, Y.; Yun, J.; Jung, E.; Lee, D.; Myung, H. A Single Correspondence Is Enough: Robust Global Registration to Avoid Degeneracy in Urban Environments. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 8010–8017. [Google Scholar] [CrossRef]
- Magnusson, M.; Lilienthal, A.; Duckett, T. Scan Registration for Autonomous Mining Vehicles Using 3D-NDT. J. Field Robot. 2007, 24, 803–827. [Google Scholar] [CrossRef]
- Rusinkiewicz, S.; Levoy, M. Efficient Variants of the ICP Algorithm. In Proceedings of the Third International Conference on 3-D Digital Imaging and Modeling, Quebec City, QC, Canada, 28 May–1 June 2001. [Google Scholar] [CrossRef]
- Koide, K.; Yokozuka, M.; Oishi, S.; Banno, A. Voxelized GICP for Fast and Accurate 3D Point Cloud Registration. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 11054–11059. [Google Scholar] [CrossRef]
- Forster, C.; Zhang, Z.; Gassner, M.; Werlberger, M.; Scaramuzza, D. SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems. IEEE Trans. Robot. 2017, 33, 249–265. [Google Scholar] [CrossRef]
- Shin, Y.S.; Park, Y.S.; Kim, A. DVL-SLAM: Sparse Depth Enhanced Direct Visual-LIDAR SLAM. Auton. Robot. 2020, 44, 115–130. [Google Scholar] [CrossRef]
- Umeyama, S. Least-Squares Estimation of Transformation Parameters Between Two Point Patterns. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 376–380. [Google Scholar] [CrossRef]
Name | Distance (m) | Duration (s) | Remark |
---|---|---|---|
eee_01 | 237.01 m | 398.7 s | Collected at the School of EEE’s central carpark |
eee_02 | 171.08 m | 321.1 s | Collected at the School of EEE’s central carpark |
eee_03 | 127.83 m | 181.4 s | Collected at the School of EEE’s central carpark |
sbs_01 | 202.87 m | 354.2 s | Collected at the School of Bio. Science’s front square |
sbs_02 | 183.57 m | 373.3 s | Collected at the School of Bio. Science’s front square |
sbs_03 | 198.54 m | 389.3 s | Collected at the School of Bio. Science’s front square |
nya_01 | 160.24 m | 396.3 s | Collected inside the Nanyang Auditorium |
nya_02 | 249.10 m | 428.7 s | Collected inside the Nanyang Auditorium |
nya_03 | 315.47 m | 411.2 s | Collected inside the Nanyang Auditorium |
eee_01 | eee_02 | eee_03 | sbs_01 | sbs_02 | sbs_03 | nya_01 | nya_02 | nya_03 | |
---|---|---|---|---|---|---|---|---|---|
Ours | 0.22 | 0.20 | 0.25 | 0.25 | 0.24 | 0.22 | 0.23 | 0.23 | 0.25 |
FAST-LIVO | 0.24 | 0.27 | 0.27 | 0.24 | 0.23 | 0.3 | 0.24 | 0.25 | 0.28 |
R2LIVE | 0.62 | 0.44 | 0.97 | 0.67 | 0.31 | 0.48 | 0.31 | 0.63 | 0.31 |
SVO2.0 | Fail | Fail | 5.25 | 8.88 | Fail | Fail | 2.49 | 3.56 | 4.49 |
DVL-SLAM | 2.96 | 2.56 | 5.22 | 2.04 | 2.58 | 2.55 | 3.58 | 2.34 | 3.33 |
LIO Module | VIO Module | Loop Closure Module | ||||
---|---|---|---|---|---|---|
Quatro | GICP | Graph Optimization (Average per Frame) | Total Time | |||
Intel i7-10875 | 56.46 | 18.02 | 1.08 | 13.48 | 25.24 | 39.80 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yu, C.; Chao, Z.; Xie, H.; Hua, Y.; Wu, W. An Enhanced Multi-Sensor Simultaneous Localization and Mapping (SLAM) Framework with Coarse-to-Fine Loop Closure Detection Based on a Tightly Coupled Error State Iterative Kalman Filter. Robotics 2024, 13, 2. https://doi.org/10.3390/robotics13010002
Yu C, Chao Z, Xie H, Hua Y, Wu W. An Enhanced Multi-Sensor Simultaneous Localization and Mapping (SLAM) Framework with Coarse-to-Fine Loop Closure Detection Based on a Tightly Coupled Error State Iterative Kalman Filter. Robotics. 2024; 13(1):2. https://doi.org/10.3390/robotics13010002
Chicago/Turabian StyleYu, Changhao, Zichen Chao, Haoran Xie, Yue Hua, and Weitao Wu. 2024. "An Enhanced Multi-Sensor Simultaneous Localization and Mapping (SLAM) Framework with Coarse-to-Fine Loop Closure Detection Based on a Tightly Coupled Error State Iterative Kalman Filter" Robotics 13, no. 1: 2. https://doi.org/10.3390/robotics13010002
APA StyleYu, C., Chao, Z., Xie, H., Hua, Y., & Wu, W. (2024). An Enhanced Multi-Sensor Simultaneous Localization and Mapping (SLAM) Framework with Coarse-to-Fine Loop Closure Detection Based on a Tightly Coupled Error State Iterative Kalman Filter. Robotics, 13(1), 2. https://doi.org/10.3390/robotics13010002