High-Precision SLAM Based on the Tight Coupling of Dual Lidar Inertial Odometry for Multi-Scene Applications
Abstract
:Featured Application
Abstract
1. Introduction
- A general high-precision SLAM framework is provided by fusing different sensors. It can adapt to multi-scene applications, such as a corridor with fewer features, stairs with height, and complex outdoor environments.
- The horizontal and vertical lidars are fused by external calibration and adaptive time synchronization algorithms to solve the narrow vertical FOV of the lidar.
- To improve the positioning accuracy of SLAM in the environment with height information (e.g., stairs), the dual lidar odometry and IMU are tightly coupled. The dual lidar odometry measurement and IMU pre-integration are jointly optimized to obtain more accurate IMU state values, which will be used to eliminate the motion distortion of the dual lidar to improve the localization accuracy.
- In addition, several practical experiments are carried out to verify the feasibility and effectiveness of the proposed method.
2. Prerequisites
2.1. IMU State Prediction and Pre-Integration
2.2. Segmentation and Feature Extraction
3. Multi-Sensor Fusion
3.1. Hardware System Description
3.2. System Overview
3.3. Fusion of Horizontal Lidar and Vertical Lidar
3.3.1. External Parameter Calibration of Horizontal Lidar and Vertical Lidar
3.3.2. The Adaptive Time Synchronization Algorithm
3.4. Tight Coupling of Dual Lidar and IMU
3.4.1. External Parameter Calibration of Horizontal Lidar and IMU and Time Synchronization
3.4.2. Joint Optimization
4. Experiment
4.1. Data Acquisition Equipment
4.2. Indoor Experiment 1
4.3. Indoor Experiment 2
4.4. Outdoor Experiment
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Dissanayake, M.W.M.G.; Newman, P.; Clark, S.; Durrant-Whyte, H.F.; Csorba, M. A solution to the simultaneous localization and map building (SLAM) problem. IEEE Trans. Robot. Autom. 2001, 17, 229–241. [Google Scholar] [CrossRef] [Green Version]
- Durrant-Whyte, H.; Bailey, T. Simultaneous localization and mapping: Part I. IEEE Robot. Autom. Mag. 2006, 13, 99–110. [Google Scholar] [CrossRef] [Green Version]
- Khairuddin, A.R.; Talib, M.S.; Haron, H. Review on Simultaneous Localization and Mapping (SLAM). In Proceedings of the IEEE International Conference on Control System, Computing and Engineering (ICCSCE), George Town, Malaysia, 25–27 November 2016; pp. 85–90. [Google Scholar]
- Belter, D.; Nowicki, M.; Skrzypczy’Nski, P. Lightweight RGB-D SLAM System for Search and Rescue Robots. In Progress in Automation, Robotics and Measuring Techniques; Springer: Cham, Switzerland, 2015; Volume 2, pp. 11–21. [Google Scholar]
- Sim, R.; Roy, N. Global A-Optimal Robot Exploration in Slam. In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 661–666. [Google Scholar]
- Cheng, Z.; Wang, G. Real-Time RGB-D SLAM with Points and Lines. In Proceedings of the 2018 2nd IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC), Xi’an, China, 25–27 May 2018; pp. 119–122. [Google Scholar]
- Liu, T.; Zhang, X.; Wei, Z.; Yuan, Z. A robust fusion method for RGB-D SLAM. In Proceedings of the 2013 Chinese Automation Congress, Changsha, China, 7–8 November 2013; pp. 474–481. [Google Scholar]
- Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-Time Loop Closure in 2D LIDAR SLAM. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–20 May 2016; pp. 1271–1278. [Google Scholar]
- Yu, Y.; Gao, W.; Liu, C. A GPS-aided Omnidirectional Visual-Inertial State Estimator in Ubiquitous Environments. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 7750–7755. [Google Scholar]
- Wang, Z.; Zhang, J.; Chen, S. Robust High Accuracy Visual-Inertial-Lidar SLAM System. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 6636–6641. [Google Scholar]
- Mur-Artal, R.; Tardós, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef] [Green Version]
- Zhou, Y.; Li, H.; Kneip, L. Canny-VO: Visual Odometry with RGB-D Cameras Based on Geometric 3-D–2-D Edge Alignment. IEEE Trans. Robot. 2019, 35, 184–199. [Google Scholar] [CrossRef]
- Hu, G.; Huang, S.; Zhao, L.; Alempijevic, A.; Dissanayake, G. A robust RGB-D SLAM algorithm. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 1714–1719. [Google Scholar]
- Zhang, J.; Singh, S. Loam: Lidar odometry and mapping in real-time. In Proceedings of the Robotics: Science and Systems X, Berkeley, CA, USA, 12–16 July 2014. [Google Scholar]
- Tsardoulias, E.; Petrou, L. Critical rays scan match SLAM. J. Intell. Robot. Syst. 2013, 72, 441–462. [Google Scholar] [CrossRef]
- Jiang, B.; Zhu, Y.; Liu, M. A Triangle Feature Based Map-to-map Matching and Loop Closure for 2D Graph SLAM. In Proceedings of the 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO), Dali, China, 6–8 December 2019; pp. 2719–2725. [Google Scholar]
- Shan, T.; Englot, B.J. Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4758–4765. [Google Scholar]
- Bogoslavskyi, I.; Stachniss, C. Fast range image-based segmentation of sparse 3D lidar scans for online operation. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 163–169. [Google Scholar]
- Ranganathan, A. The levenberg-marquardt algorithm. Tutoral LM Algorithm 2004, 11, 101–110. [Google Scholar]
- Torres-Torriti, M.; Guesalaga, A. Scan-to-map matching using the Hausdorff distance for robust mobile robot localization. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation (ICRA), Pasadena, CA, USA, 19–23 May 2008; pp. 455–460. [Google Scholar]
- Hassani, A.; Morris, N.; Spenko, M. Experimental integrity evaluation of tightly-integrated IMU/LiDAR including return-light intensity data. In Proceedings of the 32nd International Technical Meeting of The Satellite Division of the Institute of Navigation (ION), Miami, FL, USA, 16–20 September 2019; pp. 2637–2658. [Google Scholar]
- Velas, M.; Spanel, M.; Hradis, M. CNN for IMU assisted odometry estimation using velodyne LiDAR. In Proceedings of the 2018 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Torres Vedras, Portugal, 25–27 April 2018; pp. 71–77. [Google Scholar]
- Deilamsalehy, H.; Havens, T.C. Sensor fused three-dimensional localization using IMU, camera and LiDAR. In Proceedings of the 2016 IEEE Sensors, Orlando, FL, USA, 30 October–3 November 2016; pp. 1–3. [Google Scholar]
- Xie, G.; Zong, Q.; Zhang, X. Loosely-coupled lidar-inertial odometry and mapping in real time. Int. J. Intell. Robot. Appl. 2021, 5, 119–129. [Google Scholar] [CrossRef]
- Moore, T.; Stouch, D. A generalized extended kalman filter implementation for the robot operating system. In Intelligent Autonomous Systems 13; Springer: Cham, Switzerland, 2016; pp. 335–348. [Google Scholar]
- Ye, H.; Chen, Y.; Liu, M. Tightly coupled 3d lidar inertial odometry and mapping. In Proceedings of the 2019 IEEE International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 3144–3150. [Google Scholar]
- Shan, T.; Englot, B.; Meyers, D. Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 5135–5142. [Google Scholar]
- Wellington, C.; Stentz, A. Learning predictions of the load-bearing surface for autonomous rough-terrain navigation in vegetation. In Springer Tracts in Advanced Robotics (STAR); Springer: Berlin/Heidelberg, Germany, 2003; Volume 24, pp. 83–92. [Google Scholar]
- Bosse, M.; Zlot, R. Continuous 3D scan-matching with a spinning 2D laser. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation (ICRA), Kobe, Japan, 12–17 May 2009; pp. 4312–4319. [Google Scholar]
- Bosse, M.; Zlot, R.; Flick, P. Zebedee: Design of a spring-mounted 3-d range sensor with application to mobile mapping. IEEE Trans. Robot. 2012, 28, 1104–1119. [Google Scholar] [CrossRef]
- Palieri, M.; Morrell, B. Locus: A multi-sensor lidar-centric solution for high-precision odometry and 3d mapping in real-time. IEEE Robot. Autom. Lett. 2020, 6, 421–428. [Google Scholar] [CrossRef]
- Jiao, J.; Ye, H.; Zhu, Y.; Liu, M. Robust odometry and mapping for multi-lidar systems with online extrinsic calibration. IEEE Trans. Robot. 2021, 1–10. [Google Scholar] [CrossRef]
- Nguyen, T.M.; Yuan, S. MILIOM: Tightly Coupled Multi-Input Lidar-Inertia Odometry and Mapping. IEEE Robot. Autom. Lett. 2021, 6, 5573–5580. [Google Scholar] [CrossRef]
- Forster, C.; Carlone, L.; Dellaert, F. On-Manifold Preintegration for Real-Time Visual-Inertial Odometry. IEEE Trans. Robot. Autom. 2017, 33, 1–21. [Google Scholar] [CrossRef] [Green Version]
- Qin, T.; Li, P.; Shen, S. Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
- Ceres Solver. Available online: http://ceres-solver.org (accessed on 5 October 2021).
Algorithm | LeGO-LOAM | LIO-SAM | HSDLIO |
---|---|---|---|
Fail | 0.079 | 0.077 | |
Fail | 0.092 | 0.090 | |
Fail | 0.121 | 0.001 | |
RTE | Fail | 0.171 | 0.118 |
Algorithm | LeGO-LOAM | LIO-SAM | HSDLIO |
---|---|---|---|
Fail | 0.033 | 0.011 | |
Fail | 0.036 | 0.024 | |
Fail | 0.062 | 0.004 | |
RTE | Fail | 0.078 | 0.026 |
Algorithm | LeGO-LOAM | LIO-SAM | HSDLIO |
---|---|---|---|
RTE | 0.082 | 0.036 | 0.031 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xiao, K.; Yu, W.; Liu, W.; Qu, F.; Ma, Z. High-Precision SLAM Based on the Tight Coupling of Dual Lidar Inertial Odometry for Multi-Scene Applications. Appl. Sci. 2022, 12, 939. https://doi.org/10.3390/app12030939
Xiao K, Yu W, Liu W, Qu F, Ma Z. High-Precision SLAM Based on the Tight Coupling of Dual Lidar Inertial Odometry for Multi-Scene Applications. Applied Sciences. 2022; 12(3):939. https://doi.org/10.3390/app12030939
Chicago/Turabian StyleXiao, Kui, Wentao Yu, Weirong Liu, Feng Qu, and Zhenyan Ma. 2022. "High-Precision SLAM Based on the Tight Coupling of Dual Lidar Inertial Odometry for Multi-Scene Applications" Applied Sciences 12, no. 3: 939. https://doi.org/10.3390/app12030939
APA StyleXiao, K., Yu, W., Liu, W., Qu, F., & Ma, Z. (2022). High-Precision SLAM Based on the Tight Coupling of Dual Lidar Inertial Odometry for Multi-Scene Applications. Applied Sciences, 12(3), 939. https://doi.org/10.3390/app12030939