Indoor and Outdoor Backpack Mapping with Calibrated Pair of Velodyne LiDARs
Abstract
:1. Introduction
- It is capable of both small indoor and large open outdoor environments mapping, georeferencing and sufficient precision in the order of centimeters. These abilities are evaluated using multiple datasets.
- It benefits from a synchronized and calibrated dual LiDAR scanner, which significantly increases field of view. Both scanners are used for both odometry estimation and 3D model reconstruction, which enables scanning of small environments, narrow corridors, staircases, etc.
- It provides the ability to recognize objects in the map due to sufficient point density and our novel intensity normalization for the measurements from an arbitrary range.
2. Related Work
3. Design of the Laser Mapping Backpack
- It fulfils the requirements for precision of the model up to 5 cm. Thanks to the robust loop closure, ambiguities (e.g., “double wall” effects) are avoided.
- The system is comfortable to use and it is as mobile as possible. The backpack weighs 9 kg (plus 1.4 kg for the optional dual antenna extension), and it is easy to carry around various environments including stairs, narrow corridors, rugged terrain, etc.
- The pair of synchronized and calibrated Velodyne LiDARS increases the field of view (FOV) and enables mapping of small rooms, narrow corridors, staircases, etc. (see Figure 6) without the need for special guidelines for scanning process.
- The data acquisition process is fast with verification of data completeness. There are no special guidelines for the scanning process (comparing to the requirements of ZEB) and the operator is required only to visit all places to be captured in a normal pace. Moreover, captured data are visualized online at the mobile device (smartphone, tablet) for operator to see whether everything is captured correctly.
- Since we are using long range Velodyne LiDAR (compared to simple 2D rangefinders such as Hokuyko or Sick) and optional GNSS support, we provide a universal economically convenient solution for both indoor and outdoor use. For such scenarios, where GNSS is available, final reconstruction is georeferenced—the 3D position in the global geographical frame is assigned to every 3D point in the model.
- The final 3D model is dense and colored by the laser intensity, which is further normalized. This helps distinguishing important objects, inventory, larger texts, signs, and some surface texture properties.
3.1. Hardware Description
3.2. Dual LiDAR System
3.3. Calibration of the Sensors
3.4. Point Cloud Registration
3.5. Overlap Estimation
3.6. Rolling Shutter Corrections
3.7. Pose Graph Construction and Optimization
Algorithm 1 Progressive refinement of 6DoF poses for sequence of frames by optimizing pose graph . |
|
3.8. Pose Graph Verification
3.9. Horizontal Alignment of the Indoor Map
3.10. Intensities Normalization
4. Experiments
- sufficient relative precision under 5 cm;
- global absolute error within the limits described above;
- data density and coloring by normalized intensities for visual inspection; and
- data consistency without ambiguity (no dual walls effects).
4.1. Comparison of Point Cloud Registration Methods
4.2. Indoor Experiments
- in 4RECON-10, the registrations were performed only within small neighborhood of 10 nearest frames (1 s time window) and reflects the impact of accumulation error;
- for 4RECON-overlap, the registrations were performed for all overlapping frames as described in Section 3.7 reducing the accumulation error by loop closures at every possible location; and
- pose graph verification (see Section 3.8) was deployed in 4RECON-verification, yielding the best results with good precision and no ambiguities.
4.3. Outdoor Experiments
4.4. Comparison of Single and Dual Velodyne Solution
5. Discussion
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- NavVis. Available online: https://www.navvis.com/ (accessed on 19 August 2019).
- Nava, Y. Visual-LiDAR SLAM with Loop Closure. Master’s Thesis, KTH Royal Institute of Technology, Stockholm, Sweden, 2018. [Google Scholar]
- GeoSLAM. Available online: https://geoslam.com/ (accessed on 19 August 2019).
- Sirmacek, B.; Shen, Y.; Lindenbergh, R.; Zlatanova, S.; Diakite, A. Comparison of Zeb1 and Leica C10 indoor laser scanning point clouds. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 3, 143. [Google Scholar] [CrossRef]
- Bosse, M.; Zlot, R.; Flick, P. Zebedee: Design of a Spring-Mounted 3-D Range Sensor with Application to Mobile Mapping. IEEE Trans. Robot. 2012, 28, 1104–1119. [Google Scholar] [CrossRef]
- Bosse, M.; Zlot, R. Continuous 3D scan-matching with a spinning 2D laser. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 4312–4319. [Google Scholar] [CrossRef]
- GeoSLAM Ltd. The ZEB-REVO Solution. 2018. Available online: https://geoslam.com/wp-content/uploads/2018/04/GeoSLAM-ZEB-REVO-Solution_v9.pdf?x97867 (accessed on 19 July 2019).
- Dewez, T.J.; Plat, E.; Degas, M.; Richard, T.; Pannet, P.; Thuon, Y.; Meire, B.; Watelet, J.M.; Cauvin, L.; Lucas, J.; et al. Handheld Mobile Laser Scanners Zeb-1 and Zeb-Revo to map an underground quarry and its above-ground surroundings. In Proceedings of the 2nd Virtual Geosciences Conference: VGC 2016, Bergen, Norway, 21–23 Sepember 2016. [Google Scholar]
- GreenValley Internation. LiBackpack DG50, Mobile Handheld 3D Mapping System. 2019. Available online: https://greenvalleyintl.com/wp-content/uploads/2019/04/LiBackpack-DG50.pdf (accessed on 19 July 2019).
- GreenValley International. Available online: https://greenvalleyintl.com/ (accessed on 19 August 2019).
- Leica Geosystems AG. Leica Pegasus: Backpack, Mobile Reality Capture. 2017. Available online: https://www.gefos-leica.cz/data/original/skenery/mobilni-mapovani/backpack/leica_pegasusbackpack_ds.pdf (accessed on 19 July 2019).
- Masiero, A.; Fissore, F.; Guarnieri, A.; Piragnolo, M.; Vettore, A. Comparison of Low Cost Photogrammetric Survey with Tls and Leica Pegasus Backpack 3d Modelss. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 147. [Google Scholar] [CrossRef]
- Dubey, P. New bMS3D-360: The First Backpack Mobile Scanning System Including Panormic Camera. 2018. Available online: https://www.geospatialworld.net/news/new-bms3d-360-first-backpack-mobile-scanning-system-panormic-camera/ (accessed on 30 August 2019).
- Viametris. Available online: https://www.viametris.com/ (accessed on 20 August 2019).
- 3D Laser Mapping. Robin Datasheet. 2017. Available online: https://www.3dlasermapping.com/wp-content/uploads/2017/09/ROBIN-Datasheet-front-and-reverse-WEB.pdf (accessed on 19 July 2019).
- 3D Lasser Mapping. Available online: https://www.3dlasermapping.com/ (accessed on 19 August 2019).
- Rönnholm, P.; Liang, X.; Kukko, A.; Jaakkola, A.; Hyyppä, J. Quality analysis and correction of mobile backpack laser scanning data. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 3, 41. [Google Scholar] [CrossRef]
- Kukko, A.; Kaartinen, H.; Zanetti, M. Backpack personal laser scanning system for grain-scale topographic mapping. In Proceedings of the 46th Lunar and Planetary Science Conference, The Woodlands, TX, USA, 16–20 March 2015; Volume 2407. [Google Scholar]
- Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-time. In Proceedings of the Robotics: Science and Systems Conference (RSS 2014), At Berkeley, CA, USA, 12–16 July 2014. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. Visual-lidar odometry and mapping: Low-drift, robust, and fast. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 2174–2181. [Google Scholar] [CrossRef]
- Zhang, J.; Kaess, M.; Singh, S. Real-time depth enhanced monocular odometry. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 4973–4980. [Google Scholar] [CrossRef]
- Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets robotics: The KITTI dataset. Int. J. Robot. Res. 2013, 32, 1231–1237. [Google Scholar] [CrossRef] [Green Version]
- Velas, M.; Spanel, M.; Herout, A. Collar Line Segments for fast odometry estimation from Velodyne point clouds. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 4486–4495. [Google Scholar] [CrossRef]
- Besl, P.J.; McKay, N.D. A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 239–256. [Google Scholar] [CrossRef]
- Maboudi, M.; Bánhidi, D.; Gerke, M. Evaluation of indoor mobile mapping systems. In Proceedings of the GFaI Workshop 3D-NordOst 2017 (20th Application-Oriented Workshop on Measuring, Modeling, Processing and Analysis of 3D-Data), Berlin, Germany, 7–8 December 2017. [Google Scholar]
- Kukko, A.; Kaartinen, H.; Hyyppä, J.; Chen, Y. Multiplatform Mobile Laser Scanning: Usability and Performance. Sensors 2012, 12, 11712–11733. [Google Scholar] [CrossRef] [Green Version]
- Lauterbach, H.A.; Borrmann, D.; Heß, R.; Eck, D.; Schilling, K.; Nüchter, A. Evaluation of a Backpack-Mounted 3D Mobile Scanning System. Remote Sens. 2015, 7, 13753–13781. [Google Scholar] [CrossRef] [Green Version]
- Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-time loop closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1271–1278. [Google Scholar] [CrossRef]
- Nüchter, A.; Bleier, M.; Schauer, J.; Janotta, P. Continuous-Time SLAM—Improving Google’s Cartographer 3D Mapping. In Latest Developments in Reality-Based 3D Surveying and Modelling; MDPI: Basel, Switzerland, 2018; pp. 53–73. [Google Scholar] [CrossRef]
- Newcombe, R.A.; Izadi, S.; Hilliges, O.; Molyneaux, D.; Kim, D.; Davison, A.J.; Kohi, P.; Shotton, J.; Hodges, S.; Fitzgibbon, A. KinectFusion: Real-time dense surface mapping and tracking. In Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality, Basel, Switzerland, 26–29 October 2011; pp. 127–136. [Google Scholar] [CrossRef]
- Deschaud, J. IMLS-SLAM: Scan-to-Model Matching Based on 3D Data. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 2480–2485. [Google Scholar] [CrossRef]
- Kolluri, R. Provably Good Moving Least Squares. ACM Trans. Algorithms 2008, 4, 18:1–18:25. [Google Scholar] [CrossRef]
- Droeschel, D.; Behnke, S. Efficient Continuous-Time SLAM for 3D Lidar-Based Online Mapping. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 1–9. [Google Scholar] [CrossRef]
- Droeschel, D.; Schwarz, M.; Behnke, S. Continuous Mapping and Localization for Autonomous Navigation in Rough Terrain Using a 3D Laser Scanner. Robot. Auton. Syst. 2017, 88, 104–115. [Google Scholar] [CrossRef]
- Mendes, E.; Koch, P.; Lacroix, S. ICP-based pose-graph SLAM. In Proceedings of the 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Lausanne, Switzerland, 23–27 October 2016; pp. 195–200. [Google Scholar] [CrossRef]
- Park, C.; Moghadam, P.; Kim, S.; Elfes, A.; Fookes, C.; Sridharan, S. Elastic LiDAR Fusion: Dense Map-Centric Continuous-Time SLAM. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 1206–1213. [Google Scholar] [CrossRef]
- Kashani, A.G.; Olsen, M.J.; Parrish, C.E.; Wilson, N. A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration. Sensors 2015, 15, 28099–28128. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jutzi, B.; Gross, H. Normalization Of Lidar Intensity Data Based On Range And Surface Incidence Angle. ISPRS J. Photogramm. Remote Sens. 2009, 38, 213–218. [Google Scholar]
- Kaasalainen, S.; Jaakkola, A.; Kaasalainen, M.; Krooks, A.; Kukko, A. Analysis of Incidence Angle and Distance Effects on Terrestrial Laser Scanner Intensity: Search for Correction Methods. Remote Sens. 2011, 3, 2207–2221. [Google Scholar] [CrossRef] [Green Version]
- Velodyne LiDAR. Available online: https://velodynelidar.com/ (accessed on 19 August 2019).
- Nuchter, A.; Lingemann, K.; Hertzberg, J.; Surmann, H. 6D SLAM with approximate data association. In Proceedings of the ICAR ’05. Proceedings., 12th International Conference on Advanced Robotics, Seattle, WA, USA, 18–20 July 2005; pp. 242–249. [Google Scholar] [CrossRef]
- Velas, M.; Faulhammer, T.; Spanel, M.; Zillich, M.; Vincze, M. Improving multi-view object recognition by detecting changes in point clouds. In Proceedings of the 2016 IEEE Symposium Series on Computational Intelligence (SSCI), Athens, Greece, 6–9 December 2016; pp. 1–7. [Google Scholar] [CrossRef]
- Shoemake, K. Animating Rotation with Quaternion Curves. In Proceedings of the 12th Annual Conference on Computer Graphics and Interactive Techniques, San Francisco, CA, USA, 22–26 July 1985; ACM: New York, NY, USA, 1985; pp. 245–254. [Google Scholar] [CrossRef]
- Ila, V.; Polok, L.; Solony, M.; Svoboda, P. SLAM++-A highly efficient and temporally scalable incremental SLAM framework. Int. J. Robot. Res. 2017, 36, 210–230. [Google Scholar] [CrossRef]
- Markley, F.L.; Cheng, Y.; Crassidis, J.L.; Oshman, Y. Averaging Quaternions. J. Guid. Control. Dyn. 2007, 30, 1193–1197. [Google Scholar] [CrossRef]
- Segal, A.; Hähnel, D.; Thrun, S. Generalized-ICP. In Robotics: Science and Systems; Trinkle, J., Matsuoka, Y., Castellanos, J.A., Eds.; The MIT Press: Cambridge, MA, USA, 2009. [Google Scholar] [CrossRef]
Solution (Released in) | Sensor (Precision) | Range | System Precision | Price € | Open Method | Properties and Limitations | Intensities | |
---|---|---|---|---|---|---|---|---|
ZEB-1 (2013) | [3] | Hokuyo UTM-30LX (3 cm up to 10 m range) | 15–20 m (max 30 m under optimal conditions) | up to 3.8 cm indoors [4] | N/A | Proprietary, based on [5,6] |
| No |
ZEB-REVO (2015) [7] | [3] | Hokuyo UTM-30LX-F (3 cm up to 10 m range) | 15–20 m (max 30 m under optimal conditions) [7] | up to 3.6 cm indoors [8] | 34,000 | Proprietary, based on [5,6] |
| No |
LiBackpack (2019) [9] | [10] | 2× Velodyne VLP-16 (3 cm) | 100 m (Velodyne scanner limitation) | 5 cm | 60,000 | Proprietary |
| Yes |
Pegasus (2015) [11] | [11] | 2× Velodyne VLP-16 (3 cm) | 50 m usable range | 5 cm with GNSS (5–50 cm without). 4.2 cm in underground bastion [12] | 150,000 | Proprietary |
| Yes |
Viametris bMS3D [13,14] | [14] | 2× Velodyne VLP-16 (3 cm) | 100 m (Velodyne scanner limitation) | 5 cm under appropriate satellite reception conditions | N/A | Proprietary |
| Yes |
Robin (2016) [15] | [16] | RIEGL VUX-1HA (3 mm) | 120/420 m in slow/high frequency mode (for sensor) | up to 3.6 cm at 30 m range (FOG IMU update) | 220,000 | Proprietary |
| Yes |
Akhka (2015) [17,18] | [17] | FARO Focus3D 120S (1 mm) | 120 m (sensor range) | 8.7 cm in forest environments | N/A | Open [17] |
| Yes |
Error es (18) | |||||
---|---|---|---|---|---|
Sequence | Length | LOAM Online | LOAM Offline | CLS Single | CLS Multi-Frame |
0 | 4540 | 0.052 | 0.022 | 0.022 | 0.018 |
1 | 1100 | 0.038 | 0.040 | 0.042 | 0.029 |
2 | 4660 | 0.055 | 0.046 | 0.024 | 0.022 |
3 | 800 | 0.029 | 0.019 | 0.018 | 0.015 |
4 | 270 | 0.015 | 0.015 | 0.017 | 0.017 |
5 | 2760 | 0.025 | 0.018 | 0.017 | 0.012 |
6 | 1100 | 0.033 | 0.016 | 0.009 | 0.008 |
7 | 1100 | 0.038 | 0.019 | 0.011 | 0.007 |
8 | 4070 | 0.035 | 0.024 | 0.020 | 0.015 |
9 | 1590 | 0.043 | 0.032 | 0.020 | 0.018 |
Weighted average | 2108 | 0.043 | 0.029 | 0.022 | 0.017 |
Dataset | Slice # | 4RECON-10 | 4RECON-Overlap | 4RECON-Verification | ZEB-1 |
---|---|---|---|---|---|
Office | 1 | 2.50 | 1.71 | 1.49 | 1.44 |
2 | 1.97 | 1.47 | 1.31 | 1.06 | |
3 | 1.70 | 1.75 | 1.55 | 1.22 | |
4 | 1.82 | 1.54 | 1.31 | 1.22 | |
5 | 1.93 | 1.63 | 1.53 | 1.44 | |
6 | 2.13 | 1.49 | 1.47 | 1.29 | |
7 | 2.09 | 1.68 | 1.37 | 0.97 | |
8 | 2.07 | 1.36 | 1.37 | 1.31 | |
Average (cm) | 2.01 | 1.62 | 1.41 | 1.14 | |
Staircase | 1 | 3.23 | 2.11 | 1.81 | - |
2 | 3.99 | 1.87 | 1.60 | - | |
3 | 2.63 | 1.65 | 1.61 | - | |
4 | 2.74 | 1.71 | 1.53 | - | |
5 | 2.42 | 1.68 | 1.50 | - | |
6 | 2.98 | 2.67 | 1.67 | - | |
7 | 1.76 | 1.75 | 1.29 | - | |
8 | 1.82 | 1.67 | 1.56 | - | |
Average (cm) | 2.74 | 1.82 | 1.57 | - |
Ref. Point | dX | dY | Horizontal Error | dZ (Vertical) | Total Error |
---|---|---|---|---|---|
1 | −5.9 | −1.2 | 6.0 | −15.2 | 16.3 |
2 | −5.6 | 0.5 | 5.6 | −4.7 | 7.3 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Velas, M.; Spanel, M.; Sleziak, T.; Habrovec, J.; Herout, A. Indoor and Outdoor Backpack Mapping with Calibrated Pair of Velodyne LiDARs. Sensors 2019, 19, 3944. https://doi.org/10.3390/s19183944
Velas M, Spanel M, Sleziak T, Habrovec J, Herout A. Indoor and Outdoor Backpack Mapping with Calibrated Pair of Velodyne LiDARs. Sensors. 2019; 19(18):3944. https://doi.org/10.3390/s19183944
Chicago/Turabian StyleVelas, Martin, Michal Spanel, Tomas Sleziak, Jiri Habrovec, and Adam Herout. 2019. "Indoor and Outdoor Backpack Mapping with Calibrated Pair of Velodyne LiDARs" Sensors 19, no. 18: 3944. https://doi.org/10.3390/s19183944
APA StyleVelas, M., Spanel, M., Sleziak, T., Habrovec, J., & Herout, A. (2019). Indoor and Outdoor Backpack Mapping with Calibrated Pair of Velodyne LiDARs. Sensors, 19(18), 3944. https://doi.org/10.3390/s19183944