A Survey of Low-Cost 3D Laser Scanning Technology
Abstract
:1. Introduction
2. Overview of Principles
3. Classification of the Prototypes
3.1. A Rotating 2D LiDAR and a Pitching 2D LiDAR
3.2. A Push-Broom 2D LiDAR
3.3. Other Categories
3.3.1. An Irregularly Rotating 2D LiDAR
3.3.2. An Obliquely Rotating 2D LiDAR
3.3.3. An Irregularly Moving 2D LiDAR
3.4. Extra 1: A Moving 3D LiDAR
3.5. Extra 2: DIY Low-Cost 3D LiDAR and a Rotating Mirror/Prism
3.6. Discussion of the Classification of the Prototypes
4. Problems of a Moving 2D LiDAR
4.1. Problem I: Accuracy
4.1.1. Problem 1.1—For Categories 1–6: The Measurement Error of the 2D LiDAR
4.1.2. Problem 1.2—For Categories 1, 2, and 5: The Error Caused by the Motor
4.1.3. Problem 1.3—For Categories 1, 2, 4, and 5: The Error Caused by the Assembly Inaccuracy between the 2D LiDAR and the Rotating Unit
4.1.4. Problem 1.4—For Categories 1, 2, and 5: The Error Caused by the Synchronization Inaccuracy between the 2D LiDAR and the Rotating Unit
4.1.5. Problem 1.5—For Categories 1 to 6: The Error Caused by the Assembly Inaccuracy between a Moving 2D LiDAR and the Mobile Platform
4.1.6. Problem 1.6—For Categories 1 to 6: The Error Caused by the Estimation Inaccuracy of the Movement of the Mobile Platform
4.2. Problem II: Real-Time Performance
4.2.1. Problem 2.1—For Categories 1 to 6: The Negative Correlation between the Real-Time Performance and the Density of the 3D Point Cloud
4.2.2. Problem 2.2—For Categories 1, 2, 4, and 5: The Distortion of the 3D Point Cloud Caused by the Movement of the Mobile Platform
4.2.3. Problem 2.3—For Categories 1 to 6: The Distortion of the 3D Point Cloud Caused by the Moving Objects in the Environment
4.3. Other Problems
4.3.1. Problem 3.1—For Categories 1 to 6: The Density Distribution of the 3D Point Cloud Built by a Moving 2D LiDAR
4.3.2. Problem 3.2—For Categories 1 to 6: The Fusion of a Moving 2D LiDAR and 2D LiDAR SLAM
4.3.3. Problem 3.3—For Category 3: A Push-Broom 2D LiDAR, Which Is Used for the Detection of the Obstacles in Front of the Vehicle
4.3.4. Problem 3.4—For Categories 1 to 6: The Fusion of a Moving 2D LiDAR and a Camera
4.4. Discussion of the Problems of a Moving 2D LiDAR
5. Discussion
5.1. The Definition of Low Cost
5.2. Three-Dimensional LiDAR Is Still Indispensable
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A
Categories | Manufacturers | Product Model ID | Performance | Prices (USD) | Application | Nations |
---|---|---|---|---|---|---|
2D LiDAR | Sick | LMS111 | Single-line mechanical LiDAR, its horizontal field of vision is 270°, its measuring range is 0.5 m to 20 m, its scanning frequency is 25 Hz or 50 Hz (in different working modes), its angular resolution is 0.25° or 0.5° (in different working modes). The typical value of its systematic error is ±30 mm, and the typical value of its statistical error is 12 mm [122]. | 3813 | Outdoor; mobile robot | Germany |
Sick | LMS511 | Single-line mechanical LiDAR, its horizontal field of vision is 190°, its measuring range is 1 m to 80 m, its scanning frequency is 25 Hz or 35 Hz or 50 Hz or 75 Hz or 100 Hz (in different working modes). Its angular resolution is 0.042° or 0.083° or 0.1667° or 0.25° or 0.333° or 0.5° or 0.667° or 1° (in different working modes). Its systematic error is ±25 mm (1 m to 10 m) or ±35 mm (10 m to 20 m) or ±50 mm (20 m to 30 m). Its statistical error is 6 mm (1 m to 10 m) or 8 mm (10 m to 20 m) or 14 mm (20 m to 30 m) [123]. | 6405 | Outdoor; mobile robot | Germany | |
Hokuyo | UST-10LX | Single-line mechanical LiDAR, its horizontal field of vision is 270°, its measuring range is 0.06 m to 10 m, its scanning frequency is 40 Hz. Its angular resolution is 0.25°. The typical value of its accuracy is ±40 mm, and the typical value of its repeated accuracy is 30 mm [124]. | 1525 | Mobile robot | Japan | |
Hokuyo | UTM-30LX-EW | Single-line mechanical LiDAR, its horizontal field of vision is 270°, its measuring range is 0.1 m to 10 m, its scanning frequency is 40 Hz. Its angular resolution is 0.25°. Its accuracy is ±30 mm (0.1 m to 10 m) or ±50 mm (10 m to 30 m), its repeated accuracy is 10 mm (0.1 m to 10 m) or 30 mm (10 m to 30 m) [125]. | 5338 | Mobile robot | Japan | |
Slamtec | RPLIDAR A1 | Single-line mechanical LiDAR, its horizontal field of vision is 360°, its measuring range is 0.15 m to 12 m, its scanning frequency is 1 Hz to 10 Hz. Its angular resolution is higher than 1°. Its ranging resolution is higher than 0.5 mm or 1 percent of the measured range [6]. | 76 | Mobile robot | China | |
Slamtec | RPLIDAR A2M6 | Single-line mechanical LiDAR, its horizontal field of vision is 360°, its measuring range is 0.2 m to 18 m, its scanning frequency is 5 Hz to 15 Hz. Its angular resolution is 0.45° to 1.35°. Its ranging resolution is higher than 0.5 mm or 1 percent of the measured range [126]. | 290 | Mobile robot | China | |
Slamtec | RPLIDAR A3 | Single-line mechanical LiDAR, its horizontal field of vision is 360°, its measuring range is 0.2 m to 25 m, its scanning frequency is 5 Hz to 15 Hz. Its angular resolution is 0.225° or 0.36° [127]. | 625 | Mobile robot | China | |
Slamtec | RPLIDAR S1 | Single-line mechanical LiDAR, its horizontal field of vision is 360°, its measuring range is 0.1 m to 40 m, its scanning frequency is 8 Hz to 15 Hz. Its angular resolution is 0.313° to 0.587°. Its measuring accuracy is ±50 mm, and its measuring resolution is 30 mm [128]. | 686 | Mobile robot | China | |
Vanjee Technology | WLR-716 | Single-line mechanical LiDAR, its horizontal field of vision is 270°, its measuring range is no more than 25 m, its scanning frequency is 15 Hz. Its angular resolution is 0.33°. Its measuring accuracy is higher than ±20 mm [129]. | 991 | Mobile robot | China | |
3D LiDAR | Leica | BLK360 | Portable 3D laser scanner. Its horizontal field of view is 360°, and its vertical field of view is 300°. Its scanning range is 0.6 m to 60 m. Its scanning rate is as high as 360,000 points per second, and a panoramic scan can be done within 3 min. Its ranging accuracy is 4 mm (at 10 m) or 7 mm (at 20 m), and the accuracy of the constructed 3D point cloud is 6 mm (at 10 m) or 8 mm (at 20 m) [130]. | 22,875 | Surveying Engineering | Switzerland |
Faro | FocusS Plus 350 | Terrestrial 3D laser scanner. Its horizontal field of view is 360°, and its vertical field of view is 300°. Its measuring distance is 0.6 m to 350 m, its scanning speed can be as high as 2,000,000 points per second, and the vertical and horizontal scanning steps are both 0.009°. Its ranging error is ±1 mm. Its ranging noise is 0.1 mm to 1.6 mm [131]. | 48,800 | Surveying Engineering | USA | |
Velodyne | VLP-16 (Puck) | Sixteen lines mechanical LiDAR, with a maximum measurement range of 100 m. Its horizontal field of view is 360°, and its vertical field of view is −15° to 15°. The angular resolution in the horizontal direction is 0.1° to 0.4°, and the angular resolution in the vertical direction is 2°. Its scanning rate can be as high as 300,000 points per second, and its scanning frequency is 5 Hz to 20 Hz. The typical value of its measuring accuracy is ±3 cm [132]. | 5338 | Automatic driving | USA | |
Velodyne | HDL-32E | Thirty-two lines mechanical LiDAR, with a maximum measurement range of 100m. Its horizontal field of view is 360°, and its vertical field of view is −30° to 10°. The angular resolution in the horizontal direction is 0.1° to 0.4°, and the angular resolution in the vertical direction is 1.33°. Its scanning rate can be as high as 700,000 points per second, and its scanning frequency is 5 Hz to 20 Hz. The typical value of its measuring accuracy is ±2 cm [133]. | 58,064 | Automatic driving | USA | |
Velodyne | HDL-64E | Sixty-four lines mechanical LiDAR, with a maximum measurement range of 120 m. Its horizontal field of view is 360°, and its vertical field of view is −24.8° to 2°. The angular resolution in the horizontal direction is 0.08°, and the angular resolution in the vertical direction is 0.4°. Its scanning rate can be as high as 2,200,000 points per second, and its scanning frequency is 5 Hz to 20 Hz. The typical value of its measuring accuracy is ±2 cm [7]. | 103,700 | Automatic driving | USA | |
Vanjee Technology | WLR-736 | Sixteen lines mechanical LiDAR, with a maximum measurement range of 200 m. Its horizontal field of view is 145° and its vertical field of view is 7.75°. Its horizontal angular resolution is 0.1° to 0.5°, and its vertical angular resolution is 0.56° to 0.6°. Its scanning frequency is 10 Hz or 20 Hz or 30 Hz or 40 Hz or 50 Hz (optional). Its measuring accuracy is ±6 cm [134]. | 3813 | Automatic driving | China | |
Vanjee Technology | WLR-732 | Thirty-two lines mechanical LiDAR, with a maximum measurement range of 200 m. Its horizontal field of view is 360° and its vertical field of view is 24° (−12° to 12°). Its horizontal angular resolution is 0.1° to 0.4°, and its vertical angular resolution is 0.75°. Its scanning frequency is 5 Hz or 10 Hz or 15 Hz or 20 Hz (optional). Its measuring accuracy is ±6 cm [135]. | 16,775 | Automatic driving | China | |
Hesai Photonics Technology | Pandar40 | Forty lines mechanical LiDAR, with a measurement range of 0.3 m to 200 m. Its horizontal field of view is 360°, and its vertical field of view is −16° to 7°. Its scanning frequency is 10 Hz or 20 Hz (optional). Its horizontal angular resolution is 0.2° or 0.4°, corresponding to the scanning frequency of 10 Hz and 20 Hz, respectively. Its vertical angle resolution is 0.33° (corresponding to the vertical field of view of −6° to 2°) or 1° (corresponding to the vertical field of view of −16° to 6° and 2° to 7°). Its measuring accuracy is ±50 mm (0.3 m to 0.5 m) or ±20 mm (0.5 m to 200 m) [136]. | 30,500 | Automatic driving | China | |
Hesai Photonics Technology | Pandar64 | Sixty-four lines mechanical LiDAR, with a measurement range of 0.3 m to 200 m. Its horizontal field of view is 360°, and its vertical field of view is −25° to 15°. Its scanning frequency is 10 Hz or 20 Hz (optional). Its horizontal angular resolution is 0.2° or 0.4°, corresponding to the scanning frequency of 10 Hz and 20 Hz, respectively. Its minimum vertical angle resolution is 0.167°. Its measuring accuracy is ±50 mm (0.3 m to 0.5 m) or ±20 mm (0.5 m to 200 m) [137]. | 68,625 | Automatic driving | China | |
Robosense | RS-LiDAR-16 | Sixteen lines mechanical LiDAR, with a measurement range of 0.4 m to 150 m. Its horizontal field of view is 360°, and its vertical field of view is 30°. Its horizontal angular resolution is 0.1°, 0.2°, or 0.4° (optional), and its vertical angular resolution is 2°. Its scanning frequency is 5 Hz or 10 Hz or 20 Hz (optional). Its measurement accuracy is ±20 mm [138]. | 4270 | Automatic driving | China | |
Robosense | RS-LiDAR-32 | Thirty-two lines mechanical LiDAR, with a measurement range of 0.4 m to 200 m. Its horizontal field of view is 360°, and its vertical field of view is 40°. Its horizontal angular resolution is 0.1°, 0.2°, or 0.4° (optional), and its vertical angular resolution is 0.33°. Its scanning frequency is 5 Hz or 10 Hz or 20 Hz (optional). Its measurement accuracy is ±30 mm [139]. | 19,520 | Automatic driving | China |
Appendix B
Symbol | Explanation |
---|---|
L | The coordinate frame of the 2D LiDAR. |
L′ | The coordinate frame of the 2D LiDAR at another position. |
W | World coordinate frame. |
p | A sampling point of the 2D LiDAR. |
r | The ranging data of the sampling point p. |
θ | The azimuth angle of the sampling point p. |
pL | The coordinate of sampling point p relative to the coordinate frame L of the 2D LiDAR. |
pW | The coordinate of sampling point p relative to the word coordinate frame W. |
The rotation matrix from coordinate frame L to coordinate frame W. | |
The translation vector from coordinate frame L to coordinate frame W. | |
P | The coordinate frame of the prototype. |
M | The coordinate frame of the mobile platform. |
pP | The coordinate of sampling point p relative to the coordinate frame P of the prototype. |
The rotation matrix from coordinate frame L to coordinate frame P. | |
The translation vector from coordinate frame L to coordinate frame P. | |
pM | The coordinate of sampling point p relative to the coordinate frame M of the mobile platform. |
The rotation matrix from coordinate frame P to coordinate frame M. | |
The translation vector from coordinate frame P to coordinate frame M. | |
The rotation matrix from coordinate frame M to coordinate frame W. | |
The translation vector from coordinate frame M to coordinate frame W. | |
The rotation matrix from coordinate frame L to coordinate frame M. | |
The translation vector from coordinate frame L to coordinate frame M. |
References
- Yilmaz, V. Automated ground filtering of LiDAR and UAS point clouds with metaheuristics. Opt. Laser Technol. 2021, 138, 106890. [Google Scholar] [CrossRef]
- Jarén, R.R.; Arranz, J.J. Automatic segmentation and classification of BIM elements from point clouds. Autom. Constr. 2021, 124, 103576. [Google Scholar] [CrossRef]
- Javanmardi, E.; Javanmardi, M.; Gu, Y.; Kamijo, S. Autonomous vehicle self-localization based on multilayer 2D vector map and multi-channel LiDAR. In Proceedings of the 2017 IEEE Intelligent Vehicles Symposium (IV), Los Angeles, CA, USA, 11–14 June 2017. [Google Scholar]
- Briechle, S.; Krzystek, P.; Vosselman, G. Silvi-Net—A dual-CNN approach for combined classification of tree species and standing dead trees from remote sensing data. Int. J. Appl. Earth Obs. Geoinf. 2021, 98, 102292. [Google Scholar] [CrossRef]
- Estornell, J.; Hadas, E.; Martí, J.; López-Cortés, I. Tree extraction and estimation of walnut structure parameters using airborne LiDAR data. Int. J. Appl. Earth Obs. Geoinf. 2021, 96, 102273. [Google Scholar] [CrossRef]
- Slamtec Rplidar A1. Available online: http://www.slamtec.com/cn/Lidar/A1Spec (accessed on 20 December 2020).
- Velodyne HDL-64E. Available online: https://velodynelidar.com/products/hdl-64e/ (accessed on 13 April 2021).
- Kang, X.; Yin, S.; Fen, Y. 3D Reconstruction & Assessment Framework based on affordable 2D Lidar. In Proceedings of the 2018 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Auckland, New Zealand, 9–12 July 2018; pp. 292–297. [Google Scholar]
- Palacín, J.; Martínez, D.; Rubies, E.; Clotet, E. Mobile Robot Self-Localization with 2D Push-Broom LIDAR in a 2D Map. Sensors 2020, 20, 2500. [Google Scholar] [CrossRef]
- Morales, J.; Martínez, J.; Mandow, A.; Reina, A.; Pequenoboter, A.; García-Cerezo, A. Boresight Calibration of Construction Misalignments for 3D Scanners Built with a 2D Laser Rangefinder Rotating on Its Optical Center. Sensors 2014, 14, 20025–20040. [Google Scholar] [CrossRef] [Green Version]
- Morales, J.; Plazaleiva, V.; Mandow, A.; Gomezruiz, J.; Serón, J.; GarcíaCerezo, A. Analysis of 3D Scan Measurement Distribution with Application to a Multi-Beam Lidar on a Rotating Platform. Sensors 2018, 18, 395. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Neumann, T.; Dülberg, E.; Schiffer, S.; Ferrein, A. A Rotating Platform for Swift Acquisition of Dense 3D Point Clouds. In Proceedings of the International Conference on Intelligent Robotics and Applications, Tokyo, Japan, 22–24 August 2016; pp. 257–268. [Google Scholar]
- Pfrunder, A.; Borges, P.V.K.; Romero, A.R.; Catt, G.; Elfes, A. Real-time autonomous ground vehicle navigation in heterogeneous environments using a 3D LiDAR. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar]
- Neumann, T.; Ferrein, A.; Kallweit, S.; Scholl, I. Towards a Mobile Mapping Robot for Underground Mines. In Proceedings of the 2014 PRASA, RobMech and AfLaT International Joint Symposium, Cape Town, South Africa, 27–28 November 2014. [Google Scholar]
- Mandow, A.; Morales, J.; Gomez-Ruiz, J.A.; Garcia-Cerezo, A.J. Optimizing Scan Homogeneity for Building Full-3D Lidars Based on Rotating a Multi-Beam Velodyne Range-Finder. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
- Wen, C.; Sun, X.; Hou, S.; Tan, J.; Dai, Y.; Wang, C.; Li, J. Line Structure-Based Indoor and Outdoor Integration Using Backpacked and TLS Point Cloud Data. IEEE Geosci. Remote Sens. Lett. 2018, 15, 1790–1794. [Google Scholar] [CrossRef]
- Gong, Z.; Wen, C.; Wang, C.; Li, J. A Target-Free Automatic Self-Calibration Approach for Multibeam Laser Scanners. IEEE Trans. Instrum. Meas. 2017, 67, 238–240. [Google Scholar] [CrossRef]
- Wang, C.; Hou, S.; Wen, C.; Gong, Z.; Li, Q.; Sun, X.; Li, J. Semantic line framework-based indoor building modeling using backpacked laser scanning point cloud. ISPRS J. Photogramm. Remote Sens. 2018, 143, 150–166. [Google Scholar] [CrossRef]
- Vlaminck, M.; Luong, H.; Goeman, W.; Philips, W. 3D Scene Reconstruction Using Omnidirectional Vision and LiDAR: A Hybrid Approach. Sensors 2016, 16, 1923. [Google Scholar] [CrossRef] [Green Version]
- Alismail, H.; Browning, B. Automatic Calibration of Spinning Actuated Lidar Internal Parameters. J. Field Robot. 2015, 32, 723–747. [Google Scholar] [CrossRef]
- Kang, J.; Doh, N.L. Full-DOF Calibration of a Rotating 2-D LIDAR with a Simple Plane Measurement. IEEE Trans. Robot. 2016, 32, 1245–1263. [Google Scholar] [CrossRef]
- Gao, Z.; Huang, J.; Yang, X.; An, P. Calibration of rotating 2D LIDAR based on simple plane measurement. Sens. Rev. 2019, 39, 190–198. [Google Scholar] [CrossRef]
- Yadan, Z.; Heng, Y.; Houde, D.; Shuang, S.; Mingqiang, L.; Bo, S.; Wei, J.; Max, M. An Improved Calibration Method for a Rotating 2D LIDAR System. Sensors 2018, 18, 497. [Google Scholar]
- Martinez, J.L.; Morales, J.; Reina, A.J.; Mandow, A.; Pequeno-Boter, A.; Garcia-Cerezo, A.; IEEE. Construction and Calibration of a Low-Cost 3D Laser Scanner with 360 degrees Field of View for Mobile Robots. In Proceedings of the 2015 IEEE International Conference on Industrial Technology (ICIT), Seville, Spain, 17–19 March 2015; pp. 149–154. [Google Scholar]
- Murcia, H.F.; Monroy, M.F.; Mora, L.F. 3D Scene Reconstruction Based on a 2D Moving LiDAR. In International Conference on Applied Informatics; Springer: Cham, Switzerland, 2018. [Google Scholar]
- Petr, O.; Michal, K.; Pavel, M.; David, S. Calibration of Short Range 2D Laser Range Finder for 3D SLAM Usage. J. Sens. 2015, 2016, 1–13. [Google Scholar]
- Oberlander, J.; Pfotzer, L.; Roennau, A.; Dillmann, R. Fast calibration of rotating and swivelling 3-D laser scanners exploiting measurement redundancies. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015. [Google Scholar]
- Kurnianggoro, L.; Hoang, V.D.; Jo, K.H. Calibration of Rotating 2D Laser Range Finder Using Circular Path on Plane Constraints. In New Trends in Computational Collective Intelligence; Springer: Cham, Switzerland, 2015. [Google Scholar]
- Kurnianggoro, L.; Hoang, V.D.; Jo, K.H. Calibration of a 2D laser scanner system and rotating platform using a point-plane constraint. Comput. Ence. Inf. Syst. 2015, 12, 307–322. [Google Scholar] [CrossRef]
- Pfotzer, L.; Oberlaender, J.; Roennau, A.; Dillmann, R. Development and calibration of KaRoLa, a compact, high-resolution 3D laser scanner. In Proceedings of the 2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Hokkaido, Japan, 27–30 October 2014. [Google Scholar]
- Lin, C.C.; Liao, Y.D.; Luo, W.J. Calibration method for extending single-layer LIDAR to multi-layer LIDAR. In Proceedings of the 2013 IEEE/SICE International Symposium on System Integration (SII), Honolulu, HI, USA, 12–15 January 2013. [Google Scholar]
- Choi, D.-G.; Bok, Y.; Kim, J.-S.; Kweon, I.S. Extrinsic Calibration of 2-D Lidars Using Two Orthogonal Planes. IEEE Trans. Robot. 2015, 32, 83–986. [Google Scholar] [CrossRef]
- Chen, J.; Quan, S.; Quan, Y.; Guo, Q. Calibration Method of Relative Position and Pose between Dual Two-Dimensional Laser Radar. Chin. J. Lasers 2017, 44, 152–160. [Google Scholar] [CrossRef]
- He, M.; Zhao, H.; Cui, J.; Zha, H. Calibration method for multiple 2D LIDARs system. In Proceedings of the 2014 IEEE International Conference on Robotics & Automation, Hong Kong, China, 31 May–7 June 2014. [Google Scholar]
- He, M.; Zhao, H.; Davoine, F.; Cui, J.; Zha, H. Pairwise LIDAR calibration using multi-type 3D geometric features in natural scene. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013. [Google Scholar]
- Baldwin, I.; Newman, P. Laser-only road-vehicle localization with dual 2D push-broom LIDARS and 3D priors. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots & Systems, Algarve, Portugal, 7–12 October 2012. [Google Scholar]
- Newman, P.M.; Baldwin, I. Generation of 3D Models of an Environment. U.S. Patent WO2014128498A2, 28 August 2014. Available online: https://patentimages.storage.googleapis.com/9c/d8/c3/cf9155249ecc3a/US10109104.pdf (accessed on 26 April 2021).
- Bosse, M.; Zlot, R. Continuous 3D scan-matching with a spinning 2D laser. In Proceedings of the IEEE International Conference on Robotics & Automation, Kobe, Japan, 12–17 May 2019. [Google Scholar]
- Zheng, F.; Shibo, Z.; Shiguang, W.; Yu, Z. A Real-Time 3D Perception and Reconstruction System Based on a 2D Laser Scanner. J. Sens. 2018, 2018, 1–14. [Google Scholar] [CrossRef]
- Almqvist, H.; Magnusson, M.; Lilienthal, A.J. Improving Point Cloud Accuracy Obtained from a Moving Platform for Consistent Pile Attack Pose Estimation. J. Intell. Robot. Syst. Theory Appl. 2014, 75, 101–128. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in real-time. In Proceedings of the Robotics: Science and Systems Conference (RSS), Berkeley, CA, USA, 12–16 July 2014. [Google Scholar]
- Zhang, T.; Nakamura, Y. Moving Humans Removal for Dynamic Environment Reconstruction from Slow-Scanning LIDAR Data. In Proceedings of the 2018 15th International Conference on Ubiquitous Robots (UR), Jeju, Korea, 28 June–1 July 2018. [Google Scholar]
- Kim, J.; Jeong, H.; Lee, D. Single 2D lidar based follow-me of mobile robot on hilly terrains. J. Mech. Sci. Technol. 2020, 34, 1–10. [Google Scholar]
- Dewan, A.; Caselitz, T.; Tipaldi, G.D.; Burgard, W. Motion-based detection and tracking in 3D LiDAR scans. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016. [Google Scholar]
- Chu, P.M.; Cho, S.; Sim, S.; Kwak, K.; Park, Y.W.; Cho, K. Removing past data of dynamic objects using static Velodyne LiDAR sensor. In Proceedings of the 2016 16th International Conference on Control, Automation and Systems (ICCAS), Gyeongju, Korea, 16–19 October 2016. [Google Scholar]
- Morton, P.; Douillard, B.; Underwood, J. An evaluation of dynamic object tracking with 3D LIDAR. In Proceedings of the Australasian Conference on Robotics and Automation, Melbourne, Australia, 7–9 December 2011. [Google Scholar]
- Spinello, L.; Arras, K.O.; Triebel, R.; Siegwart, R. A Layered Approach to People Detection in 3D Range Data. In Proceedings of the Twenty-fourth Aaai Conference on Artificial Intelligence, Atlanta, GA, USA, 11–15 July 2010. [Google Scholar]
- Oliver Wulf, B.W. Fast 3D scanning methods for laser measurement systems. In Proceedings of the International Conference on Control Systems and Computer Science, CSCS14, Bucharest, Romania, 2–5 July 2003. [Google Scholar]
- Ueda, T.; Kawata, H.; Tomizawa, T.; Ohya, A.; Yuta, S.I. Mobile SOKUIKI Sensor System-Accurate Range Data Mapping System with Sensor Motion. In Proceedings of the 2006 International Conference on Autonomous Robots and Agents, Palmerston North, New Zealand, 12–14 December 2006. [Google Scholar]
- Ohno, K.; Kawahara, T.; Tadokoro, S. Development of 3D laser scanner for measuring uniform and dense 3D shapes of static objects in dynamic environment. In Proceedings of the IEEE International Conference on Robotics & Biomimetics, Guilin, China, 19–23 December 2009. [Google Scholar]
- Yoshida, T.; Irie, K.; Koyanagi, E.; Tomono, M. A sensor platform for outdoor navigation using gyro-assisted odometry and roundly-swinging 3D laser scanner. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010. [Google Scholar]
- Matsumoto, M. 3D laser range sensor module with roundly swinging mechanism for fast and wide view range image. In Proceedings of the IEEE Conference on Multisensor Fusion and Integration, Salt Lake City, UT, USA, 5–7 September 2010. [Google Scholar]
- Schubert, S.; Neubert, P.; Protzel, P. How to Build and Customize a High-Resolution 3D Laserscanner Using Off-the-shelf Components. In Proceedings of the Conference towards Autonomous Robotic Systems, Sheffield, UK, 26 June–1 July 2016. [Google Scholar]
- Ocando, M.G.; Certad, N.; Alvarado, S.; Terrones, N. Autonomous 2D SLAM and 3D mapping of an environment using a single 2D LIDAR and ROS. In Proceedings of the 2017 Latin American Robotics Symposium (LARS) and 2017 Brazilian Symposium on Robotics (SBR), Curitiba, Brazil, 8–10 November 2017. [Google Scholar]
- Wu, Q.; Sun, K.; Zhang, W.; Huang, C.; Wu, X. Visual and LiDAR-based for the mobile 3D mapping. In Proceedings of the IEEE International Conference on Robotics & Biomimetics, Qingdao, China, 3–7 December 2016. [Google Scholar]
- Brenneke, C.; Wulf, O.; Wagner, B. Using 3D Laser Range Data for SLAM in Outdoor Environments. In Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003), Las Vegas, NV, USA, 27–31 October 2003. [Google Scholar]
- Wen, C.; Qin, L.; Zhu, Q.; Wang, C. Three-Dimensional Indoor Mobile Mapping with Fusion of Two-Dimensional Laser Scanner and RGB-D Camera Data. IEEE Geosci. Remote Sens. Lett. 2013, 11, 843–847. [Google Scholar]
- Cong, P.; Xunyu, Z.; Huosheng, H.; Jun, T.; Xiafu, P.; Jianping, Z. Adaptive Obstacle Detection for Mobile Robots in Urban Environments Using Downward-Looking 2D LiDAR. Sensors 2018, 18, 1749. [Google Scholar]
- Demir, S.O.; Ertop, T.E.; Koku, A.B.; Konukseven, E.I. An adaptive approach for road boundary detection using 2D LIDAR sensor. In Proceedings of the 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Daegu, Korea, 16–18 November 2017. [Google Scholar]
- Xu, W.; Zhuang, Y.; Hu, H.; Zhao, Y. Real-time road detection and description for robot navigation in an unstructured campus environment. In Proceedings of the 11th World Congress on Intelligent Control and Automation, Shenyang, China, 29 June–4 July 2014. [Google Scholar]
- Wang, X.; Cai, Y.; Shi, T. Road edge detection based on improved RANSAC and 2D LIDAR Data. In Proceedings of the International Conference on Control, Jeju Island, Korea, 25–28 November 2015. [Google Scholar]
- Dias, P.; Matos, M.; Santos, V. 3D Reconstruction of Real World Scenes Using a Low-Cost 3D Range Scanner. Comput. Aided Civil. Infrastruct. Eng. 2010, 21, 486–497. [Google Scholar] [CrossRef]
- Li, J.; He, X.; Li, J. 2D LiDAR and Camera Fusion in 3D Modeling of Indoor Environment. In Proceedings of the 2015 National Aerospace and Electronics Conference (NAECON), Penang, Malaysia, 27–29 November 2015. [Google Scholar]
- Wang, S.; Zhuang, Y.; Zheng, K.; Wang, W. 3D Scene Reconstruction Using Panoramic Laser Scanning and Monocular Vision. In Proceedings of the 2010 8th World Congress on Intelligent Control and Automation, Jinan, China, 7–9 July 2010. [Google Scholar]
- Alismail, H.; Baker, L.D.; Browning, B. Automatic Calibration of a Range Sensor and Camera System. In Proceedings of the 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission, Zurich, Switzerland, 13–15 October 2012. [Google Scholar]
- Scaramuzza, D.; Harati, A.; Siegwart, R. Extrinsic self calibration of a camera and a 3D laser range finder from natural scenes. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots & Systems, San Diego, CA, USA, 29 October–2 November 2007. [Google Scholar]
- Shaukat, A.; Blacker, P.; Spiteri, C.; Gao, Y. Towards Camera-LIDAR Fusion-Based Terrain Modelling for Planetary Surfaces: Review and Analysis. Sensors 2016, 11, 1952. [Google Scholar] [CrossRef] [Green Version]
- Weingarten, J.W.; Siegwart, R. 3D SLAM using planar segments. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots & Systems, Beijing, China, 9–15 October 2006. [Google Scholar]
- Morales, J.; Martinez, J.L.; Mandow, A.; Pequenoboter, A.; Garciacerezo, A. Design and development of a fast and precise low-cost 3D laser rangefinder. In Proceedings of the International Conference on Mechatronics, Beijing, China, 7–10 August 2011; pp. 621–626. [Google Scholar]
- Baldwin, I.; Newman, P. Road vehicle localization with 2D push-broom LIDAR and 3D priors. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012. [Google Scholar]
- Napier, A.; Corke, P.; Newman, P. Cross-calibration of push-broom 2D LIDARs and cameras in natural scenes. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013. [Google Scholar]
- Wen, C.; Pan, S.; Wang, C.; Li, J. An Indoor Backpack System for 2-D and 3-D Mapping of Building Interiors. IEEE Geosci. Remote Sens. Lett. 2016, 13, 992–996. [Google Scholar] [CrossRef]
- Liu, T.; Carlberg, M.; Chen, G.; Chen, J.; Zakhor, A. Indoor localization and visualization using a human-operated backpack system. In Proceedings of the 2010 International Conference on Indoor Positioning and Indoor Navigation, Zurich, Switzerland, 15–17 September 2010. [Google Scholar]
- Bok, Y.; Choi, D.; Jeong, Y.; Kweon, I.S. Capturing village-level heritages with a hand-held camera-laser fusion sensor. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Barcelona, Spain, 6–13 November 2011. [Google Scholar]
- Dong-Geol, C.; Yunsu, B.; Jun-Sik, K.; Inwook, S.; In, K. Structure-From-Motion in 3D Space Using 2D Lidars. Sensors 2017, 17, 242. [Google Scholar]
- Winkvist, S.; Rushforth, E.; Young, K. Towards an autonomous indoor aerial inspection vehicle. Ind. Robot. 2013, 40, 196–207. [Google Scholar] [CrossRef]
- Wang, A.; Li, C.; Liu, Y.; Zhuang, Y.; Bu, C.; Xiao, J. Laser-based Online Sliding-window Approach for UAV Loop-closure Detection in Urban Environments. Int. J. Adv. Robot. Syst. 2017, 13, 1–11. [Google Scholar] [CrossRef]
- Mcgarey, P.; Yoon, D.; Tang, T.; Pomerleau, F.; Barfoot, T.D. Developing and deploying a tethered robot to map extremely steep terrain: MCGAREY et al. J. Field Robot. 2018, 35, 1327–1341. [Google Scholar] [CrossRef]
- Kaul, L.; Zlot, R.; Bosse, M. Continuous-Time Three-Dimensional Mapping for Micro Aerial Vehicles with a Passively Actuated Rotating Laser Scanner. J. Field Robot. 2015, 33, 103–132. [Google Scholar] [CrossRef]
- Bosse, M.; Zlot, R.; Flick, P. Zebedee: Design of a Spring-Mounted 3-D Range Sensor with Application to Mobile Mapping. IEEE Trans. Robot. 2012, 28, 1104–1119. [Google Scholar] [CrossRef]
- Bosse, M.; Zlot, R. Place recognition using keypoint voting in large 3D lidar datasets. In Proceedings of the IEEE International Conference on Robotics & Automation, Karlsruhe, Germany, 6–10 May 2013. [Google Scholar]
- Leica. Available online: https://shop.leica-geosystems.com/ (accessed on 12 April 2021).
- Faro. Available online: https://www.faro.com/ (accessed on 26 April 2021).
- Velodyne. Available online: https://velodynelidar.com/ (accessed on 12 April 2021).
- Desai, A.; Huber, D. Objective Evaluation of Scanning Ladar Configurations for Mobile Robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots & Systems, St. Louis, MO, USA, 11–15 October 2009. [Google Scholar]
- Son, Y.; Yoon, S.; Oh, S.Y.; Han, S. A Lightweight and Cost-Effective 3D Omnidirectional Depth Sensor Based on Laser Triangulation. IEEE Access 2019, 7, 58740–58750. [Google Scholar] [CrossRef]
- Kimoto, K.; Asada, N.; Mori, T.; Hara, Y.; Yuta, S.I. Development of small size 3D LIDAR. In Proceedings of the IEEE International Conference on Robotics & Automation, Hong Kong, China, 31 May–5 June 2014. [Google Scholar]
- Hu, C.; Huang, Z.; Qin, S. A New 3D Imaging Lidar Based on the High-Speed 2D Laser Scanner; SPIE—The International Society for Optical Engineering: Bellingham, WA, USA, 2012. [Google Scholar]
- Ryde, J.; Hu, H. Mobile Robot 3D Perception and Mapping without Odometry Using Multi-Resolution Occupancy Lists. In Proceedings of the 2007 International Conference on Mechatronics and Automation, Harbin, China, 5–9 August 2007; pp. 331–336. [Google Scholar]
- Park, C.S.; Kim, D.; You, B.J.; Oh, S.R. Characterization of the Hokuyo UBG-04LX-F01 2D laser rangefinder. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication, Viareggio, Italy, 13–15 September 2010. [Google Scholar]
- Okubo, Y.; Ye, C.; Borenstein, J. Characterization of the Hokuyo URG-04LX laser rangefinder for mobile robot obstacle negotiation. Proc. SPIE Int. Soc. Opt. Eng. 2009, 7332, 733212. [Google Scholar]
- Ueda, T.; Kawata, H.; Tomizawa, T.; Ohya, A.; Yuta, S.I. Visual Information Assist System Using 3D SOKUIKI Sensor for Blind People, System Concept and Object Detecting Experiments. In Proceedings of the Conference of the IEEE Industrial Electronics Society, Paris, France, 7–10 November 2006. [Google Scholar]
- Raymond, S.; Nawid, J.; Mohammed, K.; Claude, S. A Low-Cost, Compact, Lightweight 3D Range Sensor. In Proceedings of the Australian Conference on Robotics and Automation, Auckland, New Zealand, 6–8 December 2006. [Google Scholar]
- Matsumoto, M.; Yuta, S. 3D SOKUIKI sensor module with roundly swinging mechanism for taking wide-field range and reflection intensity image in high speed. In Proceedings of the IEEE International Conference on Robotics and Biomimetics, Phuket, Thailand, 7–11 December 2011. [Google Scholar]
- Nasrollahi, M.; Bolourian, N.; Zhu, Z.; Hammad, A. Designing LiDAR-equipped UAV Platform for Structural Inspection. In Proceedings of the 34th International Symposium on Automation and Robotics in Construction, Taipei, Taiwan, 27–30 June 2017. [Google Scholar]
- Nagatani, K.; Tokunaga, N.; Okada, Y.; Yoshida, K. Continuous Acquisition of Three-Dimensional Environment Information for Tracked Vehicles on Uneven Terrain. In Proceedings of the IEEE International Workshop on Safety, Sendai, Japan, 21–24 October 2008. [Google Scholar]
- Walther, M.; Steinhaus, P.; Dillmann, R. A foveal 3D laser scanner integrating texture into range data. In Proceedings of the International Conference on Intelligent Autonomous Systems 9-ias, Tokyo, Japan, 7–9 March 2006. [Google Scholar]
- Bertussi, S. Spin_Hokuyo—ROS Wiki. Available online: http://wiki.ros.org/spin_hokuyo (accessed on 13 February 2021).
- Yuan, C.; Bi, S.; Cheng, J.; Yang, D.; Wang, W. Low-Cost Calibration of Matching Error between Lidar and Motor for a Rotating 2D Lidar. Appl. Sci. 2021, 11, 913. [Google Scholar] [CrossRef]
- Ozbay, B.; Kuzucu, E.; Gul, M.; Ozturk, D.; Tasci, M.; Arisoy, A.M.; Sirin, H.O.; Uyanik, I. A high frequency 3D LiDAR with enhanced measurement density via Papoulis-Gerchberg. In Proceedings of the 2015 International Conference on Advanced Robotics (ICAR), Istanbul, Turkey, 27–31 July 2015; pp. 543–548. [Google Scholar]
- Kuzucu, E.; Öztürk, D.; Gül, M.; Özbay, B.; Arisoy, A.M.; Sirin, H.O.; Uyanik, I. Enhancing 3D range image measurement density via dynamic Papoulis–Gerchberg algorithm. Trans. Inst. Meas. Control 2018, 40, 4407–4420. [Google Scholar] [CrossRef]
- Yang, D.; Bi, S.; Wang, W.; Qi, X.; Cai, Y. DRE-SLAM: Dynamic RGB-D Encoder SLAM for a Differential-Drive Robot. Remote Sens. 2019, 11, 380. [Google Scholar] [CrossRef] [Green Version]
- YOLO. Available online: https://pjreddie.com/darknet/yolo/ (accessed on 16 February 2021).
- Li, Q.; Dai, B.; Fu, H. LIDAR-based dynamic environment modeling and tracking using particles based occupancy grid. In Proceedings of the 2016 IEEE International Conference on Mechatronics and Automation, Harbin, China, 7–10 August 2016. [Google Scholar]
- Qin, B.; Chong, Z.J.; Soh, S.H.; Bandyopadhyay, T.; Ang, M.H.; Frazzoli, E.; Rus, D. A Spatial-Temporal Approach for Moving Object Recognition with 2D LIDAR. In Experimental Robotics; Springer: Cham, Switzerland, 2016. [Google Scholar]
- Wang, D.Z.; Posner, I.; Newman, P. Model-free detection and tracking of dynamic objects with 2D lidar. Int. J. Robot. Res. 2015, 34, 1039–1063. [Google Scholar] [CrossRef]
- Park, Y.; Yun, S.; Won, C.; Cho, K.; Um, K.; Sim, S. Calibration between Color Camera and 3D LIDAR Instruments with a Polygonal Planar Board. Sensors 2014, 14, 5333–5353. [Google Scholar] [CrossRef] [Green Version]
- Gong, X.; Lin, Y.; Liu, J. 3D LIDAR-Camera Extrinsic Calibration Using an Arbitrary Trihedron. Sensors 2013, 12, 1902–1918. [Google Scholar] [CrossRef] [Green Version]
- Mirzaei, F.M.; Kottas, D.G.; Roumeliotis, S.I. 3D LIDAR–camera intrinsic and extrinsic calibration: Identifiability and analytical least-squares-based initialization. Int. J. Robot. Res. 2012, 31, 452–467. [Google Scholar] [CrossRef] [Green Version]
- Zhou, L.; Li, Z.; Kaess, M. Automatic Extrinsic Calibration of a Camera and a 3D LiDAR using Line and Plane Correspondences. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots & Systems, Madrid, Spain, 30 October–5 November 2018. [Google Scholar]
- Fremont, V.; Bonnifait, P. Extrinsic calibration between a multi-layer lidar and a camera. In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Seoul, Korea, 20–22 August 2008. [Google Scholar]
- Zhou, L.; Deng, Z. Extrinsic calibration of a camera and a lidar based on decoupling the rotation from the translation. In Proceedings of the Intelligent Vehicles Symposium, Alcalá de Henares, Spain, 3–7 June 2012. [Google Scholar]
- Weimin, W.; Ken, S.; Nobuo, K. Reflectance Intensity Assisted Automatic and Accurate Extrinsic Calibration of 3D LiDAR and Panoramic Camera Using a Printed Chessboard. Remote Sens. 2017, 9, 851. [Google Scholar]
- Hokuyo. Available online: https://www.hokuyo-aut.co.jp/ (accessed on 12 April 2021).
- Slamtec. Available online: http://www.slamtec.com/ (accessed on 12 April 2021).
- Riegl. Available online: http://www.riegl.com/ (accessed on 12 April 2021).
- Barbarella, M.; De Blasiis, M.R.; Fiani, M. Terrestrial laser scanner for the analysis of airport pavement geometry. Int. J. Pavement Eng. 2018, 20, 466–480. [Google Scholar] [CrossRef]
- Barbarella, M.; D’Amico, F.; De Blasiis, M.R.; Di Benedetto, A.; Fiani, M. Use of Terrestrial Laser Scanner for Rigid Airport Pavement Management. Sensors 2018, 18, 44. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Blasiis, M.D.; Benedetto, A.D.; Fiani, M.; Garozzo, M. Assessing of the Road Pavement Roughness by Means of LiDAR Technology. Coatings 2021, 11, 17. [Google Scholar] [CrossRef]
- De Giglio, M.; Greggio, N.; Goffo, F.; Merloni, N.; Dubbini, M.; Barbarella, M. Comparison of Pixel- and Object-Based Classification Methods of Unmanned Aerial Vehicle Data Applied to Coastal Dune Vegetation Communities: Casal Borsetti Case Study. Remote Sens. 2019, 11, 1416. [Google Scholar] [CrossRef] [Green Version]
- Barbarella, M.; Fiani, L. Application of Lidar-Derived Dem for Detection of Mass Movements on a Landslide. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 1, 159–165. [Google Scholar] [CrossRef] [Green Version]
- SICK LMS111. Available online: https://www.sick.com/ag/en/detection-and-ranging-solutions/2d-lidar-sensors/lms1xx/lms111-10100/p/p109842 (accessed on 13 April 2021).
- SICK LMS511. Available online: https://www.sick.com/ag/en/detection-and-ranging-solutions/2d-lidar-sensors/lms5xx/lms511-10100-pro/p/p215941 (accessed on 13 April 2021).
- Hokuyo UST-10LX. Available online: https://www.hokuyo-aut.co.jp/search/single.php?serial=16 (accessed on 12 November 2020).
- UTM-30LX-EW. Available online: https://www.hokuyo-aut.jp/search/single.php?serial=170 (accessed on 26 April 2021).
- Slamtec Rplidar A2M6. Available online: http://www.slamtec.com/cn/Lidar/A2Spec (accessed on 12 April 2021).
- Slamtec Rplidar A3. Available online: http://www.slamtec.com/cn/Lidar/A3Spec (accessed on 12 April 2021).
- Slamtec Rplidar S1. Available online: http://www.slamtec.com/cn/Lidar/S1Spec (accessed on 12 April 2021).
- Vanjee Technology WLR-716. Available online: http://wanji.net.cn/index.php?m=content&c=index&a=show&catid=110&id=123 (accessed on 12 April 2021).
- Leica BLK360. Available online: https://shop.leica-geosystems.com/blk360-scanner (accessed on 12 April 2021).
- Faro FocusS Plus 350. Available online: https://www.faro.com/zh-CN/Resource-Library/Tech-Sheet/techsheet-faro-focus-laser-scanners (accessed on 26 April 2021).
- Velodyne VLP-16 (Puck). Available online: https://velodynelidar.com/products/puck/ (accessed on 13 April 2021).
- Velodyne HDL-32E. Available online: https://velodynelidar.com/products/hdl-32e/ (accessed on 13 April 2021).
- Vanjee Technology WLR-736. Available online: http://wanji.net.cn/index.php?m=content&c=index&a=show&catid=99&id=86 (accessed on 13 April 2021).
- Vanjee Technology WLR-732. Available online: http://wanji.net.cn/index.php?m=content&c=index&a=show&catid=99&id=87 (accessed on 13 April 2021).
- Hesai Photonics Technology Pandar40. Available online: https://www.hesaitech.com/zh/Pandar40 (accessed on 13 April 2021).
- Hesai Photonics Technology Pandar64. Available online: https://www.hesaitech.com/zh/Pandar64 (accessed on 13 April 2021).
- Robosense RS-LiDAR-16. Available online: https://www.robosense.cn/rslidar/rs-lidar-16 (accessed on 13 April 2021).
- Robosense RS-LiDAR-32. Available online: https://www.robosense.cn/rslidar/RS-LiDAR-32 (accessed on 13 April 2021).
Serial Numbers | Categories | The Movement of 2D LiDAR | Characteristics |
---|---|---|---|
1 | A rotating 2D LiDAR | The 2D LiDAR is rotated regularly around the middle line of the scanning sector [20,38,48]. | |
2 | A pitching 2D LiDAR | The 2D LiDAR is rotated regularly around the perpendicular of the middle line of the scanning sector [68,69]. | |
3 | A push-broom 2D LiDAR | The 2D LiDAR is fixedly assembled on the mobile platform [16,36,37,70,71,72,73,74,75,76,77]. | |
4 | An irregularly rotating 2D LiDAR | The rotation of the 2D LiDAR is non-periodic and irregular [78,79]. | |
5 | An obliquely rotating 2D LiDAR | The 2D LiDAR is rotated obliquely so that the distribution of the collected 3D point cloud is grid-like [50,51,52,53]. | |
6 | An irregularly moving 2D LiDAR | The movement of the 2D LiDAR is irregular and is recorded by an IMU (inertial measurement unit) [80,81]. |
Problems | Categories of Problems | Categories of Prototypes | |||||
---|---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | ||
Problem 1.1. The measurement error of the 2D LiDAR. | Accuracy | • | • | • | • | • | • |
Problem 1.2. The error caused by the motor. | • | • | • | ||||
Problem 1.3. The error caused by the assembly inaccuracy between the 2D LiDAR and the rotating unit. | • | • | • | • | |||
Problem 1.4. The error caused by the synchronization inaccuracy between the 2D LiDAR and the rotating unit. | • | • | • | ||||
Problem 1.5. The error caused by the assembly inaccuracy between a moving 2D LiDAR and the mobile platform. | • | • | • | • | • | • | |
Problem 1.6. The error caused by the estimation inaccuracy of the movement of the mobile platform. | • | • | • | • | • | • | |
Problem 2.1. The negative correlation between the real-time performance and the density of the 3D point cloud. | Real-time performance | • | • | • | • | • | • |
Problem 2.2. The distortion of the 3D point cloud caused by the movement of the mobile platform. | • | • | • | • | |||
Problem 2.3. The distortion of the 3D point cloud caused by the moving objects in the environment. | • | • | • | • | • | • | |
Problem 3.1. The density distribution of the 3D point cloud built by a moving 2D LiDAR. | Others | • | • | • | • | • | • |
Problem 3.2. The fusion of a moving 2D LiDAR and 2D LiDAR SLAM. | • | • | • | • | • | • | |
Problem 3.3. A push-broom 2D LiDAR, which is used for the detection of the obstacles in front of the vehicle. | • | ||||||
Problem 3.4. The fusion of a moving 2D LiDAR and a camera. | • | • | • | • | • | • |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bi, S.; Yuan, C.; Liu, C.; Cheng, J.; Wang, W.; Cai, Y. A Survey of Low-Cost 3D Laser Scanning Technology. Appl. Sci. 2021, 11, 3938. https://doi.org/10.3390/app11093938
Bi S, Yuan C, Liu C, Cheng J, Wang W, Cai Y. A Survey of Low-Cost 3D Laser Scanning Technology. Applied Sciences. 2021; 11(9):3938. https://doi.org/10.3390/app11093938
Chicago/Turabian StyleBi, Shusheng, Chang Yuan, Chang Liu, Jun Cheng, Wei Wang, and Yueri Cai. 2021. "A Survey of Low-Cost 3D Laser Scanning Technology" Applied Sciences 11, no. 9: 3938. https://doi.org/10.3390/app11093938
APA StyleBi, S., Yuan, C., Liu, C., Cheng, J., Wang, W., & Cai, Y. (2021). A Survey of Low-Cost 3D Laser Scanning Technology. Applied Sciences, 11(9), 3938. https://doi.org/10.3390/app11093938