A Lidar-Inertial Navigation System for UAVs in GNSS-Denied Environment with Spatial Grid Structures
Abstract
:Featured Application
Abstract
1. Introduction
- (1)
- Focusing on the spatial grid structure in the GNSS-denied environment, we designed a grid feature-extraction algorithm and a shell feature-extraction algorithm that only use logical judgment.
- (2)
- We implemented a lidar-inertial navigation system based on the assumption of local collinearity of grid features and local coplanarity of reticulated shell features.
- (3)
- We experimented with the navigation system in real application scenarios. Experiments undertaken show that, compared with other recently proposed lidar-based UAV navigation systems, the proposed system can achieve faster and more accurate UAV pose estimation in a GNSS-denied environment with spatial grid structures.
2. Lidar-Inertial Navigation System for UAVs in GNSS-Denied Environment with Spatial Grid Structures
2.1. System Overview
2.2. Fast Feature Extraction Algorithm for Spatial Grid Structure
- (1)
- Calculate the pose prediction of IMU in the two point cloud frames time interval .
- (2)
- According to the scanning time of each point in the original point cloud , interpolate the pose transformation of the point to the lidar frame at time .
- (3)
- The points are projected to the lidar frame at time through their pose transformation. The lidar point cloud is recorded from which motion distortion has been removed as .
2.3. Implementation of the Lidar Navigation System
3. Experiment
3.1. Experimental System Construction
3.2. Experiments and Data Analysis
3.2.1. Analysis of Feature Extraction
3.2.2. Analysis of Localization
3.2.3. Analysis of Mapping
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Nasrollahi, M.; Bolourian, N.; Zhu, Z.; Hammad, A. Designing LiDAR-equipped UAV platform for structural inspection. In Proceedings of the International Symposium on Automation and Robotics in Construction, Berlin, Germany, 20–25 July 2018; IAARC Publications: Oulu, Finland, 2018; Volume 35. [Google Scholar]
- Rydell, J.; Tulldahl, M.; Bilock, E.; Axelsson, L.; Köhler, P. Autonomous UAV-based forest mapping below the canopy. In Proceedings of the 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS), Portland, OR, USA, 20–23 April 2020. [Google Scholar]
- Dalamagkidis, K.; Valavanis, K.P.; Piegl, L.A. Current Status and Future Perspectives for Unmanned Aircraft System Operations in the US. J. Intell. Robot. Syst. 2008, 52, 313–329. [Google Scholar] [CrossRef]
- Zhou, J.; Kang, Y.; Liu, W. Applications and Development Analysis of Unmanned Aerial Vehicle (UAV) Navigation Technology. J. CAEIT 2015, 10, 274–277+286. [Google Scholar]
- Qin, C.; Ye, H.; Pranata, C.E.; Han, J.; Liu, M. Lins: A lidar-inertial state estimator for robust and efficient navigation. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020. [Google Scholar]
- Tagliabue, A.; Tordesillas, J.; Cai, X.; Santamaria-Navarro, A.; How, J.P.; Carlone, L.; Agha-mohammadi, A.-a. LION: Lidar-Inertial observability-aware navigator for Vision-Denied environments. In International Symposium on Experimental Robotics; Springer: Cham, Switzerland, 2020. [Google Scholar]
- Ye, H.; Chen, Y.; Liu, M. Tightly coupled 3d lidar inertial odometry and mapping. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019. [Google Scholar]
- Li, K.; Li, M.; Hanebeck, U.D. Towards high-performance solid-state-lidar-inertial odometry and mapping. IEEE Robot. Autom. Lett. 2021, 6, 5167–5174. [Google Scholar] [CrossRef]
- Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef] [Green Version]
- Besl, P.J.; McKay, N.D. Method for registration of 3-D shapes. SPIE 1992, 14, 239–256. [Google Scholar] [CrossRef] [Green Version]
- Pomerleau, F.; Colas, F.; Siegwart, R. A review of point cloud registration algorithms for mobile robot-ics. Found. Trends® Robot. 2015, 4, 1–104. [Google Scholar] [CrossRef] [Green Version]
- Grant, W.S.; Voorhies, R.C.; Itti, L. Finding planes in LiDAR point clouds for real-time registration. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013. [Google Scholar]
- Censi, A. An ICP variant using a point-to-line metric. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, 19–23 May 2008. [Google Scholar]
- Chen, Y.; Medioni, G. Object modelling by registration of multiple range images. Image Vis. Comput. 1992, 10, 145–155. [Google Scholar] [CrossRef]
- Segal, A.; Haehnel, D.; Thrun, S. Generalized-icp. In Robotics: Science and Systems; MIT Press: Cambridge, MA, USA, 2009; Volume 2, p. 435. [Google Scholar]
- Zhang, J.; Singh, S. LOAM: Lidar odometry and mapping in real-time. In Proceedings of the Robotics: Science and Systems, Rome, Italy, 13–15 July 2015; Volume 2, pp. 1–9. [Google Scholar]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020. [Google Scholar]
- Wang, H.; Wang, C.; Chen, C.L.; Xie, L. F-loam: Fast lidar odometry and mapping. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September 27–1 October 2021. [Google Scholar]
- Chen, S.W.; Nardari, G.V.; Lee, E.S.; Qu, C.; Liu, X.; Romero, R.A.F.; Kumar, V. Sloam: Semantic lidar odometry and mapping for forest inventory. IEEE Robot. Autom. Lett. 2020, 5, 612–619. [Google Scholar] [CrossRef] [Green Version]
- Shang, T.; Wang, J.; Dong, L.; Chen, W. 3D lidar SLAM technology in lunar environ-ment. Acta Aeronaut. Astronaut. Sin. 2021, 42, 524166. [Google Scholar]
- Dong, S.L.; Zhao, Y. Discussion on types and classifications of spatial structures. China Civ. Eng. J. 2004, 1, 7–12. [Google Scholar]
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On-Manifold Preintegration for Real-Time Visual–Inertial Odometry. IEEE Trans. Robot. 2017, 33, 1–21. [Google Scholar] [CrossRef] [Green Version]
- Xu, W.; Zhang, F. Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter. IEEE Robot. Autom. Lett. 2021, 6, 3317–3324. [Google Scholar] [CrossRef]
- Frank, D.; Michael, K. Factor Graphs for Robot Perception. Found. Trends Robot. 2017, 6, 1–139. [Google Scholar]
- Kaess, M.; Johannsson, H.; Roberts, R.; Ila, V.; Leonard, J.J.; Dellaert, F. iSAM2: Incremental smoothing and mapping using the Bayes tree. Int. J. Robot. Res. 2012, 32, 216–235. [Google Scholar] [CrossRef]
Parameter | d | ||||
---|---|---|---|---|---|
Value /m | 0.3 | 6 | 0.3 | 6 | 0.2 |
Direction | LIO-SAM | F-LOAM | DJI | Proposed System |
---|---|---|---|---|
X /m | −8.6679 | 2.1266 | −11.8198 | 0.0550 |
Y /m | −1.9806 | −7.2650 | −7.1752 | −0.0454 |
Z /m | 0.0399 | 1.4190 | −0.5295 | −0.0880 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Qiu, Z.; Lv, J.; Lin, D.; Yu, Y.; Sun, Z.; Zheng, Z. A Lidar-Inertial Navigation System for UAVs in GNSS-Denied Environment with Spatial Grid Structures. Appl. Sci. 2023, 13, 414. https://doi.org/10.3390/app13010414
Qiu Z, Lv J, Lin D, Yu Y, Sun Z, Zheng Z. A Lidar-Inertial Navigation System for UAVs in GNSS-Denied Environment with Spatial Grid Structures. Applied Sciences. 2023; 13(1):414. https://doi.org/10.3390/app13010414
Chicago/Turabian StyleQiu, Ziyi, Junning Lv, Defu Lin, Yinan Yu, Zhiwen Sun, and Zhangxiong Zheng. 2023. "A Lidar-Inertial Navigation System for UAVs in GNSS-Denied Environment with Spatial Grid Structures" Applied Sciences 13, no. 1: 414. https://doi.org/10.3390/app13010414
APA StyleQiu, Z., Lv, J., Lin, D., Yu, Y., Sun, Z., & Zheng, Z. (2023). A Lidar-Inertial Navigation System for UAVs in GNSS-Denied Environment with Spatial Grid Structures. Applied Sciences, 13(1), 414. https://doi.org/10.3390/app13010414