A Simultaneous Control, Localization, and Mapping System for UAVs in GPS-Denied Environments
Abstract
:1. Introduction
2. Related Work
3. Contributions
4. Materials and Methods
4.1. Problem Description
- When using SLAM methods (Figure 1, upper plots), it is common to assume perfect control, where the robot is manually guided along a predefined trajectory. As the robot moves farther from its starting position, the estimated trajectory will inevitably drift from the actual path due to integration errors. However, when the robot returns to its base and detects a previously mapped area (a process known as loop closure), the SLAM algorithm can reduce the accumulated error drift. The performance of the SLAM algorithm is then evaluated by comparing the estimated trajectory to the predefined path.
- Conversely, with SCLAM methods (Figure 1, lower plots), the pose estimated by the SLAM subsystem is directly fed back to the autonomous control subsystem. This means that the autonomous control treats the SLAM-estimated pose as the ground truth. Consequently, any error in the SLAM-estimated position will result in a corresponding error between the robot’s desired and actual positions. When a loop closure occurs, the estimated robot position is corrected, enabling the control subsystem to identify and adjust the actual drifted position of the robot.
4.2. System Architecture
4.3. SLAM System
- Attitude measurements , which are acquired from the Attitude and Heading Reference System (AHRS);
- Altitude measurements , which are obtained from the altimeter.
4.4. Control System
4.4.1. Low-Level Control
- Take-off: To lift the robot from the ground when it is landed, the module sets the vertical reference to a value slightly greater than zero (e.g., 0.5 m).
- Landing: When the robot’s altitude is near the ground (e.g., less than 0.2 m), the command initiates landing by setting the rotor velocities to zero.
4.4.2. Waypoint Control
- Go-to-point: Commands the robot to move to the specified position and orientation.
4.4.3. Visual Marks-Based Control
- Go-to-point-visual: Commands the robot to move to the specified position and orientation respect to the visual mark.
4.4.4. Trajectory Generation
Algorithm 1 Waypoints p2,...,pn generation |
|
Algorithm 2 Go-home command |
|
5. Results
5.1. Low-Level Control
5.2. Waypoint Control
5.3. Visual-Mark-Based Control
5.4. Explore Area
5.5. Full Exploration Mission
- Take-off→Go-to-point-visual→Go-to-point→Explore-area→Go-to-point→Go-to-point→Go-home→Landing
5.6. Experiment in an Indoor Environment
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
SCLAM | Simultaneous Control Localization and Mapping |
UAV | Unmanned Aerial Vehicle |
GPS | Global Positioning System |
SLAM | Simultaneous Localization and Mapping |
VINS | Visual Inertial Navigation System |
MPC | Model Predictive Control |
NMPC | Nonlinear Model Predictive COntrol |
MAV | Micro Aerial Vehicle |
NED | North-East-Down |
EKF | Extended Kalman Filter |
AHRS | Attitude and Heading Reference System |
PI | Proportional Integral |
PD | Proportional Derivative |
P | Proportional |
IMU | Inertial Measurement Unit |
PnP | Perspective-n-Point |
ROS | Robot Operating System |
MAE | Mean Absolute Error |
KF | Key Frame |
References
- Smith, J.K.; Jones, L.M. Applications of Unmanned Aerial Vehicles in Surveillance and Reconnaissance. J. Aer. Technol. 2022, 45, 112–125. [Google Scholar]
- Brown, A.R.; White, C.D. Autonomous Navigation and Perception for Unmanned Aerial Vehicles. Int. J. Robot. Autom. 2021, 28, 567–581. [Google Scholar]
- Johnson, P.Q.; Lee, S.H. Challenges in GPS-Dependent Navigation for UAVs in GPS-Denied Environments. J. Navig. Position. 2020, 17, 215–230. [Google Scholar]
- Williams, R.S.; Davis, M.A. Sensor-Based Solutions for UAV Navigation in Challenging Environments. IEEE Trans. Robot. 2023, 39, 78–92. [Google Scholar]
- Anderson, E.T.; Wilson, B.R. Visual Simultaneous Localization and Mapping for UAVs: A Comprehensive Review. Int. J. Comput. Vis. 2022, 50, 321–345. [Google Scholar]
- Anderson, E.T.; Wilson, B.R. Visual-Based SLAM Techniques: A Survey of Recent Advances. IEEE Trans. Robot. 2022, 38, 123–137. [Google Scholar]
- Garcia, M.A.; Patel, S.R. Applications of Visual-Based SLAM in Autonomous Aerial Vehicles. J. Auton. Syst. 2023, 36, 178–192. [Google Scholar]
- Miller, L.H.; Robinson, A.P. Challenges in Real-Time UAV Navigation in Dynamic Environments. IEEE Robot. Autom. Lett. 2021, 26, 4210–4217. [Google Scholar]
- Della Corte, B.; Andreasson, H.; Stoyanov, T.; Grisetti, G. Unified Motion-Based Calibration of Mobile Multi-Sensor Platforms With Time Delay Estimation. IEEE Robot. Autom. Lett. 2019, 4, 902–909. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Tardos, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef]
- Shan, T.; Englot, B. LIO-SAM: Tightly-Coupled Lidar Inertial Odometry via Smoothing and Mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 5135–5142. [Google Scholar]
- Kumar, P.; Zhang, H. Real-time Loop Closure for UAVs in GPS-denied Environments. Auton. Robot. 2023, 59, 49–63. [Google Scholar]
- Shen, S.; Michael, N.; Kumar, V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 5303–5310. [Google Scholar]
- Forster, C.; Pizzoli, M.; Scaramuzza, D. SVO: Fast semi-direct monocular visual odometry. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 15–22. [Google Scholar]
- Zhang, Y.; Scaramuzza, D. Model Predictive Control for UAVs with Integrated Visual SLAM. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May; IEEE: Piscataway, NJ, USA, 2018; pp. 1731–1738. [Google Scholar]
- Faessler, M.; Kaufmann, E.; Scaramuzza, D. Differential flatness-based control of quadrotors for aggressive trajectories. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 1–24 September 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 5729–5736. [Google Scholar]
- Li, M.; Kim, B.; Mourikis, A. Real-time monocular SLAM for MAVs with improved map maintenance. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1764–1771. [Google Scholar]
- Mei, J.; Huang, S.; Wang, D.; Mei, X.; Li, J. Robust outdoor visual SLAM and control for UAVs. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 6493–6499. [Google Scholar]
- Liu, S.; Lu, L.; Yang, Y.; Wang, H.; Xie, L. Visual SLAM and adaptive control for UAVs in dynamic environments. In Proceedings of the 2019 IEEE International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 5633–5639. [Google Scholar]
- Kaufmann, E.; Loquercio, A.; Ranftl, R.; Dosovitskiy, A.; Koltun, V.; Scaramuzza, D. Deep Drone Acrobatics. IEEE Robot. Autom. Lett. 2020, 38, 23–31. [Google Scholar]
- Bachrach, A.; Prentice, S.; He, R.; Henry, P.; Huang, A.; Krainin, M.; Maturana, D.; Fox, D.; Roy, N. Estimation, Planning, and Mapping for Autonomous Flight Using an RGB-D Camera in GPS-Denied Environments. Int. J. Robot. Res. 2012, 31, 1320–1343. [Google Scholar] [CrossRef]
- Sun, Y.; Ho, Y.S.; Qian, C.; Shao, L.; Zhang, H. Reinforcement learning-based visual SLAM for autonomous UAV navigation. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 2402–2409. [Google Scholar]
- Munguía, R.; Trujillo, J.C.; Obregón-Pulido, G.; Aldana, C.I. Monocular-Based SLAM for Mobile Robots: Filtering-Optimization Hybrid Approach. J. Intell. Robot. Syst. 2023, 109, 53. [Google Scholar] [CrossRef]
- SLAM Source Code. Available online: https://github.com/rodrigo-munguia/Hybrid_VSLAM (accessed on 17 January 2025).
- Wu, Y.; Hu, Z. PnP Problem Revisited. J. Math. Imaging Vis. 2006, 24, 131–141. [Google Scholar] [CrossRef]
- Itseez. Open Source Computer Vision Library. 2015. Available online: https://github.com/itseez/opencv (accessed on 17 January 2025).
- Macenski, S.; Foote, T.; Gerkey, B.; Lalancette, C.; Woodall, W. Robot Operating System 2: Design, architecture, and uses in the wild. Sci. Robot. 2022, 7, eabm6074. [Google Scholar] [CrossRef] [PubMed]
- Webots: Mobile Robot Simulation Software. Available online: https://cyberbotics.com/ (accessed on 17 January 2025).
- SCLAM Source Code. Available online: https://github.com/rodrigo-munguia/SCLAM-UAVs (accessed on 17 January 2025).
Feats:I/D | Feat/Frame | Time(s)/Frame | Comp/Exec Time(s) | |
---|---|---|---|---|
Local SLAM | 36,610/36,421 | 194.5 ± 11.80 | 0.0261 ± 0.010 | 526.05/836.90 |
Number KF | Anchors:I/D | Time per Update (s) | Comp/Exec Time(s) | |
---|---|---|---|---|
Global Map | 555 | 88,350/31,938 | 0.354602 ± 0.12 | 197.15/836.90 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Munguia, R.; Grau, A.; Bolea, Y.; Obregón-Pulido, G. A Simultaneous Control, Localization, and Mapping System for UAVs in GPS-Denied Environments. Drones 2025, 9, 69. https://doi.org/10.3390/drones9010069
Munguia R, Grau A, Bolea Y, Obregón-Pulido G. A Simultaneous Control, Localization, and Mapping System for UAVs in GPS-Denied Environments. Drones. 2025; 9(1):69. https://doi.org/10.3390/drones9010069
Chicago/Turabian StyleMunguia, Rodrigo, Antoni Grau, Yolanda Bolea, and Guillermo Obregón-Pulido. 2025. "A Simultaneous Control, Localization, and Mapping System for UAVs in GPS-Denied Environments" Drones 9, no. 1: 69. https://doi.org/10.3390/drones9010069
APA StyleMunguia, R., Grau, A., Bolea, Y., & Obregón-Pulido, G. (2025). A Simultaneous Control, Localization, and Mapping System for UAVs in GPS-Denied Environments. Drones, 9(1), 69. https://doi.org/10.3390/drones9010069