Emergency Landing Spot Detection Algorithm for Unmanned Aerial Vehicles
Abstract
:1. Introduction
- Global Positioning System (GPS) failures. In general, UAVs use GPS messages to navigate. Although several sensors could aid the navigation, the vehicle may need to land in the occurrence of loss of GPS signal.
- Loss of communication. In the event of loss of communication between the UAV and a base station, one possible action is to perform an emergency landing.
- Battery failure. If a battery failure is detected, the vehicle may not be able to continue its operation, and as a result, an emergency landing is necessary.
- Software and hardware errors. A UAV can experience a mechanical fault during a mission, like a broken propeller or a fail in the motor, or even a software issue that could require that the vehicle must perform an emergency landing.
- Environment factors. Bad weather conditions such as strong winds and rain make the vehicle unable to carry out the mission, forcing it to land.
2. Related Work
2.1. LiDAR Based Detection Systems
2.2. Vision-Based Detection Systems
2.3. Other Approaches
2.4. Overall Discussion
3. System Design
3.1. Hardware
3.2. Software
- Data Fusion: the data fusion section estimates the UAV’s states by fusing the measurements from the IMU and GPS. There are several approaches to realize this procedure. For instance, an Extended Kalman Filter (EKF) can be applied.
- Frame Transformations: considering that the LiDAR is fixed to the vehicle, the LiDAR pose is correlated to the UAV pose. Therefore, the LiDAR point cloud can be transformed from the LiDAR frame to the local navigation frame. Using the tf2 (http://wiki.ros.org/tf2 (accessed on 14 May 2021)) package, the relations between the UAV, LiDAR and global coordinate frames are established.
- Point Cloud Downsampling: each LiDAR scan produces a point cloud with thousands of points. Considering that the vehicle did not travel a large distance to detect a spot, it is important to accumulate the point clouds in order to not to lose information. Conversely, accumulating the point clouds increases computational effort. Therefore, the referenced point cloud is downsampled.
- Plane Detection: following the downsampling of a point cloud, a plane can be detected in the new point cloud. The plane detection can be subdivided into several steps, confirming the available time, the computational power, and the desired resolution. Generally, the algorithms consist of estimating the parameters from the plane equation in a limited region of the original point cloud. Moreover, the algorithms have to fit a plane in the presence of outliers, i.e., points that do not fit the plane model. Besides the landing spot detection for aerial vehicles, estimating the ground conditions is also useful for detecting clear paths and making further processing less complex.
- Spot Evaluation: the segmented point cloud containing the detected plane is then evaluated to classify the spot’s reliability. The roughness of the landing zone can be assessed by computing the standard deviation in the z-axis. The higher the standard deviation, the rougher the area. In addition, a high value can also indicate the presence of obstacles, such as trees or buildings. The spots are also evaluated regarding their slope as the landing zone cannot be steep enough to destabilize the vehicle when landed. Furthermore, the size of the area and the distance to the UAV can also be used to evaluate the spot. Each evaluation factor has different importance according to the environment. Thus, it becomes interesting to assign different weights to these factors.
- Spot Register: the assessed spots are then registered as a landing spot. However, the suitability of a landing point to be the optimal choice varies during operation. In this way, the points registered have to be periodically reassessed.
4. Emergency Landing Spot Detection Algorithm
4.1. Frames Transformation
- Traveled distance: if the robot has traveled a long distance in relation to the last landing spot, it becomes necessary to find a new location, since, in an emergency scenario, the vehicle may not be able to reach the landing zone.
- Size of accumulated point cloud: a high number of points in the point cloud increases the execution time of search algorithms and the memory consumption of the onboard computer. For this reason, if the size of the accumulated cloud reaches a threshold, the algorithm starts the next step.
4.2. Point Cloud Downsampling
4.3. Data Structuring and Neighbor Search
4.4. Plane Detection
Algorithm 1 Algorithm for neighborhood and plane identification steps. |
Input: pointcloud downsampled |
Output: point cloud cluster |
|
4.5. Registration and Classification
4.6. Algorithm Output
5. Results
5.1. Simulated Environment
5.1.1. Simulation Setup
5.1.2. Environment I
5.1.3. Environment II
5.2. Experimental Dataset
5.2.1. Experimental Setup
5.2.2. Dataset
5.2.3. Dataset Results
6. Discussion
7. Conclusions and Future Work
Author Contributions
Funding
Conflicts of Interest
Abbreviations
EKF | Extended Kalman Filter |
GNSS | Global Navigation Satellite System |
GPS | Global Positioning System |
IMU | Inertial Measurement Unit |
LiDAR | Light Detection and Ranging |
LMS | Least Mean Square |
MORSE | Modular Open Robots Simulation Engine |
PCA | Principal Component Analysis |
PROSAC | Progressive Sample Consensus |
RADAR | Radio Detection Furthermore, Ranging |
ROS | Robot Operating System |
RTK | Real-Time Kinematic |
UAV | Unmanned Aerial Vehicle |
References
- Demir, K.A.; Cicibas, H.; Arica, N. Unmanned Aerial Vehicle Domain: Areas of Research. Def. Sci. J. 2015, 65, 319–329. [Google Scholar] [CrossRef]
- Liew, C.F.; DeLatte, D.; Takeishi, N.; Yairi, T. Recent developments in aerial robotics: A survey and prototypes overview. arXiv 2017, arXiv:1711.10085. [Google Scholar]
- Singhal, G.; Bansod, B.; Mathew, L. Unmanned Aerial Vehicle Classification, Applications and Challenges: A Review. Preprints 2018, 2018110601. [Google Scholar] [CrossRef] [Green Version]
- Sousa, P.; Ferreira, A.; Moreira, M.; Santos, T.; Martins, A.; Dias, A.; Almeida, J.; Silva, E. Isep/inesc tec aerial robotics team for search and rescue operations at the eurathlon challenge 2015. In Proceedings of the 2016 International Conference on Autonomous Robot Systems and Competitions (ICARSC), Bragança, Portugal, 4–6 May 2016; pp. 156–161. [Google Scholar]
- De Cubber, G.; Doroftei, D.; Serrano, D.; Chintamani, K.; Sabino, R.; Ourevitch, S. The EU-ICARUS project: Developing assistive robotic tools for search and rescue operations. In Proceedings of the 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Linköping, Sweden, 21–26 October 2013; pp. 1–4. [Google Scholar]
- Azevedo, F.; Dias, A.; Almeida, J.; Oliveira, A.; Ferreira, A.; Santos, T.; Martins, A.; Silva, E. Real-time lidar-based power lines detection for unmanned aerial vehicles. In Proceedings of the 2019 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Porto, Portugal, 24–26 April 2019; pp. 1–8. [Google Scholar]
- Aghaei, M.; Grimaccia, F.; Gonano, C.A.; Leva, S. Innovative automated control system for PV fields inspection and remote control. IEEE Trans. Ind. Electron. 2015, 62, 7287–7296. [Google Scholar] [CrossRef]
- Loureiro, G.; Soares, L.; Dias, A.; Martins, A. Emergency Landing Spot Detection for Unmanned Aerial Vehicle. In Iberian Robotics Conference; Springer: Berlin/Heidelberg, Germany, 2019; pp. 122–133. [Google Scholar]
- Loureiro, G.; Dias, A.; Martins, A. Survey of Approaches for Emergency Landing Spot Detection with Unmanned Aerial Vehicles. In Proceedings of the Robots in Human Life—CLAWAR’2020 Proceedings, Moscow, Russia, 24–26 August 2020; pp. 129–136. [Google Scholar]
- Johnson, A.E.; Klumpp, A.R.; Collier, J.B.; Wolf, A.A. Lidar-based hazard avoidance for safe landing on Mars. J. Guid. Control Dyn. 2002, 25, 1091–1099. [Google Scholar] [CrossRef] [Green Version]
- Whalley, M.; Takahashi, M.; Tsenkov, P.; Schulein, G.; Goerzen, C. Field-testing of a helicopter UAV obstacle field navigation and landing system. In Proceedings of the 65th Annual Forum of the American Helicopter Society, Grapevine, TX, USA, 27–29 May 2009. [Google Scholar]
- Whalley, M.S.; Takahashi, M.D.; Fletcher, J.W.; Moralez, E., III; Ott, L.C.R.; Olmstead, L.M.G.; Savage, J.C.; Goerzen, C.L.; Schulein, G.J.; Burns, H.N.; et al. Autonomous Black Hawk in Flight: Obstacle Field Navigation and Landing-site Selection on the RASCAL JUH-60A. J. Field Robot. 2014, 31, 591–616. [Google Scholar] [CrossRef]
- Chamberlain, L.; Scherer, S.; Singh, S. Self-aware helicopters: Full-scale automated landing and obstacle avoidance in unmapped environments. In Proceedings of the 67th Annual Forum of the American Helicopter Society, Virginia Beach, VA, USA, 3–5 May 2011; Volume 67. [Google Scholar]
- Scherer, S.; Chamberlain, L.; Singh, S. Autonomous landing at unprepared sites by a full-scale helicopter. Robot. Auton. Syst. 2012, 60, 1545–1562. [Google Scholar] [CrossRef]
- Scherer, S.; Chamberlain, L.; Singh, S. Online assessment of landing sites. In Proceedings of the AIAA Infotech@ Aerospace 2010, Atlanta, GA, USA, 20–22 April 2010; p. 3358. [Google Scholar]
- Scherer, S. Low-Altitude Operation of Unmanned Rotorcraft. 2011. Available online: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.206.5553&rep=rep1&type=pdf (accessed on 14 May 2021).
- Preparata, F.P.; Shamos, M.I. Computational Geometry: An Introduction; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Maturana, D.; Scherer, S. 3d convolutional neural networks for landing zone detection from lidar. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 3471–3478. [Google Scholar]
- LeCun, Y.; Boser, B.E.; Denker, J.S.; Henderson, D.; Howard, R.E.; Hubbard, W.E.; Jackel, L.D. Handwritten digit recognition with a back-propagation network. In Advances in Neural Information Processing Systems; Morgan Kaufmann: Burlington, MA, USA, 1990; pp. 396–404. [Google Scholar]
- Lorenzo, O.G.; Martínez, J.; Vilariño, D.L.; Pena, T.F.; Cabaleiro, J.C.; Rivera, F.F. Landing sites detection using LiDAR data on manycore systems. J. Supercomput. 2017, 73, 557–575. [Google Scholar] [CrossRef]
- Yan, L.; Qi, J.; Wang, M.; Wu, C.; Xin, J. A Safe Landing Site Selection Method of UAVs Based on LiDAR Point Clouds. In Proceedings of the 2020 39th Chinese Control Conference (CCC), Shenyang, China, 27–29 July 2020; pp. 6497–6502. [Google Scholar]
- Bosch, S.; Lacroix, S.; Caballero, F. Autonomous detection of safe landing areas for an UAV from monocular images. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 5522–5527. [Google Scholar]
- Fitzgerald, D.; Walker, R.; Campbell, D. A vision based forced landing site selection system for an autonomous UAV. In Proceedings of the 2005 International Conference on Intelligent Sensors, Sensor Networks and Information Processing, Melbourne, VIC, Australia, 5–8 December 2005; pp. 397–402. [Google Scholar]
- Fitzgerald, D.L. Landing Site Selection for UAV Forced Landings Using Machine Vision. Ph.D. Thesis, Queensland University of Technology, Brisbane, Australia, 2007. [Google Scholar]
- Mejias, L.; Fitzgerald, D.L.; Eng, P.C.; Xi, L. Forced landing technologies for unmanned aerial vehicles: Towards safer operations. Aer. Veh. 2009, 1, 415–442. [Google Scholar]
- Warren, M.; Mejias, L.; Yang, X.; Arain, B.; Gonzalez, F.; Upcroft, B. Enabling aircraft emergency landings using active visual site detection. In Field and Service Robotics; Springer: Berlin/Heidelberg, Germany, 2015; pp. 167–181. [Google Scholar]
- Eendebak, P.; van Eekeren, A.; den Hollander, R. Landing spot selection for UAV emergency landing. In Unmanned Systems Technology XV. International Society for Optics and Photonics; SPIE Defense, Security, and Sensing: Baltimore, MD, USA, 2013; Volume 8741, p. 874105. [Google Scholar]
- Forster, C.; Faessler, M.; Fontana, F.; Werlberger, M.; Scaramuzza, D. Continuous on-board monocular-vision-based elevation mapping applied to autonomous landing of micro aerial vehicles. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 111–118. [Google Scholar]
- Hinzmann, T.; Stastny, T.; Cadena, C.; Siegwart, R.; Gilitschenski, I. Free LSD: Prior-free visual landing site detection for autonomous planes. IEEE Robot. Autom. Lett. 2018, 3, 2545–2552. [Google Scholar] [CrossRef] [Green Version]
- Kaljahi, M.A.; Shivakumara, P.; Idris, M.Y.I.; Anisi, M.H.; Lu, T.; Blumenstein, M.; Noor, N.M. An automatic zone detection system for safe landing of UAVs. Expert Syst. Appl. 2019, 122, 319–333. [Google Scholar] [CrossRef] [Green Version]
- Bektash, O.; Pedersen, J.N.; Gomez, A.R.; la Cour-Harbo, A. Automated Emergency Landing System for Drones: SafeEYE Project. In Proceedings of the 2020 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 1–4 September 2020; pp. 1056–1064. [Google Scholar]
- Serrano, N. A bayesian framework for landing site selection during autonomous spacecraft descent. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 5112–5117. [Google Scholar]
- Kyburg, H.E. Probabilistic reasoning in intelligent systems: Networks of plausible inference by Judea Pearl. J. Philos. 1991, 88, 434–437. [Google Scholar] [CrossRef]
- Howard, A.; Seraji, H. Multi-sensor terrain classification for safe spacecraft landing. IEEE Trans. Aerosp. Electron. Syst. 2004, 40, 1122–1131. [Google Scholar] [CrossRef] [Green Version]
- Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Wheeler, R.; Ng, A.Y. ROS: An Open-Source Robot Operating System; ICRA Workshop on Open Source Software; ICRA: Kobe, Japan, 2009; Volume 3, p. 5. [Google Scholar]
- Soares, E.; Brandão, P.; Prior, R. Analysis of Timekeeping in Experimentation. In Proceedings of the 2020 12th International Symposium on Communication Systems, Networks and Digital Signal Processing (CSNDSP), Porto, Portugal, 20–22 July 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Freepik. Flat Drone. Available online: https://www.freepik.com/free-vector/basic-variety-of-flat-drones_1348538.htm (accessed on 11 October 2020).
- DNS. ODROID-XU4. Available online: https://cdn.antratek.nl/media/product/d9e/odroid-xu4-octa-core-computer-with-samsung-exynos-5422-g143452239825-47c.jpg (accessed on 11 October 2020).
- Orts-Escolano, S.; Morell, V.; García-Rodríguez, J.; Cazorla, M. Point cloud data filtering and downsampling using growing neural gas. In Proceedings of the The 2013 International Joint Conference on Neural Networks (IJCNN), Dallas, TX, USA, 4–9 August 2013; pp. 1–8. [Google Scholar]
- Meagher, D.J. Octree Encoding: A New Technique for the Representation, Manipulation and Display of Arbitrary 3-d Objects by Computer; Electrical and Systems Engineering Department Rensseiaer Polytechnic: Troy, NY, USA, 1980. [Google Scholar]
- Mosa, A.S.M.; Schön, B.; Bertolotto, M.; Laefer, D.F. Evaluating the benefits of octree-based indexing for LiDAR data. Photogramm. Eng. Remote. Sens. 2012, 78, 927–934. [Google Scholar] [CrossRef] [Green Version]
- Echeverria, G.; Lemaignan, S.; Degroote, A.; Lacroix, S.; Karg, M.; Koch, P.; Lesire, C.; Stinckwich, S. Simulating complex robotic scenarios with MORSE. In International Conference on Simulation, Modeling, and Programming for Autonomous Robots; Springer: Berlin/Heidelberg, Germany, 2012; pp. 197–208. [Google Scholar]
- Echeverria, G.; Lassabe, N.; Degroote, A.; Lemaignan, S. Modular open robots simulation engine: Morse. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation. Citeseer, Shanghai, China, 9–13 May 2011; pp. 46–51. [Google Scholar]
- The MORSE Simulator Documentation. Available online: https://www.openrobots.org/morse/doc/stable/morse.html (accessed on 6 October 2020).
- Azevedo, F.; Dias, A.; Almeida, J.; Oliveira, A.; Ferreira, A.; Santos, T.; Martins, A.; Silva, E. Lidar-based real-time detection and modeling of power lines for unmanned aerial vehicles. Sensors 2019, 19, 1812. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Parameters | Description |
---|---|
Number of random points chosen from the cloud cluster | |
Minimum points used to fit a plane | |
Maximum radius considered for a plane | |
Minimum radius considered for a plane | |
Maximum slope accepted for the drone |
Parameters | Weight | Description |
---|---|---|
Spot radius | ||
Spot slope | ||
Standard deviation of the spot | ||
Distance from the spot to the vehicle |
Search Points (#) | Radius (m) | Downsample Mean Time (ms) | PCA Mean Time (ms) | Rejected Planes | Accepted Planes | Chosen Spots |
---|---|---|---|---|---|---|
10 | 3 | 20.54 | 84.73 | 184 | 146 | 27 |
20 | 3 | 19.07 | 408.00 | 495 | 65 | 41 |
30 | 3 | 19.55 | 556.93 | 754 | 116 | 54 |
50 | 3 | 20.00 | 898.35 | 1197 | 203 | 56 |
100 | 3 | 19.621 | 1674.37 | 2499 | 401 | 70 |
10 | 4 | 21.50 | 400.96 | 36 | 244 | 62 |
10 | 5 | 23.18 | 802.21 | 35 | 245 | 82 |
10 | 10 | 22.14 | 4583.9 | 24 | 156 | 7 |
Search Points (#) | Downsample Mean Time (ms) | PCA Mean Time(ms) | Rejected Planes | Accepted Planes | Chosen Spots |
---|---|---|---|---|---|
20 | 23.77 | 20.64 | 278 | 22 | 3 |
30 | 24.27 | 25.30 | 566 | 54 | 5 |
50 | 23.57 | 52.43 | 1362 | 137 | 5 |
100 | 23.97 | 119.63 | 2712 | 288 | 8 |
Voxel Size (m) | Mean Point Cloud Size | Mean Downsampled Point Cloud Size | Downsample Mean Time (ms) | PCA Mean Time (ms) | Accepted Planes | Detected Spots |
---|---|---|---|---|---|---|
0.05 | 183,278.6 | 70,022.4 | 15.507 | 18.907 | 22 | 3 |
0.10 | 183,278.6 | 32,146.2 | 15.7388 | 9.155 | 78 | 7 |
0.5 | 183,278.6 | 5735.2 | 12.390 | 9.606 | 132 | 8 |
1.0 | 183,278.6 | 1541.5 | 11.733 | 10.565 | 210 | 0 |
Parameter Evaluated | Value |
---|---|
Plane detection step mean time | 37.87 ms |
Plane detection physical memory mean usage | 14.90 MB |
Total of detected landing spots | 1724 |
Mean of spots’ radius | 3.24 m |
Mean of spots’ | 0.028 m |
Mean energy consumption without the algorithm | 15.26 W |
Mean energy consumption with the algorithm | 18.22 W |
Maximum energy consumption with the algorithm | 21.89 W |
Voxel Size (m) | Mean Point Cloud Size | Mean Downsampled Point Cloud Size | Downsample Mean Time (ms) | PCA Mean Time (ms) | Accepted Planes | Detected Spots |
---|---|---|---|---|---|---|
0.05 | 31,155.9 | 23,131.5 | 5.333 | 55.035 | 2206 | 372 |
0.10 | 31,155.9 | 13,945.4 | 4.493 | 39.390 | 1703 | 319 |
0.5 | 31,155.9 | 1996.8 | 3.695 | 4.838 | 1440 | 302 |
1.0 | 31,155.9 | 538.9 | 3.261 | 3.177 | 565 | 37 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Loureiro, G.; Dias, A.; Martins, A.; Almeida, J. Emergency Landing Spot Detection Algorithm for Unmanned Aerial Vehicles. Remote Sens. 2021, 13, 1930. https://doi.org/10.3390/rs13101930
Loureiro G, Dias A, Martins A, Almeida J. Emergency Landing Spot Detection Algorithm for Unmanned Aerial Vehicles. Remote Sensing. 2021; 13(10):1930. https://doi.org/10.3390/rs13101930
Chicago/Turabian StyleLoureiro, Gabriel, André Dias, Alfredo Martins, and José Almeida. 2021. "Emergency Landing Spot Detection Algorithm for Unmanned Aerial Vehicles" Remote Sensing 13, no. 10: 1930. https://doi.org/10.3390/rs13101930
APA StyleLoureiro, G., Dias, A., Martins, A., & Almeida, J. (2021). Emergency Landing Spot Detection Algorithm for Unmanned Aerial Vehicles. Remote Sensing, 13(10), 1930. https://doi.org/10.3390/rs13101930