Vision-Based UAV Landing with Guaranteed Reliability in Adverse Environment
Abstract
:1. Introduction
- The developed safety strategy can adjust the altitude of the UAV and recover visual detection (if it is lost in open fields) in the presence of strong wind disturbances. Such a mechanism is integrated into the controller.
- A robust visual-based controller, using the April tags as the landing mark, is designed to achieve the high-precision landing on a customized multi-altitude landing platform.
- The proposed system’s performance was evaluated and validated through 20 real-world experimental scenarios and the results show the proposed strategy is effective and reliable even in adverse landing conditions.
2. UAV-UGV Auto-Landing System
2.1. Flight Controller
2.2. Jetson Xavier
2.3. Camera and Gimbal
3. Vision-Based UAV Autonomous Safe Landing Control
3.1. Target Detection Algorithm
Algorithm 1: Pose Estimation |
3.2. Safe Landing Control
- Phase 1: Triggered by the valid tag detection from the camera, the UAV hovers 2.5 m over the center of the UGV, and exchanges signals with the UGV by LoRa to check if the cones of the platform are open for landing;
- Phase 2: Triggered by the signal sent from the UGV when attitude errors are within a pre-defined range, the UAV hovers 0.5 m over the center of the UGV, for final calibration before landing;
- Phase 3: Triggered by the confidence level at Phase 2, the UAV cuts off the thrust, and lands on the platform.
Algorithm 2: Safe landing algorithm |
4. Experiments and Results
4.1. Simulations
4.2. Field Testing
- Case 1 is an instance of “no disturbance”, where the UAV lands in a relatively calm environment (the effect of the breeze can be ignored). The results indicate that the proposed method works for a “no disturbance” scenario.
- Case 2 is an instance of “consistent disturbance”, where the UAV lands against the consistent wind. The results indicate that the proposed method works for a “consistent disturbance” scenario.
- Case 3 is an instance of “pulse disturbance”, while the strong wind gust unexpectedly emerges during the UAV landing. The results indicate that the proposed method works for the “pulse disturbance” scenario.
4.2.1. Case 1: UAV Landing without Disturbances
4.2.2. Case 2: UAV Landing Subject to Consistent Disturbance
4.2.3. Case 3: UAV Landing Subject to Pulse Disturbance
5. Conclusions and Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wei, Z.; Han, Y.; Li, M.; Yang, K.; Yang, Y.; Luo, Y.; Ong, S.H. A small UAV based multi-temporal image registration for dynamic agricultural terrace monitoring. Remote Sens. 2017, 9, 904. [Google Scholar] [CrossRef] [Green Version]
- Siebert, S.; Teizer, J. Mobile 3D mapping for surveying earthwork projects using an Unmanned Aerial Vehicle (UAV) system. Autom. Constr. 2014, 41, 1–14. [Google Scholar] [CrossRef]
- Lewicka, O.; Specht, M.; Stateczny, A.; Specht, C.; Brčić, D.; Jugović, A.; Widźgowski, S.; Wiśniewska, M. Analysis of GNSS, Hydroacoustic and Optoelectronic Data Integration Methods Used in Hydrography. Sensors 2021, 21, 7831. [Google Scholar] [CrossRef]
- Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.N.; et al. Individual tree detection and classification with UAV-based photogrammetric point clouds and hyperspectral imaging. Remote Sens. 2017, 9, 185. [Google Scholar] [CrossRef] [Green Version]
- Rudol, P.; Doherty, P. Human body detection and geolocalization for UAV search and rescue missions using color and thermal imagery. In Proceedings of the 2008 IEEE Aerospace Conference, Big Sky, MT, USA, 1–8 March 2008; pp. 1–8. [Google Scholar]
- Molina, P.; Colomina, I.; Victoria, T.; Skaloud, J.; Kornus, W.; Prades, R.; Aguilera, C. Searching lost people with UAVs: The system and results of the close-search project. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, 441–446. [Google Scholar] [CrossRef] [Green Version]
- Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
- Sylvester, G. E-Agriculture in Action: Drones for Agriculture; Food and Agriculture Organization ofn the United Nations and International Telecommunication Union: Rome, Italy, 2018.
- Gautam, A.; Sujit, P.; Saripalli, S. A survey of autonomous landing techniques for UAVs. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 1210–1218. [Google Scholar]
- Talha, M.; Asghar, F.; Rohan, A.; Rabah, M.; Kim, S.H. Fuzzy Logic-Based Robust and Autonomous Safe Landing for UAV Quadcopter. Arab. J. Sci. Eng. 2019, 44, 2627–2639. [Google Scholar] [CrossRef]
- Nemra, A.; Aouf, N. Robust INS/GPS sensor fusion for UAV localization using SDRE nonlinear filtering. IEEE Sens. J. 2010, 10, 789–798. [Google Scholar] [CrossRef] [Green Version]
- Chen, C.; Chen, S.; Hu, G.; Chen, B.; Chen, P.; Su, K. An auto-landing strategy based on pan-tilt based visual servoing for unmanned aerial vehicle in GNSS-denied environments. Aerosp. Sci. Technol. 2021, 116, 106891. [Google Scholar] [CrossRef]
- Cesetti, A.; Frontoni, E.; Mancini, A.; Zingaretti, P.; Longhi, S. A vision-based guidance system for UAV navigation and safe landing using natural landmarks. J. Intell. Robot. Syst. 2010, 57, 233–257. [Google Scholar] [CrossRef]
- Li, Z.; Chen, Y.; Lu, H.; Wu, H.; Cheng, L. UAV autonomous landing technology based on apriltags vision positioning algorithm. In Proceedings of the Chinese Control Conference, CCC, Guangzhou, China, 27–30 July 2019; pp. 8148–8153. [Google Scholar] [CrossRef]
- Xin, L.; Tang, Z.; Gai, W.; Liu, H. Vision-Based Autonomous Landing for the UAV: A Review. Aerospace 2022, 9, 634. [Google Scholar] [CrossRef]
- Shim, T.; Bang, H. Autonomous landing of UAV using vision based approach and PID controller based outer loop. In Proceedings of the 2018 18th International Conference on Control, Automation and Systems (ICCAS), PyeongChang, Republic of Korea, 17–20 October 2018; pp. 876–879. [Google Scholar]
- Sharma, A.; Barve, A. Controlling of quad-rotor uav using pid controller and fuzzy logic controller. Int. J. Electr. Electron. Comput. Eng. 2012, 1, 38–41. [Google Scholar]
- Mohammadi, A.; Feng, Y.; Zhang, C.; Rawashdeh, S.; Baek, S. Vision-based autonomous landing using an MPC-controlled micro UAV on a moving platform. In Proceedings of the 2020 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 1–4 September 2020; pp. 771–780. [Google Scholar]
- Feng, Y.; Zhang, C.; Baek, S.; Rawashdeh, S.; Mohammadi, A. Autonomous landing of a UAV on a moving platform using model predictive control. Drones 2018, 2, 34. [Google Scholar] [CrossRef] [Green Version]
- Baca, T.; Stepan, P.; Spurny, V.; Hert, D.; Penicka, R.; Saska, M.; Thomas, J.; Loianno, G.; Kumar, V. Autonomous landing on a moving vehicle with an unmanned aerial vehicle. J. Field Robot. 2019, 36, 874–891. [Google Scholar] [CrossRef]
- Setyawan, G.E.; Kurniawan, W.; Gaol, A.C.L. Linear quadratic regulator controller (LQR) for AR. Drone’s safe landing. In Proceedings of the 2019 International Conference on Sustainable Information Engineering and Technology (SIET), Lombok, Indonesia, 28–30 September 2019; pp. 228–233. [Google Scholar]
- Piponidis, M.; Aristodemou, P.; Theocharides, T. Towards a Fully Autonomous UAV Controller for Moving Platform Detection and Landing. In Proceedings of the 2022 35th International Conference on VLSI Design and 2022 21st International Conference on Embedded Systems (VLSID), Bangalore, India, 26 February–2 March 2022; pp. 180–185. [Google Scholar]
- Rodriguez-Ramos, A.; Sampedro, C.; Bavle, H.; De La Puente, P.; Campoy, P. A deep reinforcement learning strategy for UAV autonomous landing on a moving platform. J. Intell. Robot. Syst. 2019, 93, 351–366. [Google Scholar] [CrossRef]
- Lee, M.F.R.; Nugroho, A.; Le, T.T.; Bastida, S.N. Landing area recognition using deep learning for unammaned aerial vehicles. In Proceedings of the 2020 International Conference on Advanced Robotics and Intelligent Systems (ARIS), Taipei, Taiwan, 19–21 August 2020; pp. 1–6. [Google Scholar]
- Bektash, O.; Naundrup, J.J.; la Cour-Harbo, A. Analyzing visual imagery for emergency drone landing on unknown environments. Int. J. Micro Air Veh. 2022, 14, 17568293221106492. [Google Scholar] [CrossRef]
- Krishnakumar, R.; Rasheed, A.M.; Kumar, K.S. Enhanced hover control of quad tilt frame UAV under windy conditions. Int. J. Adv. Robot. Syst. 2015, 12, 146. [Google Scholar] [CrossRef] [Green Version]
- Scicluna, L.; Sant, T.; Farrugia, R.N. Investigation of wind flow conditions on the flight endurance of UAVs in hovering flight: A preliminary study. In Proceedings of the International Conference on Offshore Mechanics and Arctic Engineering, St. Julian’s, Malta, 3–6 November 2019; Volume 59353, p. V001T01A037. [Google Scholar]
- Yang, J.; Liu, C.; Coombes, M.; Yan, Y.; Chen, W.H. Optimal path following for small fixed-wing UAVs under wind disturbances. IEEE Trans. Contro. Syst. Technol. 2020, 29, 996–1008. [Google Scholar] [CrossRef]
- Yan, Y.; Yang, J.; Liu, C.; Coombes, M.; Li, S.; Chen, W.H. On the actuator dynamics of dynamic control allocation for a small fixed-wing UAV with direct lift control. IEEE Trans. Control Syst. Technol. 2020, 28, 984–991. [Google Scholar] [CrossRef] [Green Version]
- Yan, Y.; Liu, C.; Oh, H.; Chen, W.H. Dual-layer optimization-based control allocation for a fixed-wing UAV. Aerosp. Sci. Technol. 2021, 119, 107184. [Google Scholar] [CrossRef]
- Wu, L.; Wang, C.; Zhang, P.; Wei, C. Deep Reinforcement Learning with Corrective Feedback for Autonomous UAV Landing on a Mobile Platform. Drones 2022, 6, 238. [Google Scholar] [CrossRef]
- Rabah, M.; Rohan, A.; Talha, M.; Nam, K.H.; Kim, S.H. Autonomous Vision-based Target Detection and Safe Landing for UAV. Int. J. Control Autom. Syst. 2018, 16, 3013–3025. [Google Scholar] [CrossRef]
- Olson, E. AprilTag: A robust and flexible visual fiducial system. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3400–3407. [Google Scholar]
- Felzenszwalb, P.F.; Huttenlocher, D.P. Efficient graph-based image segmentation. Int. J. Comput. Vis. 2004, 59, 167–181. [Google Scholar] [CrossRef]
Channel | |||
---|---|---|---|
x | 0.55 | 0.02 | 0.15 |
y | 0.55 | 0.02 | 0.15 |
z | 0.4 | 0.03 | 0.1 |
0.5 | 0.01 | 0.1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ge, Z.; Jiang, J.; Pugh, E.; Marshall, B.; Yan, Y.; Sun, L. Vision-Based UAV Landing with Guaranteed Reliability in Adverse Environment. Electronics 2023, 12, 967. https://doi.org/10.3390/electronics12040967
Ge Z, Jiang J, Pugh E, Marshall B, Yan Y, Sun L. Vision-Based UAV Landing with Guaranteed Reliability in Adverse Environment. Electronics. 2023; 12(4):967. https://doi.org/10.3390/electronics12040967
Chicago/Turabian StyleGe, Zijian, Jingjing Jiang, Ewan Pugh, Ben Marshall, Yunda Yan, and Liang Sun. 2023. "Vision-Based UAV Landing with Guaranteed Reliability in Adverse Environment" Electronics 12, no. 4: 967. https://doi.org/10.3390/electronics12040967
APA StyleGe, Z., Jiang, J., Pugh, E., Marshall, B., Yan, Y., & Sun, L. (2023). Vision-Based UAV Landing with Guaranteed Reliability in Adverse Environment. Electronics, 12(4), 967. https://doi.org/10.3390/electronics12040967