A Robust and Accurate Landing Methodology for Drones on Moving Targets
Abstract
:1. Introduction
1.1. Related Works
- GNSS-based landing: This method is commonly used by commercial drones [3]. For improved accuracy, the use of differential GNSS [4] and RTK (real-time kinematics) [5] were suggested. In those methods, additional sensors such as INS (inertial navigation sensors), altimeter, or barometric pressure sensors are commonly used in conjunction with GNSS.
- Visual processing landing: GNSS signals may not be available, therefore GNSS-based autonomous landing may not be possible in many remote regions. Optical flow seems to be a significant capability regarding autonomous flying. Therefore, researchers suggested vision-based solutions. Those solutions require a camera and an algorithm to detect the landing area, allowing the drone to track and maintain its position. Research studies have suggested several methods of detecting the landing target: an algorithm that identifies an “H-shaped” landing target using invariant moments [6], a black and white pattern consisting of six squares of different sizes [7], or a target design with four white rings [8]. Recently, researchers have began to use ArUco markers as targets [9,10,11]. Alternatively, the use of April QR codes combined with simulated GNSS signals was suggested as a landing method for micro drones in windy conditions [12].
- Drone in a box: The concept of a drone in a box [18] is designed to support an autonomous drone system where it allows a drone that performs remote automatic operations to nest inside an enclosure for charging and safe harbor. There are several “drone-in-a-box” systems available commercially, such as Airobotics and Percepto, and DJI.
1.2. Motivation
1.3. Our Contribution
2. Modeling a Commercial Camera Drone
2.1. Drone Basic Sensors
- Pitch: so it can move backwards and forwards.
- Roll: so it can move left and right.
- Yaw: so it can rotate.
- Throttle: so it can change the altitude and move up and down.
2.2. Remote Control API
3. Modeling the Landing Process Commercial Drones
3.1. Landing Using GNSS
3.2. Vision-Based Landing
3.2.1. Target
3.2.2. Controller Mode
3.3. Distance Calculation Method
Camera Gimbal Position
Angle of −90°
Angle of −45°
Adjustable Angle
4. Drone Landing Methodology
4.1. Simulation of Autonomous Successful Landing
4.2. Controlling Algorithms
4.3. VSL Algorithm
- Disarmed: The drone is on the ground not armed.
- Arm: The drone is on the ground armed (ready to take off).
- Take Off: The drone starts flying upwards and reaches a predefined altitude (e.g., 1 m).
- Mission: The drone initiates a predefined route (mission) which is composed of several waypoints.
- Search Target: The drone is looking for a known target (e.g., QR code) (see Figure 13).
- Leash Tracking: The drone maintains the distance and orientation from the target. The “leash” is relatively flexible, allowing a smooth tracking. Note that in this mode the relative velocity between the drone and the moving target is close to zero.
- Centering: The drone maintains the detected target in the center of the image as captured by the drone gimbal camera. It can be achieved by using yaw roll and throttle; see Figure 14a–d.
- Helipad Approach: Following the detection of the helipad target and executing centering process, the drone reaches a certain distance and angle to the helipad (i.e., approach funnel); see Figure 15a.
- Guiding Target Tracking: This stage starts after the drone detects the guiding target. Then, it maintains “leash” and executes a centering process (according to the guiding target; see Figure 15b).
- Gimbal Adjustment: The camera angle is moved according to the drop (e.g., distance 3 m, angle −45°).
- Final Approach: The drone performs a vertical landing while maintaining a fixed distance from the guiding target, and adjusting the gimbal according to the height and the required descent rate from the helipad.
- Touchdown: On touching the helipad, the drone shuts down the motors, reports “landing”, and moves to “arm” (or ready to fly) stage.
- Fail-safe: The drone detects some kind of abnormality or risk, i.e., hardware malfunction, RC communication loss, or loss of sight of the target. The drone will return to a “safe-stage” according to the type of error and its fail-safe policy.
4.4. Fixed Camera Drones
- (i)
- The wide-angle camera should be calibrated in order to improve the target detection and tracking.
- (ii)
- A virtual gimbal should be implemented on the camera image, by cropping the wide-angle frame into a standard FoV (field of view) image.
- (iii)
- In case the drone has standard FoV camera, the virtual gimbal can be implemented by changing the y-axis of the center point of the camera. In other words, the y-coordinate of the center of the image should be compensated according to the value as if the camera could be tilted by the gimbal.
4.5. Safe Landing
- (i)
- Drone parameters: the drone working global behavioral parameters: e.g., max descent speed or max horizontal speed.
- (ii)
- Helipad parameters: e.g., type, size, or max slope (in our case, this should not be more than 20°).
- (iii)
- Drone to helipad parameters: e.g., drone to helipad relative speed, or marker tracking confidence (by the drone). The credibility factor is the overall combined parameter which takes into consideration all the parameters above and also checks for abnormality or risk. As mentioned, we also defined a fail-safe policy (see Figure 17).
4.6. Study Limitations
5. Experimental Results
5.1. ArUco Tracking
5.2. PID Calibration
- The drone needs to have smooth reactions.
- The fastest settling time (the time our system will converge to steady state).
- The max overshoot cannot be more than half of the distance from the target needed to land (the distance between the target to the helipad).
5.3. Experiment System
- Hover, which allows us to track the target and maintain a constant distance from it.
- Land, which allows us to close the distance to the target and change the the gimbal pitch angle and decrease height while concentrating on the target.
- Stop (most important button), which breaks the command that is running and allows us to take back control manually and control the drone from the remote control.
5.4. Experiment Strategy
- Find the large target and approach it.
- Fix the orientation.
- Hover according to the smaller (landing) target.
- Land according to a small target.
5.4.1. Fix the Orientation Error
5.4.2. Hover and Landing
5.4.3. Logging Results
6. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Casagrande, G.; Sik, A.; Szabó, G. Small Flying Drones; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
- Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P.J. Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef] [Green Version]
- Cho, A.; Kim, J.; Lee, S.; Choi, S.; Lee, B.; Kim, B.; Park, N.; Kim, D.; Kee, C. Fully automatic taxiing, takeoff and landing of a UAV using a single-antenna GPS receiver only. In Proceedings of the 2007 International Conference on Control, Automation and Systems, Seoul, Korea, 17–20 October 2007; pp. 821–825. [Google Scholar]
- Smit, S.J.A. Autonomous Landing of a Fixed-Wing Unmanned Aerial Vehicle Using Differential GPS. Ph.D. Thesis, Stellenbosch University, Stellenbosch, South Africa, 2013. [Google Scholar]
- Baca, T.; Stepan, P.; Spurny, V.; Hert, D.; Penicka, R.; Saska, M.; Thomas, J.; Loianno, G.; Kumar, V. Autonomous landing on a moving vehicle with an unmanned aerial vehicle. J. Field Robot. 2019, 36, 874–891. [Google Scholar] [CrossRef]
- Saripalli, S.; Montgomery, J.F.; Sukhatme, G.S. Vision-based autonomous landing of an unmanned aerial vehicle. In Proceedings of the 2002 IEEE International Conference on Robotics and Automation (Cat. No. 02CH37292), Washington, DC, USA, 11–15 May 2002; Volume 3, pp. 2799–2804. [Google Scholar]
- Sharp, C.S.; Shakernia, O.; Sastry, S.S. A vision system for landing an unmanned aerial vehicle. In Proceedings of the 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164), Seoul, Korea, 21–26 May 2001; Volume 2, pp. 1720–1727. [Google Scholar]
- Lange, S.; Sunderhauf, N.; Protzel, P. A vision-based onboard approach for landing and position control of an autonomous multirotor UAV in GPS-denied environments. In Proceedings of the 2009 International Conference on Advanced Robotics, Munich, Germany, 22–26 June 2009; pp. 1–6. [Google Scholar]
- Wubben, J.; Fabra, F.; Calafate, C.T.; Krzeszowski, T.; Marquez-Barja, J.M.; Cano, J.C.; Manzoni, P. Accurate landing of unmanned aerial vehicles using ground pattern recognition. Electronics 2019, 8, 1532. [Google Scholar] [CrossRef] [Green Version]
- Sani, M.F.; Karimian, G. Automatic navigation and landing of an indoor AR. drone quadrotor using ArUco marker and inertial sensors. In Proceedings of the 2017 International Conference on Computer and Drone Applications (IConDA), Kuching, Malaysia, 9–11 November 2017; pp. 102–107. [Google Scholar]
- Marut, A.; Wojtowicz, K.; Falkowski, K. ArUco markers pose estimation in UAV landing aid system. In Proceedings of the 2019 IEEE 5th International Workshop on Metrology for AeroSpace (MetroAeroSpace), Turin, Italy, 19–21 June 2019; pp. 261–266. [Google Scholar]
- Paris, A.; Lopez, B.T.; How, J.P. Dynamic landing of an autonomous quadrotor on a moving platform in turbulent wind conditions. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 9577–9583. [Google Scholar]
- Ruffier, F.; Viollet, S.; Amic, S.; Franceschini, N. Bio-inspired optical flow circuits for the visual guidance of micro air vehicles. In Proceedings of the 2003 International Symposium on Circuits and Systems, ISCAS ’03, Bangkok, Thailand, 25–28 May 2003; Volume 3, p. III. [Google Scholar]
- Saripalli, S.; Sukhatme, G.S. Landing a helicopter on a moving target. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 2030–2035. [Google Scholar]
- Zhang, X.; Liu, P.; Zhang, C. An integration method of inertial navigation system and three-beam lidar for the precision landing. Math. Probl. Eng. 2016, 2016. [Google Scholar] [CrossRef]
- Kong, W.; Zhou, D.; Zhang, Y.; Zhang, D.; Wang, X.; Zhao, B.; Yan, C.; Shen, L.; Zhang, J. A ground-based optical system for autonomous landing of a fixed wing UAV. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 4797–4804. [Google Scholar]
- Kong, W.; Zhang, D.; Zhang, J. A ground-based multi-sensor system for autonomous landing of a fixed wing UAV. In Proceedings of the 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), Zhuhai, China, 6–9 December 2015; pp. 1303–1310. [Google Scholar]
- Langåker, H.A.; Kjerkreit, H.; Syversen, C.L.; Moore, R.J.; Holhjem, Ø.H.; Jensen, I.; Morrison, A.; Transeth, A.A.; Kvien, O.; Berg, G.; et al. An autonomous drone-based system for inspection of electrical substations. Int. J. Adv. Robot. Syst. 2021, 18, 17298814211002973. [Google Scholar] [CrossRef]
- Alam, M.S.; Oluoch, J. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs). Expert Syst. Appl. 2021, 179, 115091. [Google Scholar] [CrossRef]
- Cohen, A.P.; Shaheen, S.A.; Farrar, E.M. Urban air mobility: History, ecosystem, market potential, and challenges. IEEE Trans. Intell. Transp. Syst. 2021, 22, 6074–6087. [Google Scholar] [CrossRef]
- Matus-Vargas, A.; Rodriguez-Gomez, G.; Martinez-Carranza, J. Ground effect on rotorcraft unmanned aerial vehicles: A review. Intell. Serv. Robot. 2021, 14, 99–118. [Google Scholar] [CrossRef]
- Praveen, V.; Pillai, S. Modeling and simulation of quadcopter using PID controller. Int. J. Control Theory Appl. 2016, 9, 7151–7158. [Google Scholar]
- Bolanakis, G.; Nanos, K.; Papadopoulos, E. A QR Code-based High-Precision Docking System for Mobile Robots Exhibiting Submillimeter Accuracy. In Proceedings of the 2021 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Delft, The Netherlands, 12–16 July 2021; pp. 830–835. [Google Scholar]
- Yu, J.; Jiang, W.; Luo, Z.; Yang, L. Application of a Vision-Based Single Target on Robot Positioning System. Sensors 2021, 21, 1829. [Google Scholar] [CrossRef] [PubMed]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Keller, A.; Ben-Moshe, B. A Robust and Accurate Landing Methodology for Drones on Moving Targets. Drones 2022, 6, 98. https://doi.org/10.3390/drones6040098
Keller A, Ben-Moshe B. A Robust and Accurate Landing Methodology for Drones on Moving Targets. Drones. 2022; 6(4):98. https://doi.org/10.3390/drones6040098
Chicago/Turabian StyleKeller, Assaf, and Boaz Ben-Moshe. 2022. "A Robust and Accurate Landing Methodology for Drones on Moving Targets" Drones 6, no. 4: 98. https://doi.org/10.3390/drones6040098
APA StyleKeller, A., & Ben-Moshe, B. (2022). A Robust and Accurate Landing Methodology for Drones on Moving Targets. Drones, 6(4), 98. https://doi.org/10.3390/drones6040098