An Onboard Vision-Based System for Autonomous Landing of a Low-Cost Quadrotor on a Novel Landing Pad
Abstract
:1. Introduction
2. Landing Pad Design and Vision Algorithm
2.1. Landing Pad Design and Recognition
2.2. Relative Pose Estimation
2.3. Fusion Estimation
3. Dynamic Model and Landing System
3.1. Dynamic Model
3.2. Flight Control Algorithm
4. Experiments and Results
4.1. Experimental Setup
4.2. Hovering Flight Control and Accuracy Analysis
4.3. Autonomous Landing
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Sun, J.; Li, B.; Jiang, Y.; Wen, C. A Camera-Based Target Detection and Positioning UAV System for Search and Rescue (SAR) Purposes. Sensors 2016, 16, 1778. [Google Scholar] [CrossRef] [PubMed]
- Koppány, M.; Lucian, B. Vision and control for uavs: A survey of general methods and of inexpensive platforms for infrastructure inspection. Sensors 2015, 15, 14887–14916. [Google Scholar]
- Jung, Y.; Lee, D.; Bang, H. Study on Ellipse Fitting Problem for Vision-based Autonomous Landing of an UAV. In Proceedings of the 14th International Conference on Control, Automation and Systems (ICCAS), Seoul, Korea, 22–25 October 2014; pp. 1631–1634. [Google Scholar]
- Saripalli, S.; Montgomery, J.F.; Sukhatme, G.S. Visually Guided Landing of an Unmanned Aerial Vehicle. IEEE Trans. Robot. Autom. 2003, 19, 371–380. [Google Scholar] [CrossRef]
- Vetrella, A.R.; Fasano, G.; Accardo, D.; Moccia, A. Differential GNSS and Vision-Based Tracking to Improve Navigation Performance in Cooperative Multi-UAV Systems. Sensors 2016, 16, 2164. [Google Scholar] [CrossRef]
- Wenzel, K.E.; Rosset, P.; Zell, A. Low-cost visual tracking of a landing place and hovering flight control with a microcontroller. J. Intell. Robot. Syst. 2010, 57, 297–311. [Google Scholar] [CrossRef]
- Yang, S.; Scherer, S.A.; Zell, A. An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. J. Intell. Robot. Syst. 2013, 69, 499–515. [Google Scholar] [CrossRef]
- Chen, Z.; Huang, J.B. A vision-based method for the circle pose determination with a direct geometric interpretation. IEEE Trans. Robot. Autom. 1999, 15, 1135–1140. [Google Scholar] [CrossRef]
- Forsyth, D.; Mundy, J.L.; Zisserman, A.; Coelho, C.; Heller, A.; Rothwell, C. Invariant descriptors for 3d object recognition and pose. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 971–991. [Google Scholar] [CrossRef]
- He, L.; Chao, Y.; Suzuki, K. A run-based two-scan labeling algorithm. IEEE Trans. Image Process. 2008, 17, 749–756. [Google Scholar]
- Sanchez-Lopez, J.L.; Pestana, J.; Saripalli, S.; Campoy, P. An Approach Toward Visual Autonomous Ship Board. J. Intell. Robot. Syst. 2014, 74, 113–127. [Google Scholar] [CrossRef]
- Srinivasan, M.V.; Zhang, S.W.; Chahl, J.S.; Barth, E.; Venkatesh, S. How honeybees make grazing landings on flat surfaces. Biol. Cybern. 2000, 83, 171–183. [Google Scholar] [CrossRef] [PubMed]
- Koenderink, J.J.; Van, A.D. Facts on optic flow. Biol. Cybern. 1987, 56, 247–254. [Google Scholar] [CrossRef] [PubMed]
- Herisse, B.; Hamel, T.; Mahony, R.; Russotto, F.X. Landing a VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical Flow. IEEE Trans. Robot. 2012, 28, 77–89. [Google Scholar] [CrossRef]
- Yang, S.; Scherer, S.A.; Schauwecker, K.; Zell, A. Autonomous Landing of MAVs on an Arbitrarily Textured Landing Site Using Onboard Monocular Vision. J. Intell. Robot. Syst. 2014, 74, 27–43. [Google Scholar] [CrossRef]
- Chaves, S.M.; Wolcott, R.W.; Eustice, R.M. Neec Research: Toward GPS-Denied Landing of Unmanned Aerial Vehicles on Ships at Sea. Nav. Eng. J. 2015, 127, 23–35. [Google Scholar]
- Olson, E. AprilTag: A robust and flexible visual fiducial system. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, 9–13 May 2011; pp. 3400–3407. [Google Scholar]
- Lee, H.; Jung, S.; Shim, D.H. Vision-based UAV landing on the moving vehicle. In Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 8–10 June 2016. [Google Scholar]
- Huang, L.; Song, J.; Zhang, C. Observability Analysis and Filter Design for a Vision Inertial Absolute Navigation System for UAV Using Landmarks. Optik 2017, 149, 455–468. [Google Scholar] [CrossRef]
- Tong, Q.; Peiliang, L.; Shaojie, S. Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar]
- Shen, S.; Mulgaonkar, Y.; Michael, N.; Kumar, V. Vision-based state estimation for autonomous rotorcraft MAVs in complex environments. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany, 6–10 May 2013. [Google Scholar]
- Sanchez-Lopez, J.L.; Arellano-Quintana, V.; Tognon, M.; Campoy, P.; Franchi, A. Visual Marker based Multi-Sensor Fusion State Estimation. In Proceedings of the 20th IFAC World Congress, Toulouse, France, 9–14 July 2017. [Google Scholar]
- Rekimoto, J. Matrix: A Realtime Object Identification and Registration Method for Augmented Reality. In Proceedings of the Asia Pacific Computer Human Interaction, Shonan Village Center, Kanagawa, Japan, 15–17 July 1998. [Google Scholar]
- Fiala, M. ARTag, a fiducial marker system using digital techniques. In Proceedings of the IEEE Computer Society Conference on Computer Vision & Pattern Recognition, San Diego, CA, USA, 20–25 June 2005. [Google Scholar]
- Chang, Y.; He, Z. Research on underground pipeline augmented reality system based on ARToolKit. Comput. Appl. Eng. Educ. 2005, 29, 196–199. [Google Scholar]
- Wagner, D.; Schmalstieg, D. Artoolkitplus for pose tracking on mobile devices. In Proceedings of the Computer Vision Winter Workshop, St. Lambrecht, Australia, 6–8 February 2007. [Google Scholar]
- Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.J.; Marín-Jiménez, M.J. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
- Jin, R.; Jiang, J.; Qi, Y.; Lin, D.; Song, T. Drone Detection and Pose Estimation Using Relational Graph Networks. Sensors 2019, 19, 1479. [Google Scholar] [CrossRef]
- Gao, X.S.; Hou, X.R.; Tang, J.; Cheng, H.F. Complete solution classification for the perspective-three-point problem. IEEE Trans. Pattern Anal. Mach. Intell. 2003, 25, 930–943. [Google Scholar]
- Kim, H.; Lee, D.; Oh, T.; Choi, H.-T.; Myung, H. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera. Sensors 2015, 15, 21636–21659. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lepetit, V.; Moreno-Noguer, F.; Fua, P. Epnp: An accurate o(n) solution to the pnp problem. Int. J. Comput. Vis. 2009, 81, 155–166. [Google Scholar] [CrossRef]
- Abdel-Aziz, Y.I.; Karara, H.M. Direct linear transformation from comparator coordinates into object space coordinates in close-range photogrammetry. Photogramm. Eng. Remote Sens. 2015, 81, 103–107. [Google Scholar] [CrossRef]
- Zhou, J.; Shang, Y.; Zhang, X.; Yu, W. A Trajectory and Orientation Reconstruction Method for Moving Objects Based on a Moving Monocular Camera. Sensors 2015, 15, 5666–5686. [Google Scholar] [CrossRef] [Green Version]
- Tørdal, S.S.; Hovland, G. Relative Vessel Motion Tracking Using Sensor Fusion, Aruco Markers, and MRU Sensors. Model. Identif. Control 2017, 38, 79–93. [Google Scholar] [CrossRef]
- Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
- Romero-Ramirez, F.; Munoz-Salinas, R.; Medina-Carnicer, R. Fractal Markers: A New Approach for Long-Range Marker Pose Estimation under Occlusion. 2019. Available online: https://www.researchgate.net/publication/332727382_Fractal_Markers_a_new_approach_for_long-range_marker_pose_estimation_under_occlusion (accessed on 20 October 2019). [CrossRef]
- Moré, J.J. The Levenberg-Marquardt algorithm: Implementation and theory. Lect. Notes Math. 1978, 630, 105–116. [Google Scholar]
- Pestana, J.; Mellado-Bataller, I.; Sanchez-Lopez, J.L.; Fu, C.; Mondragon, I.F.; Campoy, P. A General Purpose Configurable Controller for Indoors and Outdoors GPS-Denied Navigation for Multirotor Unmanned Aerial Vehicles. J. Intell. Robot. Syst. 2014, 73, 387–400. [Google Scholar] [CrossRef]
- Pixhawk. Available online: http://www.pixhawk.com/ (accessed on 26 March 2019).
- Mavros. Available online: https://github.com/mavlink/mavros/ (accessed on 4 May 2019).
- Ros. Available online: http://www.ros.org/ (accessed on 6 June 2019).
- Optitrack. Available online: http://www.optitrack.com/ (accessed on 6 June 2019).
RMSEs (mm) | x | y | z | xy Plane | 3D |
---|---|---|---|---|---|
Single marker | 30.8 | 44.0 | 8.9 | 53.7 | 54.4 |
Estimated | 10.1 | 19.1 | 5.5 | 21.6 | 22.2 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, X.; Zhang, S.; Tian, J.; Liu, L. An Onboard Vision-Based System for Autonomous Landing of a Low-Cost Quadrotor on a Novel Landing Pad. Sensors 2019, 19, 4703. https://doi.org/10.3390/s19214703
Liu X, Zhang S, Tian J, Liu L. An Onboard Vision-Based System for Autonomous Landing of a Low-Cost Quadrotor on a Novel Landing Pad. Sensors. 2019; 19(21):4703. https://doi.org/10.3390/s19214703
Chicago/Turabian StyleLiu, Xuancen, Shifeng Zhang, Jiayi Tian, and Longbin Liu. 2019. "An Onboard Vision-Based System for Autonomous Landing of a Low-Cost Quadrotor on a Novel Landing Pad" Sensors 19, no. 21: 4703. https://doi.org/10.3390/s19214703
APA StyleLiu, X., Zhang, S., Tian, J., & Liu, L. (2019). An Onboard Vision-Based System for Autonomous Landing of a Low-Cost Quadrotor on a Novel Landing Pad. Sensors, 19(21), 4703. https://doi.org/10.3390/s19214703