Visual Servoing of a Moving Target by an Unmanned Aerial Vehicle
Abstract
:1. Introduction
- An optimal tracking controller is developed to track a moving target undergoing unknown motion while meeting motion, control, and sensing constraints, where the relative motion between the target and the UAV is estimated.
- In contrast to previous approaches [34,35,36,37,38], this work models the dynamics of the relative rotation between the target and the UAV, and the relative angle can be controlled to a predefined desired value, which can be applied in applications such as the automatic searching, detection, and recognition of car license plate.
- The controller is designed to ensure that the target remains within the FOV.
- Compared to the IBVS controller, the developed controller can ensure smooth control input, less energy, and smaller tracking error.
- The developed control architecture can be applied to track other moving targets as long as it can be detected by the YOLO network.
2. Preliminaries
Control Objectives
3. Estimating the Motion of the Moving Target
3.1. Kinematics Model
3.2. Measurements: Bounding Box and Orientation
3.2.1. Bounding Box
3.2.2. Orientation Measurement
3.3. Observability
4. Controller Architecture
Controller Design
5. Simulations
5.1. Environment Setup
5.2. Simulation Results
5.2.1. Simulation 1: Controller with Relative Rotational Dynamics
Comparison of State Feature Vectors
Comparison of Control Inputs
5.2.2. Simulation 2: Controller with Target Motion Pattern
Comparison of State Feature Vectors
Comparison of Control Inputs
5.2.3. Simulation 3: IBVS vs. the Developed Controller
Comparison of State Feature Vectors
Comparison of Control Inputs
IBVS Controller vs. the Developed Controller
6. Conclusions
- The control effort is less than for traditional controllers, while the tracking error is quantified by the RMS error and is less than that for a traditional controller.
- The value of the control input to the controller always remains within the UAV motion capabilities.
- The target can be tracked, and its image features always remain within the image, which is not guaranteed for a traditional IBVS controller.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Minaeian, S.; Liu, J.; Son, Y. Vision-based target detection and localization via a team of cooperative uav and ugvs. IEEE Trans. Syst. Man Cybern. Syst. 2016, 46, 1005–1016. [Google Scholar] [CrossRef]
- Cavaliere, D.; Loia, V.; Saggese, A.; Senatore, S.; Vento, M. Semantically enhanced uavs to increase the aerial scene understanding. IEEE Trans. Syst. Man Cybern. Syst. 2019, 49, 555–567. [Google Scholar] [CrossRef]
- Chitrakaran, V.; Dawson, D.M.; Dixon, W.E.; Chen, J. Identification of a moving object’s velocity with a fixed camera. Automatica 2005, 41, 553–562. [Google Scholar] [CrossRef]
- Schlichenmaier, J.; Selvaraj, N.; Stolz, M.; Waldschmidt, C. Template matching for radar-based orientation and position estimation in automotive scenarios. In Proceedings of the IEEE MTT-S International Conference Microwaves Intelligent Mobility (ICMIM), Nagoya, Japan, 19–21 March 2017; pp. 95–98. [Google Scholar]
- Stroupe, A.; Martin, M.; Balch, T. Distributed sensor fusion for object position estimation by multi-robot systems. In Proceedings of the IEEE International Conference on Robotics and Automation, Seoul, Korea, 21–26 May 2001; pp. 1092–1098. [Google Scholar]
- Kamthe, A.; Jiang, L.; Dudys, M.; Cerpa, A. Scopes: Smart cameras object position estimation system. In EWSN ’09; Springer: Berlin/Heidelberg, Germany, 2009; pp. 279–295. [Google Scholar]
- Ahn, H.S.; Ko, K.H. Simple pedestrian localization algorithms based on distributed wireless sensor networks. IEEE Trans. Ind. Electron. 2009, 56, 4296–4302. [Google Scholar]
- Chitrakaran, V.K.; Dawson, D.M. A lyapunov-based method for estimation of euclidean position of static features using a single camera. In Proceedings of the American Control Conference, New York, NY, USA, 9–13 July 2007; pp. 1988–1993. [Google Scholar]
- Braganza, D.; Dawson, D.M.; Hughes, T. Euclidean position estimation of static features using a moving camera with known velocities. In Proceedings of the IEEE Conference on Decision and Control, New Orleans, LA, USA, 12–14 December 2007; pp. 2695–2700. [Google Scholar]
- Chhatkuli, A.; Pizarro, D.; Collins, T.; Bartoli, A. Inextensible non-rigid structure-from-motion by second-order cone programming. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 2428–2441. [Google Scholar] [CrossRef] [PubMed]
- Agudo, A.; Moreno-Noguer, F. Simultaneous pose and non-rigid shape with particle dynamics. In Proceedings of the IEEE Conference Computer Vision Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015. [Google Scholar]
- Mateus, A.; Tahri, O.; Miraldo, P. Active Structure-from-Motion for 3D Straight Lines. arXiv 2018, arXiv:1807.00753. [Google Scholar]
- Spica, R.; Giordano, P.R.; Chaumette, F. Active structure from motion: Application to point, sphere, and cylinder. IEEE Trans. Robot. 2014, 30, 1499–1513. [Google Scholar] [CrossRef] [Green Version]
- Spica, R.; Giordano, P.R.; Chaumette, F. Plane estimation by active vision from point features and image moments. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 6003–6010. [Google Scholar] [CrossRef] [Green Version]
- Giordano, P.R.; Spica, R.; Chaumette, F. Learning the shape of image moments for optimal 3d structure estimation. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 5990–5996. [Google Scholar] [CrossRef] [Green Version]
- Chaumette, F.; Hutchinson, S. Visual servo control part I: Basic approaches. IEEE Rob. Autom. Mag. 2006, 13, 82–90. [Google Scholar] [CrossRef]
- Keshmiri, M.; Xie, W. Image-based visual servoing using an optimized trajectory planning technique. IEEE/ASME Trans. Mechatron. 2017, 22, 359–370. [Google Scholar] [CrossRef]
- Allibert, G.; Courtial, E.; Chaumette, F. Predictive control for constrained image-based visual servoing. IEEE Trans. Robot. 2010, 26, 933–939. [Google Scholar] [CrossRef] [Green Version]
- Qiu, Z.; Hu, S.; Liang, X. Model predictive control for uncalibrated and constrained image-based visual servoing without joint velocity measurements. IEEE Access 2019, 7, 73540–73554. [Google Scholar] [CrossRef]
- Gao, J.; Proctor, A.A.; Shi, Y.; Bradley, C. Hierarchical model predictive image-based visual servoing of underwater vehicles with adaptive neural network dynamic control. IEEE Trans. Cybern. 2016, 46, 2323–2334. [Google Scholar] [CrossRef] [PubMed]
- Ke, F.; Li, Z.; Xiao, H.; Zhang, X. Visual servoing of constrained mobile robots based on model predictive control. IEEE Tran. Syst. Man Cybern. Syst. 2017, 47, 1428–1438. [Google Scholar] [CrossRef]
- Falanga, D.; Foehn, P.; Lu, P.; Scaramuzza, D. PAMPC: Perception-aware model predictive control for quadrotors. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 1–8. [Google Scholar]
- Razzanelli, M.; Innocenti, M.; Pannocchia, G.; Pollini, L. Vision-based Model Predictive Control for Unmanned Aerial Vehicles Automatic Trajectory Generation and Tracking. Available online: https://arc.aiaa.org/doi/abs/10.2514/6.2019-1409 (accessed on 5 June 2021).
- Shi, H.; Sun, G.; Wang, Y.; Hwang, K. Adaptive image-based visual servoing with temporary loss of the visual signal. IEEE Trans. Ind. Inf. 2019, 15, 1956–1965. [Google Scholar] [CrossRef]
- Mondragon, I.; Campoy, P.; Olivares-Mendez, M.; Martinez, C. 3D object following based on visual information for unmanned aerial vehicles. In Proceedings of the IEEE Colombian Conference on Automatic Control, Bogota, Colombia, 1–4 October 2011; pp. 1–7. [Google Scholar]
- Wang, H.; Yang, B.; Wang, J.; Liang, X.; Chen, W.; Liu, Y. Adaptive visual servoing of contour features. IEEE/ASME Trans. Mechatron. 2018, 23, 811–822. [Google Scholar] [CrossRef]
- Siradjuddin, I.; Tundung, S.P.; Indah, A.S.; Adhisuwignjo, S. A real-time model based visual servoing application for a differential drive mobile robot using beaglebone black embedded system. In Proceedings of the 2015 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS), Langkawi, Malaysia, 18–20 October 2015; pp. 186–192. [Google Scholar]
- Pan, W.; Lyu, M.; Hwang, K.-S.; Ju, M.-Y.; Shi, H. A neuro-fuzzy visual servoing controller for an articulated manipulator. IEEE Access 2018, 6, 3346–3357. [Google Scholar] [CrossRef]
- Lang, H.; Wang, Y.; de Silva, C.W. Vision based object identification and tracking for mobile robot visual servo control. In Proceedings of the IEEE ICCA 2010, Xiamen, China, 9–11 June 2010; pp. 92–96. [Google Scholar]
- Pence, W.G.; Farelo, F.; Alqasemi, R.; Sun, Y.; Dubey, R. Visual servoing control of a 9-dof wmra to perform adl tasks. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 916–922. [Google Scholar]
- Djelal, N.; Saadia, N.; Ramdane-Cherif, A. Target tracking based on surf and image based visual servoing. In Proceedings of the CCCA12, Marseilles, France, 6–8 December 2012; pp. 1–5. [Google Scholar]
- Anh, T.L.; Song, J. Robotic grasping based on efficient tracking and visual servoing using local feature descriptors. Int. J. Precis. Eng. Manuf. 2012, 13, 387–393. [Google Scholar] [CrossRef]
- Penin, B.; Giordano, P.R.; Chaumette, F. Vision-based reactive planning for aggressive target tracking while avoiding collisions and occlusions. IEEE Robot. Autom. Lett. 2018, 3, 3725–3732. [Google Scholar] [CrossRef] [Green Version]
- Hulens, D.; Goedemé, T. Autonomous flying cameraman with embedded person detection and tracking while applying cinematographic rules. In Proceedings of the Conference Comput. Robot Vision, Edmonton, AB, Canada, 17–19 May 2017; pp. 56–63. [Google Scholar]
- Gupta, M.; Kumar, S.; Behera, L.; Subramanian, V.K. A novel vision-based tracking algorithm for a human-following mobile robot. IEEE Trans. Syst. Man Cybern. Syst. 2017, 47, 1415–1427. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, Q.; Hu, H.; He, Y. A novel real-time moving target tracking and path planning system for a quadrotor uav in unknown unstructured outdoor scenes. IEEE Trans. Syst. Man Cybern. Syst. 2019, 49, 2362–2372. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.H.; Millard, J.D.; Lusk, P.C.; Beard, R.W. Autonomous target following with monocular camera on uas using recursive-ransac tracker. In Proceedings of the International Conference on Unmanned Aircraft Systems, Dallas, TX, USA, 12–15 June 2018; pp. 1070–1074. [Google Scholar]
- Cao, Z.; Chen, X.; Yu, Y.; Yu, J.; Liu, X.; Zhou, C.; Tan, M. Image dynamics-based visual servoing for quadrotors tracking a target with a nonlinear trajectory observer. IEEE Trans. Syst. Man Cybern. Syst. 2020, 50, 376–384. [Google Scholar] [CrossRef]
- Limon, D.; Pereira, M.; Peña, D.; Alamo, T.; Jones, C.N.; Zeilinger, M.N. MPC for tracking periodic references. IEEE Trans. Autom. Control. 2016, 61, 1123–1128. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
- Li, J.-M.; Chen, C.-W.; Cheng, T.-H. Motion prediction and robust tracking of a dynamic and temporarily-occluded target by an unmanned aerial vehicle. IEEE Trans. Control Syst. Technol. 2020, 29, 1623–1635. [Google Scholar] [CrossRef]
- Wan, E.A.; Menve, R.V. The unscented kalman filter for nonlinear estimation. In Proceedings of the IEEE Adaptive Systems for Signal Processing, Communications, and Control Symposium, Lake Louise, AB, Canada, 1–4 October 2000; pp. 153–158. [Google Scholar]
- Whalen, A.J.; Brennan, S.N.; Sauer, T.D.; Schiff, S.J. Observability and controllability of nonlinear networks: The role of symmetry. Phys. Rev. X 2015, 5, 011005. Available online: https://link.aps.org/doi/10.1103/PhysRevX.5.011005 (accessed on 5 June 2021). [CrossRef]
- Sun, Y.; Xu, D.; Ng, D.W.K.; Dai, L.; Schober, R. Optimal 3d-trajectory design and resource allocation for solar-powered UAV communication systems. IEEE Trans. Commun. 2019, 67, 4281–4298. [Google Scholar] [CrossRef] [Green Version]
States | Min. | Max. |
---|---|---|
1 | ||
Inputs | Min. | Max. |
---|---|---|
10 | ||
Parameter | Value |
---|---|
Sampling Time () | |
Horizon Length () | 50 |
Case | U-Axis | V-Axis |
---|---|---|
1 | ||
2 |
Case | U-Axis | V-Axis |
---|---|---|
1 | ||
2 |
Case | U-Axis | V-Axis |
---|---|---|
1 | ||
2 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, C.-W.; Hung, H.-A.; Yang, P.-H.; Cheng, T.-H. Visual Servoing of a Moving Target by an Unmanned Aerial Vehicle. Sensors 2021, 21, 5708. https://doi.org/10.3390/s21175708
Chen C-W, Hung H-A, Yang P-H, Cheng T-H. Visual Servoing of a Moving Target by an Unmanned Aerial Vehicle. Sensors. 2021; 21(17):5708. https://doi.org/10.3390/s21175708
Chicago/Turabian StyleChen, Ching-Wen, Hsin-Ai Hung, Po-Hung Yang, and Teng-Hu Cheng. 2021. "Visual Servoing of a Moving Target by an Unmanned Aerial Vehicle" Sensors 21, no. 17: 5708. https://doi.org/10.3390/s21175708
APA StyleChen, C. -W., Hung, H. -A., Yang, P. -H., & Cheng, T. -H. (2021). Visual Servoing of a Moving Target by an Unmanned Aerial Vehicle. Sensors, 21(17), 5708. https://doi.org/10.3390/s21175708