A Multilevel Architecture for Autonomous UAVs
Abstract
:1. Introduction
2. Materials and Methods
2.1. Architecture
- Functional high-level.This comprises the hardware and software necessary to implement the VBN system and to run the autonomous navigation algorithms.
- Functional mid-level.This is represented by the software necessary to manage the commands sent to the flight controller, which can be those from the ground station (human pilot remote control) or the ones generated by the autonomous navigation algorithm. This is the communication interface between the other two levels of the architecture.
- Functional low-level.It comprises the flight controller together with the hardware and software necessary for the attitude stabilization of the drone.
- A multi-sensor board, also comprising a GNSS receiver, connected through a UART3 port;
- An Intel Realsense t-265, connected via USB-3.0 and used for the 3D localization and orientation of the drone;
- A digital camera, connected via Camera Serial Interface (CSI) and used by computer vision algorithms.
2.2. Mid-Level Software Layer: Cyber Pilot and Interface with the Low-Level Implementation
- Pulse width modulation (PWM);
- Pulse position modulation (PPM);
- Serial bus (SBUS),
2.3. High-Level Software Layer
- displaySampleTime: Shows active threads, expected and real sample rates;
- windows: Shows the images processed by the vision algorithm inside a window;
- plot: Makes the real-time plot of the variables of interest inside a window (it also needs the windows command);
- log: Save all shared variables in a .txt file, usable for data post processing;
- rec: Records the video stream processed by the vision algorithm and superimposes the telemetry obtained in real time;
- ssh: If enabled, it eliminates the OpenGL optimization of the windows which are rendered on the remote PC.
- SLAM thread.Thread 1 implements SLAM algorithms. It receives the 3D position data generated with a frequency of 200 Hz by the t-265 stereoscopic camera. These data are related to the position of the UAV in the environment and are defined in fixed frames. Thread 1 also receives the video streams of the two fish-eye lenses of the stereoscopic camera at a frequency of 30 Hz;
- Vision thread.Thread 2 receives the information computed by Thread 1 together with the additional video stream from the CSI camera. The vision thread also receives some additional data, such as the attitude, relative and absolute position given by the t-265 stereoscopic camera and by the GNSS (if available). Inside this thread, various computer vision algorithms can be implemented, such as environmental marker detection [21,22] or object recognition [23,24];
- Inertial and GNSS thread.Thread 3 provides additional information on accelerations and position to the vision thread. This information is elaborated by sensor fusion algorithms to improve the precision in the estimation of the drone position and speed.
- Planner thread.Thread 4 exploits the outputs of the previous threads to build the reference trajectory. It contains several routines for path generation, which are specific for different kinds of missions. For example, this thread is responsible for the generation of a specific path built from environmental markers, but it can also account for fixed trajectories in the space. The desired path is a sequence of vectors of four elements:
- Control thread.Thread 5 implements the control strategy [25,26,27] of the cyber pilot with respect to the planned trajectory generated by Thread 4. The result is a sequence of vectors containing the reference values, which are sent to the flight controller implementing the low-level:In each vector , and are the set points for roll and pitch angles, is the reference value of the thrust and is the set point for the yaw rate.Many different control algorithms can be used here, depending on the mission specifications, whether for instance the UAV has to maintain a specific position (hovering) or track a complex trajectory. A quite simple but efficient control scheme implemented in this thread and tested in experiments consists of an array of double-nested loops of proportional–integral–derivative (PID) controllers that process the 3D position and velocity errors to compute the first three components of the vector, i.e., roll, pitch and thrust set points, whereas the yaw rate is left as an additional degree of freedom. In the cyber pilot paradigm, the vector contains the commands used by the mid-level software layer to overwrite the manual commands during the autonomous flight modes. It is important to stress that the signals in the vector depend on the commands accepted by the flight controller implementing the low-level. Anyway, the commands generated by the cyber pilot share their nature with those received from the human pilot during manual flight. In other words, from the flight controller point of view, these two kinds of commands look exactly the same. In this sense, the cyber pilot makes the autonomous flight system independent from the flight controller, provided that its input interface can be properly replicated.
- Communication thread.Thread 6 is devoted to sending the output of the control thread to the mid-level software layer, which, based on its policies, decides whether to use them or those coming from the human pilot. This thread also handles the inputs from the auxiliary channels of the remote controller, which regard useful information for the management of the drone and in particular for interacting with the mid-level functions.
- Log thread.
- Display thread.
- Plot Thread.Threads 7, 8 and 9 are optional and can be activated if needed. They are used to log, store and plot the shared variables, and so they are useful during the experimental phase, as they allow for acquiring data for the post processing and for the debugging phases.
2.4. Frame Sizing
3. Results
Axis | std |
horizontal (along x) | 1.36 cm |
altitude (along y) | 2.20 cm |
longitudinal (along z) | 0.42 cm |
yaw | 0.22 deg |
4. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Grabe, V.; Bülthoff, H.H.; Scaramuzza, D.; Giordano, P.R. Nonlinear ego-motion estimation from optical flow for online control of a quadrotor UAV. Int. J. Robot. Res. 2015, 34, 1114–1135. [Google Scholar] [CrossRef] [Green Version]
- Penin, B.; Spica, R.; Giordano, P.R.; Chaumette, F. Vision-based minimum-time trajectory generation for a quadrotor UAV. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar]
- Al Habsi, S.; Shehada, M.; Abdoon, M.; Mashood, A.; Noura, H. Integration of a Vicon camera system for indoor flight of a Parrot AR Drone. In Proceedings of the 2015 10th International Symposium on Mechatronics and Its Applications (ISMA), Sharjah, United Arab Emirates, 8–10 December 2015. [Google Scholar]
- Gargioni, G.; Peterson, M.; Persons, J.B.; Schroeder, K.; Black, J. A Full Distributed Multipurpose Autonomous Flight System Using 3D Position Tracking and ROS. In Proceedings of the 2019 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 11–14 June 2019. [Google Scholar]
- Antonio-Toledo, M.E.; Sanchez, E.N.; Alanis, A.Y.; Flórez, J.A.; Perez-Cisneros, M.A. Real-Time Integral Backstepping with Sliding Mode Control for a Quadrotor UAV. In Proceedings of the (IFAC-PapersOnLine) 2nd IFAC Conference on Modelling, Identification and Control of Nonlinear Systems MICNON 2018, Guadalajara, Jalisco, Mexico, 20–22 June 2018. [Google Scholar]
- Masiero, A.; Fissore, F.; Antonello, R.; Cenedese, A.; Vettore, A. A comparison of UWB and motion capture UAV indoor positioning. In Proceedings of the The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, SPRS Geospatial Week 2019, Enschede, The Netherlands, 10–14 June 2019. [Google Scholar]
- Xiao, X.; Dufek, J.; Suhail, M.; Murphy, R. Motion Planning for a UAV with a Straight or Kinked Tether. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
- Aguilar, W.G.; Manosalvas, J.F.; Guillén, J.A.; Collaguazo, B. Robust Motion Estimation Based on Multiple Monocular Camera for Indoor Autonomous Navigation of Micro Aerial Vehicle. In Proceedings of the International Conference on Augmented Reality, Virtual Reality, and Computer Graphics (AVR 2018), Otranto, Italy, 14 July 2018. [Google Scholar]
- Basso, M.; Bigazzi, L.; Innocenti, G. DART Project: A High Precision UAV Prototype Exploiting On-board Visual Sensing. In Proceedings of the 15th International Conference on Autonomic and Autonomous Systems (ICAS), Athens, Greece, 2–6 June 2019. [Google Scholar]
- Bigazzi, L.; Basso, M.; Gherardini, S.; Innocenti, G. Mitigating latency problems in vision-based autonomous UAVs. In Proceedings of the 29th Mediterranean Conference on Control and Automation (MED2021), Bari, Puglia, Italy, 22–25 June 2021. [Google Scholar]
- Ceron, A.; Mondragon, I.; Prieto, F. Onboard visual-based navigation system for power line following with UAV. Int. J. Adv. Robot. Syst. 2018, 15, 2–12. [Google Scholar] [CrossRef] [Green Version]
- Lu, L.; Redondo, C.; Campoy, P. Optimal Frontier-Based Autonomous Exploration in Unconstructed Environment Using RGB-D Sensor. Sensors 2020, 20, 6507. [Google Scholar] [CrossRef] [PubMed]
- Ma, C.; Zhou, Y.; Li, Z. A New Simulation Environment Based on Airsim, ROS, and PX4 for Quadcopter Aircrafts. In Proceedings of the 2020 6th International Conference on Control, Automation and Robotics (ICCAR), Singapore, 20–23 April 2020. [Google Scholar]
- Hinas, A.; Roberts, J.M.; Gonzalez, F. Vision-Based Target Finding and Inspection of a Ground Target Using a Multirotor UAV System. Sensors 2017, 17, 12. [Google Scholar]
- Atoev, S.; Kwon, K.R.; Lee, S.H.; Moon, K.S. Data analysis of the MAVLink communication protocol. In Proceedings of the 2017 International Conference on Information Science and Communications Technologies (ICISCT), Tashkent, Uzbekistan, 2–4 November 2017. [Google Scholar]
- Kwon, Y.M.; Yu, J.; Cho, B.M.; Eun, Y.; Park, K.J. Empirical Analysis of MAVLink Protocol Vulnerability for Attacking Unmanned Aerial Vehicles. IEEE Access 2018, 6, 43203–43212. [Google Scholar] [CrossRef]
- Madgwick, S.O.H. An efficient orientation filter for inertial and inertial/magnetic sensor arrays. April 2010. [Google Scholar]
- Mahony, R.; Hamel, T.; Pflimlin, J.M. Nonlinear Complementary Filters on the Special Orthogonal Group. IEEE Trans. Autom. Control 2008, 53, 1203–1218. [Google Scholar] [CrossRef] [Green Version]
- Bigazzi, L.; Gherardini, S.; Innocenti, G.; Basso, M. Development of Non Expensive Technologies for Precise Maneuvering of Completely Autonomous Unmanned Aerial Vehicles. Sensors 2021, 21, 391. [Google Scholar] [CrossRef] [PubMed]
- Gardner, W.; Brown, W.; Chen, C.-K. Spectral Correlation of Modulated Signals: Part II—Digital Modulation. IEEE Trans. Commun. 1987, 35, 595–601. [Google Scholar] [CrossRef]
- Olson, E. AprilTag: A robust and flexible visual fiducial system. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011. [Google Scholar]
- Wang, J.; Olson, E. AprilTag2: Efficient and robust fiducial detection. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016. [Google Scholar]
- Kechagias-Stamatis, O.; Aouf, N.; Nam, D. 3D Automatic Target Recognition for UAV Platforms. In Proceedings of the 2017 Sensor Signal Processing for Defence Conference (SSPD), London, UK, 6–7 December 2017. [Google Scholar]
- Vujasinovic, S.; Becker, S.; Breuer, T.; Bullinger, S.; Scherer-Negenborn, N.; Arens, M. Integration of the 3D Environment for UAV Onboard Visual Object Tracking. Appl. Sci. 2020, 10, 7622. [Google Scholar] [CrossRef]
- Antonelli, G.; Cataldi, E.; Giordano, P.R.; Chiaverini, S.; Franchi, A. Experimental validation of a new adaptive control scheme for quadrotors MAVs. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013. [Google Scholar]
- Koubaa, A.; Taher Azar, A. Unmanned Aerial Systems: Theoretical Foundation and Applications. Advances in Nonlinear Dynamics and Chaos (ANDC); Academic Press: Cambridge, MA, USA, 2021. [Google Scholar]
- Sutton, A.; Fidan, B.; van der Walle, D. Hierarchical UAV Formation Control for Cooperative Surveillance. IFAC Proc. 2008, 41, 12087–12092. [Google Scholar] [CrossRef] [Green Version]
- Castiblanco, J.M.; Garcia-Nieto, S.; Simarro, R.; Salcedo, J.V. Experimental study on the dynamic behaviour of drones designed for racing competitions. Int. J. Micro Air Veh. 2021. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bigazzi, L.; Basso, M.; Boni, E.; Innocenti, G.; Pieraccini, M. A Multilevel Architecture for Autonomous UAVs. Drones 2021, 5, 55. https://doi.org/10.3390/drones5030055
Bigazzi L, Basso M, Boni E, Innocenti G, Pieraccini M. A Multilevel Architecture for Autonomous UAVs. Drones. 2021; 5(3):55. https://doi.org/10.3390/drones5030055
Chicago/Turabian StyleBigazzi, Luca, Michele Basso, Enrico Boni, Giacomo Innocenti, and Massimiliano Pieraccini. 2021. "A Multilevel Architecture for Autonomous UAVs" Drones 5, no. 3: 55. https://doi.org/10.3390/drones5030055
APA StyleBigazzi, L., Basso, M., Boni, E., Innocenti, G., & Pieraccini, M. (2021). A Multilevel Architecture for Autonomous UAVs. Drones, 5(3), 55. https://doi.org/10.3390/drones5030055