Infrared and Visible Camera Integration for Detection and Tracking of Small UAVs: Systematic Evaluation
Abstract
:1. Introduction
Related Work
- RADAR: RADAR transmits electromagnetic waves that are reflected by targets, with a frequency range from 3 MHz to 300 GHz. The interpretation of the reflected rays may determine the position and velocity of the objects. A significant issue with the use of RADAR for the detection of UAVs is raised by their low RADAR cross-section area, that might make them undetectable. However, because RADAR presents a high robustness to weather and lighting conditions, research on micro-Doppler signature-based methods has been conducted [6]. In [7], the micro-Doppler effect for frequency-modulated continuous-wave (FMCW) RADAR applications is modelled, showing high confidence rates for UAV class identification based on the number of UAV motors. In [8], an X-band pulse-Doppler RADAR is used to compare the RADAR signatures of fixed-wing UAVs with only puller blades, multirotor UAVs with only lifting blades, and VTOL UAVs with both lifting and puller blades, which can help identify UAV types.
- LiDAR: LiDAR shares the working principle of RADAR, although having a higher frequency range, from 200 THz to 400 THz. It is also able to issue a 3D map of the environment. Although it is less robust to weather, it is still a valuable tool to initialize the position of the object in a detection system. In [9], a probabilistic analysis of the detection of small UAVs in various scenarios is proposed.
- RF sensor: These passive sensors capture the signals used by a target to communicate with the ground, making it possible to detect, locate and also, in some cases, identify the aircraft. Apart from being robust to weather and lighting, one important feature of RF sensors is the possibility of detecting the controller on the ground, which is relevant in countering a threat. In [10], spectral–temporal localization and classification of the RF signals of UAVs with a visual object detector approach are performed. This shows promising results, even in noise interference situations, by processing the spectrograms using the YOLOv5 object detector. In [11], a novel RF signal image representation scheme that incorporates a convolutional neural network (CNN) is implemented to perform UAV classification, achieving high classification accuracy scores of 98.72% and 98.67% on two different datasets.
- Acoustic sensor: An acoustic sensor can detect, distinguish, and identify the sound emitted by the engine and propellers of a UAV. By using a specific arrangement of multiple microphones, the estimation of the azimuth and elevation of one or more UAVs is possible. However, this sensor presents some limitations in terms of the detection range and accuracy, and susceptibility to background noise interference, even though it is a low-cost and accessible tool. In [12], good performance results of the detection and localization of small UAVs are achieved by using an acoustic-based surveillance system. A UAV detection system based on acoustic signatures that are fed to two machine learning models, a Random Forest and Multilayer Perceptron (MLP), is proposed [13]. The MLP model was considered a better solution for the detection of complex and nonlinear acoustic features.
- EO camera: An EO camera allows for the detection of objects by capturing the light reflected by them. Although being intuitive for interpretation, and capable of providing detailed information on the surrounding environment, these sensors present low robustness to low lighting scenes, namely at night, and weather conditions, such as rain and fog. For visual object detection, different computer vision algorithms based on Deep Learning (DL) models have been developed. A comparison between 14 object detectors with the proposed visual UAV dataset is conducted in [14]. This makes conclusions based on the performance and processing time. In [15], a detection and tracking system for small UAVs using a DL framework that performs image alignment is proposed. Results with high-resolution images show a track probability of more than 95% up to 700 m. The problem of distinguishing small UAVs from birds is addressed in [16]. It concludes that object detectors benefit from being trained with datasets that include various UAVs and birds in order to decrease the number of False Positive (FP) detections on inference.
- IR sensor: IR sensors, that capture the thermal signature of objects, are much explored in the military sector, and are more adequate for some challenging scenarios. Particularly using Long-Wave Infrared (LWIR) sensors, the thermal signatures emitted by the batteries of UAVs can be detected. These sensors typically have a lower imaging resolution and higher granularity, which limit their use independently. These issues that result in a lack of texture and feature highlight are addressed in [17]. An improved detector that drops low-resolution layers and enhances high-resolution layers is proposed. It also includes a multi-frame filtering stage that consists of an adaptive pipeline filter (APF) to reduce the FP rate, achieving a precision of more than 95%. Promising results in small-UAV detection using an IR sensor are achieved in [18], by learning the nonlinear mapping from an input image to the residual image and highlighting the target UAV by subtracting these images.
2. Detection and Tracking Architecture
2.1. Proposed System Architecture
2.2. Data Fusion Methodology
3. Dataset
3.1. Experimental Work
3.2. Labelled Dataset
3.3. Inference Dataset
4. Results and Discussion
4.1. Detector on Labelled Dataset
4.2. Overfitting
4.3. Independent Model Testing
4.3.1. ECC Algorithm
4.3.2. Target Operational Conditions
4.4. Data Fusion Architecture Testing
Target Operational Conditions
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
C-UAS | Counter-UAS |
CNN | Convolutional Neural Network |
DL | Deep Learning |
EO | Electro-optical |
FE | Flight Experiment |
FN | False Negative |
FP | False Positive |
GAN | Generative Adversarial Network |
IDS | Identification Switch |
IR | Infrared |
[email protected] | Mean Average Precision at 0.5 |
mAP@[.5:.95] | Mean Average Precision at [.5:.95] |
MLP | Multilayer Perceptron |
RF | Radio Frequency |
UAS | Unmanned Aerial System |
UAV | Unmanned Aerial Vehicle |
UVIC-CfAR | University of Victoria’s Center for Aerospace Research |
References
- Worldwide Drone Incidents. Available online: https://www.dedrone.com/resources/incidents-new/all (accessed on 19 January 2024).
- Castrillo, V.U.; Manco, A.; Pascarella, D.; Gigante, G. A Review of Counter-UAS Technologies for Cooperative Defensive Teams of Drones. Drones 2022, 6, 65. [Google Scholar] [CrossRef]
- Park, S.; Kim, H.T.; Lee, S.; Joo, H.; Kim, H. Survey on Anti-Drone Systems: Components, Designs, and Challenges. IEEE Access 2021, 9, 42635–42659. [Google Scholar] [CrossRef]
- Wang, J.; Liu, Y.; Song, H. Counter-Unmanned Aircraft System(s) (C-UAS): State of the Art, Challenges, and Future Trends. IEEE Aerosp. Electron. Syst. Mag. 2021, 36, 4–29. [Google Scholar] [CrossRef]
- Wang, B.; Li, Q.; Mao, Q.; Wang, J.; Chen, C.L.P.; Shangguan, A.; Zhang, H. A Survey on Vision-Based Anti Unmanned Aerial Vehicles Methods. Drones 2024, 8, 518. [Google Scholar] [CrossRef]
- Sun, Y.; Abeywickrama, S.; Jayasinghe, L.; Yuen, C.; Chen, J.; Zhang, M. Micro-Doppler Signature-Based Detection, Classification, and Localization of Small UAV with Long Short-Term Memory Neural Network. IEEE Trans. Geosci. Remote Sens. 2021, 59, 6285–6300. [Google Scholar] [CrossRef]
- Passafiume, M.; Rojhani, N.; Collodi, G.; Cidronali, A. Modeling small UAV micro-doppler signature using millimeter-wave FMCW radar. Electronics 2021, 10, 747. [Google Scholar] [CrossRef]
- Yan, J.; Hu, H.; Gong, J.; Kong, D.; Li, D. Exploring Radar Micro-Doppler Signatures for Recognition of Drone Types. Drones 2021, 7, 280. [Google Scholar] [CrossRef]
- Dogru, S.; Marques, L. Drone Detection Using Sparse Lidar Measurements. IEEE Robot. Autom. Lett. 2022, 7, 3062–3069. [Google Scholar] [CrossRef]
- Nelega, R.; Belean, B.; Valeriu, R.; Turcu, F.; Puschita, E. Radio Frequency-Based Drone Detection and Classification using Deep Learning Algorithms. In Proceedings of the 2023 International Conference on Software, Telecommunications and Computer Networks (SoftCOM), Split, Croatia, 21–23 September 2023. [Google Scholar]
- Fu, Y.; He, Z. Radio Frequency Signal-Based Drone Classification with Frequency Domain Gramian Angular Field and Convolutional Neural Network. Drones 2024, 8, 511. [Google Scholar] [CrossRef]
- Shi, Z.; Chang, X.; Yang, C.; Wu, Z.; Wu, J. An Acoustic-Based Surveillance System for Amateur Drones Detection and Localization. IEEE Trans. Veh. Technol. 2020, 69, 2731–2739. [Google Scholar] [CrossRef]
- Ahmed, C.A.; Batool, F.; Haider, W.; Asad, M.; Raza Hamdani, S.H. Acoustic Based Drone Detection Via Machine Learning. In Proceedings of the 2022 International Conference on IT and Industrial Technologies (ICIT), Shanghai, China, 28–31 March 2022. [Google Scholar]
- Zhao, J.; Zhang, J.; Li, D.; Wang, D. Vision-Based Anti-UAV Detection and Tracking. IEEE Trans. Intell. Transp. Syst. 2022, 23, 25323–25334. [Google Scholar] [CrossRef]
- Ghosh, S.; Patrikar, J.; Moon, B.; Hamidi, M.M.; Scherer, S. AirTrack: Onboard Deep Learning Framework for Long-Range Aircraft Detection and Tracking. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023. [Google Scholar]
- Coluccia, A.; Fascista, A.; Schumann, A.; Sommer, L.; Dimou, A.; Zarpalas, D.; Méndez, M.; de la Iglesia, D.; González, I.; Mercier, J.P.; et al. Drone vs. Bird detection: Deep learning algorithms and results from a grand challenge. Sensors 2021, 21, 2824. [Google Scholar] [CrossRef] [PubMed]
- Ding, L.; Xu, X.; Cao, Y.; Zhai, G.; Yang, F.; Qian, L. Detection and tracking of infrared small target by jointly using SSD and pipeline filter. Digit. Signal Process. Rev. J. 2021, 110, 102949. [Google Scholar] [CrossRef]
- Fang, H.; Ding, L.; Wang, L.; Chang, Y.; Yan, L.; Han, J. Infrared Small UAV Target Detection Based on Depthwise Separable Residual Dense Network and Multiscale Feature Fusion. IEEE Trans. Instrum. Meas. 2022, 71, 1–20. [Google Scholar] [CrossRef]
- Wu, X.; Li, W.; Hong, D.; Tao, R.; Du, Q. Deep Learning for Unmanned Aerial Vehicle-Based Object Detection and Tracking: A survey. IEEE Geosci. Remote Sens. Mag. 2022, 10, 91–124. [Google Scholar] [CrossRef]
- Svanström, F.; Alonso-Fernandez, F.; Englund, C. Drone Detection and Tracking in Real-Time by Fusion of Different Sensing Modalities. Drones 2022, 6, 317. [Google Scholar] [CrossRef]
- Alldieck, T.; Bahnsen, C.H.; Moeslund, T.B. Context-aware fusion of RGB and thermal imagery for traffic monitoring. Sensors 2016, 16, 1947. [Google Scholar] [CrossRef] [PubMed]
- Yang, L.; Ma, R.; Zakhor, A. Drone Object Detection Using RGB/IR Fusion. In Proceedings of the Symposium on Electronic Imaging: Computational Imaging XX, Online, 17–20 January 2022. [Google Scholar]
- Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 18–22 June 2023. [Google Scholar]
- Zhang, Y.; Sun, P.; Jiang, Y.; Yu, D.; Weng, F.; Yuan, Z.; Luo, P.; Liu, W.; Wang, X. YOLOv7: ByteTrack: Multi-Object Tracking by Associating Every Detection Box. In Proceedings of the European Conference on Computer Vision (ECCV), Tel Aviv, Israel, 23–27 October 2022. [Google Scholar]
- Evangelidis, G.D.; Psarakis, E.Z. Parametric image alignment using enhanced correlation coefficient maximization. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 1858–1865. [Google Scholar] [CrossRef] [PubMed]
- Lopes, J.P.D.; Suleman, A.; Figueiredo, M.A.T. Detection and Tracking of Non-Cooperative UAVs: A Deep Learning Moving-Object Tracking Approach. MsC Thesis, Instituto Superior Técnico, Lisbon, Portugal, 2022. [Google Scholar]
- Sun, C.; Zhang, C.; Xiong, N. Infrared and visible image fusion techniques based on deep learning: A review. Electronics 2020, 9, 2162. [Google Scholar] [CrossRef]
- Ma, J.; Yu, W.; Liang, P.; Li, C.; Jiang, J. FusionGAN: A generative adversarial network for infrared and visible image fusion. Inf. Fusion 2019, 48, 11–26. [Google Scholar] [CrossRef]
- Szeliski, R. Computer Vision: Algorithms and Applications, 2nd ed.; Springer: New York, NY, USA, 2021; pp. 33–96. [Google Scholar]
- Pedro, S.; Tomás, D.; Vale, J.L.; Suleman, A. Design and performance quantification of VTOL systems for a canard aircraft. Aeronaut. J. 2021, 125, 1768–1791. [Google Scholar] [CrossRef]
- Castellani, N.; Pedrosa, F.; Matlock, J.; Mazur, A.; Lowczycki, K.; Widera, P.; Zawadzki, K.; Lipka, K.; Suleman, A. Development of a Series Hybrid Multirotor. In Proceedings of the 13th EASN International Conference on Innovation in Aviation & Space for opening New Horizons, Salerno, Italy, 5–8 September 2023. [Google Scholar]
- Zheng, Y.; Lin, S.; Kambhamettu, C.; Yu, J.; Kang, S.B. Single-Image Vignetting Correction. IEEE Trans. Pattern Anal. Mach. Intell. 2009, 31, 2243–2256. [Google Scholar] [CrossRef] [PubMed]
- Zheng, Y.; Grossman, M.; Awate, S.; Gee, J. Automatic Correction of Intensity Nonuniformity From Sparseness of Gradient Distribution in Medical Images. In Proceedings of the 12th International Conference on Medical Image Computing and Computer Assisted Intervention, London, UK, 20–24 September 2009. [Google Scholar]
Sensor | Calibration Matrix | Distortion Coefficients |
---|---|---|
EO | {2.1355, −5.5289, , 0.1064} | |
IR | {−0.3653, 23.3465, −0.0247, −0.020} |
Model | Precision | Recall | [email protected] | mAP@[.5:.95] |
---|---|---|---|---|
EO | 0.860 | 0.827 | 0.839 | 0.599 |
IR | 0.894 | 0.846 | 0.885 | 0.656 |
IR BR | 0.887 | 0.860 | 0.886 | 0.646 |
Pixel Fused | 0.893 | 0.873 | 0.900 | 0.622 |
Pixel Fused BR | 0.886 | 0.866 | 0.896 | 0.614 |
Average | 0.884 | 0.854 | 0.885 | 0.627 |
Model | Precision | Recall | [email protected] | mAP@[.5:.95] | Time per Image Variation (%) |
---|---|---|---|---|---|
EO | 0.856 | 0.813 | 0.823 | 0.544 | −47.6 |
IR | 0.877 | 0.835 | 0.873 | 0.615 | −67.1 |
Pixel Fused | 0.878 | 0.855 | 0.872 | 0.550 | −55.7 |
Average | 0.870 | 0.834 | 0.856 | 0.570 | −56.8 |
FE | Data | Range | Nr. of Frames | Precision | Recall | Frame Rate (fps) | IDS per 100 Frames |
---|---|---|---|---|---|---|---|
A | EO | close | 2736 | 0.956 | 0.983 | 97.6 | 3.408 |
medium | 5721 | 0.962 | 0.995 | 101.3 | 2.483 | ||
far | 4645 | 0.937 | 0.878 | 101.2 | 3.896 | ||
A | IR | close | 2736 | 0.940 | 0.977 | 99.2 | 5.783 |
medium | 5721 | 0.960 | 0.963 | 103.0 | 4.398 | ||
far | 4645 | 0.905 | 0.955 | 100.4 | 11.779 | ||
C | EO | close | 2226 | 0.950 | 0.638 | 105.1 | 5.809 |
medium | 8680 | 0.986 | 0.770 | 101.8 | 1.501 | ||
far | 9087 | 0.986 | 0.634 | 105.0 | 1.734 | ||
v. far | 2812 | 0.932 | 0.111 | 117.0 | 1.798 | ||
C | IR | close | 2226 | 0.964 | 0.455 | 111.4 | 1.565 |
medium | 8680 | 0.943 | 0.773 | 103.9 | 4.354 | ||
far | 9087 | 0.919 | 0.717 | 104.8 | 9.071 | ||
v. far | 2812 | 0.928 | 0.245 | 115.8 | 3.872 |
FE | Data | Range | Precision | Precision Variation (%) | Recall | Recall Variation (%) | Frame Rate (fps) | IDS per 100 Frames |
---|---|---|---|---|---|---|---|---|
A | EO-IR | close | 0.999 | +4.3 | 0.979 | −0.4 | 91.1 | 0.000 |
medium | 0.999 | +3.7 | 0.988 | −0.7 | 93.1 | 0.494 | ||
far | 0.996 | +5.9 | 0.951 | +7.3 | 82.2 | 3.885 | ||
A | IR-EO | close | 0.992 | +5.2 | 0.979 | +0.3 | 89.8 | 0.395 |
medium | 0.997 | +3.7 | 0.989 | +2.6 | 92.4 | 0.681 | ||
far | 0.992 | +8.7 | 0.952 | −0.3 | 84.1 | 3.007 | ||
C | EO-IR | close | 0.999 | +4.9 | 0.634 | −0.4 | 81.5 | 0.679 |
medium | 0.999 | +1.3 | 0.808 | +3.8 | 87.2 | 0.427 | ||
far | 0.995 | +0.9 | 0.719 | +8.4 | 78.9 | 1.662 | ||
v. far | 0.994 | +6.1 | 0.182 | +7.1 | 76.1 | 1.180 | ||
C | IR-EO | close | 0.995 | +3.1 | 0.572 | +11.7 | 77.1 | 0.679 |
medium | 0.992 | +4.9 | 0.822 | +4.9 | 87.0 | 0.532 | ||
far | 0.989 | +7.0 | 0.752 | +3.5 | 81.7 | 1.431 | ||
v. far | 0.975 | +4.7 | 0.241 | −0.4 | 75.8 | 0.600 | ||
C | Pixel Fused | close | 0.943 | −0.70|−2.10 | 0.429 | −21.0|−2.70 | 113.4 | 4.253 |
medium | 0.940 | −4.50|+0.10 | 0.577 | −16.5|−16.9 | 107.4 | 5.219 | ||
far | 0.954 | −3.30|+1.70 | 0.413 | −15.8|−15.5 | 113.7 | 2.190 | ||
v. far | 0.984 | +5.20|+5.60 | 0.123 | +1.30|−12.1 | 117.1 | 0.397 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pereira, A.; Warwick, S.; Moutinho, A.; Suleman, A. Infrared and Visible Camera Integration for Detection and Tracking of Small UAVs: Systematic Evaluation. Drones 2024, 8, 650. https://doi.org/10.3390/drones8110650
Pereira A, Warwick S, Moutinho A, Suleman A. Infrared and Visible Camera Integration for Detection and Tracking of Small UAVs: Systematic Evaluation. Drones. 2024; 8(11):650. https://doi.org/10.3390/drones8110650
Chicago/Turabian StylePereira, Ana, Stephen Warwick, Alexandra Moutinho, and Afzal Suleman. 2024. "Infrared and Visible Camera Integration for Detection and Tracking of Small UAVs: Systematic Evaluation" Drones 8, no. 11: 650. https://doi.org/10.3390/drones8110650
APA StylePereira, A., Warwick, S., Moutinho, A., & Suleman, A. (2024). Infrared and Visible Camera Integration for Detection and Tracking of Small UAVs: Systematic Evaluation. Drones, 8(11), 650. https://doi.org/10.3390/drones8110650