Real-Time Target Detection System for Intelligent Vehicles Based on Multi-Source Data Fusion
Abstract
:1. Introduction
2. System Design
- (1)
- Information acquisition: The jointly calibrated millimeter-wave radar and camera detect the targets in their working area; receive the measured velocity, relative position, azimuth, and other information; and release them in ROS space after pre-processing.
- (2)
- Temporal synchronization: Because the data format and transmission period of the two sensors are different, without temporal alignment, it is impossible to obtain the data of the two sensors for the same measurement at the same time, and thus it is impossible to start fusion detection. Because the two sensors are installed in different frontal positions, without spatial alignment, it is impossible to describe the spatial indicators of the same measurement in a unified coordinate system (body coordinate system in this paper). Two sensor data frames with the same timestamp are combined into one data frame and published to the ROS space as the temporal synchronization result. Spatial synchronization, on the other hand, projects the target coordinate points detected by the millimeter-wave radar into the camera coordinate system based on the written sensor joint calibration data and publishes them to the ROS space as the spatial synchronization result.
- (3)
- Fusion decision: Based on the results of spatio-temporal synchronization, the detection area obtained by visual inspection is the main focus, combined with the data detected by millimeter-wave radar to further improve the accuracy of the detection frame and complete the detection process.
- (1)
- On-board platform: This section is used to mount the sensors, domain controller, and display. After the sensors are installed, the external parameters of the sensors are jointly calibrated according to their installation position.
- (2)
- Sensor: It senses the position, size, and azimuth of the target to be detected in the environment and uploads the data to the controller using CAN(Controller Area Network) communication and FPDlink communication.
- (3)
- Domain controller: Deploy a real-time target detection program to complete sensor data pre-processing, data fusion, and target detection on the domain controller, and publish the results.
3. System Composition
3.1. Hardware System
3.2. Software System
4. Data Processing
4.1. Visual Data Processing Based on Improved YOLO Algorithm
4.2. Millimeter-Wave Radar Data Processing
5. The Key Technologies in Data Fusion
5.1. Time Synchronization
5.2. Space Synchronization
5.3. Data Fusion
5.4. Target Detection and Tracking
6. Equipment Commissioning and Real Vehicle Test
6.1. Equipment Calibration
6.1.1. Internal Reference Calibration
6.1.2. External Reference Calibration
6.2. Fusion Effect
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Zhang, H.; Liang, J.; Zhang, Z. Active Fault Tolerant Control of Adaptive Cruise Control System Considering Vehicle-borne Millimeter Wave Radar Sensor Failure. Transportation-Automobile Safety; New Automobile Safety Study Results from Zhejiang University Described. IEEE Access 2020, 8, 11228–11240. [Google Scholar] [CrossRef]
- Zhang, Q.; Ma, L.; Ren, J.; Zhang, M.; Pan, S. Transmission line distance measurement with millimeter wave radar based on improved robust Kalman filter. J. Physics Conf. Ser. 2022, 2260, 012029. [Google Scholar] [CrossRef]
- Murata, H. Photonics-Applied High-Frequency Electromagnetic Field Measurement Techniques for 5G/Beyond-5G Mobile Systems and Millimeter-Wave Radars. IEICE Proceeding Ser. 2020. [Google Scholar] [CrossRef]
- Zhang, Y.; Hu, B.; Yuan, X.; Li, Y. Road object recognition method based on improved YOLOv3. Acad. J. Comput. Inf. Sci. 2022, 5, 1–9. [Google Scholar] [CrossRef]
- Vision St Co. Ltd. Application Titled Road Object Recognition Method And Device Using Stereo Camera. USPTO Patent 20200320314, 2007. [Google Scholar]
- Ravindran, R.; Santora, M.J.; Jamali, M.M. Multi-Object Detection and Tracking, Based on DNN, for Autonomous Vehicles: A Review. IEEE Sens. J. 2021, 21, 5668–5677. [Google Scholar] [CrossRef]
- Singh, C.H.; Mishra, V.; Jain, K.; Shukla, A.K. FRCNN-Based Reinforcement Learning for Real-Time Vehicle Detection, Tracking and Geolocation from UAS. Drones 2022, 6, 406. [Google Scholar] [CrossRef]
- Liu, Q.; Li, Z.; Yuan, S.; Zhu, Y.; Li, X. Review on Vehicle Detection Technology for Unmanned Ground Vehicles. Sensors 2021, 21, 1354. [Google Scholar] [CrossRef]
- Cao, J.; Song, C.; Peng, S.; Song, S.; Zhang, X.; Shao, Y.; Xiao, F. Pedestrian Detection Algorithm for Intelligent Vehicles in Complex Scenarios. Sensors 2020, 20, 3646. [Google Scholar] [CrossRef] [PubMed]
- Ristić-Durrant, D.; Franke, M.; Michels, K. A Review of Vision-Based On-Board Obstacle Detection and Distance Estimation in Railways. Sensors 2021, 21, 3452. [Google Scholar] [CrossRef]
- Dichgans, J.; Kallwies, J.; Wuensche, H.-J. Robust Vehicle Tracking with Monocular Vision using Convolutional Neuronal Networks. In Proceedings of the 2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Karlsruhe, Germany, 14–16 September 2020; pp. 297–302. [Google Scholar] [CrossRef]
- Li, P.; Xia, H.; Zhou, B.; Yan, F.; Guo, R. A Method to Improve the Accuracy of Pavement Crack Identification by Combining a Semantic Segmentation and Edge Detection Model. Appl. Sci. 2022, 12, 4714. [Google Scholar] [CrossRef]
- Liu, Z.; Yeoh, J.K.; Gu, X.; Dong, Q.; Chen, Y.; Wu, W.; Wang, L.; Wang, D. Automatic pixel-level detection of vertical cracks in asphalt pavement based on GPR investigation and improved mask R-CNN. Autom. Constr. 2023, 146, 104689. [Google Scholar] [CrossRef]
- Patole, S.M.; Torlak, M.; Wang, D.; Ali, M. Automotive radars: A review of signal processing techniques. IEEE Signal Process. Mag. 2017, 34, 22–35. [Google Scholar] [CrossRef]
- Melendez-Pastor, C.; Ruiz-Gonzalez, R.; Gomez-Gil, J. A data fusion system of GNSS data and on-vehicle sensors data for improving car positioning precision in urban environments. Expert Syst. Appl. 2017, 80, 28–38. [Google Scholar] [CrossRef]
- Yang, J.; Liu, S.; Su, H.; Tian, Y. Driving assistance system based on data fusion of multisource sensors for autonomous unmanned ground vehicles. Comput. Networks 2021, 192, 108053. [Google Scholar] [CrossRef]
- Bhatia, S.; Alsuwailam, R.I.; Roy, D.G.; Mashat, A. Improved Multimedia Object Processing for the Internet of Vehicles. Sensors 2022, 22, 4133. [Google Scholar] [CrossRef]
- Gajda, J.; Sroka, R.; Burnos, P. Sensor Data Fusion in Multi-Sensor Weigh-In-Motion Systems. Sensors 2020, 20, 3357. [Google Scholar] [CrossRef] [PubMed]
- Wang, T.-C.; Tong, C.-S.; Xu, B.-L. Recent Findings in Multimedia Described by Researchers from Nanjing University of Aeronautics and Astronautics (AGV navigation analysis based on multi-sensor data fusion). Multimed. Tools Appl. 2020, 79, 5109–5124. [Google Scholar] [CrossRef]
- Choi, J.D.; Kim, M.Y. A Sensor Fusion System with Thermal Infrared Camera and LiDAR for Autonomous Vehicles: Its Calibration and Application. In Proceedings of the 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN), Jeju Island, Republic of Korea, 7–20 August 2021; pp. 361–365. [Google Scholar] [CrossRef]
- Mendez, J.; Molina, M.; Rodriguez, N.; Cuellar, M.; Morales, D. Camera-LiDAR Multi-Level Sensor Fusion for Target Detection at the Network Edge. Sensors 2021, 21, 3992. [Google Scholar] [CrossRef] [PubMed]
- Garcia, F.; Cerri, P.; Broggi, A.; de la Escalera, A.; Armingol, J.M. Data fusion for overtaking vehicle detection based on radar and optical flow. In Proceedings of the 2012 IEEE Intelligent Vehicles Symposium, Madrid, Spain, 3–7 June 2012. [Google Scholar] [CrossRef]
- Wang, Z.; Miao, X.; Huang, Z.; Luo, H. Research of Target Detection and Classification Techniques Using Millimeter-Wave Radar and Vision Sensors. Remote. Sens. 2021, 13, 1064. [Google Scholar] [CrossRef]
- Jiang, Q.; Zhang, L.; Meng, D. Target Detection Algorithm Based on MMW Radar and Camera Fusion. In Proceedings of the CONF-CDS 2021: The 2nd International Conference on Computing and Data Science, Stanford, CA, USA, 28–30 January 2021. [Google Scholar]
- Kim, K.-E.; Lee, C.-J.; Pae, D.-S.; Lim, M.-T. Sensor fusion for vehicle tracking with camera and radar sensor. In Proceedings of the 2017 17th International Conference on Control, Automation and Systems (ICCAS), Jeju, Republic of Korea, 18–21 October 2017. [Google Scholar] [CrossRef]
- Liu, Z.; Cai, Y.; Wang, H.; Chen, L.; Gao, H.; Jia, Y.; Li, Y. Robust target recognition and tracking of self-driving cars with radar and camera information fusion under severe weather conditions. In Proceedings of the IEEE Transactions on Intelligent Transportation Systems, Indianapolis, IN, USA, 19–22 September 2021; pp. 1–14. [Google Scholar]
- Weon, I.-S.; Lee, S.-G. Environment Recognition Based on Multi-sensor Fusion for Autonomous Driving Vehicles. J. Inst. Control. Robot. Syst. 2019, 25, 125–131. [Google Scholar] [CrossRef]
- Ma, J.; Tian, Z.; Li, Y.; Cen, M. Vehicle Tracking Method in Polar Coordinate System Based on Radar and Monocular Camera. In Proceedings of the 2020 Chinese Control And Decision Conference (CCDC), Hefei, China, 22–24 August 2020. [Google Scholar]
- Chen, B.; Pei, X.; Chen, Z. Research on Target Detection Based on Distributed Track Fusion for Intelligent Vehicles. Sensors 2019, 20, 56. [Google Scholar] [CrossRef] [PubMed]
- Sengupta, A.; Jin, F.; Cao, S. A DNN-LSTM based Target Tracking Approach using mmWave Radar and Camera Sensor Fusion. In Proceedings of the 2019 IEEE National Aerospace and Electronics Conference (NAECON), Dayton, OH, USA, 15–19 July 2019; 688–693. [Google Scholar] [CrossRef]
- Liu, Z.; Gu, X.; Chen, J.; Wang, D.; Chen, Y.; Wang, L. Automatic recognition of pavement cracks from combined GPR B-scan and C-scan images using multiscale feature fusion deep neural networks. Autom. Constr. 2023, 146, 104698. [Google Scholar] [CrossRef]
- Niu, C.; Li, K. Traffic Light Detection and Recognition Method Based on YOLOv5s and AlexNet. Appl. Sci. 2022, 12, 10808. [Google Scholar] [CrossRef]
- Mijic, D.; Brisinello, M.; Vranjes, M.; Grbic, R. Traffic Sign Detection Using YOLOv3. In Proceedings of the 2020 IEEE 10th International Conference on Consumer Electronics (ICCE-Berlin), Berlin, Germany, 9–11 November 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Fang, R.; Cai, C. Computer vision based obstacle detection and target tracking for autonomous vehicles. IMATEC Web Conf. 2021, 336, 07004. [Google Scholar] [CrossRef]
- Wang, L.; Li, J.; Kang, F. Crack Location and Degree Detection Method Based on YOLOX Model. Appl. Sci. 2022, 12, 12572. [Google Scholar] [CrossRef]
- Fassmeyer, P.; Kortmann, F.; Drews, P.; Funk, B. Towards a Camera-Based Road Damage Assessment and Detection for Autonomous Vehicles: Applying Scaled-YOLO and CVAE-WGAN. In Proceedings of the 2021 IEEE 94th Vehicular Technology Conference (VTC2021-Fall), Norman, OK, USA, 27–30 September 2021; pp. 1–7. [Google Scholar] [CrossRef]
- Zhang, Y.; Hou, X. Combining Self-Supervised Learning and Yolo v4 Network for Construction Vehicle Detection. Mob. Inf. Syst. 2022, 2022, 9056415. [Google Scholar] [CrossRef]
- Li, Y.; Wang, J.; Huang, J.; Li, Y. Research on Deep Learning Automatic Vehicle Recognition Algorithm Based on RES-YOLO Model. Sensors 2022, 22, 3783. [Google Scholar] [CrossRef] [PubMed]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Target Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Gu, J.; Luo, S. An improved YOLO v3 based vehicle detection method. Agric. Equip. Veh. Eng. 2021, 59, 98–103. [Google Scholar]
- Li, X. Research on Modeling and Simulation of Millimeter Wave Radar for Automotive Intelligent Driving. Ph.D. Thesis, Jilin University, Changchun, China, 2020. [Google Scholar] [CrossRef]
- Zhang, J.; Liu, T.; Li, R.; Liu, D.; Zhan, J.; Kan, H. Research on time calibration method for multisensor fusion for autonomous driving. Automot. Eng. 2022, 44, 215–224. [Google Scholar] [CrossRef]
- Yan, Q. Forward Obstacle Detection Based on Millimeter Wave Radar and Camera Data Fusion. Master’s Thesis, Chongqing University of Technology, Chongqing, China, 2021. [Google Scholar] [CrossRef]
- Hu, Y.P.; Liu, F.; Wei, Z.Y.; Zhao, L.F. Millimeter wave radar and vision sensor information fusion for vehicle tracking. China Mech. Eng. 2021, 32, 2181–2188. [Google Scholar] [CrossRef]
- Niu, C.; Su, O.; Wu, K.; Xu, J.; You, G. Testing and research on sensor ranging accuracy of automobile forward collision warning system. J. Automot. Eng. 2019, 9, 373–379. [Google Scholar]
- Jiang, Q.; Zhang, L.; Meng, D. Target Detection Algorithm Based on MMW Radar and Camera Fusion. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019. [Google Scholar] [CrossRef]
- Tan, G.; Fu, D.; Ding, Y.; Wang, Y. Vehicle driving state estimation based on extended Kalman filter. J. Guangxi Univ. Sci. Technol. 2020, 31, 18–24. [Google Scholar] [CrossRef]
- Ye, H.; Ruan, H.; Wang, Y. Calibration study of robot-based docking position visual measurement system. Mechatron. Prod. Dev. Innov. 2022, 35, 9–12. [Google Scholar]
Equipment | Quantity | Function |
---|---|---|
Senyun Smart Industrial Camera | 1 | Acquisition of the environment to be measured |
Continental ARS-408 mm wave radar | 1 | Obtain information such as the distance and position of the target to be measured in the environment |
Cuckoo Autopilot Domain Controller | 1 | Deploy programs, receive data, and publish identification results, system power supply |
12V DC regulated power supply | 1 | System power supply |
7-inch display | 1 | Display the recognition effect of the program and related information |
Communication Module | 1 | Data transfer between sensors |
Software Name | Versions |
---|---|
Ubuntu | 18.04 LTS |
JetPack | 4.4.1 |
Cuda | 10.2.89 |
Opencv | 4.1.1 |
ROS | Melodic |
Python | 3.6.9 |
YOLOv5 | PyTorch version |
(a) | ||||||
Training Set | Sunny | Rain | Cloudy | Test Set | Sunny | Rain |
Daytime open space | 400 | 300 | 300 | Daytime open space | 100 | 75 |
Daytime Road Complex | 400 | 300 | 300 | Daytime Road Complex | 100 | 75 |
Evening open space | 400 | 300 | 300 | Evening open space | 100 | 75 |
Evening Road Complex | 400 | 300 | 300 | Evening Road Complex | 100 | 75 |
(b) | ||||||
Name | Model size | Image size | Detection speed (fps) | Average positive inspection rate (%) | Recall | |
Improve YOLOv5 | 168 M | 640 × 640 | 31.9 | 88.06 | 80.46 | |
YOLOv5 | 90.2 M | 640 × 640 | 32.6 | 87.21 | 79.98 | |
YOLOv3 | 237 M | 640 × 640 | 16.5 | 85.54 | 77.66 |
Camera | Millimeter-Wave Radar | Strategies |
---|---|---|
Target detected | Target detected | Perform fusion detection |
No target detected | Target detected | Output millimeter-wave radar detection data |
Target detected | No target detected | Output visual monitoring data |
No target detected | No target detected | Judged by the extended Kalman filter |
Detection Method | Total Number of Targets (Vehicles/People) | Positive Inspection | Misdetection | Missing Inspection | Distance Information Error | Accuracy % | Missing Detection Rate % |
---|---|---|---|---|---|---|---|
Single vision inspection | 2604 (832/1772) | 716/1521 | 724/1544 | 116/251 | None | 86.06 | 13,94 |
85.84 | 14.16 | ||||||
Single radar detection | 631/1323 | 740/1602 | 201/449 | <2 m | 75.84 | 24.16 | |
74.66 | 25.34 | ||||||
Fusion Detection | 766/1604 | 74/172 | 66/168 | <2 m | 92.07 | 7.93 | |
90.52 | 9.48 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zou, J.; Zheng, H.; Wang, F. Real-Time Target Detection System for Intelligent Vehicles Based on Multi-Source Data Fusion. Sensors 2023, 23, 1823. https://doi.org/10.3390/s23041823
Zou J, Zheng H, Wang F. Real-Time Target Detection System for Intelligent Vehicles Based on Multi-Source Data Fusion. Sensors. 2023; 23(4):1823. https://doi.org/10.3390/s23041823
Chicago/Turabian StyleZou, Junyi, Hongyi Zheng, and Feng Wang. 2023. "Real-Time Target Detection System for Intelligent Vehicles Based on Multi-Source Data Fusion" Sensors 23, no. 4: 1823. https://doi.org/10.3390/s23041823
APA StyleZou, J., Zheng, H., & Wang, F. (2023). Real-Time Target Detection System for Intelligent Vehicles Based on Multi-Source Data Fusion. Sensors, 23(4), 1823. https://doi.org/10.3390/s23041823