A Survey on Sensor Failures in Autonomous Vehicles: Challenges and Solutions
Abstract
:1. Introduction
- Identifying and categorizing sensor failures.
- Identifying sensor limitations and the most frequent failures.
- Researching mitigation strategies for the categorized problems, such as sensor calibration, sensor fusion, radar ambiguity detection, and strategies for dealing with camera failures such as blur, condensation, broken lenses, heat, water, etc. Given that none of those mitigation strategies solves the issue of a complete failure of a sensor, we also explored how redundancy is applied in AVs.
- Discussing ongoing and potential future advances in the perception systems, such as continuous improvement of sensors, improving the robustness of machine--learning models, and the importance of redundancy in AVs.
2. Background Concepts
3. Sensors Used in the Perception Layer
3.1. Ultrasonic Sensors
3.2. RADAR: Radio Detection and Ranging
3.3. LiDAR: Light Detection and Ranging
3.4. Camera
3.5. GNSS: Global Navigation Satellite Systems
- Timing errors, due to variations between the satellite’s atomic clock and the receiver’s crystal clock.
- Signal delays, caused by propagation through the ionosphere and troposphere.
- Multipath effect.
- Satellite orbit uncertainties.
3.6. Inertial Measurements Units
3.7. Summary of Problems and Weaknesses of Interceptive and Exteroceptive Sensors
4. Mitigation Strategies
4.1. Sensor Calibration
4.1.1. Intrinsic Calibration
4.1.2. Extrinsic Calibration
4.1.3. Temporal Calibration
4.2. Sensor Fusion
4.2.1. Sensor Fusion Methodologies
4.2.2. Sensor Fusion Techniques and Algorithms
4.3. Radar Interference Mitigation Strategies
- Detection and suppression at the receiver.Interference is detected in the measurement data and eliminated by removing the corrupted data and recreating its value [73].
- Detection and avoidance.The radar actively modifies its signal to avoid interference in subsequent cycles when interference is detected in the measurement signal. This strategy is inspired by the interference avoidance mechanism of bats [74], which avoids interference rather than suppressing it locally.
- Interference-aware cognitive radar.The radar senses the entire operational spectrum and adaptively avoids interference using waveform modification [75].
- Centralized coordination.Self-driving cars are centrally coordinated to avoid radar interference [76]. Vehicles send their locations and routes to a control center, which models the radar operating schedule of vehicles in the same environment as a graph coloring problem and creates playbooks for each self-driving car to ensure that its radars work without interference.
4.4. Radar Ambiguity Detection Mitigation Strategies
4.5. Redundancy
4.6. Camera Image Failures Mitigation Strategies
4.7. LiDAR Mirror-like Object Detection Using Unsupervised Learning
5. Future Research Directions
5.1. Synthetic Aperture Radar (SAR) in AVs
5.2. Sensor Fusion Algorithms
5.3. Advanced Driver Assistance Systems (ADAS) Redundancy
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Silva, Ó.; Cordera, R.; González-González, E.; Nogués, S. Environmental impacts of autonomous vehicles: A review of the scientific literature. Sci. Total Environ. 2022, 830, 154615. [Google Scholar] [CrossRef] [PubMed]
- Xie, G.; Li, Y.; Han, Y.; Xie, Y.; Zeng, G.; Li, R. Recent Advances and Future Trends for Automotive Functional Safety Design Methodologies. IEEE Trans. Ind. Inf. 2020, 16, 5629–5642. [Google Scholar] [CrossRef]
- Rojas-Rueda, D.; Nieuwenhuijsen, M.J.; Khreis, H.; Frumkin, H. Autonomous Vehicles and Public Health. Annu. Rev. Public. Health 2020, 41, 329–345. [Google Scholar] [CrossRef] [PubMed]
- Vinkhuyzen, E.; Cefkin, M. Developing Socially Acceptable Autonomous Vehicles. Ethnogr. Prax. Ind. Conf. Proc. 2016, 2016, 522–534. [Google Scholar] [CrossRef]
- Morita, T.; Managi, S. Autonomous vehicles: Willingness to pay and the social dilemma. Transp. Res. Part. C Emerg. Technol. 2020, 119, 102748. [Google Scholar] [CrossRef]
- Wang, J.; Zhang, L.; Huang, Y.; Zhao, J.; Bella, F. Safety of Autonomous Vehicles. J. Adv. Transp. 2020, 2020, 8867757. [Google Scholar] [CrossRef]
- Ahangar, M.N.; Ahmed, Q.Z.; Khan, F.A.; Hafeez, M. A Survey of Autonomous Vehicles: Enabling Communication Technologies and Challenges. Sensors 2021, 21, 706. [Google Scholar] [CrossRef]
- Pandharipande, A.; Cheng, C.-H.; Dauwels, J.; Gurbuz, S.Z.; Ibanez-Guzman, J.; Li, G.; Piazzoni, A.; Wang, P.; Santra, A. Sensing and Machine Learning for Automotive Perception: A Review. IEEE Sens. J. 2023, 23, 11097–11115. [Google Scholar] [CrossRef]
- Ramos, M.A.; Correa Jullian, C.; McCullough, J.; Ma, J.; Mosleh, A. Automated Driving Systems Operating as Mobility as a Service: Operational Risks and SAE J3016 Standard. In Proceedings of the 2023 Annual Reliability and Maintainability Symposium (RAMS), Orlando, FL, USA, 23–26 January 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–6. [Google Scholar]
- Scurt, F.B.; Vesselenyi, T.; Tarca, R.C.; Beles, H.; Dragomir, G. Autonomous vehicles: Classification, technology and evolution. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1169, 012032. [Google Scholar] [CrossRef]
- Velasco-Hernandez, G.; Yeong, D.J.; Barry, J.; Walsh, J. Autonomous Driving Architectures, Perception and Data Fusion: A Review. In Proceedings of the 2020 IEEE 16th International Conference on Intelligent Computer Communication and Processing (ICCP), Cluj-Napoca, Romania, 3–5 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 315–321. [Google Scholar]
- Ignatious, H.A.; Sayed, H.-E.; Khan, M. An overview of sensors in Autonomous Vehicles. Procedia Comput. Sci. 2022, 198, 736–741. [Google Scholar] [CrossRef]
- Ortiz, F.M.; Sammarco, M.; Costa, L.H.M.K.; Detyniecki, M. Applications and Services Using Vehicular Exteroceptive Sensors: A Survey. IEEE Trans. Intell. Veh. 2023, 8, 949–969. [Google Scholar] [CrossRef]
- Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review. Sensors 2021, 21, 2140. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Wu, Y.; Niu, Q. Multi-Sensor Fusion in Automated Driving: A Survey. IEEE Access 2020, 8, 2847–2868. [Google Scholar] [CrossRef]
- AlZu’bi, S.; Jararweh, Y. Data Fusion in Autonomous Vehicles Research, Literature Tracing from Imaginary Idea to Smart Surrounding Community. In Proceedings of the 2020 Fifth International Conference on Fog and Mobile Edge Computing (FMEC), Paris, France, 20–23 April 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 306–311. [Google Scholar]
- Avizienis, A.; Laprie, J.-C.; Randell, B.; Landwehr, C. Basic concepts and taxonomy of dependable and secure computing. IEEE Trans. Dependable Secur. Comput. 2004, 1, 11–33. [Google Scholar] [CrossRef]
- Budisusila, E.N.; Khosyi’in, M.; Prasetyowati, S.A.D.; Suprapto, B.Y.; Nawawi, Z. Ultrasonic Multi-Sensor Detection Patterns on Autonomous Vehicles Using Data Stream Method. In Proceedings of the 2021 8th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), Semarang, Indonesia, 20–21 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 144–150. [Google Scholar]
- Paidi, V.; Fleyeh, H.; Håkansson, J.; Nyberg, R.G. Smart parking sensors, technologies and applications for open parking lots: A review. IET Intell. Transp. Syst. 2018, 12, 735–741. [Google Scholar] [CrossRef]
- Vargas, J.; Alsweiss, S.; Toker, O.; Razdan, R.; Santos, J. An Overview of Autonomous Vehicles Sensors and Their Vulnerability to Weather Conditions. Sensors 2021, 21, 5397. [Google Scholar] [CrossRef] [PubMed]
- Rosique, F.; Navarro, P.J.; Fernández, C.; Padilla, A. A Systematic Review of Perception System and Simulators for Autonomous Vehicles Research. Sensors 2019, 19, 648. [Google Scholar] [CrossRef]
- Komissarov, R.; Kozlov, V.; Filonov, D.; Ginzburg, P. Partially coherent radar unties range resolution from bandwidth limitations. Nat. Commun. 2019, 10, 1423. [Google Scholar] [CrossRef] [PubMed]
- Li, Y.; Ibanez-Guzman, J. Lidar for Autonomous Driving: The Principles, Challenges, and Trends for Automotive Lidar and Perception Systems. IEEE Signal Process Mag. 2020, 37, 50–61. [Google Scholar] [CrossRef]
- Dreissig, M.; Scheuble, D.; Piewak, F.; Boedecker, J. Survey on LiDAR Perception in Adverse Weather Conditions. In Proceedings of the 2023 IEEE Intelligent Vehicles Symposium (IV), Anchorage, AK, USA, 4–7 June 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–8. [Google Scholar]
- Damodaran, D.; Mozaffari, S.; Alirezaee, S.; Ahamed, M.J. Experimental Analysis of the Behavior of Mirror-like Objects in LiDAR-Based Robot Navigation. Appl. Sci. 2023, 13, 2908. [Google Scholar] [CrossRef]
- Li, Y.; Moreau, J.; Ibanez-Guzman, J. Emergent Visual Sensors for Autonomous Vehicles. IEEE Trans. Intell. Transp. Syst. 2023, 24, 4716–4737. [Google Scholar] [CrossRef]
- Roszyk, K.; Nowicki, M.R.; Skrzypczyński, P. Adopting the YOLOv4 Architecture for Low-Latency Multispectral Pedestrian Detection in Autonomous Driving. Sensors 2022, 22, 1082. [Google Scholar] [CrossRef]
- Sun, C.; Chen, Y.; Qiu, X.; Li, R.; You, L. MRD-YOLO: A Multispectral Object Detection Algorithm for Complex Road Scenes. Sensors 2024, 24, 3222. [Google Scholar] [CrossRef]
- Xie, Y.; Zhang, L.; Yu, X.; Xie, W. YOLO-MS: Multispectral Object Detection via Feature Interaction and Self-Attention Guided Fusion. IEEE Trans. Cogn. Dev. Syst. 2023, 15, 2132–2143. [Google Scholar] [CrossRef]
- Altay, F.; Velipasalar, S. The Use of Thermal Cameras for Pedestrian Detection. IEEE Sens. J. 2022, 22, 11489–11498. [Google Scholar] [CrossRef]
- Chen, Y.; Shin, H. Pedestrian Detection at Night in Infrared Images Using an Attention-Guided Encoder-Decoder Convolutional Neural Network. Appl. Sci. 2020, 10, 809. [Google Scholar] [CrossRef]
- Tan, M.; Chao, W.; Cheng, J.-K.; Zhou, M.; Ma, Y.; Jiang, X.; Ge, J.; Yu, L.; Feng, L. Animal Detection and Classification from Camera Trap Images Using Different Mainstream Object Detection Architectures. Animals 2022, 12, 1976. [Google Scholar] [CrossRef]
- Iwasaki, Y.; Misumi, M.; Nakamiya, T. Robust Vehicle Detection under Various Environmental Conditions Using an Infrared Thermal Camera and Its Application to Road Traffic Flow Monitoring. Sensors 2013, 13, 7756–7773. [Google Scholar] [CrossRef]
- Bijelic, M.; Gruber, T.; Mannan, F.; Kraus, F.; Ritter, W.; Dietmayer, K.; Heide, F. Seeing Through Fog Without Seeing Fog: Deep Multimodal Sensor Fusion in Unseen Adverse Weather. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 11679–11689. [Google Scholar]
- Yamaguchi, E.; Higuchi, H.; Yamashita, A.; Asama, H. Glass Detection Using Polarization Camera and LRF for SLAM in Environment with Glass. In Proceedings of the 2020 21st International Conference on Research and Education in Mechatronics (REM), Cracow, Poland, 9–11 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
- Shariff, W.; Dilmaghani, M.S.; Kielty, P.; Moustafa, M.; Lemley, J.; Corcoran, P. Event Cameras in Automotive Sensing: A Review. IEEE Access 2024, 12, 51275–51306. [Google Scholar] [CrossRef]
- Ceccarelli, A.; Secci, F. RGB Cameras Failures and Their Effects in Autonomous Driving Applications. IEEE Trans. Dependable Secur. Comput. 2023, 20, 2731–2745. [Google Scholar] [CrossRef]
- Dosovitskiy, A.; Ros, G.; Codevilla, F.; Lopez, A.; Koltun, V. CARLA: An Open Urban Driving Simulator. In Proceedings of the 1st Annual Conference on Robot Learning, CoRL 2017, Mountain View, CA, USA, 13–15 November 2017. [Google Scholar]
- Raveena, C.S.; Sravya, R.S.; Kumar, R.V.; Chavan, A. Sensor Fusion Module Using IMU and GPS Sensors For Autonomous Car. In Proceedings of the 2020 IEEE International Conference for Innovation in Technology (INOCON), Bangluru, India, 6–8 November 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
- Yusefi, A.; Durdu, A.; Bozkaya, F.; Tığlıoğlu, Ş.; Yılmaz, A.; Sungur, C. A Generalizable D-VIO and Its Fusion With GNSS/IMU for Improved Autonomous Vehicle Localization. IEEE Trans. Intell. Veh. 2024, 9, 2893–2907. [Google Scholar] [CrossRef]
- Xia, X.; Hashemi, E.; Xiong, L.; Khajepour, A.; Xu, N. Autonomous Vehicles Sideslip Angle Estimation: Single Antenna GNSS/IMU Fusion With Observability Analysis. IEEE Internet Things J. 2021, 8, 14845–14859. [Google Scholar] [CrossRef]
- Zong, W.; Zhang, C.; Wang, Z.; Zhu, J.; Chen, Q. Architecture Design and Implementation of an Autonomous Vehicle. IEEE Access 2018, 6, 21956–21970. [Google Scholar] [CrossRef]
- Goberville, N.; El-Yabroudi, M.; Omwanas, M.; Rojas, J.; Meyer, R.; Asher, Z.; Abdel-Qader, I. Analysis of LiDAR and Camera Data in Real-World Weather Conditions for Autonomous Vehicle Operations. SAE Int. J. Adv. Curr. Pr. Mobil. 2020, 2, 2428–2434. [Google Scholar] [CrossRef]
- Raiyn, J. Performance Metrics for Positioning Terminals Based on a GNSS in Autonomous Vehicle Networks. Wirel. Pers. Commun. 2020, 114, 1519–1532. [Google Scholar] [CrossRef]
- Jing, H.; Gao, Y.; Shahbeigi, S.; Dianati, M. Integrity Monitoring of GNSS/INS Based Positioning Systems for Autonomous Vehicles: State-of-the-Art and Open Challenges. IEEE Trans. Intell. Transp. Syst. 2022, 23, 14166–14187. [Google Scholar] [CrossRef]
- Kamal, M.; Barua, A.; Vitale, C.; Laoudias, C.; Ellinas, G. GPS Location Spoofing Attack Detection for Enhancing the Security of Autonomous Vehicles. In Proceedings of the 2021 IEEE 94th Vehicular Technology Conference (VTC2021-Fall), Norman, OK, USA, 27–30 December 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–7. [Google Scholar]
- Liu, Z.; Wang, L.; Wen, F.; Zhang, H. IMU/Vehicle Calibration and Integrated Localization for Autonomous Driving. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 4013–4019. [Google Scholar]
- Wang, Z.; Zhang, Z.; Hu, X.; Zhu, W.; Deng, H. Extrinsic Calibration of Visual and Inertial Sensors for the Autonomous Vehicle. IEEE Sens. J. 2023, 23, 15934–15941. [Google Scholar] [CrossRef]
- Choi, J.D.; Kim, M.Y. A Sensor Fusion System with Thermal Infrared Camera and LiDAR for Autonomous Vehicles: Its Calibration and Application. In Proceedings of the 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN), Jeju Island, Republic of Korea, 17–20 August 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 361–365. [Google Scholar]
- Shahian Jahromi, B.; Tulabandhula, T.; Cetin, S. Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles. Sensors 2019, 19, 4357. [Google Scholar] [CrossRef]
- Lundquist, C. Sensor Fusion for Automotive Applications. Ph.D. Thesis, Department of Electrical Engineering Linköping University, Linköping, Sweden, 2011. [Google Scholar]
- Nobis, F.; Geisslinger, M.; Weber, M.; Betz, J.; Lienkamp, M. A Deep Learning-based Radar and Camera Sensor Fusion Architecture for Object Detection. In Proceedings of the 2019 Sensor Data Fusion: Trends, Solutions, Applications (SDF), Bonn, Germany, 15–17 October 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–7. [Google Scholar]
- Gu, S.; Zhang, Y.; Yang, J.; Alvarez, J.M.; Kong, H. Two-View Fusion based Convolutional Neural Network for Urban Road Detection. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–9 November 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 6144–6149. [Google Scholar]
- Pollach, M.; Schiegg, F.; Knoll, A. Low Latency And Low-Level Sensor Fusion For Automotive Use-Cases. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 6780–6786. [Google Scholar]
- Fayyad, J.; Jaradat, M.A.; Gruyer, D.; Najjaran, H. Deep Learning Sensor Fusion for Autonomous Vehicle Perception and Localization: A Review. Sensors 2020, 20, 4220. [Google Scholar] [CrossRef]
- Wang, X.; Li, K.; Chehri, A. Multi-Sensor Fusion Technology for 3D Object Detection in Autonomous Driving: A Review. IEEE Trans. Intell. Transp. Syst. 2023, 25, 1148–1165. [Google Scholar] [CrossRef]
- Shi, J.; Tang, Y.; Gao, J.; Piao, C.; Wang, Z. Multitarget-Tracking Method Based on the Fusion of Millimeter-Wave Radar and LiDAR Sensor Information for Autonomous Vehicles. Sensors 2023, 23, 6920. [Google Scholar] [CrossRef] [PubMed]
- Gao, L.; Xia, X.; Zheng, Z.; Ma, J. GNSS/IMU/LiDAR fusion for vehicle localization in urban driving environments within a consensus framework. Mech. Syst. Signal Process 2023, 205, 110862. [Google Scholar] [CrossRef]
- Xiang, C.; Feng, C.; Xie, X.; Shi, B.; Lu, H.; Lv, Y.; Yang, M.; Niu, Z. Multi-Sensor Fusion and Cooperative Perception for Autonomous Driving: A Review. IEEE Intell. Transp. Syst. Mag. 2023, 15, 36–58. [Google Scholar] [CrossRef]
- Hasanujjaman, M.; Chowdhury, M.Z.; Jang, Y.M. Sensor Fusion in Autonomous Vehicle with Traffic Surveillance Camera System: Detection, Localization, and AI Networking. Sensors 2023, 23, 3335. [Google Scholar] [CrossRef] [PubMed]
- Zhao, X.; Sun, P.; Xu, Z.; Min, H.; Yu, H. Fusion of 3D LIDAR and Camera Data for Object Detection in Autonomous Vehicle Applications. IEEE Sens. J. 2020, 20, 4901–4913. [Google Scholar] [CrossRef]
- Ogunrinde, I.; Bernadin, S. Deep Camera–Radar Fusion with an Attention Framework for Autonomous Vehicle Vision in Foggy Weather Conditions. Sensors 2023, 23, 6255. [Google Scholar] [CrossRef] [PubMed]
- Yao, S.; Guan, R.; Huang, X.; Li, Z.; Sha, X.; Yue, Y.; Lim, E.G.; Seo, H.; Man, K.L.; Zhu, X.; et al. Radar-Camera Fusion for Object Detection and Semantic Segmentation in Autonomous Driving: A Comprehensive Review. IEEE Trans. Intell. Veh. 2024, 9, 2094–2128. [Google Scholar] [CrossRef]
- Wang, S.; Mei, L.; Yin, Z.; Li, H.; Liu, R.; Jiang, W.; Lu, C.X. End-to-End Target Liveness Detection via mmWave Radar and Vision Fusion for Autonomous Vehicles. ACM Trans. Sens. Netw. 2024, 20, 1–26. [Google Scholar] [CrossRef]
- Kurniawan, I.T.; Trilaksono, B.R. ClusterFusion: Leveraging Radar Spatial Features for Radar-Camera 3D Object Detection in Autonomous Vehicles. IEEE Access 2023, 11, 121511–121528. [Google Scholar] [CrossRef]
- Banerjee, K.; Notz, D.; Windelen, J.; Gavarraju, S.; He, M. Online Camera LiDAR Fusion and Object Detection on Hybrid Data for Autonomous Driving. In Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 26–30 June 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1632–1638. [Google Scholar]
- Brena, R.F.; Aguileta, A.A.; Trejo, L.A.; Molino-Minero-Re, E.; Mayora, O. Choosing the Best Sensor Fusion Method: A Machine-Learning Approach. Sensors 2020, 20, 2350. [Google Scholar] [CrossRef]
- Kim, J.; Kim, J.; Cho, J. An advanced object classification strategy using YOLO through camera and LiDAR sensor fusion. In Proceedings of the 2019 13th International Conference on Signal Processing and Communication Systems (ICSPCS), Gold Coast, QLD, Australia, 16–18 December 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–5. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.-Y.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2016; pp. 21–37. [Google Scholar]
- Roth, M.; Jargot, D.; Gavrila, D.M. Deep End-to-end 3D Person Detection from Camera and Lidar. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 521–527. [Google Scholar]
- Mazher, K.U.; Heath, R.W.; Gulati, K.; Li, J. Automotive Radar Interference Characterization and Reduction by Partial Coordination. In Proceedings of the 2020 IEEE Radar Conference (RadarConf20), Florence, Italy, 21–25 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
- Hakobyan, G.; Yang, B. High-Performance Automotive Radar: A Review of Signal Processing Algorithms and Modulation Schemes. IEEE Signal Process Mag. 2019, 36, 32–44. [Google Scholar] [CrossRef]
- Bechter, J.; Rameez, M.; Waldschmidt, C. Analytical and Experimental Investigations on Mitigation of Interference in a DBF MIMO Radar. IEEE Trans. Microw. Theory Tech. 2017, 65, 1727–1734. [Google Scholar] [CrossRef]
- Bechter, J.; Sippel, C.; Waldschmidt, C. Bats-inspired frequency hopping for mitigation of interference between automotive radars. In Proceedings of the 2016 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), San Diego, CA, USA, 19–20 May 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–4. [Google Scholar]
- Greco, M.S.; Gini, F.; Stinco, P.; Bell, K. Cognitive Radars: On the Road to Reality: Progress Thus Far and Possibilities for the Future. IEEE Signal Process Mag. 2018, 35, 112–125. [Google Scholar] [CrossRef]
- Khoury, J.; Ramanathan, R.; McCloskey, D.; Smith, R.; Campbell, T. RadarMAC: Mitigating Radar Interference in Self-Driving Cars. In Proceedings of the 2016 13th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON), London, UK, 27–30 June 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–9. [Google Scholar]
- Tebaldini, S.; Manzoni, M.; Tagliaferri, D.; Rizzi, M.; Monti-Guarnieri, A.V.; Prati, C.M.; Spagnolini, U.; Nicoli, M.; Russo, I.; Mazzucco, C. Sensing the Urban Environment by Automotive SAR Imaging: Potentials and Challenges. Remote Sens. 2022, 14, 3602. [Google Scholar] [CrossRef]
- Schurwanz, M.; Oettinghaus, S.; Mietzner, J.; Hoeher, P.A. Reducing On-Board Interference and Angular Ambiguity Using Distributed MIMO Radars in Medium-Sized Autonomous Air Vehicle. IEEE Aerosp. Electron. Syst. Mag. 2024, 39, 4–14. [Google Scholar] [CrossRef]
- Fürst, S. Scalable, safe und multi-oem capable architecture for autonomous driving. In Proceedings of the 9th Vector Congress, Stuttgart, Germany, 21 November 2018. [Google Scholar]
- Hanselaar, C.A.J.; Silvas, E.; Terechko, A.; Heemels, W.P.M.H. Detection and Mitigation of Functional Insufficiencies in Autonomous Vehicles: The Safety Shell. In Proceedings of the 2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC), Macau, China, 8–12 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 2021–2027. [Google Scholar]
- Boubakri, A.; Mettali Gamar, S. A New Architecture of Autonomous Vehicles: Redundant Architecture to Improve Operational Safety. Int. J. Robot. Control Syst. 2021, 1, 355–368. [Google Scholar] [CrossRef]
- Cai, J.-F.; Ji, H.; Liu, C.; Shen, Z. Blind motion deblurring from a single image using sparse approximation. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 104–111. [Google Scholar]
- Fang, L.; Liu, H.; Wu, F.; Sun, X.; Li, H. Separable Kernel for Image Deblurring. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 2885–2892. [Google Scholar]
- Eigen, D.; Krishnan, D.; Fergus, R. Restoring an Image Taken through a Window Covered with Dirt or Rain. In Proceedings of the 2013 IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 633–640. [Google Scholar]
- Gu, J.; Ramamoorthi, R.; Belhumeur, P.; Nayar, S. Removing image artifacts due to dirty camera lenses and thin occluders. In Proceedings of the ACM SIGGRAPH Asia 2009 Papers, Yokohama, Japan, 16–19 December 2009; ACM: New York, NY, USA, 2009; pp. 1–10. [Google Scholar]
- Kondou, M. Condensation Prevention Camera Device. U.S. Patent 9,525,809, 20 December 2016. [Google Scholar]
- Bhagavathy, S.; Llach, J.; Zhai, J. fu Multi-Scale Probabilistic Dithering for Suppressing Banding Artifacts in Digital Images. In Proceedings of the 2007 IEEE International Conference on Image Processing, San Antonio, TX, USA, 16 September–19 October 2007; IEEE: Piscataway, NJ, USA, 2007; pp. IV–397–IV–400. [Google Scholar]
- Cho, C.-Y.; Chen, T.-M.; Wang, W.-S.; Liu, C.-N. Real-Time Photo Sensor Dead Pixel Detection for Embedded Devices. In Proceedings of the 2011 International Conference on Digital Image Computing: Techniques and Applications, Noosa, QLD, Australia, 6–8 December 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 164–169. [Google Scholar]
- Zamfir, A.; Drimbarean, A.; Zamfir, M.; Buzuloiu, V.; Steinberg, E.; Ursu, D. An optical model of the appearance of blemishes in digital photographs. In Digital Photography III; Martin, R.A., DiCarlo, J.M., Sampat, N., Eds.; SPIE: St. Bellingham, WA, USA, 2007; p. 65020I. [Google Scholar]
- Chung, S.-W. Removing chromatic aberration by digital image processing. Opt. Eng. 2010, 49, 067002. [Google Scholar] [CrossRef]
- Kang, S.B. Automatic Removal of Chromatic Aberration from a Single Image. In Proceedings of the 2007 IEEE Conference on Computer Vision and Pattern Recognition, Minneapolis, MN, USA, 17–22 June 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 1–8. [Google Scholar]
- Grosssmann, B.; Eldar, Y.C. An efficient method for demosaicing. In Proceedings of the 2004 23rd IEEE Convention of Electrical and Electronics Engineers in Israel, Tel-Aviv, Israel, 6–7 September 2004; IEEE: Piscataway, NJ, USA, 2004; pp. 436–439. [Google Scholar]
- Shi, L.; Ovsiannikov, I.; Min, D.-K.; Noh, Y.; Kim, W.; Jung, S.; Lee, J.; Shin, D.; Jung, H.; Waligorski, G.; et al. Demosaicing for RGBZ Sensor. In Computational Imaging XI; Bouman, C.A., Pollak, I., Wolfe, P.J., Eds.; SPIE: St. Bellingham, WA, USA, 2013; p. 865705. [Google Scholar]
- Prescott, B.; McLean, G.F. Line-Based Correction of Radial Lens Distortion. Graph. Models Image Process. 1997, 59, 39–47. [Google Scholar] [CrossRef]
- Yoneyama, S. Lens distortion correction for digital image correlation by measuring rigid body displacement. Opt. Eng. 2006, 45, 023602. [Google Scholar] [CrossRef]
- Sawhney, H.S.; Kumar, R. True multi-image alignment and its application to mosaicing and lens distortion correction. IEEE Trans. Pattern Anal. Mach. Intell. 1999, 21, 235–243. [Google Scholar] [CrossRef]
- Baek, Y.-M.; Cho, D.-C.; Lee, J.-A.; Kim, W.-Y. Noise Reduction for Image Signal Processor in Digital Cameras. In Proceedings of the 2008 International Conference on Convergence and Hybrid Information Technology, Daejeon, Republic of Korea, 28–30 August 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 474–481. [Google Scholar]
- Guo, L.; Manglani, S.; Liu, Y.; Jia, Y. Automatic Sensor Correction of Autonomous Vehicles by Human-Vehicle Teaching-and-Learning. IEEE Trans. Veh. Technol. 2018, 67, 8085–8099. [Google Scholar] [CrossRef]
- Deng, D. DBSCAN Clustering Algorithm Based on Density. In Proceedings of the 2020 7th International Forum on Electrical Engineering and Automation (IFEEA), Hefei, China, 25–27 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 949–953. [Google Scholar]
- Gao, X.; Roy, S.; Xing, G. MIMO-SAR: A Hierarchical High-Resolution Imaging Algorithm for mmWave FMCW Radar in Autonomous Driving. IEEE Trans. Veh. Technol. 2021, 70, 7322–7334. [Google Scholar] [CrossRef]
- Manzoni, M.; Tebaldini, S.; Monti-Guarnieri, A.V.; Prati, C.M. Multipath in Automotive MIMO SAR Imaging. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–12. [Google Scholar] [CrossRef]
- Qian, K.; Zhang, X. SAR on the Wheels: High-Resolution Synthetic Aperture Radar Sensing on Moving Vehicles. In Proceedings of the 2023 57th Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA, USA, 29 October–1 November 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1204–1209. [Google Scholar]
- Park, S.; Kim, Y.; Matson, E.T.; Smith, A.H. Accessible synthetic aperture radar system for autonomous vehicle sensing. In Proceedings of the 2016 IEEE Sensors Applications Symposium (SAS), Catania, Italy, 20–22 April 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–6. [Google Scholar]
- Harrer, F.; Pfeiffer, F.; Loffler, A.; Gisder, T.; Biebl, E. Synthetic aperture radar algorithm for a global amplitude map. In Proceedings of the 2017 14th Workshop on Positioning, Navigation and Communications (WPNC), Bremen, Germany, 25–26 October 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–6. [Google Scholar]
- Bharilya, V.; Kumar, N. Machine learning for autonomous vehicle’s trajectory prediction: A comprehensive survey, challenges, and future research directions. Veh. Commun. 2024, 46, 100733. [Google Scholar] [CrossRef]
Sensor | Advantages | Limitations/Weaknesses |
---|---|---|
Ultrasonic sensors |
| Limitations:
|
Radar |
| Limitations:
|
LiDAR |
| Limitations:
|
Cameras |
| Limitations:
|
GNSS |
| Limitations: Weaknesses: |
IMU |
| Limitations:
|
Sensor Type | Failure | Impact |
---|---|---|
Ultrasonic Sensors | Wrong perception due to interference between multiple sensors. | Extreme range errors due to overlapping ultrasonic signals. Requires unique identification to reject false echoes. |
Radar | False positives due to bounced waves. | This can lead to incorrect object detection or classification due to reflected signals from the environment. |
Wrong perception due to frequency interference from multiple radars. | Shared frequency interference may cause inaccuracies in object detection and tracking. | |
LiDAR | Detection performance degradation due to adverse weather conditions. | Reduced effectiveness in fog, rain, or snow, leading to incomplete or inaccurate spatial data. |
Missing or wrong perception due to reflection from mirrors or highly reflective surfaces. | This can result in faulty maps or missing data due to the laser beams being completely reflected. | |
Camera | Poor object detection due to variability in lighting conditions. | Performance can be significantly impaired in varying light conditions, leading to poor object detection. |
Image degradation due to rain, snow, or fog. | This can result in blurred or obscured images, affecting the accuracy of perception tasks. | |
Misinterpretation in ADAS due to degraded images. | Degraded images can lead to AV collisions if the AI/ML systems fail to properly interpret the information. | |
GNSS | Timing errors due to clock differences. | This can affect the accuracy of location information, leading to incorrect positioning. |
Susceptibility to jamming and spoofing. | This can lead to loss of navigation accuracy or misdirection if the GNSS signals are blocked or falsified. | |
Multipath effect and satellite orbit uncertainties. | This can lead to errors in location determination due to signal reflections and orbital inaccuracies. | |
IMU | Error accumulation and drift. | Errors in acceleration and rotational data can lead to inaccuracies in vehicle movement and orientation over time. |
Factors | Best Sensor |
---|---|
Range | Radar |
Resolution | Camera |
Distance Accuracy | LiDAR |
Velocity | Radar |
Color Perception (e.g., traffic lights) | Camera |
Object Detection | LiDAR |
Object Classification | Camera |
Lane Detection | Camera |
Obstacle Edge Detection | Camera and LiDAR |
Sensor | Characteristics |
---|---|
YOLO | You Only Look Once (YOLO) is a single-stage detector that uses a single convolutional neural network (CNN) to predict bounding boxes and compute class probabilities and confidence scores for an image [68]. Advantages:
|
SSD | The Single-Shot Multibox Detector (SSD) is a single-stage CNN detector that converts bounding boxes into a collection of boxes with different sizes and aspect ratios to detect obstacles of various dimensions [69].Advantages:
|
VoxelNet | VoxelNet is a generic 3D obstacle detection network that combines feature extraction and bounding box prediction into a single-stage, fully trainable deep network. It detects obstacles using a voxelized technique based on point cloud data [70].Advantages:
|
PointNet | PointNet is a permutation-invariant deep neural network that learns global features from unordered point clouds using a two-stage detection approach [70].Advantages:
|
Component | Failure | Mitigations |
---|---|---|
Lens | Brightness | Brightness, if detected, can be compensated to some degree in post-processing. Images that are completely black or white are easy to detect, but recovering the original image is difficult. |
Blur | There exist several approaches for eliminating or correcting image blur [82,83]. For example, in [82], the authors define blind blur as a new joint optimization problem that optimizes the sparsity of both the blur kernel and the clear image. | |
Broken Lens | Image processing can detect a broken lens, but reconstructing a clean image can be difficult. | |
Dirty | Image processing software can remove localized rain and soil effects from a single image [84]. For example, physics-based approaches [85] can remove dust and dirt from digital photos and videos. | |
Flare | Lens artifacts such as flare and ghosting can be minimized or eliminated from a single input image during post-processing to correct the image. | |
Rain | To mitigate the effects of rain, we refer to Dirty’s statement and [84]. | |
Condensation | Several studies, including [86], address the issue of avoiding or eliminating condensation inside cameras. | |
Image Sensor | Banding | There are numerous strategies to reduce the visual effects of banding, such as using dithering patterns [87]. |
Dead Pixel | Dead pixels can be detected using solutions that can be implemented on an embedded device [88]. | |
Spots | There are many approaches to eliminate spots based on image processing; for example, [89] identifies spots and computes their physical appearance (size, shape, position, and transparency) in an image based on camera parameters. | |
Chromatic Aberration | Image processing can help to reduce chromatic aberration [90,91]. | |
Mosaic | Although efficient demosaicing methods exist [92,93] it is difficult to recover the image when demosaicing fails. | |
Distortion | Lens distortion can be detected and measured [94] and corrected using image processing methods [95,96]. | |
Noise | There are several methods to reduce or eliminate noise and sharpening, including commercial tools. There are also methods available that work directly at the sensor level, e.g., [97,98]. | |
Sharpness |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Matos, F.; Bernardino, J.; Durães, J.; Cunha, J. A Survey on Sensor Failures in Autonomous Vehicles: Challenges and Solutions. Sensors 2024, 24, 5108. https://doi.org/10.3390/s24165108
Matos F, Bernardino J, Durães J, Cunha J. A Survey on Sensor Failures in Autonomous Vehicles: Challenges and Solutions. Sensors. 2024; 24(16):5108. https://doi.org/10.3390/s24165108
Chicago/Turabian StyleMatos, Francisco, Jorge Bernardino, João Durães, and João Cunha. 2024. "A Survey on Sensor Failures in Autonomous Vehicles: Challenges and Solutions" Sensors 24, no. 16: 5108. https://doi.org/10.3390/s24165108
APA StyleMatos, F., Bernardino, J., Durães, J., & Cunha, J. (2024). A Survey on Sensor Failures in Autonomous Vehicles: Challenges and Solutions. Sensors, 24(16), 5108. https://doi.org/10.3390/s24165108