Next Article in Journal
Developing a Framework for Evaluating and Predicting Management Innovation in Public Research Institutions
Next Article in Special Issue
A Driver Behavior Monitoring System for Sustainable Traffic and Road Construction
Previous Article in Journal
Trace Metals in Rice Grains and Their Associated Health Risks from Conventional and Non-Conventional Rice Growing Areas in Punjab-Pakistan
Previous Article in Special Issue
Safety Culture among Transport Companies in Ethiopia: Are They Ready for Emerging Fleet Technologies?
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Effect of Rainfall and Illumination on Automotive Sensors Detection Performance

1
Institute of Automotive Engineering, Graz University of Technology, 8010 Graz, Austria
2
DigiTrans GmbH, 4020 Linz, Austria
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(9), 7260; https://doi.org/10.3390/su15097260
Submission received: 16 March 2023 / Revised: 9 April 2023 / Accepted: 25 April 2023 / Published: 27 April 2023
(This article belongs to the Collection Emerging Technologies and Sustainable Road Safety)

Abstract

:
Vehicle safety promises to be one of the Advanced Driver Assistance System’s (ADAS) biggest benefits. Higher levels of automation remove the human driver from the chain of events that can lead to a crash. Sensors play an influential role in vehicle driving as well as in ADAS by helping the driver to watch the vehicle’s surroundings for safe driving. Thus, the driving load is drastically reduced from steering as well as accelerating and braking for long-term driving. The baseline for the development of future intelligent vehicles relies even more on the fusion of data from surrounding sensors such as Camera, LiDAR and Radar. These sensors not only need to perceive in clear weather but also need to detect accurately adverse weather and illumination conditions. Otherwise, a small error could have an incalculable impact on ADAS. Most of the current studies are based on indoor or static testing. In order to solve this problem, this paper designs a series of dynamic test cases with the help of outdoor rain and intelligent lightning simulation facilities to make the sensor application scenarios more realistic. As a result, the effect of rainfall and illumination on sensor perception performance is investigated. As speculated, the performance of all automotive sensors is degraded by adverse environmental factors, but their behaviour is not identical. Future work on sensor model development and sensor information fusion should therefore take this into account.

1. Introduction

The causes of traffic accidents can be assigned to three reasons: driver-related, vehicle-related and environment-related critical causes [1]. According to the Stanford Center for Internet and Society, “ninety percent of motor vehicle crashes are caused at least in part by human error” [2]. Thus, in order to eliminate driver-related factors, the demands of autonomous driving vehicles have primarily driven the development of ADAS. Furthermore, vehicle-related factors are mainly related to the robustness of vehicle components. For example, if the data coming from the sensors are not accurate or reliable, they can corrupt everything else downstream in ADAS. Finally, environment-related factors also raise challenges for road safety. For example, in a dataset of traffic accidents collected in a Chinese city from 2014 to 2016, approximately 30.5% of accidents were related to harsh weather and illumination conditions [3]. For these reasons, automotive manufacturers are placing a very high priority on the development of safety systems. Therefore, a reliable and safe ADAS can prevent accidents and reduce the risk of injury to vehicle occupants and vulnerable road users. To fulfil this requirement, the sensors must be highly robust and operate in real time while also being able to cope with adverse weather and lighting conditions. As a result, multi-sensor fusion solutions based on Camera, LiDAR and Radar are widely used in higher-level automation driving for a powerful interpretation of a vehicle’s surroundings [4,5,6].
According to road traffic accident severity analysis [3,7,8], late-night and adverse weather accidents are more fatal than other traffic accident factors. Driving at night under low illumination conditions and rainfall proved to be the most important, leading to the highest number of accidents, fatalities and injuries. In the state of the art, some studies have outlined the impact of the aforementioned environmental factors on sensor performance [9,10,11]. For example, for illumination conditions, LiDAR and Radar are active sensors which are not dependent on sunlight for perception and measurements as summarized in [9]. In contrast, the Camera is a passive sensor affected by illumination, which brings up the problem of image saturation [12]. The Camera is mainly responsible for traffic lane detection, which is formed by the difference in grey values between the road surface and lane boundary points. Namely, the value of the grayscale gradient varies according to the illumination intensity [13]. The study of [14] demonstrated that artificial illumination is a factor in detection accuracy. Meanwhile, object detection used in ADAS is also sensitive to illumination [15]. Therefore, it is important to build a system with multiple systems without depending on a single sensor.
Unlike the effects produced by illumination, which only have a greater influence on the Camera. the negative effects of rainfall must be taken into account in all vehicle sensors. Rainfall is a frequent adverse condition and it is necessary to consider the impact on all sensors. In [16], raindrops on the lens can cause noise in the captured image, resulting in poor object recognition performance. Although the wipers eliminate raindrops to ensure Camera perception performance, the sight distance values vary with the intensity of the rainfall to the extent that the ADAS function is suspended [17]. Furthermore, in other studies and analysis results on the influence of rainfall on the LiDAR used in ADAS, all sensors demonstrate sensitivity to rain. At different rainfall intensities, the laser power and number of point clouds decrease, resulting in reduced object recognition as the LiDAR perception is dependent on the received point cloud data [18,19]. This effect is mostly caused by water absorption in the near-infrared spectral band. Some experimental pieces of evidence indicate that rainfall reduces the relative intensity of the point cloud [10]. Although Radar is more environmentally tolerant than LiDAR, it is subject to radio attenuation due to rainfall [20]. Compared to normal conditions, the simulation results show that the detection range drops to 45% under heavy rainfall of 150 mm/h [21]. A similar phenomenon is confirmed in the study of [22]. A humid environment can cause a water film to form on the covering radome, which can affect the propagation of electromagnetic waves at microwave frequencies and lead to considerable loss [23]. Meanwhile, the second major cause of Radar signal attenuation is the interaction of electromagnetic waves with rain in the propagation medium. Several studies have obtained quantitative data demonstrating that precipitation generally affects electromagnetic wave propagation at millimetre wave frequencies [24,25]. Therefore, the negative impact of rainfall directly affects the recognition capability of the perception system, which results in the ADAS function being downgraded or disabled.
No sensor is perfect in harsh environmental conditions. There are already several scientific studies showing the experimental results of the sensors in different environments and giving quantitative data. However, in most cases, these experiments are carried out at static or indoor conditions [10,11,19,26,27], which makes it difficult to comprehensively evaluate the performance of the sensor based on these laboratory data alone. This is because for the actual road traffic environment, vehicles equipped with sensors drive dynamically and ADAS is also required to cope with various environmental factors at different speed conditions. To compensate for the limitations of the current implementation, in this study, we design a series of dynamic test cases under different illumination and rainfall conditions. In addition, we consider replicating more day-to-day traffic scenarios, such as cutting in, following and overtaking, rather than a single longitudinal test. The study statistically measures sensor detection data collected from a proving ground for autonomous driving. Thus, a more comprehensive and realistic comparison of experimental data from different sensors in adverse environments can be conducted and we discuss the main barriers to the development of ADAS.
The outline of the subsequent sections of this paper is as follows: The proving ground and test facilities are introduced in Section 2. Section 3 presents the methodology for test case implementation. Section 4 demonstrates the statistics from real sensor measurement and evaluation for main automotive sensors. Limitations of sensors for ADAS are discussed in Section 5. Finally, a conclusion is provided in Section 6.

2. Test Facilities

The proving ground and test facilities are introduced through our measurements conducted at the DigiTrans test track. This proving ground is designed to replicate realistic driving conditions and provide a controlled environment for testing autonomous driving systems. The test track enables the simulation of various environmental conditions to test the detection performance of the sensors under different scenarios. Further details regarding the test track are provided in Section 2.1. Section 2.2 introduces three commonly used sensor types tested in our experiments. These sensors are widely used in the current automotive industry and their detection performance under adverse weather conditions is of great interest. We introduced a ground truth system in Section 2.3 to analyze the sensor detection error. This system allowed us to evaluate the accuracy and reliability of the sensors in detecting the surrounding environment.

2.1. Test Track

DigiTrans is a test environment located in Austria that collaborates with national and international partners to furnish expertise and testing infrastructure while supporting testing, validation, research and implementation of automated applications within the realm of municipal services, logistics and heavy goods transport. DigiTrans expanded the decades-old testing site in St. Valentin (see [28]) in multiple phases to meet the demands of testing automated and autonomous vehicles.
In our study, we are focused on the influence of adverse weather conditions on automotive sensors. Namely, testing these technologies in a suitable, realistic and reproducible test environment is absolutely necessary to ensure functional capability and to increase the road safety of ADAS and AD systems. To create this test environment, DigiTrans has built a unique outdoor rain plant (see Figure 1) to provide important insights into which natural precipitation conditions affect the performance of optical sensors in detail and how to replicate the characteristics of natural rain.
The outdoor rain plant covers a total length of 100 m with a lane width of 6 m. It is designed and built to replicate natural rain characteristics in a reproducible manner. Figure 2 shows a cross-section of the rain plant in a longitudinal driving direction.
The characteristics of rain are predominantly delineated by its intensity, homogeneity distribution, droplet size distribution and droplet velocities. Rain intensity refers to the average amount of water per unit of time (e.g., mm/h). Homogeneity distribution provides information on the spatial distribution of rain within a specific wetted area, with the mean value of homogeneity distribution being the intensity. Droplet sizes are measured as the mass distribution of different droplet sizes within a defined volume, with typical diameters ranging from 0.5 to 5 mm and the technical information can be found in [29]. It should be mentioned here that natural rain droplets have no classical rain-teardrop shape [30]. The fourth characteristic is droplet speed, which varies according to droplet size and weight-to-drag ratio, as described in [31]. The present study conducted tests under two different rain intensities, namely 25 and 100 mm/h, corresponding to mid- and high-intensity rain based on internationally accepted definitions [32].

2.2. Tested Sensors

In our research, we primarily focus on the performance of three sensors (Camera, Radar, LiDAR) widely used in the automotive industry. Table 1 shows their specific parameters and performance. In the experiment, sensors and measurement systems were integrated into the measurement vehicle, referred to as the “ego car”, shown in Figure 3. Meanwhile, the vehicle’s motion trajectory and dynamics data recording were found using GPS-RTK positioning (using a Novatel OEM-6-RT2 receiver) and the GENESYS Automotive Dynamic Motion Analyser (ADMA). The GPS-RTK system was also used to provide global time synchronization, ensuring that the sensor detection data transmitted on the bus were aligned, making it easier to post-process the data.

2.3. Ground Truth Definition

In Figure 3, the RoboSense RS-Reference system is mounted on the roof of the vehicle, which provides ground truth data in the measurement. Hence, we quantify the detection errors of the sensors using the ground truth data. It is a high-precision reference system designed to accurately evaluate the performance of LiDAR, Radar and Camera systems. It provides a reliable standard for comparison and ensures the accuracy and consistency of test results. The RS-reference system uses advanced algorithms and sensors to provide precise and accurate data for multiple targets. This allows for high-accuracy detection and tracking of objects, even in challenging environments and conditions. Due to the involvement of multiple vehicles in our test scenarios, the use of inertial measurement systems relying on GPS-RTK is unsuitable as these systems can only be installed on one target vehicle. In our case, multiple object tracking is essential, which is why we opted for RS-reference.
To validate the measurement accuracy of the RS-reference system, we compared it with the ADMA. The latter enables highly accurate positioning with an accuracy of up to 1 cm. Therefore, ADMA is used as a benchmark to verify the accuracy of RS-Reference. The target and ego cars are equipped with ADMA testing equipment during the entire testing process. Additionally, the RS-Reference is also installed on the ego car. Due to the difference in the reference frame, the reference benchmarks for ADMA and RS-Reference on the ego car are calibrated to the midpoint of the rear axle. On the target car, the ADMA coordinate system was transformed to the centre point of the bumper to serve as a reference, which is consistent with the measurement information provided via the RS-Reference. The accuracy information is summarized in Table 2. RS-Reference does not perform as accurately as ADMA in longitudinal displacement errors. In light of our designed test cases, multiple targets must be tracked. With limited testing equipment, RS-Reference can meet multi-target detection needs, which provides excellent convenience for subsequent post-processing. Therefore, the measurement information from RS-Reference can serve as ground truth to support evaluating other sensors’ performance.

3. Test Methodology

The methodology for test case implementation involved a total of seven different manoeuvres included in the overall manoeuvre matrix. These scenarios were carefully selected to replicate real-life driving situations. To cover a common repertoire of manoeuvres, each test scenario considers both low and high vehicle speeds. One main research question of this project is to assess the influence of different day and weather conditions on sensors such as LiDAR, Radar and Camera. Figure 4 shows the test matrix with all day and weather conditions. The examined experiments were performed during the daytime on a dry/waterlogged road. Meanwhile, The tests were also conducted under moderate and heavy rain. For nighttime conditions, all tests were conducted under conditions with good artificial lighting. The combinations of weather conditions were, respectively, dry road and simulated moderate and heavy rain. Therefore, all scenarios were performed on the dynamic driving track underneath the rain plant of the proving ground (see Section 2), thus ensuring consistent experimental conditions.
Finally, 276 test cases were performed. Each weather condition consists of seven manoeuvres with two variations in speed and each variation was repeated three times to allow an additional statistical evaluation. To get as close as possible to automated driving behaviour, each vehicle that was equipped with Adaptive Cruise Control (ACC) had it activated while driving. Hence distance maintenance to the front target and acceleration or deceleration of the vehicle were controlled by ACC. Since the rain simulator is only 100 m long; the main part of the manoeuvre should be performed under the rain simulator. Table 3 illustrates the entire test matrix with pictograms and a short description.

4. Results and Evaluation

After a series of post-processing work episodes, we collected 278 valid measurement cases, each sensor containing more than 80,000 detection data, which provided the statistics from real sensor measurement and evaluation for main automotive sensors. In this section, we demonstrate the quantitative analysis for each sensor to show the detection performance. Since the distance of the rain simulator is only 80 m, the collected data are filtered based on GPS location information to ensure that all test results are produced within the coverage area of the rain simulator. For rainfall simulation, we split measurement into moderate and heavy rain conditions with intensities of 25 mm/h and 100 mm/h, respectively. Meanwhile, the artificial illumination condition is also considered in our test. Additionally, we also discuss the influence of detection distance on the results. However, due to the rain simulator’s length limitation, environmental factors’ effect on sensor detection is not considered for this part of the presentation of the results. As introduced in Section 3, the test scenarios have been divided into two parts: daytime and nighttime. The daytime tests are further divided into dry and wet road conditions, as well as moderate and heavy rainfall conditions, which will be simulated using a rain simulator. For the nighttime test, the focus is only on dry road conditions and moderate rainfall, given the test conditions. This approach thoroughly evaluates the systems’ performance under different weather and lighting conditions.
According to the guide to the expression of uncertainty in measurement [36], the detection error can be defined in Equation (1), where D a t a m e a s u r e d can be regarded as the sensor measurement output and the ground truth is labelled D a t a r e f e r e n c e . In addition, the measurement error is denoted as ε u n c e r t a i n t y . After obtaining a series of detection errors for the corresponding sensors, we quantify the Interquartile Range (IQR) of the boxplot and the number of outliers to indicate the detection capability of the sensor.
ε u n c e r t a i n t y = D a t a m e a s u r e d D a t a r e f e r e n c e
In this study, our sole focus is on different sensors’ performance of lateral distance detection. This is because autonomous vehicles rely on sensors to detect and respond to their surroundings and lateral distance detection is essential in this process. An accurate lateral detection enables the vehicle to maintain a safe and stable driving path, which is critical to ensuring the safety of passengers and other road users compared to longitudinal detection-related functions. By continuously monitoring the vehicle’s position in relation to its lane and the surrounding vehicles, an autonomous vehicle can make real-time adjustments to its driving path and speed to maintain a safe and stable driving experience. This information is also used by the vehicle’s control systems to make decisions about lane changes, merging and navigating curves and intersections. A typical example is used in Baidu Apollo, the world’s largest autonomous driving platform, providing trajectory planning via EM planner [37]. Therefore, the results of other outputs from sensors are presented in Appendix A, while the focus of the following sections is solely on the performance of the sensor for lateral distance detection.

4.1. Camera

Cameras are currently widely utilized in the field of automotive safety. Hasirlioglu et al. [10] have demonstrated through a series of experiments that intense rainfall causes a loss of information between the Camera sensor and the object, which cannot be fully retrieved in real time. Meanwhile, Borkar et al. [14] have proven the presence of artificial lighting can be a distraction factor which makes lane detection very difficult. In addition, Koschmieder’s model describes visibility as inversely proportional to the extinction coefficient of air, which has been widely used in the last century [38]. This model can be conveniently defined in [39] by Equation (2).
I ( x , λ ) = e β ( λ ) d ( x ) · J ( x , λ ) + [ 1 e β ( λ ) d ( x ) ] · A ( λ )
where x denotes the horizontal and vertical coordinates of the pixel, λ denotes the wavelength of visible light, β is the extinction coefficient of the atmosphere and d is the scene depth. Furthermore, I and J denote the scene radiance of the observed and clear images depending on x and λ , respectively. The last item A indicates the lightness of the observed scene. Therefore, by analyzing this equation, once the illumination and the extinction coefficient influence the image observed by the Camera sensor, the estimation of obstacles can lead to detection errors. These test results can be observed in Figure 5 and Figure 6, which illustrate the Camera’s lateral detection results.
In general, having the wipers activated during rainfall can help to maintain a clear view for the Camera, making the detection more stable. However, the performance of the Camera’s detection is still impacted by other environmental factors, such as the intensity of the rain and the level of ambient light. These factors can affect the image quality captured by the Camera, making it more difficult for the system to detect and track objects accurately. As shown in Table 4, the IQR increases as the rainfall increases. Specifically, the outlier numbers increase by at least 23% compared to the dry road conditions. It is also evident that the Camera is susceptible to illumination. In principle, the car body material’s reflectivity is higher under sufficient lighting. Hence, the extinction coefficient decreases. Good contrast with the surrounding environment is conducive to recognition. Meanwhile, artificial lighting provides the Camera with enough light at night to capture clear images and perform accurate detection. In this scenario, there is a significant increase in outliers. For dry road conditions, the detected outliers are 55% higher at night than during the day. In the case of moderate rainfall, the number of outliers increased by 41.7%. The high uncertainty of detection at night leads to a decrease in average accuracy, as seen in Table 5.
Comparing Figure 5 and Figure 6, the Camera detection error with the smallest range of outliers is during the day and on a dry road. In addition, nighttime conditions with moderate rainfall are a challenge for Camera detection, where the outlier range is significantly increased in Figure 6b. Since the Camera is a passive sensing sensor, like most computer vision systems, it relies on clearly visible features in the Camera’s field of view to detect and track objects. For a waterlogged road, the water can cause reflections and glare that can affect the image quality captured by the Camera and make it difficult for the system to process the information accurately. As a result, the average error is greatest in this condition as illustrated in Table 5.
Finally, Figure 7 demonstrates the influence of distance on the detection results and it can be clearly seen that the effective detection range of the Camera is about 100 m. The error increases for detection results beyond this range and the confidence interval also grows, which makes the detection results unreliable.

4.2. Radar

In the last decade, Radar-based ADAS has been widely used by almost every car manufacturer in the world. However, In the millimetre wave spectrum, adverse weather conditions, for example rain, snow, fog and hail, can have a significant impact on Radar performance [21]. Moreover, the study of [10] has demonstrated that different rain intensities directly affect the capability of an obstacle to reflect an echo signal in the direction of a Radar receiver, thus resulting in an impact on maximum detectable range, target detectability and tracking stability. Therefore, rain effects on the mm-wave Radar can be classified as attenuation and backscatter. The two mathematical models for the attenuation and backscattering effects of rain can next be represented by Equation (3) and Equation (4), respectively.
P r = P t G 2 λ 2 σ t ( 4 π ) 3 r 4 · V 4 e x p ( 2 γ r )
where r is the distance between Radar sensor and the target obstacle, λ is the Radar wavelength, P t is the transmission power, G denotes antenna gain, and σ t denotes the Radar cross-section of the target. The rain attenuation coefficient γ is determined by rainfall rate and the multipath coefficient is V. From Equation (3), it can be seen that we need to consider the rain attenuation effects when calculating the received signal power P r ; this is based on the rain attenuation effects, pathloss and multipath coefficient.
S t S b = 8 σ t τ c θ B W 2 π r 2 σ i
The relationship between the power intensity S t of the target signal and that of the backscatter signal S b is characterized according to Equation (4). It is essential to maintain the ratio of the two variables above a certain threshold for reliable detection. Where τ is pulse duration, θ B W denotes antenna beamwidth, c is the speed of light. However, The rain backscatter coefficient σ i is highly variable as a function of the drop-size distribution. Therefore, Radar will also consume more energy and cause greater rain backscatter interference from Equation (4). Rainwater could produce the water film on the Radar’s housing and thus affect the detection effect, which can be observed in Table 6. Although the difference in IQR values between rainy and clear weather conditions is insignificant, the number of outliers increases during heavy rainfall. Overall, the Radar is not sensitive to environmental factors. In particular, illumination level does not affect the Radar’s detection performance and the IQR values remain consistent with those during the day. Figure 5 and Figure 6 demonstrate this phenomenon, the ranges of IQR and outliers are basically the same, but when the rainfall intensity is relative high, the rain backscatter interference leads to more outliers in Figure 6b.
However, the average error of Radar’s lateral detection results is larger compared to the Camera and LiDAR, as shown in Table 5. This is because the working principle of Radar is to emit and receive radio waves, which are less focused and have a wider beam width compared to the laser used by LiDAR. This results in a lower spatial resolution for Radar, posing a challenge for lateral detection. LiDAR uses the laser to construct a high-resolution 3D map of the surrounding environment. At the same time, the Camera captures high-resolution images that can be processed using advanced algorithms to detect objects in the scene. This makes LiDAR and Camera systems more suitable for lateral detection than Radar systems. Finally, comparing the performance of the Camera and LiDAR in Figure 7, Radar has the farthest detection distance of approximately 200 m, but the error increases as the distance increases.

4.3. LiDAR

In recent years, automotive LiDAR scanners have been autonomous vehicle sensors essential to the development of autonomous cars. A large number of algorithms have been developed around the 3D point cloud generated by LiDAR for object detection, tracking, environmental mapping or localization. However, LiDAR’s performance is more susceptible to the effects of adverse weather. The studies of [19,40] tested the performance of various LiDARs in a well-controlled fog and rain facility. Meanwhile, these studies verified that as the rainfall intensity increases, the number of point clouds received by the LiDAR decreases, which affects the tracking and recognition of objects. This process can be summarized by LiDAR’s power model in Equation (5).
P ( r ) = E p c η A 2 r 2 · β · T ( r )
This equation describes the power a received laser returns at a distance r, where E p is the total energy of a transmitted pulse laser and c is light speed. A represents receiver’s optical aperture area, η is the overall system efficiency. β denotes the reflectivity of the target’s surface, which is decided by surface properties and incident angle. This last item can be regarded as the transmission loss through the transmission medium, which is given by Equation (6).
T ( r ) = e x p ( 2 0 r α ( x ) d x )
where α ( x ) is the extinction coefficient of the transmission medium; extinction is due to the fact that particles within the transmission medium would scatter and absorb laser light.
From a short review of Equations (5) and (6), we can infer that the rainfall enlarges the transmission loss T ( R ) and hence leads to a decrease in the received laser power P ( R ) , which makes the following signal processing steps fail. In fact, the performance of the LiDAR is degraded due to changes in the extinction coefficient α and the target reflectivity β . Most of the previous studies focused on the statistics of point cloud intensities; the point cloud intensity decreases with rain intensity and distance. However, object recognition based on deep learning is robust and can resist well the impact of environmental noise on the accuracy of the final results. This phenomenon can be observed from our statistics shown in Figure 5 and Figure 6. Although the list of objects output by LiDAR is less influenced by the environment, there are still performance differences. In Table 7, it can still be seen that dry road conditions are indeed the most suitable for LiDAR detection and the difference in IQR between daytime and nighttime is insignificant. Since the tests under the rain simulator are all close-range detection, the results on wet road surfaces are not much different from those on dry surfaces. However, once the rain test started, the difference was noticeable. Raindrops can scatter the laser beams, causing them to return false or distorted readings. This can result in reduced visibility, making it more difficult for the system to detect objects and obstacles on the road. Therefore, as the amount of rain increases, LiDAR detection becomes more difficult. Especially in heavy rain, where the IQR increased by 0.156 m compared to when tested in dry conditions. Additionally, the number of outliers also significantly increases. Furthermore, Figure 5c,d also demonstrate the phenomenon, with a larger varying range of outliers under rainy conditions. Meanwhile, The range of outliers covers the entire observed range of the boxplot in Figure 6b. The influence of rain on LiDAR performance is evident.
In Figure 7c, it can be seen that the detection range of LiDAR can reach 100 m. However, the effective range of 16-beam LiDAR for stable target tracking is about 30 m. Larger than this range, tracking becomes unstable and is accompanied by missing tracking. This is because the algorithms may use a threshold for the minimum signal strength or confidence level required to recognize an object, which limits the maximum range of the object recognition output. In addition, considering the robustness and accuracy of the algorithm could filter out point cloud at long ranges due to the limited resolution and other sources of error, thereby reducing the computational requirements and potential errors associated with processing data at more distant ranges. Through this method, 16-beam LiDAR can provide higher resolution and accuracy over a shorter range, which is suitable for many applications, such as automated driving vehicles and robotics. Only the error is larger in the closer range, which is caused by the mounting position of the LiDAR. Since our tested LiDAR is installed in the front end of the vehicle, it is difficult to cover the whole object when the car is close to the target, which makes the recognition more difficult and less accurate. However, this problem gradually improves when the target vehicle is far away from the ego car.
Figure 7. Camera, Radar and LiDAR detection performance over the distance. (a) Camera detection performance; (b) Radar detection performance; (c) LiDAR detection performance.
Figure 7. Camera, Radar and LiDAR detection performance over the distance. (a) Camera detection performance; (b) Radar detection performance; (c) LiDAR detection performance.
Sustainability 15 07260 g007

5. Discussion

In this section, we discuss the observations and limitations of sensors for ADAS during the measurement under the rain simulator. By comparison using Table 5, we calculate the average detection error of the sensors for different environmental conditions. The comparison of lateral distance errors reveals that there is no significant difference between the errors of the Camera and LiDAR sensors, as the average detection error for both sensors is only 0.054 m and 0.042 m, respectively. Meanwhile, the conclusion drawn in the study of [40] is consistent with our findings, as changes in the propagation medium of the laser due to rain and fog weather adversely affect the detection. However, Radar detection is not as reliable as longitudinal detection, as indicated by the average error of 0.479 m. This is due to the small amount of point cloud data from the Radar and it is challenging to discern lateral deviations after clustering, as discussed in [10,20]. Furthermore, the error results from different environments indicate that Radar is the least affected by environmental factors. Although Cameras are also less impacted by the rainfall, it should be noted that the tests were performed with the wipers on. Additionally, in the night test, our results demonstrated that the detection performance is enhanced by the contrast improvement at night with sufficient artificial light. Finally, while LiDAR has the highest detection accuracy, it is susceptible to the amount of rain and the accuracy difference is more than four times.
To investigate the impact of distance on detection accuracy, we aggregated all test cases in Figure 7 statistically. LiDAR showed an extremely high accuracy rate. The mean error is observed to be merely 0.041 and the standard deviation is also effectively controlled. However, the effective detection range of LiDAR is only about 30 m. Beyond this range, target tracking is occasionally lost. Compared with Radar’s effective detection range of up to 200 m, it is obvious that there are limitations in the usage scenario. However, the detection error of both Radar and Camera becomes larger as the distance increases. The Camera’s average lateral error is 0.617 m, whereas the Radar exhibits a surprisingly high lateral error of 1.456 m, indicating a potential deviation of one lane as distance increases. This presents a significant risk to the accuracy of estimated target vehicle trajectories. Finally, by using uniform sampling, we calculated the detection error for each sensor under all conditions as summarized in Table 8.

6. Conclusions

Through a series of experiments, we have shown the impact of unfavourable weather conditions on automotive sensors’ detection performance. Our analysis focused on lateral distance detection and we quantitatively evaluated the experimental results. Our studies demonstrated that rainfall could significantly reduce the performance of automotive sensors, especially for LiDAR and Camera. Based on the results presented in Table 5, it can be inferred that LiDAR’s detection accuracy diminishes by a factor of 4.8 as the rainfall intensity increases, yet it still exhibits a relatively high precision. In contrast, the Camera’s performance experiences less variation in rainy weather, with a maximum reduction of 1.57 times. However, as the Camera is significantly affected by lighting conditions, its detection accuracy declines by 4.6 times in rainy nighttime conditions compared to clear weather conditions. Additionally, the detection error fluctuation of Radar was slight but lacked lateral estimation accuracy. In the same weather conditions, Radar exhibits detection accuracy that was on average 16.5 and 14 times less precise than Camera and LiDAR, respectively.
Furthermore, we conducted a series of nighttime tests that illustrated the positive effect of high artificial illumination on Camera detection. These experimental findings provide essential insights for automotive manufacturers to design and test their sensors under various weather and lighting conditions to ensure accurate and reliable detection. Additionally, drivers should be aware of the limitations of their vehicle’s sensors and adjust their driving behaviour accordingly during adverse weather conditions. Overall, the detection performance of different automotive sensors under environmental conditions provides valuable data to support sensor fusion. For instance, while LiDAR has a maximum effective detection range of around 100 m, tracking loss occurs beyond 30 m. Thus, to address the limitations of individual sensors, multi-sensor fusion is a promising approach.
As part of our future work, we aim to conduct a more in-depth analysis of the raw data obtained from automotive sensors and introduce more rain and Illumination conditions, for example, introducing more tests for rain and artificial light intensity variation. Raw data are critical inputs to the perception algorithm and often have a significant impact on the final detection output. We particularly want to investigate the effects of rainfall on LiDAR’s point cloud data, as it can significantly impact detection accuracy. Additionally, we plan to explore the development of a sensor fusion algorithm based on the experimental results. By combining data from multiple sensors, sensor fusion can compensate for the limitations of individual sensors, providing a more comprehensive perception of the environment and enabling safer and more effective decision-making for autonomous driving systems. Therefore, our future work will focus on improving the accuracy and reliability of sensor data to enable more robust sensor fusion algorithms.

Author Contributions

Conceptualization, H.L., T.M., Z.F.M., N.B., F.O., Y.Z. and A.E.; methodology, H.L., Z.F.M., N.B. and A.E.; software, H.L., Z.F.M. and N.B.; validation, H.L., T.M., Z.F.M., N.B., F.O. and A.E.; formal analysis, H.L. and N.B.; investigation, H.L., T.M., Z.F.M., N.B., F.O. and A.E.; resources, H.L., T.M., Z.F.M., N.B., F.O. and A.E.; data curation, H.L., Z.F.M., N.B., F.O., Y.Z. and A.E.; writing—original draft preparation, H.L., T.M., Z.F.M., N.B., F.O., Y.Z., C.F. and A.E.; writing—review and editing, H.L., T.M., Z.F.M., N.B., F.O., Y.Z. and A.E.; visualization, H.L., T.M., Z.F.M., N.B., F.O. and A.E.; supervision, A.E.; project administration, A.E. All authors have read and agreed to the published version of the manuscript.

Funding

Open Access Funding by the Graz University of Technology. This activity is part of the research project InVADE (FFG nr. 889349) and has received funding from the program Mobility of the Future, operated by the Austrian research funding agency FFG. Mobility of the Future is a mission-oriented research and development program to help Austria create a transport system designed to meet future mobility and social challenges.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Figure A1. Effect of environmental condition on sensors at daytime in rain simulator (near range). (a) Longitudinal distance detection error under dry road condition; (b) Longitudinal distance detection error under wet road condition; (c) Longitudinal distance detection error under moderate intensity rain conditions; (d) Longitudinal distance detection error under heavy intensity rain conditions.
Figure A1. Effect of environmental condition on sensors at daytime in rain simulator (near range). (a) Longitudinal distance detection error under dry road condition; (b) Longitudinal distance detection error under wet road condition; (c) Longitudinal distance detection error under moderate intensity rain conditions; (d) Longitudinal distance detection error under heavy intensity rain conditions.
Sustainability 15 07260 g0a1
Figure A2. Effect of environmental condition on sensors at nighttime in rain simulator (near range) with artificial lightning. (a) Longitudinal distance detection error under dry road condition; (b) Longitudinal distance detection error under moderate intensity rain conditions.
Figure A2. Effect of environmental condition on sensors at nighttime in rain simulator (near range) with artificial lightning. (a) Longitudinal distance detection error under dry road condition; (b) Longitudinal distance detection error under moderate intensity rain conditions.
Sustainability 15 07260 g0a2
Figure A3. Effect of environmental condition on sensors at daytime in rain simulator (near range). (a) Velocity detection error under dry road condition; (b) Velocity detection error under wet road condition; (c) Velocity detection error under moderate intensity rain conditions; (d) Velocity detection error under heavy intensity rain conditions.
Figure A3. Effect of environmental condition on sensors at daytime in rain simulator (near range). (a) Velocity detection error under dry road condition; (b) Velocity detection error under wet road condition; (c) Velocity detection error under moderate intensity rain conditions; (d) Velocity detection error under heavy intensity rain conditions.
Sustainability 15 07260 g0a3
Figure A4. Effect of environmental condition on sensors at nighttime in rain simulator (near range) with artificial lightning. (a) Velocity detection error under dry road condition; (b) Velocity detection error under moderate intensity rain conditions.
Figure A4. Effect of environmental condition on sensors at nighttime in rain simulator (near range) with artificial lightning. (a) Velocity detection error under dry road condition; (b) Velocity detection error under moderate intensity rain conditions.
Sustainability 15 07260 g0a4
Figure A5. Effect of environmental condition on sensors at daytime in rain simulator (near range). (a) Heading angle detection error under dry road condition; (b) Heading angle detection error under wet road condition; (c) Heading angle detection error under moderate intensity rain conditions; (d) Heading angle detection error under heavy intensity rain conditions.
Figure A5. Effect of environmental condition on sensors at daytime in rain simulator (near range). (a) Heading angle detection error under dry road condition; (b) Heading angle detection error under wet road condition; (c) Heading angle detection error under moderate intensity rain conditions; (d) Heading angle detection error under heavy intensity rain conditions.
Sustainability 15 07260 g0a5
Figure A6. Effect of environmental condition on sensors at nighttime in rain simulator (near range) with artificial lightning. (a) Heading angle detection error under dry road condition; (b) Heading angle detection error under moderate intensity rain conditions.
Figure A6. Effect of environmental condition on sensors at nighttime in rain simulator (near range) with artificial lightning. (a) Heading angle detection error under dry road condition; (b) Heading angle detection error under moderate intensity rain conditions.
Sustainability 15 07260 g0a6

References

  1. Singh, S. Critical Reasons for Crashes Investigated in the National Motor Vehicle Crash Causation Survey; Technical Report; NHTSA’s National Center for Statistics and Analysis: Washington, DC, USA, 2015.
  2. Smith, B.W. Human Error as a Cause of Vehicle Crashes. 2013. Available online: http://cyberlaw.stanford.edu/blog/2013/12/human-error-cause-vehicle-crashes (accessed on 18 December 2013).
  3. Liu, J.; Li, J.; Wang, K.; Zhao, J.; Cong, H.; He, P. Exploring factors affecting the severity of night-time vehicle accidents under low illumination conditions. Adv. Mech. Eng. 2019, 11, 1687814019840940. [Google Scholar] [CrossRef]
  4. Kawasaki, T.; Caveney, D.; Katoh, M.; Akaho, D.; Takashiro, Y.; Tomiita, K. Teammate Advanced Drive System Using Automated Driving Technology. SAE Int. J. Adv. Curr. Pract. Mobil. 2021, 3, 2985–3000. [Google Scholar]
  5. Schrepfer, J.; Picron, V.; Mathes, J.; Barth, H. Automated driving and its sensors under test. ATZ Worldw. 2018, 120, 28–35. [Google Scholar] [CrossRef]
  6. Marti, E.; De Miguel, M.A.; Garcia, F.; Perez, J. A review of sensor technologies for perception in automated driving. IEEE Intell. Transp. Syst. Mag. 2019, 11, 94–108. [Google Scholar] [CrossRef]
  7. Wang, D.; Liu, Q.; Ma, L.; Zhang, Y.; Cong, H. Road traffic accident severity analysis: A census-based study in China. J. Saf. Res. 2019, 70, 135–147. [Google Scholar] [CrossRef] [PubMed]
  8. Brázdil, R.; Chromá, K.; Zahradníček, P.; Dobrovolnỳ, P.; Dolák, L. Weather and traffic accidents in the Czech Republic, 1979–2020. Theor. Appl. Climatol. 2022, 149, 153–167. [Google Scholar] [CrossRef]
  9. Yoneda, K.; Suganuma, N.; Yanase, R.; Aldibaja, M. Automated driving recognition technologies for adverse weather conditions. IATSS Res. 2019, 43, 253–262. [Google Scholar] [CrossRef]
  10. Hasirlioglu, S.; Kamann, A.; Doric, I.; Brandmeier, T. Test methodology for rain influence on automotive surround sensors. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil, 1–4 November 2016; pp. 2242–2247. [Google Scholar]
  11. Hasirlioglu, S.; Doric, I.; Kamann, A.; Riener, A. Reproducible fog simulation for testing automotive surround sensors. In Proceedings of the 2017 IEEE 85th Vehicular Technology Conference (VTC Spring), Sydney, Australia, 4–7 June 2017; pp. 1–7. [Google Scholar]
  12. Li, C.; Jia, Z.; Li, P.; Wen, H.; Lv, G.; Huang, X. Parallel detection of refractive index changes in a porous silicon microarray based on digital images. Sensors 2017, 17, 750. [Google Scholar] [CrossRef] [PubMed]
  13. Goldbeck, J.; Huertgen, B. Lane detection and tracking by video sensors. In Proceedings of the 1999 IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems (Cat. No. 99TH8383), Tokyo, Japan, 5–8 October 1999; pp. 74–79. [Google Scholar]
  14. Borkar, A.; Hayes, M.; Smith, M.T.; Pankanti, S. A layered approach to robust lane detection at night. In Proceedings of the 2009 IEEE Workshop on Computational Intelligence in Vehicles and Vehicular Systems, Nashville, TN, USA, 30 March–2 April 2009; pp. 51–57. [Google Scholar]
  15. Balisavira, V.; Pandey, V. Real-time object detection by road plane segmentation technique for ADAS. In Proceedings of the 2012 Eighth International Conference on Signal Image Technology and Internet Based Systems, Naples, Italy, 25–29 November 2012; pp. 161–167. [Google Scholar]
  16. Roh, C.G.; Kim, J.; Im, I.J. Analysis of impact of rain conditions on ADAS. Sensors 2020, 20, 6720. [Google Scholar] [CrossRef] [PubMed]
  17. Hadi, M.; Sinha, P.; Easterling IV, J.R. Effect of environmental conditions on performance of image recognition-based lane departure warning system. Transp. Res. Rec. 2007, 2000, 114–120. [Google Scholar] [CrossRef]
  18. Bijelic, M.; Gruber, T.; Ritter, W. A benchmark for LiDAR sensors in fog: Is detection breaking down? In Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Suzhou, China, 26–30 June 2018; pp. 760–767. [Google Scholar]
  19. Kutila, M.; Pyykönen, P.; Holzhüter, H.; Colomb, M.; Duthon, P. Automotive LiDAR performance verification in fog and rain. In Proceedings of the 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 4–7 November 2018; pp. 1695–1701. [Google Scholar]
  20. Chaudhary, S.; Wuttisittikulkij, L.; Saadi, M.; Sharma, A.; Al Otaibi, S.; Nebhen, J.; Rodriguez, D.Z.; Kumar, S.; Sharma, V.; Phanomchoeng, G.; et al. Coherent detection-based photonic radar for autonomous vehicles under diverse weather conditions. PLoS ONE 2021, 16, e0259438. [Google Scholar] [CrossRef] [PubMed]
  21. Zang, S.; Ding, M.; Smith, D.; Tyler, P.; Rakotoarivelo, T.; Kaafar, M.A. The impact of adverse weather conditions on autonomous vehicles: How rain, snow, fog and hail affect the performance of a self-driving car. IEEE Veh. Technol. Mag. 2019, 14, 103–111. [Google Scholar] [CrossRef]
  22. Arage Hassen, A. Indicators for the Signal Degradation and Optimization of Automotive Radar Sensors Under Adverse Weather Conditions. Ph.D. Thesis, Technische Universität, Berlin, Germany, 2007. [Google Scholar]
  23. Blevis, B. Losses Due to Rain on Radomes and Antenna Reflecting Surfaces; Technical Report; Defence Research Telecommunications Establishment Ottawa (Ontario): Ottawa, ON, USA, 1964. [Google Scholar]
  24. Slavik, Z.; Mishra, K.V. Phenomenological modeling of millimeter-wave automotive radar. In Proceedings of the 2019 URSI Asia-Pacific Radio Science Conference (AP-RASC), New Delhi, India, 9–15 March 2019; pp. 1–4. [Google Scholar]
  25. Gourova, R.; Krasnov, O.; Yarovoy, A. Analysis of rain clutter detections in commercial 77 GHz automotive radar. In Proceedings of the 2017 European Radar Conference (EURAD), Nuremberg, Germany, 11–13 October 2017; pp. 25–28. [Google Scholar]
  26. Bijelic, M.; Gruber, T.; Ritter, W. Benchmarking image sensors under adverse weather conditions for autonomous driving. In Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Suzhou, China, 26–30 June 2018; pp. 1773–1779. [Google Scholar]
  27. Rosenberger, P.; Holder, M.; Huch, S.; Winner, H.; Fleck, T.; Zofka, M.R.; Zöllner, J.M.; D’hondt, T.; Wassermann, B. Benchmarking and functional decomposition of automotive LiDAR sensor models. In Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, 9–12 June 2019; pp. 632–639. [Google Scholar]
  28. Digitrans. Test Track for Autonomous Driving in St. Valentin. 2023. Available online: https://www.digitrans.expert/en/test-track/ (accessed on 11 October 2020).
  29. G. Kathiravelu, T.L.; Nichols, P. Rain Drop Measurement Techniques: A Review. Water 2016, 8, 29. [Google Scholar] [CrossRef]
  30. Pruppacher, H.; Pitter, R. A semi-empirical determination of the shape of cloud and Rain Drops. J. Atmos. Sci. 1971, 28, 86–94. [Google Scholar] [CrossRef]
  31. Foote, G.; Toit, P.D. Terminal velocity of raindrops aloft. J. Appl. Meteorol. 1969, 8, 249–253. [Google Scholar] [CrossRef]
  32. AMS. Glossary of Meteorolgy. 2023. Available online: https://glossary.ametsoc.org/wiki/Rain (accessed on 25 April 2012).
  33. Nassi, D.; Ben-Netanel, R.; Elovici, Y.; Nassi, B. MobilBye: Attacking ADAS with camera spoofing. arXiv 2019, arXiv:1906.09765. [Google Scholar]
  34. Continental. ARS 408 Long Range Radar Sensor 77 GHz. 2020. Available online: https://conti-engineering.com/components/ars-408/ (accessed on 26 February 2020).
  35. RoboSense. RS-LiDAR-16 Powerful 16 laser-beam LiDAR. 2021. Available online: https://www.robosense.ai/en/rsLiDAR/RS-LiDAR-16 (accessed on 8 April 2021).
  36. BIPM, IEC, IFCC, ISO, IUPAC, OIML. Guide to the Expression of Uncertainty in Measurement; International Organisation for Standardisation: Geneva, Switzerland, 1995. [Google Scholar]
  37. Fan, H.; Zhu, F.; Liu, C.; Zhang, L.; Zhuang, L.; Li, D.; Zhu, W.; Hu, J.; Li, H.; Kong, Q. Baidu apollo em motion planner. arXiv 2018, arXiv:1807.08048. [Google Scholar]
  38. Lee, Z.; Shang, S. Visibility: How applicable is the century-old Koschmieder model? J. Atmos. Sci. 2016, 73, 4573–4581. [Google Scholar] [CrossRef]
  39. Ngo, D.; Lee, S.; Lee, G.D.; Kang, B. Single-image visibility restoration: A machine learning approach and its 4K-capable hardware accelerator. Sensors 2020, 20, 5795. [Google Scholar] [CrossRef] [PubMed]
  40. Li, Y.; Duthon, P.; Colomb, M.; Ibanez-Guzman, J. What happens for a ToF LiDAR in fog? IEEE Trans. Intell. Transp. Syst. 2020, 22, 6670–6681. [Google Scholar] [CrossRef]
Figure 1. Overview of DigiTrans test track in St. Valentin, Lower Austria, with outdoor rain plant © DigiTrans GmbH.
Figure 1. Overview of DigiTrans test track in St. Valentin, Lower Austria, with outdoor rain plant © DigiTrans GmbH.
Sustainability 15 07260 g001
Figure 2. Cross-section of outdoor rain plant in a longitudinal direction with rain characteristics © DigiTrans GmbH.
Figure 2. Cross-section of outdoor rain plant in a longitudinal direction with rain characteristics © DigiTrans GmbH.
Sustainability 15 07260 g002
Figure 3. Overview of sensors and measurement equipment integrated on the testing car.
Figure 3. Overview of sensors and measurement equipment integrated on the testing car.
Sustainability 15 07260 g003
Figure 4. Structure of weather combination.
Figure 4. Structure of weather combination.
Sustainability 15 07260 g004
Figure 5. Effect of environmental condition on sensors at daytime in rain simulator (near range). (a) Lateral distance detection error under dry road conditions; (b) Lateral distance detection error under wet road conditions; (c) Lateral distance detection error under moderate intensity rain conditions; (d) Lateral distance detection error under heavy intensity rain conditions.
Figure 5. Effect of environmental condition on sensors at daytime in rain simulator (near range). (a) Lateral distance detection error under dry road conditions; (b) Lateral distance detection error under wet road conditions; (c) Lateral distance detection error under moderate intensity rain conditions; (d) Lateral distance detection error under heavy intensity rain conditions.
Sustainability 15 07260 g005
Figure 6. Effect of environmental condition on sensors at nighttime in rain simulator (near range) with artificial lightning. (a) Lateral distance detection error under dry road conditions; (b) Lateral distance detection error under moderate intensity rain conditions.
Figure 6. Effect of environmental condition on sensors at nighttime in rain simulator (near range) with artificial lightning. (a) Lateral distance detection error under dry road conditions; (b) Lateral distance detection error under moderate intensity rain conditions.
Sustainability 15 07260 g006
Table 1. The tested sensors with their respective information.
Table 1. The tested sensors with their respective information.
Mobileye 6 Series Camera [33]Continental ARS 408 [34]RoboSense RS-16 [35]
PictureSustainability 15 07260 i001Sustainability 15 07260 i002Sustainability 15 07260 i003
Distance rangeapp. 190 m250 m150 m
Azimuth angle38 18 (far range)360
Elevation angle28 14 (far range)30
Cycle timeapp. 50 msapp. 72 ms50 ms
Interface500 kbit/s CAN-bus500 kbit/s CAN-bus100 Mbps Ethernet
DimensionsL122×W79×H43 mmL138×W91×H31 mm ϕ 109 mm × H80.7 mm
Table 2. Accuracy of RS-Reference compared with GENESYS ADMA.
Table 2. Accuracy of RS-Reference compared with GENESYS ADMA.
Measurement InformationPrecision
Target longitudinal direction (m)0.131
Target lateral direction (m)0.028
Target longitudinal speed (m/s)0.054
Target lateral speed (m/s)0.008
Target heading angle (rad)0.012
Table 3. Test cases description.
Table 3. Test cases description.
ManoeuvrePictogramShort Description
Accelerate LeavingSustainability 15 07260 i004The target vehicle in front accelerates and drives away from the ego vehicle, this case represents a drive off from traffic lights or stop sign.
Accelerate ApproachSustainability 15 07260 i005This case depicts ego car heading towards and approaching traffic congestion or slow-moving traffic. Ego car with ACC function automatically slows down and maintains a safe distance from the target car.
Lateral LeavingSustainability 15 07260 i006An evasive manoeuvre or a lateral lane change are presented. The test speed covers different situations from low speed to high speed and the target car performs double lane change after reaching the preset speed.
Cut-inSustainability 15 07260 i007Cut-in is a common driving behavior. Ego cars with ACC function follow the front car while the side-by-side vehicle accelerates to overtake and cut in at a preset speed.
Cut-outSustainability 15 07260 i008The opposite of the last described action is the cut-out. Under the condition of activating the ACC following function, the front car cuts out to the adjacent lane and performs the overtaking.
Separation TestSustainability 15 07260 i009This scenario presents an ego car with ACC function approaching traffic congestion or waiting for a red light. It usually occurs in urban scenarios with multiple lanes.
Platoon TestSustainability 15 07260 i010Vehicle platooning often occurs on highways and using the ACC function allows for smaller speed fluctuations to keep up with traffic.
Table 4. Quantification of the statistics of Camera detection performance in Figure 5 and Figure 6.
Table 4. Quantification of the statistics of Camera detection performance in Figure 5 and Figure 6.
Environmental ConditionIQRNumber of Outliers
Day—Dry0.154140
Day—Wet Road0.154187
Day—Moderate Rain0.201173
Day—Heavy Rain0.238162
Night—Dry Road0.140217
Night—Moderate Rain0.171265
Table 5. Average lateral detection accuracy comparison of sensors in rain simulator (near range) in meters.
Table 5. Average lateral detection accuracy comparison of sensors in rain simulator (near range) in meters.
Environmental ConditionCameraRadarLiDAR
Day—Dry0.0190.4530.015
Day—Wet Road0.1110.4390.021
Day—Moderate Rain0.0100.4720.046
Day—Heavy Rain0.0300.5070.073
Night—Dry Road0.0640.4580.035
Night—Moderate Rain0.0880.5450.063
Table 6. Quantification of the statistics of Radar detection performance in Figure 5 and Figure 6.
Table 6. Quantification of the statistics of Radar detection performance in Figure 5 and Figure 6.
Environmental ConditionsIQRNumber of Outliers
Day—Dry0.334113
Day—Wet Road0.31174
Day—Moderate Rain0.350109
Day—Heavy Rain0.359135
Night—Dry Road0.320102
Night—Moderate Rain0.328196
Table 7. Quantification of the statistics of LiDAR detection performance in Figure 5 and Figure 6.
Table 7. Quantification of the statistics of LiDAR detection performance in Figure 5 and Figure 6.
Environmental ConditionIQRNumber of Outliers
Day—Dry0.222115
Day—Wet Road0.189114
Day—Moderate Rain0.253131
Day—Heavy Rain0.378288
Night—Dry Road0.172127
Night—Moderate Rain0.248227
Table 8. Average lateral detection accuracy comparison of sensors over the full range.
Table 8. Average lateral detection accuracy comparison of sensors over the full range.
ParametersCameraRadarLiDAR
Mean (m)0.6171.4560.041
Standard Deviation0.5710.8260.118
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, H.; Bamminger, N.; Magosi, Z.F.; Feichtinger, C.; Zhao, Y.; Mihalj, T.; Orucevic, F.; Eichberger, A. The Effect of Rainfall and Illumination on Automotive Sensors Detection Performance. Sustainability 2023, 15, 7260. https://doi.org/10.3390/su15097260

AMA Style

Li H, Bamminger N, Magosi ZF, Feichtinger C, Zhao Y, Mihalj T, Orucevic F, Eichberger A. The Effect of Rainfall and Illumination on Automotive Sensors Detection Performance. Sustainability. 2023; 15(9):7260. https://doi.org/10.3390/su15097260

Chicago/Turabian Style

Li, Hexuan, Nadine Bamminger, Zoltan Ferenc Magosi, Christoph Feichtinger, Yongqi Zhao, Tomislav Mihalj, Faris Orucevic, and Arno Eichberger. 2023. "The Effect of Rainfall and Illumination on Automotive Sensors Detection Performance" Sustainability 15, no. 9: 7260. https://doi.org/10.3390/su15097260

APA Style

Li, H., Bamminger, N., Magosi, Z. F., Feichtinger, C., Zhao, Y., Mihalj, T., Orucevic, F., & Eichberger, A. (2023). The Effect of Rainfall and Illumination on Automotive Sensors Detection Performance. Sustainability, 15(9), 7260. https://doi.org/10.3390/su15097260

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop