Do Statistics Show Differences between Distance Estimations of 3D Objects in the Traffic Environment Using Glances, Side View Mirrors, and Camera Display?
Abstract
:1. Introduction
2. Materials and Methods
2.1. Procedure
2.2. Experimental Design and Protocol
- Looking over the driver’s shoulder;
- Looking in the side-view mirrors;
- Looking at the camera display.
2.3. Stimuli
2.3.1. Characteristics of Vehicle
2.3.2. Description of Objects Used in the Experiment
2.4. Collecting and Processing Data
- Null and alternative hypotheses were established. Hypothesis (H0): There is no statistically significant difference between the estimation of the distance to the object of different colors.
- Alternative hypotheses were that there are statistically significant differences in the estimation of the distance to the object of different colors depending on:
- Hypothesis 1 (H1): all examined conditions;
- Hypothesis 2 (H2): gender;
- Hypothesis 3 (H3): category of driver’s license;
- Hypothesis 4 (H4): driving experience;
- Hypothesis 5 (H5): frequency of driving;
- Hypothesis 6 (H6): use of glasses while driving;
- Hypothesis 7 (H7): dominant lateralization.
3. Results
3.1. Descriptive Statistics
3.2. Estimation of Distance to the Object
3.3. The Relationship between the Estimation of Distance and Colors of the Object
3.4. Gender Differences
3.5. Driving License Category and Estimation of the Distance to the Object
3.6. The Influence of the Driver’s Experience on the Estimation of the Distance to the Object
3.7. The Influence of the Frequency of Driving on the Estimation of the Distance to the Object
3.8. The Influence of Wearing Glasses on the Estimation of the Distance to the Object
3.9. The Influence of Dominant Lateralization on the Estimation of the Distance to the Object
4. Discussion
5. Conclusions
- Participants with the smallest error estimated the distance to the red object better in the conditions when the driver was sitting and looking at the camera display (M = 3.934; SD = 1.927), while for the participants, in the situation when the driver was sitting and looking in the side-view mirror (M = 4.957; SD = 1.667) and in the situation when the driver was sitting and looking over his shoulder (M = 5.065; SD = 1.401), they estimated the distance to the yellow object with the smallest error;
- The participants estimated the distance to the blue and green objects with the highest error in all described conditions;
- There are differences for the color red in the situation when the driver estimated the distance of the object by looking at the object through the rear-view mirror and over the shoulder;
- There are differences for blue, green, yellow, and red colors in a situation when the driver estimated the distance of the object by looking at the object over driver’s shoulder and at the camera display;
- There are differences for blue, green, yellow, and red colors in a situation when the driver estimates the distance of the object by looking at the object through the rear-view mirror and in the camera display;
- Women estimated the distance of blue and green objects with the smallest error, while men estimated the distance of yellow and red objects with the smallest error, in a situation where the driver was looking at the object over his shoulder and through the camera display;
- When the participants estimated the distance by observing the objects using the side-view mirrors, men estimated the distance to blue, green, and red objects with the smallest error, while women estimated the distance to the yellow object with the smallest error;
- There was no significant difference in the results for the estimation of the distance to the object, depending on the category of driver’s license, for all tested conditions and for all tested colors;
- The results show a significant statistical difference between drivers with different driving experiences for the estimated distance to the green object in all tested conditions (looking in the side-view mirrors (F = 2.658; p = 0.050), looking over the driver’s shoulder (F = 3.811; p = 0.011), looking at the camera display (F = 3.825; p = 0.011)). The best estimate, i.e., the smallest error for estimating the distance to the green object in all tested conditions, was made by participants with 1 to 3 years of driving experience (looking in the side-view mirrors (M = 5.366; SD = 1.935), looking over the driver’s shoulder (M = 5.226; SD = 1.508), looking at the camera display (M = 4.716; SD = 1.531));
- The results show a significant statistical difference between drivers with different frequencies of driving on the estimation of the distance to a blue object (F = 3.001; p = 0.013), a green object (F = 2.646; p = 0.025), and a yellow object (F = 2.746; p = 0.021) in the situation when the driver was looking in the side-view mirrors, to the blue object (F = 2.350; p = 0.043) in the situation when the driver was looking over the driver’s shoulder, and to the red object (F = 3.272; p = 0.008) in the situation when the driver observed the object by looking at the camera display;
- The results showed statistically significant differences in participants who wore glasses (M = 4.531; SD = 1.777) and in participants who did not wear glasses (M = 3.657; SD = 1.939) for estimating the distance of the red object (t = −2.845; p = 0.005), in a situation where the driver estimated the distance to the object by looking at the camera display;
- The results showed statistically significant differences in subjects who were right handed (M = 5.094; SD = 1.426) and in subjects who were left handed (M = 4.500; SD = 0.535) for estimating the distance of the yellow object (t = 2.689; p = 0.019), in a situation where the driver estimated the distance to the object by looking over the driver’s shoulder. Additionally, statistically significant differences were observed in participants who were right handed (M = 4.794; SD = 1.435) and in participants who were left handed (M = 4.225; SD = 0.515) for the estimation of the distance to the yellow object (t = 2.642; p = 0.020), in a situation where the driver estimated the distance to the object by looking at the camera display.
- Recommendations
- The above results can be used to improve the display of information to the driver in the cockpit of the vehicle. Using modern mathematical tools [18,19], modeled objects from the real environment can be displayed in the vehicle in 3D. Certainly, it is necessary to examine the appearance and position of the screen in the vehicle from an ergonomic point of view due to a better perception of the driver and less distraction. In this way, it would be easier for the driver to assess information from the external environment. Apart from the application in the automobile and truck industry, a large application can also be achieved with tractors because it is of great importance for tractors to precisely monitor the movement of the attachment device on the rear side of the tractor. However, users’ acceptability to switch to modern technologies remains an open question.
5.1. Limitations
5.2. Future Research
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Frostig, M.; Lefever, D.W.; Whittlesey, J.R. A developmental test of visual perception for evaluating normal and neurologically handicapped children. Percept. Mot. Ski. 1961, 12, 383–394. [Google Scholar] [CrossRef]
- Grandi, B.; Cardinali, M.G. Colours and Price Offers: How Different Price Communications Can Affect Sales and Customers’ Perceptions. SSRN Electron. J. 2022, 68, 103073. [Google Scholar] [CrossRef]
- Milić, A. Traffic Psychology; Faculty of Transport and Traffic Engineering: Belgrade, Serbia, 2007. [Google Scholar]
- Chun, J.; Lee, I.; Park, G.; Seo, J.; Choi, S.; Han, S.H. Efficacy of haptic blind spot warnings applied through a steering wheel or a seatbelt. Transp. Res. Part F Traffic Psychol. Behav. 2013, 21, 231–241. [Google Scholar] [CrossRef]
- Vincent, D.S.; Pitchipoo, P.; Rajakarunakaran, S. Elimination of blind spots for heavy transport vehicles by driver seat design. In Proceedings of the Second International Conference on Advanced Manufacturing and Automation, Krishnankoil, India, 28–30 March 2013. [Google Scholar]
- Pitchipoo, P.; Vincent, D.S.; Rajini, N.; Rajakarunakaran, S. COPRAS Decision Model to Optimize Blind Spot in Heavy Vehicles: A Comparative Perspective. Procedia Eng. 2014, 97, 1049–1059. [Google Scholar] [CrossRef] [Green Version]
- Gudzulić, V.; Baroš, M. Upside-Down World – The Importance of Vestibular Information for Percieved Distance. J. Psychol. 2008, 455–461. [Google Scholar]
- Bernhard, C.; Hecht, H. The Ups and Downs of Camera-Monitor Systems: The Effect of Camera Position on Rearward Distance Perception. Hum. Factors J. Hum. Factors Ergon. Soc. 2020, 63, 415–432. [Google Scholar] [CrossRef] [PubMed]
- Bernhard, C.; Klem, A.; Altuntas, E.C.; Hecht, H. Wider is better but sharper is not: Optimizing the image of camera-monitor systems. Ergonomics 2022, 65, 899–914. [Google Scholar] [CrossRef] [PubMed]
- Hahnel, U.J.; Hecht, H. The impact of rear-view mirror distance and curvature on judgements relevant to road safety. Ergonomics 2011, 55, 23–36. [Google Scholar] [CrossRef] [PubMed]
- Hecht, H.; Brauer, J. Convex rear view mirrors compromise distance and time-to-contact judgements. Ergonomics 2007, 50, 601–614. [Google Scholar] [CrossRef] [PubMed]
- Ivanišević, T.; Ivković, I.; Čičević, S.; Trifunović, A.; Pešić, D.; Vukšić, V.; Simović, S. The impact of daytime running (LED) lights on motorcycles speed estimation: A driving simulator study. Transp. Res. Part F Traffic Psychol. Behav. 2022, 90, 47–57. [Google Scholar] [CrossRef]
- Simović, S.; Ivanišević, T.; Trifunović, A.; Čičević, S.; Taranović, D. What Affects the E-Bicycle Speed Perception in the Era of Eco-Sustainable Mobility: A Driving Simulator Study. Sustainability 2021, 13, 5252. [Google Scholar] [CrossRef]
- Pešić, D.; Trifunović, A.; Ivković, I.; Čičević, S.; Žunjić, A. Evaluation of the effects of daytime running lights for passenger cars. Transp. Res. Part F Traffic Psychol. Behav. 2019, 66, 252–261. [Google Scholar] [CrossRef]
- Ehlers, S.G.; Field, W.E. Determining the Effectiveness of Mirrors and Camera Systems in Monitoring the Rearward Visibility of Self-Propelled Agricultural Machinery. J. Agric. Saf. Health 2017, 23, 183–201. [Google Scholar] [CrossRef] [PubMed]
- Jin, S.X.; Panvini, A.R.; Chuck, R.S. Penetrating ocular injury from motor vehicle rear-view side-mirror. Am. J. Ophthalmol. Case Rep. 2020, 20, 100863. [Google Scholar] [CrossRef] [PubMed]
- Slović, Ž.S.; Vitošević, K.; Mihajlović, F.; Trifunović, A.; Todorović, M. Abdominal injuries in road traffic accidents-autopsy study. Vojnosanit. Pregl. 2022, 42. [Google Scholar] [CrossRef]
- Markova, K.T.; Dovramadjiev, T.A.; Jecheva, G.V. Computer parametric designing in Blender software for creating 3D paper models. Annu. J. Tech. Univ. Varna Bulg. 2017, 1, 77–84. [Google Scholar] [CrossRef]
- Wiegand, G.; Mai, C.; Holländer, K.; Hussmann, H. Incarar: A design space towards 3d augmented reality applications in vehicles. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 21–25 September 2019; pp. 1–13. [Google Scholar]
- Zeng, S.; Geng, G.; Zhou, M. Automatic Representative View Selection of a 3D Cultural Relic Using Depth Variation Entropy and Depth Distribution Entropy. Entropy 2021, 23, 1561. [Google Scholar] [CrossRef] [PubMed]
- Contreras-Reyes, J.E. Analyzing Fish Condition Factor Index Through Skew-Gaussian Information Theory Quantifiers. Fluct. Noise Lett. 2016, 15, 1650013. [Google Scholar] [CrossRef]
Conditions | Look over the Driver’s Shoulder | Looking in the Side-View Mirrors | Looking at the Camera Display | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Distance | Red | Yellow | Green | Blue | Red | Yellow | Green | Blue | Red | Yellow | Green | Blue |
Mean | 5.030 | 5.065 | 5.115 | 5.046 | 5.267 | 4.957 | 5.227 | 5.011 | 3.934 | 4.766 | 4.614 | 4.474 |
Standard Deviation | 1.586 | 1.401 | 1.496 | 1.358 | 2.021 | 1.677 | 1.905 | 1.544 | 1.927 | 1.409 | 1.509 | 1.379 |
Conditions/Color | Red | Yellow | Green | Blue |
---|---|---|---|---|
Look over the driver’s shoulder | −0.53 | 0.44 | 0.88 | 1.95 |
Looking in the side-view mirrors | −0.77 | 0.54 | 0.77 | 1.99 |
Looking at the camera display | 0.57 | 0.73 | 1.39 | 2.53 |
Colors | Conditions | Mean | Str. Deviation | t | p | Eta Square | Magnitude of Impacts |
---|---|---|---|---|---|---|---|
Blue | Looking in the side-view mirrors | −0.034 | 0.928 | −0.480 | 0.632 | - | - |
Look over the driver’s shoulder | |||||||
Looking at the camera display | 0.572 | 0.332 | 22.035 | 0.000 | 0.749 | Large | |
Look over the driver’s shoulder | |||||||
Looking in the side-view mirrors | 0.537 | 0.942 | 7.306 | 0.000 | 0.247 | Large | |
Looking at the camera display | |||||||
Green | Looking in the side-view mirrors | 0.112 | 1.076 | 1.327 | 0.186 | - | - |
Look over the driver’s shoulder | |||||||
Looking at the camera display | 0.501 | 0.216 | 29.766 | 0.000 | 0.845 | Large | |
Look over the driver’s shoulder | |||||||
Looking in the side-view mirrors | 0.613 | 1.075 | 7.298 | 0.000 | 0.246 | Large | |
Looking at the camera display | |||||||
Yellow | Looking in the side-view mirrors | −0.107 | 0.936 | −1.468 | 0.144 | - | - |
Look over the driver’s shoulder | |||||||
Looking at the camera display | 0.299 | 0.082 | 46.582 | 0.000 | 0.930 | Large | |
Look over the driver’s shoulder | |||||||
Looking in the side-view mirrors | 0.191 | 0.947 | 2.589 | 0.010 | 0.039 | Moderate | |
Looking at the camera display | |||||||
Red | Looking in the side-view mirrors | 0.237 | 1.290 | 2.348 | 0.020 | 0.032 | Moderate |
Look over the driver’s shoulder | |||||||
Looking at the camera display | 1.096 | 1.979 | 7.096 | 0.000 | 0.236 | Large | |
Look over the driver’s shoulder | |||||||
Looking in the side-view mirrors | 1.333 | 2.310 | 7.390 | 0.000 | 0.251 | Large | |
Looking at the camera display |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Trifunović, A.; Ivanišević, T.; Čičević, S.; Simović, S.; Vukšić, V.; Slović, Ž. Do Statistics Show Differences between Distance Estimations of 3D Objects in the Traffic Environment Using Glances, Side View Mirrors, and Camera Display? Mathematics 2023, 11, 1258. https://doi.org/10.3390/math11051258
Trifunović A, Ivanišević T, Čičević S, Simović S, Vukšić V, Slović Ž. Do Statistics Show Differences between Distance Estimations of 3D Objects in the Traffic Environment Using Glances, Side View Mirrors, and Camera Display? Mathematics. 2023; 11(5):1258. https://doi.org/10.3390/math11051258
Chicago/Turabian StyleTrifunović, Aleksandar, Tijana Ivanišević, Svetlana Čičević, Sreten Simović, Vedran Vukšić, and Živana Slović. 2023. "Do Statistics Show Differences between Distance Estimations of 3D Objects in the Traffic Environment Using Glances, Side View Mirrors, and Camera Display?" Mathematics 11, no. 5: 1258. https://doi.org/10.3390/math11051258
APA StyleTrifunović, A., Ivanišević, T., Čičević, S., Simović, S., Vukšić, V., & Slović, Ž. (2023). Do Statistics Show Differences between Distance Estimations of 3D Objects in the Traffic Environment Using Glances, Side View Mirrors, and Camera Display? Mathematics, 11(5), 1258. https://doi.org/10.3390/math11051258