Evaluation of Multimodal External Human–Machine Interface for Driverless Vehicles in Virtual Reality
Abstract
:1. Introduction
2. Framework for a Multimodal eHMI of a Driverless Vehicle
2.1. Physical Modality
2.2. Visual Modality
2.3. Auditory Modality
3. Materials and Methods
3.1. Participants
3.2. Apparatus
3.3. Virtual Environment
3.4. Experiment Design
3.5. Procedure
4. Results
4.1. Objective Data Analyses
4.2. Subjective Data Analyses
4.2.1. Simulator Sickness Questionnaire Statistical Analysis Results
4.2.2. Presence Questionnaire Statistical Analysis Results
4.2.3. Interface Rating Scale Statistical Analysis Results
5. Discussion
5.1. Effect on Psychological Comfort
5.2. Effect on Feature Evaluating
5.3. Limitations
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Interface Rating Scale (IRS)
- What visual feature is placed on a driverless vehicle to make you feel safer and more comfortable when you are crossing the street? (Table A1)
- What combination of eHMI on a driverless vehicle makes you feel safer and comfortable when you are crossing the street? (Table A2)
Visual Feature | First Choice | Second Choice | Third Choice |
---|---|---|---|
None | |||
Smile | |||
Arrow |
Physical | Visual + Auditory | First Choice | Second Choice | Third Choice | Fourth Choice | Fifth Choice | Sixth Choice |
---|---|---|---|---|---|---|---|
Yielding | None + none | ||||||
None + human voice | |||||||
Smile + none | |||||||
Arrow + none | |||||||
Smile + human voice | |||||||
Arrow + human voice | |||||||
Not yielding | None + none | ||||||
None + warning sound | |||||||
Smile + none | |||||||
Arrow + none | |||||||
Smile + warning sound | |||||||
Arrow + warning sound |
References
- Schieben, A.; Wilbrink, M.; Kettwich, C.; Madigan, R.; Louw, T.; Merat, N. Designing the interaction of automated vehicles with other traffic participants: Design considerations based on human needs and expectations. Cogn. Technol. Work 2019, 21, 69–85. [Google Scholar] [CrossRef] [Green Version]
- Clamann, M.; Aubert, M.; Cummings, M.L. Evaluation of Vehicle-to-Pedestrian Communication Displays for Autonomous Vehicles. In Proceedings of the Transportation Research Board 96th Annual Meeting, Washington, DC, USA, 8–12 January 2017. [Google Scholar]
- Cacciabue, P.C.; Carsten, O.; Vanderhaegen, F. Is there still a need for CTW? Cogn. Technol. Work 2014, 16, 311–317. [Google Scholar] [CrossRef] [Green Version]
- Lundgren, V.M. Autonomous Vehicles’ Interaction with Pedestrians: An Investigation of Pedestrian-Driver Communication and Development of a Vehicle External Interface; Chalmers University of Technology: Gothenborg, Sweden, 2015. [Google Scholar]
- Habibovic, A.; Andersson, J.; Nilsson, M.; Lundgren, V.M.; Nilsson, J.; IEEE. Evaluating Interactions with Non-existing Automated Vehicles: Three Wizard of Oz Approaches. In Proceedings of the 2016 IEEE Intelligent Vehicles Symposium, Gothenburg, Sweden, 19–22 June 2016; pp. 32–37. [Google Scholar]
- Rasouli, A.; Tsotsos, J.K. Autonomous Vehicles That Interact With Pedestrians: A Survey of Theory and Practice. IEEE Trans. Intell. Transp. Syst. 2020, 21, 900–918. [Google Scholar] [CrossRef] [Green Version]
- Risto, M.; Emmenegger, C.; Vinkhuyzen, E.; Cefkin, M.; Hollan, J. Human-vehicle interfaces: The power of vehicle movement gestures in human road user coordination. In Proceedings of the 9th International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, Manchester Village, VT, USA, 28 June 2017. [Google Scholar]
- Zito, G.A.; Cazzoli, D.; Scheffler, L.; Jager, M.; Muri, R.M.; Mosimann, U.P.; Nyffeler, T.; Mast, F.W.; Nef, T. Street crossing behavior in younger and older pedestrians: An eye- and head-tracking study. BMC Geriatr. 2015, 15, 10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Meir, A.; Oron-Gilad, T.; Parmet, Y. Can child-pedestrians’ hazard perception skills be enhanced? Accid. Anal. Prev. 2015, 83, 101–110. [Google Scholar] [CrossRef]
- Banducci, S.E.; Ward, N.; Gaspar, J.G.; Schab, K.R.; Crowell, J.A.; Kaczmarski, H.; Kramer, A.F. The Effects of Cell Phone and Text Message Conversations on Simulated Street Crossing. Hum. Factors 2016, 58, 150–162. [Google Scholar] [CrossRef] [PubMed]
- Nagamatsu, L.S.; Voss, M.; Neider, M.B.; Gaspar, J.G.; Handy, T.C.; Kramer, A.F.; Liu-Ambrose, T.Y.L. Increased Cognitive Load Leads to Impaired Mobility Decisions in Seniors at Risk for Falls. Psychol. Aging 2011, 26, 253–259. [Google Scholar] [CrossRef] [Green Version]
- Debernard, S.; Chauvin, C.; Pokam, R.; Langlois, S. Designing Human-Machine Interface for Autonomous Vehicles. Ifac Pap. 2016, 49, 609–614. [Google Scholar] [CrossRef]
- Voinescu, A.; Morgan, P.L.; Alford, C.; Caleb-Solly, P. The utility of psychological measures in evaluating perceived usability of automated vehicle interfaces—A study with older adults. Transp. Res. Part F Traffic Psychol. Behav. 2020, 72, 244–263. [Google Scholar] [CrossRef]
- Beller, J.; Heesen, M.; Vollrath, M. Improving the Driver-Automation Interaction: An Approach Using Automation Uncertainty. Hum. Factors 2013, 55, 1130–1141. [Google Scholar] [CrossRef] [PubMed]
- Carsten, O.; Martens, M.H. How can humans understand their automated cars? HMI principles, problems and solutions. Cogn. Technol. Work 2019, 21, 3–20. [Google Scholar] [CrossRef] [Green Version]
- Sucha, M.; Dostal, D.; Risser, R. Pedestrian-driver communication and decision strategies at marked crossings. Accid. Anal. Prev. 2017, 102, 41–50. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ackermann, C.; Beggiato, M.; Schubert, S.; Krems, J.F. An experimental study to investigate design and assessment criteria: What is important for communication between pedestrians and automated vehicles? Appl. Ergon. 2019, 75, 272–282. [Google Scholar] [CrossRef] [PubMed]
- de Clercq, K.; Dietrich, A.; Velasco, J.P.N.; de Winter, J.; Happee, R. External Human-Machine Interfaces on Automated Vehicles: Effects on Pedestrian Crossing Decisions. Hum. Factors 2019, 61, 1353–1370. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Deb, S.; Strawderman, L.J.; Carruth, D.W. Investigating pedestrian suggestions for external features on fully autonomous vehicles: A virtual reality experiment. Transp. Res. Part F Traffic Psychol. Behav. 2018, 59, 135–149. [Google Scholar] [CrossRef]
- Mahadevan, K.; Somanath, S.; Sharlin, E. Communicating Awareness and Intent in Autonomous Vehicle-Pedestrian Interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018. [Google Scholar] [CrossRef]
- Varhelyi, A. Drivers′ speed behaviour at a zebra crossing: A case study. Accid. Anal. Prev. 1998, 30, 731–743. [Google Scholar] [CrossRef]
- Garder, P.E. The impact of speed and other variables on pedestrian safety in Maine. Accid. Anal. Prev. 2004, 36, 533–542. [Google Scholar] [CrossRef]
- Laureshyn, A.; Svensson, A.; Hyden, C. Evaluation of traffic safety, based on micro-level behavioural data: Theoretical framework and first implementation. Accid. Anal. Prev. 2010, 42, 1637–1646. [Google Scholar] [CrossRef] [PubMed]
- Habibovic, A.; Lundgren, V.M.; Andersson, J.; Klingegard, M.; Lagstrom, T.; Sirkka, A.; Fagerlonn, J.; Edgren, C.; Fredriksson, R.; Krupenia, S.; et al. Communicating Intent of Automated Vehicles to Pedestrians. Front. Psychol. 2018, 9, 1336. [Google Scholar] [CrossRef]
- Chang, C.-M.; Toda, K.; Sakamoto, D.; Igarashi, T.; Assoc Comp, M. Eyes on a Car: An Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Oldenburg, Germany, 24–27 September 2017; pp. 65–73. [Google Scholar] [CrossRef]
- Li, Y.; Dikmen, M.; Hussein, T.G.; Wang, Y.; Burns, C. To cross or not to cross: Urgency-based external warning displays on autonomous vehicles to improve pedestrian crossing safety. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Toronto, ON, Canada, 23–25 September 2018; pp. 188–197. [Google Scholar]
- Zhang, K.; Kim, L.H.; Guo, Y.; Follmer, S. Automatic Generation of Spatial Tactile Effects by Analyzing Cross-modality Features of a Video. In Symposium on Spatial User Interaction; Association for Computing Machinery: New York, NY, USA, 2020; Article 13. [Google Scholar] [CrossRef]
- Rahimian, P.; O′Neal, E.E.; Yon, J.P.; Franzen, L.; Jiang, Y.; Plumert, J.M.; Kearney, J.K. Using a virtual environment to study the impact of sending traffic alerts to texting pedestrians. In Proceedings of the 2016 IEEE Virtual Reality (VR), Greenville, SC, USA, 19–23 March 2016; pp. 141–149. [Google Scholar]
- Millard-Ball, A. Pedestrians, Autonomous Vehicles, and Cities. J. Plan. Educ. Res. 2018, 38, 6–12. [Google Scholar] [CrossRef]
- Liu, P.; Du, Y.; Wang, L.; Da Young, J. Ready to bully automated vehicles on public roads? Accid. Anal. Prev. 2020, 137, 105457. [Google Scholar] [CrossRef] [PubMed]
- Alonso, F.; Esteban, C.; Montoro, L.; Serge, A. Conceptualization of aggressive driving behaviors through a Perception of aggressive driving scale (PAD). Transp. Res. Part F Traffic Psychol. Behav. 2019, 60, 415–426. [Google Scholar] [CrossRef]
- Bazilinskyy, P.; Dodou, D.; de Winte, J. Survey on eHMI concepts: The effect of text, color, and perspective. Transp. Res. Part F Traffic Psychol. Behav. 2019, 67, 175–194. [Google Scholar] [CrossRef]
- Wagemans, J.; Elder, J.H.; Kubovy, M.; Palmer, S.E.; Peterson, M.A.; Singh, M.; von der Heydt, R. A Century of Gestalt Psychology in Visual Perception: I. Perceptual Grouping and Figure-Ground Organization. Psychol. Bull. 2012, 138, 1172–1217. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Technology, M.o.I.a.I. Acoustic Vehicle Alerting System of Electric Vehicles Running at Low Speed; GB/T 37153-2018; Standardization Administration of China: Beijing, China, 2018. [Google Scholar]
- Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. Int. J. Aviat. Psychol. 1993, 3, 203–220. [Google Scholar] [CrossRef]
- Witmer, B.G.; Singer, M.J. Measuring presence in virtual environments: A presence questionnaire. Presence Teleoperators Virtual Environ. 1998, 7, 225–240. [Google Scholar] [CrossRef]
- Rebenitsch, L.; Owen, C. Estimating cybersickness from virtual reality applications. Virtual Real. 2021, 25, 165–174. [Google Scholar] [CrossRef]
- Bailey, J.H.; Witmer, B.G. Learning and Transfer of Spatial Knowledge in a Virtual Environment. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 1994, 38, 1158–1162. [Google Scholar] [CrossRef]
- Deb, S.; Carruth, D.W.; Sween, R.; Strawderman, L.; Garrison, T.M. Efficacy of virtual reality in pedestrian safety research. Appl. Ergon. 2017, 65, 449–460. [Google Scholar] [CrossRef]
- Coeugnet, S.; Cahour, B.; Kraiem, S. Risk-taking, emotions and socio-cognitive dynamics of pedestrian street-crossing decision-making in the city. Transp. Res. Part F Traffic Psychol. Behav. 2019, 65, 141–157. [Google Scholar] [CrossRef]
- Rosenbloom, T. Crossing at a red light: Behaviour of individuals and groups. Transp. Res. Part F Traffic Psychol. Behav. 2009, 12, 389–394. [Google Scholar] [CrossRef]
- Blau, M.A. Driverless Vehicles′ Potential Influence on Cyclist and Pedestrian Facility Preferences. Ph.D. Thesis, The Ohio State University, Columbus, OH, USA, 2015. [Google Scholar]
- Stavrinos, D.; Byington, K.W.; Schwebel, D.C. Distracted walking: Cell phones increase injury risk for college pedestrians. J. Saf. Res. 2011, 42, 101–107. [Google Scholar] [CrossRef] [PubMed]
- Schmidt, S.; Faerber, B. Pedestrians at the kerb—Recognising the action intentions of humans. Transp. Res. Part F Traffic Psychol. Behav. 2009, 12, 300–310. [Google Scholar] [CrossRef]
- te Velde, A.F.; van der Kamp, J.; Barela, J.A.; Savelsbergh, G.J.P. Visual timing and adaptive behavior in a road-crossing simulation study. Accid. Anal. Prev. 2005, 37, 399–406. [Google Scholar] [CrossRef] [PubMed]
- Antonescu, O. Front Stop Lamps for a Safer Traffic. In Proceedings of the FISITA 2012 World Automotive Congress, Beijing, China, 27–30 November 2012; pp. 311–314. [Google Scholar]
- Walton, E.B. Automotive, Front and Side Brake/Running/Turn Signal Light. U.S. Patent No. 5,966,073, 12 October 1999. [Google Scholar]
- Petzoldt, T.; Schleinitz, K.; Banse, R. Potential safety effects of a frontal brake light for motor vehicles. IET Intell. Transp. Syst. 2018, 12, 449–453. [Google Scholar] [CrossRef]
Modality | Combinations |
---|---|
Physical | Yielding |
Not yielding | |
Visual + physical | Smile + yielding |
Arrow + yielding | |
Smile + not yielding | |
Arrow + not yielding | |
Auditory + physical | Human voice + yielding |
Warning sound + not yielding | |
Visual + auditory + physical | Smile + human voice + yielding |
Smile + warning sound + yielding | |
Arrow + human voice + yielding | |
Arrow + warning sound + not yielding |
Trial Conditions | Mean (SD) | Median (Interquartile Range) |
---|---|---|
Yielding | 79.64% (0.142) | 83.94% (76.12%~92.73%) |
Smile + yielding | 84.33% (0.106) | 94.52% (89.00%~97.28%) |
Human voice + yielding | 90.36% (0.098) | 79.79% (68.90%~92.69%) |
Arrow + yielding | 92.81% (0.055) | 94.09% (80.10%~100.00%) |
Smile + human voice + yielding | 94.20% (0.080) | 98.97% (86.99%~100.00%) |
Arrow + human voice + yielding | 95.66% (0.050) | 96.77% (92.50%~100.00%) |
Group | Median (Interquartile Range) | Z | p1 |
---|---|---|---|
Single-modal | 77.85% (71.53%~88.99%) | −4.995 | <0.001 |
Multimodal | 91.60% (85.56%~97.48%) |
Trial Conditions | Mean (SD) | Median (Interquartile Range) |
---|---|---|
Not yielding | 75.51% (0.184) | 77.23% (62.43%~92.93%) |
Arrow + not yielding | 71.63% (0.163) | 70.69% (62.10%~80.30%) |
Smile + not yielding | 70.07% (0.145) | 71.54% (58.58%~78.45%) |
Smile + warning sound + not yielding | 69.83% (0.168) | 67.08% (57.32%~79.83%) |
Warning sound + not yielding | 67.84% (0.161) | 68.64% (52.85%~72.68%) |
Arrow + warning sound + not yielding | 63.05% (0.124) | 71.05% (56.50%~80.75%) |
Group | Median (Interquartile Range) | p |
---|---|---|
Arrow | 94.52% (89.90%~97.28%) | 0.016 < 0.05 |
Smile | 83.94% (76.12%~92.73%) |
SSQ Score | Mean score (SD) at Various Times | |||
---|---|---|---|---|
Baseline | After 12 Trials (10 min) | After 24 Trials (20 min) | After 36 Trials (30 min) | |
Total Score | 4.49 (3.22) | 5.98 (5.25) | 10.97 (5.38) | 14.96 (7.35) |
Nausea | 5.72 (4.84) | 3.82 (6.03) | 8.26 (6.11) | 11.45 (10.33) |
Oculomotor Disturbance | 4.55 (3.84) | 7.07 (6.70) | 11.12 (9.00) | 14.15 (8.53) |
Disorientation | 1.86 (4.90) | 3.71 (6.37) | 8.35 (8.80) | 12.99 (12.30) |
Sub-Scales of PQ | Items | Mean Score (SD) |
---|---|---|
Involvement | 1, 2, 3, 4, 6, 7, 8, 13, 16 | 5.06 (0.34) |
Immersion | 9, 10, 17 a, 18, 19, 22 | 5.33 (0.37) |
Visual fidelity | 14, 15 | 5.17 (0.36) |
Interface quality | 20 a, 21 a | 5.10 (0.47) |
Sound b | 5, 11, 12 | 5.22 (0.48) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dou, J.; Chen, S.; Tang, Z.; Xu, C.; Xue, C. Evaluation of Multimodal External Human–Machine Interface for Driverless Vehicles in Virtual Reality. Symmetry 2021, 13, 687. https://doi.org/10.3390/sym13040687
Dou J, Chen S, Tang Z, Xu C, Xue C. Evaluation of Multimodal External Human–Machine Interface for Driverless Vehicles in Virtual Reality. Symmetry. 2021; 13(4):687. https://doi.org/10.3390/sym13040687
Chicago/Turabian StyleDou, Jinzhen, Shanguang Chen, Zhi Tang, Chang Xu, and Chengqi Xue. 2021. "Evaluation of Multimodal External Human–Machine Interface for Driverless Vehicles in Virtual Reality" Symmetry 13, no. 4: 687. https://doi.org/10.3390/sym13040687
APA StyleDou, J., Chen, S., Tang, Z., Xu, C., & Xue, C. (2021). Evaluation of Multimodal External Human–Machine Interface for Driverless Vehicles in Virtual Reality. Symmetry, 13(4), 687. https://doi.org/10.3390/sym13040687