Smiles and Angry Faces vs. Nods and Head Shakes: Facial Expressions at the Service of Autonomous Vehicles
Abstract
:1. Introduction
1.1. Interaction in Traffic
1.2. External Human–Machine Interfaces as a Substitute
1.3. The Case for Anthropomorphism in eHMI Development
1.4. Facial Expressions and Vehicle Yielding Intention
1.5. Facial Expressions and Vehicle Non-Yielding Intention
1.6. Aim and Approach
2. Method
2.1. Design
2.2. Stimuli
2.3. Apparatus
2.4. Procedure
2.5. Dependent Variables
2.6. Participants
2.7. Data Analysis
3. Results
3.1. Latency
3.2. Accuracy
4. Discussion
4.1. Findings
4.2. Implications
4.3. Limitations
4.4. Future Work
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A. Stills Taken from the Animated Sequences
1. Rex looking directly at the participant while smiling. |
2. Roxie looking directly at the participant while smiling. |
3. Rex looking directly at the participant while nodding. |
4. Roxie looking directly at the participant while nodding. |
5. Rex looking directly at the participant while making an angry expression. |
6. Roxie looking directly at the participant while making an angry expression. |
7. Rex looking directly at the participant while shaking its head. |
8. Roxie looking directly at the participant while shaking its head. |
9. Green circle. |
10. Red circle. |
References
- Rasouli, A.; Kotseruba, I.; Tsotsos, J.K. Understanding pedestrian behaviour in complex traffic scenes. IEEE Trans. Intell. Veh. 2017, 3, 61–70. [Google Scholar] [CrossRef]
- Markkula, G.; Madigan, R.; Nathanael, D.; Portouli, E.; Lee, Y.M.; Dietrich, A.; Billington, J.; Merat, N. Defining interactions: A conceptual framework for understanding interactive behaviour in human and automated road traffic. Theor. Issues Ergon. Sci. 2020, 21, 728–752. [Google Scholar] [CrossRef] [Green Version]
- Färber, B. Communication and communication problems between autonomous vehicles and human drivers. In Autonomous Driving; Springer: Berlin/Heidelberg, Germany, 2016; pp. 125–144. [Google Scholar]
- Sucha, M.; Dostal, D.; Risser, R. Pedestrian-driver communication and decision strategies at marked crossings. Accid. Anal. Prev. 2017, 102, 41–50. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Llorca, D.F. From driving automation systems to autonomous vehicles: Clarifying the terminology. arXiv 2021, arXiv:2103.10844. [Google Scholar]
- SAE International. Taxonomy and Definitions of Terms Related to Driving Automation Systems for on-Road Motor Vehicles. 2021. Available online: www.sae.org (accessed on 26 January 2022).
- Dey, D.; Terken, J. Pedestrian interaction with vehicles: Roles of explicit and implicit communication. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Oldenburg, Germany, 24–27 September 2017; ACM: New York, NY, USA; pp. 109–113. [Google Scholar]
- Moore, D.; Currano, R.; Strack, G.E.; Sirkin, D. The case for implicit external human-machine interfaces for autonomous vehicles. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 21–25 September 2019; pp. 295–307. [Google Scholar]
- Lee, Y.M.; Madigan, R.; Giles, O.; Garach-Morcillo, L.; Markkula, G.; Fox, C.; Camara, F.; Merat, N. Road users rarely use explicit communication when interacting in today’s traffic: Implications for automated vehicles. Cogn. Technol. Work. 2020, 23, 367–380. [Google Scholar] [CrossRef]
- Guéguen, N.; Eyssartier, C.; Meineri, S. A pedestrian’s smile and drivers’ behaviour: When a smile increases careful driving. J. Saf. Res. 2016, 56, 83–88. [Google Scholar] [CrossRef]
- Guéguen, N.; Meineri, S.; Eyssartier, C. A pedestrian’s stare and drivers’ stopping behaviour: A field experiment at the pedestrian crossing. Saf. Sci. 2015, 75, 87–89. [Google Scholar] [CrossRef]
- Ren, Z.; Jiang, X.; Wang, W. Analysis of the influence of pedestrians’ eye contact on drivers’ comfort boundary during the crossing conflict. Procedia Eng. 2016, 137, 399–406. [Google Scholar] [CrossRef] [Green Version]
- Nathanael, D.; Portouli, E.; Papakostopoulos, V.; Gkikas, K.; Amditis, A. Naturalistic Observation of Interactions Between Car Drivers and Pedestrians in High Density Urban Settings. In Congress of the International Ergonomics Association; Springer: Cham, Switzerland, 2018; pp. 389–397. [Google Scholar]
- Dey, D.; Walker, F.; Martens, M.; Terken, J. Gaze patterns in pedestrian interaction with vehicles: Towards effective design of external human-machine interfaces for automated vehicles. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 21–25 September 2019; ACM: New York, NY, USA; pp. 369–378. [Google Scholar]
- Eisma, Y.B.; Van Bergen, S.; Ter Brake, S.M.; Hensen, M.T.T.; Tempelaar, W.J.; De Winter, J.C.F. External Human–Machine Interfaces: The Effect of Display Location on Crossing Intentions and Eye Movements. Information 2020, 11, 13. [Google Scholar] [CrossRef] [Green Version]
- Uttley, J.; Lee, Y.M.; Madigan, R.; Merat, N. Road user interactions in a shared space setting: Priority and communication in a UK car park. Transp. Res. Part F Traffic Psychol. Behav. 2020, 72, 32–46. [Google Scholar] [CrossRef]
- De Winter, J.; Bazilinskyy, P.; Wesdorp, D.; de Vlam, V.; Hopmans, B.; Visscher, J.; Dodou, D. How do pedestrians distribute their visual attention when walking through a parking garage? An eye-tracking study. Ergonomics 2021, 64, 793–805. [Google Scholar] [CrossRef] [PubMed]
- Kong, X.; Das, S.; Zhang, Y.; Xiao, X. Lessons learned from pedestrian-driver communication and yielding patterns. Transp. Res. Part F Traffic Psychol. Behav. 2021, 79, 35–48. [Google Scholar] [CrossRef]
- Onkhar, V.; Bazilinskyy, P.; Dodou, D.; De Winter, J.C.F. The effect of drivers’ eye contact on pedestrians’ perceived safety. Transp. Res. Part F Traffic Psychol. Behav. 2022, 84, 194–210. [Google Scholar] [CrossRef]
- Lobjois, R.; Cavallo, V. Age-related differences in street-crossing decisions: The effects of vehicle speed and time constraints on gap selection in an estimation task. Accid. Anal. Prev. 2007, 39, 934–943. [Google Scholar] [CrossRef] [PubMed]
- Sun, R.; Zhuang, X.; Wu, C.; Zhao, G.; Zhang, K. The estimation of vehicle speed and stopping distance by pedestrians crossing streets in a naturalistic traffic environment. Transp. Res. Part F Traffic Psychol. Behav. 2015, 30, 97–106. [Google Scholar] [CrossRef]
- Papić, Z.; Jović, A.; Simeunović, M.; Saulić, N.; Lazarević, M. Underestimation tendencies of vehicle speed by pedestrians when crossing unmarked roadway. Accid. Anal. Prev. 2020, 143, 105586. [Google Scholar] [CrossRef]
- ISO/TR 23049:2018; Road Vehicles: Ergonomic Aspects of External Visual Communication from Automated Vehicles to Other Road Users. BSI: London, UK, 2018.
- Merat, N.; Louw, T.; Madigan, R.; Wilbrink, M.; Schieben, A. What externally presented information do VRUs require when interacting with fully Automated Road Transport Systems in shared space? Accid. Anal. Prev. 2018, 118, 244–252. [Google Scholar] [CrossRef]
- Rasouli, A.; Tsotsos, J.K. Autonomous vehicles that interact with pedestrians: A survey of theory and practice. IEEE Trans. Intell. Transp. Syst. 2019, 21, 900–918. [Google Scholar] [CrossRef] [Green Version]
- Rouchitsas, A.; Alm, H. External human–machine interfaces for autonomous vehicle-to-pedestrian communication: A review of empirical work. Front. Psychol. 2019, 10, 2757. [Google Scholar] [CrossRef]
- Schieben, A.; Wilbrink, M.; Kettwich, C.; Madigan, R.; Louw, T.; Merat, N. Designing the interaction of automated vehicles with other traffic participants: Design considerations based on human needs and expectations. Cogn. Technol. Work. 2019, 21, 69–85. [Google Scholar] [CrossRef] [Green Version]
- Ezzati Amini, R.; Katrakazas, C.; Riener, A.; Antoniou, C. Interaction of automated driving systems with pedestrians: Challenges, current solutions, and recommendations for eHMIs. Transp. Rev. 2021, 41, 788–813. [Google Scholar] [CrossRef]
- Carmona, J.; Guindel, C.; Garcia, F.; de la Escalera, A. eHMI: Review and Guidelines for Deployment on Autonomous Vehicles. Sensors 2021, 21, 2912. [Google Scholar] [CrossRef] [PubMed]
- Tabone, W.; de Winter, J.; Ackermann, C.; Bärgman, J.; Baumann, M.; Deb, S.; Emmenegger, C.; Habibovic, A.; Hagenzieker, M.; Stanton, N.A. Vulnerable road users and the coming wave of automated vehicles: Expert perspectives. Transp. Res. Interdiscip. Perspect. 2021, 9, 100293. [Google Scholar] [CrossRef]
- Böckle, M.P.; Brenden, A.P.; Klingegård, M.; Habibovic, A.; Bout, M. SAV2P: Exploring the Impact of an Interface for Shared Automated Vehicles on Pedestrians’ Experience. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct, Oldenburg, Germany, 24–27 September 2017; ACM: New York, NY, USA, 2017; pp. 136–140. [Google Scholar]
- Chang, C.M.; Toda, K.; Sakamoto, D.; Igarashi, T. Eyes on a Car: An Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Oldenburg, Germany, 24–27 September 2017; ACM: New York, NY, USA, 2017; pp. 65–73. [Google Scholar]
- Costa, G. Designing Framework for Human-Autonomous Vehicle Interaction. Master’s Thesis, Graduate School of Media Design, Keio University, Yokohama, Japan, 2017. [Google Scholar]
- Deb, S.; Strawderman, L.J.; Carruth, D.W. Investigating pedestrian suggestions for external features on fully autonomous vehicles: A virtual reality experiment. Transp. Res. Part F Traffic Psychol. Behav. 2018, 59, 135–149. [Google Scholar] [CrossRef]
- Habibovic, A. Communicating intent of automated vehicles to pedestrians. Front. Psychol. 2018, 9, 1336. [Google Scholar] [CrossRef] [PubMed]
- Hudson, C.R.; Deb, S.; Carruth, D.W.; McGinley, J.; Frey, D. Pedestrian Perception of Autonomous Vehicles with External Interacting Features. In Advances in Human Factors and Systems Interaction, Proceedings of the AHFE 2018 International Conference on Human Factors and Systems Interaction, Loews Sapphire Falls Resort at Universal Studios, Orlando, FL, USA, 21–25 July 2018; Springer: Cham, Switzerland, 2018; pp. 33–39. [Google Scholar]
- Mahadevan, K.; Somanath, S.; Sharlin, E. Communicating Awareness and Intent in Autonomous Vehicle-Pedestrian Interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; ACM: New York, NY, USA; p. 429. [Google Scholar]
- Othersen, I.; Conti-Kufner, A.S.; Dietrich, A.; Maruhn, P.; Bengler, K. Designing for automated vehicle and pedestrian communication: Perspectives on eHMIs from older and younger persons. Proc. Hum. Factors Ergon. Soc. Eur. 2018, 4959, 135–148. [Google Scholar]
- Petzoldt, T.; Schleinitz, K.; Banse, R. Potential safety effects of a frontal brake light for motor vehicles. IET Intell. Transp. Syst. 2018, 12, 449–453. [Google Scholar] [CrossRef]
- Song, Y.E.; Lehsing, C.; Fuest, T.; Bengler, K. External HMIs and their effect on the interaction between pedestrians and automated vehicles. In Intelligent Human Systems Integration, Proceedings of the 1st International Conference on Intelligent Human Systems Integration (IHSI 2018): Integrating People and Intelligent Systems, Dubai, United Arab Emirates, 7–9 January 2018; Springer: Cham, Switzerland, 2018; pp. 13–18. [Google Scholar]
- De Clercq, K.; Dietrich, A.; Núñez Velasco, J.P.; De Winter, J.; Happee, R. External human-machine interfaces on automated vehicles: Effects on pedestrian crossing decisions. Hum. Factors 2019, 61, 1353–1370. [Google Scholar] [CrossRef] [Green Version]
- Holländer, K.; Colley, A.; Mai, C.; Häkkilä, J.; Alt, F.; Pfleging, B. Investigating the influence of external car displays on pedestrians’ crossing behavior in virtual reality. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services, Taipei, Taiwan, 1–4 October 2019; pp. 1–11. [Google Scholar]
- Stadler, S.; Cornet, H.; Theoto, T.N.; Frenkler, F. A Tool, not a Toy: Using Virtual Reality to Evaluate the Communication Between Autonomous Vehicles and Pedestrians. In Augmented Reality and Virtual Reality; Springer: Cham, Switzerland, 2019; pp. 203–216. [Google Scholar]
- Ackermans, S.C.A.; Dey, D.D.; Ruijten, P.A.; Cuijpers, R.H.; Pfleging, B. The effects of explicit intention communication, conspicuous sensors, and pedestrian attitude in interactions with automated vehicles. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery, Inc.: New York, NY, USA, 2020; p. 70. [Google Scholar]
- Faas, S.M.; Mathis, L.A.; Baumann, M. External HMI for self-driving vehicles: Which information shall be displayed? Transp. Res. Part F Traffic Psychol. Behav. 2020, 68, 171–186. [Google Scholar] [CrossRef]
- Singer, T.; Kobbert, J.; Zandi, B.; Khanh, T.Q. Displaying the driving state of automated vehicles to other road users: An international, virtual reality-based study as a first step for the harmonized regulations of novel signaling devices. IEEE Trans. Intell. Transp. Syst. 2020, 23, 2904–2918. [Google Scholar] [CrossRef]
- Lee, Y.M.; Madigan, R.; Uzondu, C.; Garcia, J.; Romano, R.; Markkula, G.; Merat, N. Learning to interpret novel eHMI: The effect of vehicle kinematics and eHMI familiarity on pedestrian’ crossing behavior. J. Saf. Res. 2022, 80, 270–280. [Google Scholar] [CrossRef] [PubMed]
- Wilbrink, M.; Lau, M.; Illgner, J.; Schieben, A.; Oehl, M. Impact of External Human–Machine Interface Communication Strategies of Automated Vehicles on Pedestrians’ Crossing Decisions and Behaviors in an Urban Environment. Sustainability 2021, 13, 8396. [Google Scholar] [CrossRef]
- Clamann, M.; Aubert, M.; Cummings, M.L. Evaluation of vehicle-to-pedestrian communication displays for autonomous vehicles (No. 17-02119). In Proceedings of the Transportation Research Board 96th Annual Meeting, Washington, DC, USA, 8–12 January 2017. [Google Scholar]
- Li, Y.; Dikmen, M.; Hussein, T.G.; Wang, Y.; Burns, C. To Cross or Not to Cross: Urgency-Based External Warning Displays on Autonomous Vehicles to Improve Pedestrian Crossing Safety. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Toronto, ON, Canada, 23–25 September 2018; ACM: New York, NY, USA, 2018; pp. 188–197. [Google Scholar]
- Hensch, A.C.; Neumann, I.; Beggiato, M.; Halama, J.; Krems, J.F. How Should Automated Vehicles Communicate? Effects of a Light-Based Communication Approach in a Wizard-of-Oz Study. In Advances in Human Factors of Transportation, Proceedings of the AHFE 2019 International Conference on Human Factors in Transportation, Washington, DC, USA, 24–28 July 2019; Springer: Cham, Switzerland, 2019; pp. 79–91. [Google Scholar]
- Colley, M.; Fabian, T.; Rukzio, E. Investigating the Effects of External Communication and Automation Behavior on Manual Drivers at Intersections. Proc. ACM Hum.-Comput. Interact. 2022, 6, 1–16. [Google Scholar] [CrossRef]
- Papakostopoulos, V.; Nathanael, D.; Portouli, E.; Amditis, A. Effect of external HMI for automated vehicles (AVs) on drivers’ ability to infer the AV motion intention: A field experiment. Transp. Res. Part F Traffic Psychol. Behav. 2021, 82, 32–42. [Google Scholar] [CrossRef]
- Rettenmaier, M.; Albers, D.; Bengler, K. After you?!–Use of external human-machine interfaces in road bottleneck scenarios. Transp. Res. Part F Traffic Psychol. Behav. 2020, 70, 175–190. [Google Scholar] [CrossRef]
- Fridman, L.; Mehler, B.; Xia, L.; Yang, Y.; Facusse, L.Y.; Reimer, B. To walk or not to walk: Crowdsourced assessment of external vehicle-to-pedestrian displays. arXiv 2017, arXiv:1707.02698. [Google Scholar]
- Chang, C.M.; Toda, K.; Igarashi, T.; Miyata, M.; Kobayashi, Y. A Video-based Study Comparing Communication Modalities between an Autonomous Car and a Pedestrian. In Proceedings of the Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Toronto, ON, Canada, 23–25 September 2018; ACM: New York, NY, USA, 2018; pp. 104–109. [Google Scholar]
- Ackermann, C.; Beggiato, M.; Schubert, S.; Krems, J.F. An experimental study to investigate design and assessment criteria: What is important for communication between pedestrians and automated vehicles? Appl. Ergon. 2019, 75, 272–282. [Google Scholar] [CrossRef]
- Bazilinskyy, P.; Dodou, D.; De Winter, J. Survey on eHMI concepts: The effect of text, color, and perspective. Transp. Res. Part F: Traffic Psychol. Behav. 2019, 67, 175–194. [Google Scholar] [CrossRef]
- Dey, D.; Habibovic, A.; Löcken, A.; Wintersberger, P.; Pfleging, B.; Riener, A.; Martens, M.; Terken, J. Taming the eHMI jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles’ external human-machine interfaces. Transp. Res. Interdiscip. Perspect. 2020, 7, 100174. [Google Scholar] [CrossRef]
- Zhang, J.; Vinkhuyzen, E.; Cefkin, M. Evaluation of an autonomous vehicle external communication system concept: A survey study. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Los Angeles, CA, USA, 17–21 July 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 650–661. [Google Scholar]
- Alvarez, W.M.; de Miguel, M.Á.; García, F.; Olaverri-Monreal, C. Response of Vulnerable Road Users to Visual Information from Autonomous Vehicles in Shared Spaces. In Proceedings of the 2019 IEEE, Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; pp. 3714–3719. [Google Scholar]
- Chang, C.M. A Gender Study of Communication Interfaces between an Autonomous Car and a Pedestrian. In Proceedings of the 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Virtual, 21–22 September 2020; pp. 42–45. [Google Scholar]
- Nowak, K.L.; Rauh, C. Choose your “buddy icon” carefully: The influence of avatar androgyny, anthropomorphism, and credibility in online interactions. Comput. Hum. Behav. 2008, 24, 1473–1493. [Google Scholar] [CrossRef]
- De Visser, E.J.; Krueger, F.; McKnight, P.; Scheid, S.; Smith, M.; Chalk, S.; Parasuraman, R. The world is not enough: Trust in cognitive agents. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting; Sage Publications: Thousand Oaks, CA, USA, 2012; Volume 56, pp. 263–267. [Google Scholar]
- Pak, R.; Fink, N.; Price, M.; Bass, B.; Sturre, L. Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults. Ergonomics 2012, 55, 1059–1072. [Google Scholar] [CrossRef] [PubMed]
- Hoff, K.A.; Bashir, M. Trust in automation: Integrating empirical evidence on factors that influence trust. Hum. Factors 2015, 57, 407–434. [Google Scholar] [CrossRef] [PubMed]
- Waytz, A.; Heafner, J.; Epley, N. The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. J. Exp. Soc. Psychol. 2014, 52, 113–117. [Google Scholar] [CrossRef]
- Choi, J.K.; Ji, Y.G. Investigating the importance of trust on adopting an autonomous vehicle. Int. J. Hum.-Comput. Interact. 2015, 31, 692–702. [Google Scholar] [CrossRef]
- Hengstler, M.; Enkel, E.; Duelli, S. Applied artificial intelligence and trust—The case of autonomous vehicles and medical assistance devices. Technol. Forecast. Soc. Chang. 2016, 105, 105–120. [Google Scholar] [CrossRef]
- Reig, S.; Norman, S.; Morales, C.G.; Das, S.; Steinfeld, A.; Forlizzi, J. A field study of pedestrians and autonomous vehicles. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Toronto, ON, Canada, 23–25 September 2018; pp. 198–209. [Google Scholar]
- Oliveira, L.; Proctor, K.; Burns, C.G.; Birrell, S. Driving style: How should an automated vehicle behave? Information 2019, 10, 219. [Google Scholar] [CrossRef] [Green Version]
- Olaverri-Monreal, C. Promoting trust in self-driving vehicles. Nat. Electron. 2020, 3, 292–294. [Google Scholar] [CrossRef]
- Wang, Y.; Hespanhol, L.; Tomitsch, M. How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots. Multimodal Technol. Interact. 2021, 5, 84. [Google Scholar] [CrossRef]
- Schilbach, L.; Wohlschlaeger, A.M.; Kraemer, N.C.; Newen, A.; Shah, N.J.; Fink, G.R.; Vogeley, K. Being with virtual others: Neural correlates of social interaction. Neuropsychologia 2006, 44, 718–730. [Google Scholar] [CrossRef]
- Kuzmanovic, B.; Georgescu, A.L.; Eickhoff, S.B.; Shah, N.J.; Bente, G.; Fink, G.R.; Vogeley, K. Duration matters: Dissociating neural correlates of detection and evaluation of social gaze. Neuroimage 2009, 46, 1154–1163. [Google Scholar] [CrossRef]
- Schrammel, F.; Pannasch, S.; Graupner, S.T.; Mojzisch, A.; Velichkovsky, B.M. Virtual friend or threat? The effects of facial expression and gaze interaction on psychophysiological responses and emotional experience. Psychophysiology 2009, 46, 922–931. [Google Scholar] [CrossRef]
- Georgescu, A.L.; Kuzmanovic, B.; Schilbach, L.; Tepest, R.; Kulbida, R.; Bente, G.; Vogeley, K. Neural correlates of “social gaze” processing in high-functioning autism under systematic variation of gaze duration. NeuroImage Clin. 2013, 3, 340–351. [Google Scholar] [CrossRef] [Green Version]
- Parsons, T.D. Virtual reality for enhanced ecological validity and experimental control in the clinical, affective, and social neurosciences. Front. Hum. Neurosci. 2015, 9, 660. [Google Scholar] [CrossRef]
- Parsons, T.D.; Gaggioli, A.; Riva, G. Virtual reality for research in social neuroscience. Brain Sci. 2017, 7, 42. [Google Scholar] [CrossRef] [PubMed]
- Dobs, K.; Bülthoff, I.; Schultz, J. Use and usefulness of dynamic face stimuli for face perception studies–a review of behavioral findings and methodology. Front. Psychol. 2018, 9, 1355. [Google Scholar] [CrossRef] [PubMed]
- Georgescu, A.L.; Kuzmanovic, B.; Roth, D.; Bente, G.; Vogeley, K. The use of virtual characters to assess and train non-verbal communication in high-functioning autism. Front. Hum. Neurosci. 2014, 8, 807. [Google Scholar] [CrossRef] [Green Version]
- Biocca, F.; Harms, C.; Burgoon, J.K. Toward a more robust theory and measure of social presence: Review and suggested criteria. Presence Teleoperators Virtual Environ. 2003, 12, 456–480. [Google Scholar] [CrossRef]
- Cassell, J.; Thorisson, K.R. The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents. Appl. Artif. Intell. 1999, 13, 519–538. [Google Scholar] [CrossRef]
- Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA, 2000. [Google Scholar]
- Pütten, A.V.D.; Reipen, C.; Wiedmann, A.; Kopp, S.; Krämer, N.C. Comparing emotional vs. envelope feedback for ECAs. In International Workshop on Intelligent Virtual Agents; Springer: Berlin/Heidelberg, Germany, 2008; pp. 550–551. [Google Scholar]
- Ochs, M.; Niewiadomski, R.; Pelachaud, C. How a virtual agent should smile? In International Conference on Intelligent Virtual Agents; Springer: Berlin/Heidelberg, Germany, 2010; pp. 427–440. [Google Scholar]
- Scherer, K.R.; Bänziger, T.; Roesch, E. (Eds.) A Blueprint for Affective Computing: A Sourcebook and Manual; University Press: Oxford, UK, 2010. [Google Scholar]
- Wang, N.; Gratch, J. Don’t just stare at me! In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, GA, USA, 10–15 April 2010; pp. 1241–1250. [Google Scholar]
- McDonnell, R.; Breidt, M.; Bülthoff, H.H. Render me real? Investigating the effect of render style on the perception of animated virtual humans. ACM Trans. Graph. (TOG) 2012, 31, 91. [Google Scholar] [CrossRef]
- Wong, J.W.E.; McGee, K. Frown more, talk more: Effects of facial expressions in establishing conversational rapport with virtual agents. In Intelligent Virtual Agents, Proceedings of the 12th International Conference, IVA 2012, Santa Cruz, CA, USA, 12–14 September 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 419–425. [Google Scholar]
- Aljaroodi, H.M.; Adam, M.T.; Chiong, R.; Teubner, T. Avatars and embodied agents in experimental information systems research: A systematic review and conceptual framework. Australas. J. Inf. Syst. 2019, 23. [Google Scholar] [CrossRef] [Green Version]
- Furuya, H.; Kim, K.; Bruder, G.J.; Wisniewski, P.F.; Welch, G. Autonomous Vehicle Visual Embodiment for Pedestrian Interactions in Crossing Scenarios: Virtual Drivers in AVs for Pedestrian Crossing. In Proceedings of the Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–7. [Google Scholar]
- Frith, C.D.; Frith, U. Interacting minds—A biological basis. Science 1999, 286, 1692–1695. [Google Scholar] [CrossRef] [Green Version]
- Gallagher, H.L.; Frith, C.D. Functional imaging of ‘theory of mind’. Trends Cogn. Sci. 2003, 7, 77–83. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Krumhuber, E.G.; Kappas, A.; Manstead, A.S. Effects of dynamic aspects of facial expressions: A review. Emot. Rev. 2013, 5, 41–46. [Google Scholar] [CrossRef]
- Horstmann, G. What do facial expressions convey: Feeling states, behavioral intentions, or actions requests? Emotion 2003, 3, 150. [Google Scholar] [CrossRef]
- Scherer, K.R.; Grandjean, D. Facial expressions allow inference of both emotions and their components. Cogn. Emot. 2008, 22, 789–801. [Google Scholar] [CrossRef]
- Ekman, P. Facial expressions of emotion: New findings, new questions. Psychol. Sci. 1992, 3, 34–38. [Google Scholar] [CrossRef]
- Berkowitz, L.; Harmon-Jones, E. Toward an understanding of the determinants of anger. Emotion 2004, 4, 107. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Semcon. The Smiling Car. 2016. Available online: https://semcon.com/uk/smilingcar/ (accessed on 21 April 2022).
- Rouchitsas, A.; Alm, H. Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention. Information 2022, 13, 420. [Google Scholar] [CrossRef]
- Nusseck, M.; Cunningham, D.W.; Wallraven, C.; Bülthoff, H.H. The contribution of different facial regions to the recognition of conversational expressions. J. Vis. 2008, 8, 1. [Google Scholar] [CrossRef] [Green Version]
- Cunningham, D.W.; Wallraven, C. Dynamic information for the recognition of conversational expressions. J. Vis. 2009, 9, 7. [Google Scholar] [CrossRef]
- Kaulard, K.; Cunningham, D.W.; Bülthoff, H.H.; Wallraven, C. The MPI facial expression database—A validated database of emotional and conversational facial expressions. PLoS ONE 2012, 7, e32321. [Google Scholar] [CrossRef] [PubMed]
- Kendon, A. Some uses of the head shake. Gesture 2002, 2, 147–182. [Google Scholar] [CrossRef]
- Guidetti, M. Yes or no? How young French children combine gestures and speech to agree and refuse. J. Child Lang. 2005, 32, 911–924. [Google Scholar] [CrossRef] [PubMed]
- Andonova, E.; Taylor, H.A. Nodding in dis/agreement: A tale of two cultures. Cogn. Process. 2012, 13, 79–82. [Google Scholar] [CrossRef]
- Fusaro, M.; Vallotton, C.D.; Harris, P.L. Beside the point: Mothers’ head nodding and shaking gestures during parent–child play. Infant Behav. Dev. 2014, 37, 235–247. [Google Scholar] [CrossRef]
- Osugi, T.; Kawahara, J.I. Effects of Head Nodding and Shaking Motions on Perceptions of Likeability and Approachability. Perception 2018, 47, 16–29. [Google Scholar] [CrossRef] [Green Version]
- Moretti, S.; Greco, A. Nodding and shaking of the head as simulated approach and avoidance responses. Acta Psychol. 2020, 203, 102988. [Google Scholar] [CrossRef]
- Yamada, H.; Matsuda, T.; Watari, C.; Suenaga, T. Dimensions of visual information for categorizing facial expressions of emotion. Jpn. Psychol. Res. 1994, 35, 172–181. [Google Scholar] [CrossRef] [Green Version]
- Tottenham, N.; Tanaka, J.W.; Leon, A.C.; McCarry, T.; Nurse, M.; Hare, T.A.; Marcus, D.J.; Westerlund, A.; Nelson, C. The NimStim set of facial expressions: Judgments from untrained research participants. Psychiatry Res. 2009, 168, 242–249. [Google Scholar] [CrossRef] [Green Version]
- Wu, S.; Sun, S.; Camilleri, J.A.; Eickhoff, S.B.; Yu, R. Better the devil you know than the devil you don’t: Neural processing of risk and ambiguity. NeuroImage 2021, 236, 118109. [Google Scholar] [CrossRef]
- Bainbridge, L. Ironies of automation. In Analysis, Design, and Evaluation of Man–Machine Systems; Pergamon: Oxford, UK, 1983; pp. 129–135. [Google Scholar]
- Reason, J. Understanding adverse events: Human factors. BMJ Qual. Saf. 1995, 4, 80–89. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kaß, C.; Schoch, S.; Naujoks, F.; Hergeth, S.; Keinath, A.; Neukum, A. Standardized Test Procedure for External Human–Machine Interfaces of Automated Vehicles. Information 2020, 11, 173. [Google Scholar] [CrossRef] [Green Version]
- Weber, M.; Giacomin, J.; Malizia, A.; Skrypchuk, L.; Gkatzidou, V.; Mouzakitis, A. Investigation of the dependency of the drivers’ emotional experience on different road types and driving conditions. Transp. Res. Part F Traffic Psychol. Behav. 2019, 65, 107–120. [Google Scholar] [CrossRef]
- Popuşoi, S.A.; Havârneanu, G.M.; Havârneanu, C.E. “Get the f#∗ k out of my way!” Exploring the cathartic effect of swear words in coping with driving anger. Transp. Res. Part F Traffic Psychol. Behav. 2018, 56, 215–226. [Google Scholar]
- Stephens, A.N.; Lennon, A.; Bihler, C.; Trawley, S. The measure for angry drivers (MAD). Transp. Res. Part F Traffic Psychol. Behav. 2019, 64, 472–484. [Google Scholar] [CrossRef]
- Deffenbacher, J.L.; Lynch, R.S.; Oetting, E.R.; Swaim, R.C. The Driving Anger Expression Inventory: A measure of how people express their anger on the road. Behav. Res. Ther. 2002, 40, 717–737. [Google Scholar] [CrossRef]
- Utriainen, R.; Pöllänen, M. Prioritizing Safety or Traffic Flow? Qualitative Study on Highly Automated Vehicles’ Potential to Prevent Pedestrian Crashes with Two Different Ambitions. Sustainability 2020, 12, 3206. [Google Scholar] [CrossRef] [Green Version]
- Bevan, N.; Carter, J.; Harker, S. ISO 9241-11 revised: What have we learnt about usability since 1998? In Human-Computer Interaction: Design and Evaluation, Proceedings of the 17th International Conference, HCI International 2015, Los Angeles, CA, USA, 2–7 August 2015; Springer: Cham, Switzerland, 2015; pp. 143–151. [Google Scholar]
- Wiese, E.; Metta, G.; Wykowska, A. Robots as intentional agents: Using neuroscientific methods to make robots appear more social. Front. Psychol. 2017, 8, 1663. [Google Scholar] [CrossRef] [Green Version]
- Hess, U.; Adams, R., Jr.; Kleck, R. Who may frown and who should smile? Dominance, affiliation, and the display of happiness and anger. Cogn. Emot. 2005, 19, 515–536. [Google Scholar] [CrossRef]
- Becker, D.V.; Kenrick, D.T.; Neuberg, S.L.; Blackwell, K.C.; Smith, D.M. The confounded nature of angry men and happy women. J. Personal. Soc. Psychol. 2007, 92, 179. [Google Scholar] [CrossRef] [Green Version]
- Nag, P.; Yalçın, Ö.N. Gender stereotypes in virtual agents. In Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents, Virtual, 20–22 October 2020; pp. 1–8. [Google Scholar]
- Dey, D. External Communication for Self-Driving Cars: Designing for Encounters between Automated Vehicles and Pedestrians. Ph.D. Thesis, Technische Universiteit Eindhoven, Eindhoven, The Netherlands, 2020. [Google Scholar]
- Bantoft, C.; Summers, M.J.; Tranent, P.J.; Palmer, M.A.; Cooley, P.D.; Pedersen, S.J. Effect of standing or walking at a workstation on cognitive function: A randomized counterbalanced trial. Hum. Factors 2016, 58, 140–149. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kang, S.H.; Lee, J.; Jin, S. Effect of standing desk use on cognitive performance and physical workload while engaged with high cognitive demand tasks. Appl. Ergon. 2021, 92, 103306. [Google Scholar] [CrossRef] [PubMed]
- Bazilinskyy, P.; Kooijman, L.; Dodou, D.; de Winter, J.C.F. How should external Human-Machine Interfaces behave? Examining the effects of colour, position, message, activation distance, vehicle yielding, and visual distraction among 1434 participants. Appl. Ergon. 2021, 95, 103450. [Google Scholar] [CrossRef] [PubMed]
- Kyllonen, P.C.; Zu, J. Use of response time for measuring cognitive ability. J. Intell. 2016, 4, 14. [Google Scholar] [CrossRef]
- Ratcliff, R. Methods for dealing with reaction time outliers. Psychol. Bull. 1993, 114, 510. [Google Scholar] [CrossRef] [PubMed]
- Aguinis, H.; Gottfredson, R.K.; Joo, H. Best-practice recommendations for defining, identifying, and handling outliers. Organ. Res. Methods 2013, 16, 270–301. [Google Scholar] [CrossRef]
- Field, A. Discovering Statistics Using IBM SPSS Statistics; SAGE: Thousand Oaks, CA, USA, 2013. [Google Scholar]
- Kohn, N.; Fernández, G. Emotion and sex of facial stimuli modulate conditional automaticity in behavioral and neuronal interference in healthy men. Neuropsychologia 2020, 145, 106592. [Google Scholar] [CrossRef]
- Shaw-Garlock, G. Gendered by design: Gender codes in social robotics. In Social Robots; Routledge: London, UK, 2017; pp. 199–218. [Google Scholar]
- Fong, T.; Nourbakhsh, I.; Dautenhahn, K. A survey of socially interactive robots. Robot. Auton. Syst. 2003, 42, 143–166. [Google Scholar] [CrossRef] [Green Version]
- Suzuki, T.; Nomura, T. Gender preferences for robots and gender equality orientation in communication situations. AI Soc. 2022, 1–10. [Google Scholar] [CrossRef]
- Eagly, A.H.; Nater, C.; Miller, D.I.; Kaufmann, M.; Sczesny, S. Gender stereotypes have changed: A cross-temporal meta-analysis of US public opinion polls from 1946 to 2018. Am. Psychol. 2020, 75, 301. [Google Scholar] [CrossRef] [Green Version]
- Gong, L. How social is social responses to computers? The function of the degree of anthropomorphism in computer representations. Comput. Hum. Behav. 2008, 24, 1494–1509. [Google Scholar] [CrossRef]
- Tapiro, H.; Oron-Gilad, T.; Parmet, Y. Pedestrian distraction: The effects of road environment complexity and age on pedestrian’s visual attention and crossing behavior. J. Saf. Res. 2020, 72, 101–109. [Google Scholar] [CrossRef] [PubMed]
- Andrade, C. Internal, external, and ecological validity in research design, conduct, and evaluation. Indian J. Psychol. Med. 2018, 40, 498–499. [Google Scholar] [CrossRef]
- Deb, S.; Carruth, D.W.; Sween, R.; Strawderman, L.; Garrison, T.M. Efficacy of virtual reality in pedestrian safety research. Appl. Ergon. 2017, 65, 449–460. [Google Scholar] [CrossRef]
- Vermersch, P. Describing the practice of introspection. J. Conscious. Stud. 2009, 16, 20–57. [Google Scholar]
- Cahour, B.; Salembier, P.; Zouinar, M. Analyzing lived experience of activity. Le Trav. Hum. 2016, 79, 259–284. [Google Scholar] [CrossRef]
- Deb, S.; Carruth, D.W.; Fuad, M.; Stanley, L.M.; Frey, D. Comparison of Child and Adult Pedestrian Perspectives of External Features on Autonomous Vehicles Using Virtual Reality Experiment. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Washington, DC, USA, 24–28 July 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 145–156. [Google Scholar]
- Tapiro, H.; Meir, A.; Parmet, Y.; Oron-Gilad, T. Visual search strategies of child-pedestrians in road crossing tasks. In Proceedings of the Human Factors and Ergonomics Society Europe Chapter 2013 Annual Conference, Torino, Italy, 16–18 October 2013; pp. 119–130. [Google Scholar]
- Charisi, V.; Habibovic, A.; Andersson, J.; Li, J.; Evers, V. Children’s views on identification and intention communication of self-driving vehicles. In Proceedings of the 2017 Conference on Interaction Design and Children, Stanford, CA, USA, 27–30 June 2017; pp. 399–404. [Google Scholar]
- Klin, A.; Jones, W.; Schultz, R.; Volkmar, F.; Cohen, D. Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Arch. Gen. Psychiatry 2002, 59, 809–816. [Google Scholar] [CrossRef] [Green Version]
- Crehan, E.T.; Althoff, R.R. Me looking at you, looking at me: The stare-in-the-crowd effect and autism spectrum disorder. J. Psychiatr. Res. 2021, 140, 101–109. [Google Scholar] [CrossRef]
- Strauss, D.; Shavelle, R.; Anderson, T.W.; Baumeister, A. External causes of death among persons with developmental disability: The effect of residential placement. Am. J. Epidemiol. 1998, 147, 855–862. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rouchitsas, A.; Alm, H. Smiles and Angry Faces vs. Nods and Head Shakes: Facial Expressions at the Service of Autonomous Vehicles. Multimodal Technol. Interact. 2023, 7, 10. https://doi.org/10.3390/mti7020010
Rouchitsas A, Alm H. Smiles and Angry Faces vs. Nods and Head Shakes: Facial Expressions at the Service of Autonomous Vehicles. Multimodal Technologies and Interaction. 2023; 7(2):10. https://doi.org/10.3390/mti7020010
Chicago/Turabian StyleRouchitsas, Alexandros, and Håkan Alm. 2023. "Smiles and Angry Faces vs. Nods and Head Shakes: Facial Expressions at the Service of Autonomous Vehicles" Multimodal Technologies and Interaction 7, no. 2: 10. https://doi.org/10.3390/mti7020010
APA StyleRouchitsas, A., & Alm, H. (2023). Smiles and Angry Faces vs. Nods and Head Shakes: Facial Expressions at the Service of Autonomous Vehicles. Multimodal Technologies and Interaction, 7(2), 10. https://doi.org/10.3390/mti7020010