Human Perception of the Emotional Expressions of Humanoid Robot Body Movements: Evidence from Survey and Eye-Tracking Measurements
Abstract
:1. Introduction
- Design the robot’s body movements from the viewpoint of imitating human emotional movements to express different emotions that can be perceived by humans and provide a guide to emotional HRI design for humanoid robots.
- Explore the human perceptions of different emotional body movements of robots, including perceptions of emotional category, intensity, and arousal, to provide references for the design of emotional body movements in HRIs.
2. Related Works
2.1. Bodily Expression of Emotion
2.2. Evaluation of Human Emotional Perception in HRIs
3. Methods
3.1. Robot Platform
3.2. Materials of Robot’s Emotional Body Movements
3.3. Measures
3.3.1. Implementation of Questionnaire
3.3.2. Eye-Tracking Measurement
4. Analysis
4.1. Analysis of Questionnaires
4.2. Eye-Tracking Analysis
4.2.1. Analysis of Pupil Diameter and Saccade Count
4.2.2. Analysis of Fixation Duration and Trajectory
5. Discussion
5.1. Perceptions of Emotional Category, Intensity, and Arousal
5.2. Guide for Grading the Designed Robot’s Bodily Expressions of Emotion
5.3. Corresponding Characteristics of Human’s Perception
6. Conclusions
- It provides subjective and objective evidence that humans can perceive robots’ emotional movements through vision alone, which provides a reference for designing long-distance human–robot emotional interactions.
- It proposes a guide for grading the designed robot’s bodily expressions of emotion, which can help with the delicate expression of the emotions of robots in real-world environments. Designers can adopt different emotional expressions for robots according to their needs in real-world environments.
- It summarizes the corresponding characteristics of human perception, which is helpful for regulating human emotions in HRI processes. Humans have a greater ability to perceive happiness than negative emotions, such as fear, sadness, anger, and disgust. Eye movements have different characteristics when perceiving happiness and negative emotions. Gender may not affect human perception of emotional body movements for robots.
- It also showed that robots with relatively few DoFs can express strong and diverse emotions with proper body movement design, and humans have the ability to perceive these emotions.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Intensity Level | E1 * | E2 * | E3 * | E4 * | E5 * | E6 * |
---|---|---|---|---|---|---|
TL1 | B4, B1 | B16, B15, B14 | ||||
TL2 | B12 | B7 | B9, B14 | |||
TL3 | B5, B2 | B8, B6 | B11, B16 | B13, B12 | B9, B19 | B17, B15, B7 |
References
- Ahmed, F.; Bari, A.S.M.H.; Gavrilova, M.L. Emotion Recognition from Body Movement. IEEE Access 2020, 8, 11761–11781. [Google Scholar] [CrossRef]
- Atkinson, A.P.; Dittrich, W.H.; Gemmell, A.J.; Young, A.W. Emotion Perception from Dynamic and Static Body Expressions in Point-light and Full-light Displays. Perception 2004, 33, 717–746. [Google Scholar] [CrossRef]
- Tsiourti, C.; Weiss, A.; Wac, K.; Vincze, M. Multimodal Integration of Emotional Signals from Voice, Body, and Context: Effects of (In)Congruence on Emotion Recognition and Attitudes Towards Robots. Int. J. Soc. Robot. 2019, 11, 555–573. [Google Scholar] [CrossRef]
- Wallbott, H.G. Bodily Expression of Emotion. Eur. J. Soc. Psychol. 1998, 28, 879–896. [Google Scholar] [CrossRef]
- Kyrarini, M.; Lygerakis, F.; Rajavenkatanarayanan, A.; Sevastopoulos, C.; Nambiappan, H.R.; Chaitanya, K.K.; Babu, A.R.; Mathew, J.; Makedon, F. A Survey of Robots in Healthcare. Technologies 2021, 9, 8. [Google Scholar] [CrossRef]
- Abbas, T.; Khan, V.-J.; Gadiraju, U.; Barakova, E.; Markopoulos, P. Crowd of Oz: A Crowd-Powered Social Robotics System for Stress Management. Sensors 2020, 20, 569. [Google Scholar] [CrossRef] [PubMed]
- Martín, A.; Pulido, J.C.; González, J.C.; García-Olaya, Á.; Suárez, C. A Framework for User Adaptation and Profiling for Social Robotics in Rehabilitation. Sensors 2020, 20, 4792. [Google Scholar] [CrossRef]
- Chita-Tegmark, M.; Scheutz, M. Assistive Robots for the Social Management of Health: A Framework for Robot Design and Human–Robot Interaction Research. Int. J. Soc. Robot. 2021, 13, 197–217. [Google Scholar] [CrossRef]
- Wakabayashi, H.; Hiroi, Y.; Miyawaki, K.; Ito, A. Development of a Personal Guide Robot That Leads a Guest Hand-in-Hand While Keeping a Distance. Sensors 2024, 24, 2345. [Google Scholar] [CrossRef]
- Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social Robots for Education: A Review. Sci. Robot. 2018, 3, eaat5954. [Google Scholar] [CrossRef]
- Egido-García, V.; Estévez, D.; Corrales-Paredes, A.; Terrón-López, M.-J.; Velasco-Quintana, P.-J. Integration of a Social Robot in a Pedagogical and Logopedic Intervention with Children: A Case Study. Sensors 2020, 20, 6483. [Google Scholar] [CrossRef]
- Guan, X.; Gong, J.; Li, M.; Huan, T.-C. Exploring Key Factors Influencing Customer Behavioral Intention in Robot Restaurants. Int. J. Contemp. Hosp. Manag. 2022, 34, 3482–3501. [Google Scholar] [CrossRef]
- Chen, Y.; Wu, F.; Shuai, W.; Chen, X. Robots Serve Humans in Public Places—KeJia Robot as a Shopping Assistant. Int. J. Adv. Robot. Syst. 2017, 14, 1729881417703569. [Google Scholar] [CrossRef]
- Gasteiger, N.; Hellou, M.; Ahn, H.S. Deploying Social Robots in Museum Settings: A Quasi-systematic Review Exploring Purpose and Acceptability. Int. J. Adv. Robot. Syst. 2021, 18, 17298814211066740. [Google Scholar] [CrossRef]
- Kanda, T.; Shiomi, M.; Miyashita, Z.; Ishiguro, H.; Hagita, N. An Affective Guide Robot in a Shopping Mall. In Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction, La Jolla, CA, USA, 11–13 March 2009. [Google Scholar] [CrossRef]
- Hall, J.; Tritton, T.; Rowe, A.; Pipe, A.; Melhuish, C.; Leonards, U. Perception of Own and Robot Engagement in Human-robot Interactions and Their Dependence on Robotics Knowledge. Robot. Auton. Syst. 2014, 62, 392–399. [Google Scholar] [CrossRef]
- Eyssel, F.; Hegel, F.; Horstmann, G.; Wagner, C. Anthropomorphic Inferences from Emotional Nonverbal Cues: A Case Study. In Proceedings of the 19th IEEE International Conference on Robot and Human Interactive Communication, Viareggio, Italy, 13–15 September 2010. [Google Scholar] [CrossRef]
- Erden, M.S. Emotional Postures for the Humanoid-Robot Nao. Int. J. Soc. Robot. 2013, 5, 441–456. [Google Scholar] [CrossRef]
- Hsieh, W.F.; Sato-Shimokawara, E.; Yamaguchi, T. Investigation of Robot Expression Style in Human-Robot Interaction. J. Robot. Mechatron. 2020, 32, 224–235. [Google Scholar] [CrossRef]
- De Silva, P.R.; Bianchi-Berthouze, N. Modeling Human Affective Postures: An Information Theoretic Characterization of Posture Features. In Proceedings of the 17th International Conference on Computer Animation and Social Agents, Geneva, Switzerland, 7–9 July 2004. [Google Scholar] [CrossRef]
- Dong, R.; Chen, Y.; Cai, D.; Nakagawa, S.; Higaki, T.; Asai, N. Robot Motion Design Using Bunraku Emotional Expressions—Focusing on Jo-Ha-Kyū in Sounds and Movements. Adv. Robot. 2020, 34, 299–312. [Google Scholar] [CrossRef]
- Takahashi, Y.; Kayukawa, Y.; Terada, K.; Inoue, H. Emotional Expressions of Real Humanoid Robots and Their Influence on Human Decision-Making in a Finite Iterated Prisoner’s Dilemma Game. Int. J. Soc. Robot. 2021, 13, 1777–1786. [Google Scholar] [CrossRef]
- Li, J.; Chignell, M. Communication of Emotion in Social Robots through Simple Head and Arm Movements. Int. J. Soc. Robot. 2011, 3, 125–142. [Google Scholar] [CrossRef]
- Xu, J.; Broekens, J.; Hindriks, K.; Neerincx, M.A. Mood Contagion of Robot Body Language in Human Robot Interaction. Auton. Agent. Multi-Agent. Syst. 2015, 29, 1216–1248. [Google Scholar] [CrossRef]
- Hwang, J.; Park, T.; Hwang, W. The Effects of Overall Robot Shape on the Emotions Invoked in Users and the Perceived Personalities of Robot. Appl. Ergon. 2013, 44, 459–471. [Google Scholar] [CrossRef]
- Ghafurian, M.; Lakatos, G.; Dautenhahn, K. The Zoomorphic Miro Robot’s Affective Expression Design and Perceived Appearance. Int. J. Soc. Robot. 2022, 14, 945–962. [Google Scholar] [CrossRef]
- Jung, M.F. Affective Grounding in Human-robot Interaction. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017. [Google Scholar] [CrossRef]
- Guo, F.; Li, M.M.; Qu, Q.X. The Effect of a Humanoid Robot’s Emotional Behaviors on Users’ Emotional Responses: Evidence from Pupillometry and Electroencephalography Measures. Int. J. Hum.-Comput. Int. 2019, 35, 1947–1959. [Google Scholar] [CrossRef]
- LaViers, A. Make Robot Motions Natural. Nature 2019, 565, 422–424. [Google Scholar] [CrossRef]
- Seifert, J.; Friedrich, O.; Schleidgen, S. Imitating the Human. New Human–Machine Interactions in Social Robots. Nanoethics 2022, 16, 181–192. [Google Scholar] [CrossRef]
- Xu, K. A Mini Imitation Game: How Individuals Model Social Robots via Behavioral Outcomes and Social Roles. Telemat. Inform. 2023, 78, 101950. [Google Scholar] [CrossRef]
- Tuyen, N.T.V.; Elibol, A.; Chong, N.Y. Learning Bodily Expression of Emotion for Social Robots Through Human Interaction. IEEE. Trans. Cogn. Develop. Syst. 2021, 13, 16–30. [Google Scholar] [CrossRef]
- Li, L.; Li, Y.; Song, B.; Shi, Z.; Wang, C. How Human-like Behavior of Service Robot Affects Social Distance: A Mediation Model and Cross-Cultural Comparison. Behav. Sci. 2022, 12, 205. [Google Scholar] [CrossRef]
- Hu, X.C.; Tong, S. Effects of Robot Animacy and Emotional Expressions on Perspective-Taking Abilities: A Comparative Study across Age Groups. Behav. Sci. 2023, 13, 728. [Google Scholar] [CrossRef]
- Wu, C.Y.; Davaasuren, D.; Shafir, T. Bodily Expressed Emotion Understanding through Integrating Laban Movement Analysis. Patterns 2023, 4, 100816. [Google Scholar] [CrossRef]
- Zhang, M.M.; Yu, L.; Zhang, K.Y. Kinematic Dataset of Actors Expressing Emotions. Sci. Data 2020, 7, 292. [Google Scholar] [CrossRef]
- de Gelder, B.; Van den Stock, J. The Bodily Expressive Action Stimulus Test (BEAST). Construction and Validation of a Stimulus Basis for Measuring Perception of Whole Body Expression of Emotions. Front. Psychol. 2011, 2, 181. [Google Scholar] [CrossRef]
- Atkinson, A.P.; Heberlein, A.S.; Adolphs, R. Spared Ability to Recognise Fear from Static and Moving Whole-body Cues Following Bilateral Amygdala Damage. Neuropsychologia 2007, 45, 2772–2782. [Google Scholar] [CrossRef]
- Peelen, M.V.; Lucas, N.; Mayer, E.; Vuilleumier, P. Emotional Attention in Acquired Prosopagnosia. Soc. Cogn. Affect. Neurosci. 2009, 4, 268–277. [Google Scholar] [CrossRef]
- Zhang, M.; Zhou, Y.; Xu, X.; Ren, Z.; Zhang, Y.; Liu, S.; Luo, W. Multi-view Emotional Expressions Dataset Using 2D Pose Estimation. Sci. Data 2023, 10, 649. [Google Scholar] [CrossRef]
- Darwin, C.; Darwin, F. (Eds.) The Expression of the Emotions in Man and Animals, 2nd ed.; John Murray: London, UK, 1890; p. 176. [Google Scholar]
- Coulson, M. Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence. J. Nonverbal. Behav. 2004, 28, 117–139. [Google Scholar] [CrossRef]
- de Meijer, M. The Contribution of General Features of Body Movement to the Attribution of Emotions. J. Nonverbal. Behav. 1989, 13, 247–268. [Google Scholar] [CrossRef]
- Kleinsmith, A.; Bianchi-Berthouze, N. Affective Body Expression Perception and Recognition: A Survey. IEEE Tran. Affect. Comput. 2013, 4, 15–33. [Google Scholar] [CrossRef]
- McColl, D.; Nejat, G. Recognizing Emotional Body Language Displayed by a Human-like Social Robot. Int. J. Soc. Robot. 2014, 6, 261–280. [Google Scholar] [CrossRef]
- Mizumaru, K.; Sakamoto, D.; Ono, T. Perception of Emotional Relationships by Observing Body Expressions between Multiple Robots. In Proceedings of the 10th International Conference on Human-Agent Interaction, Christchurch, New Zealand, 5–8 December 2022. [Google Scholar] [CrossRef]
- Beck, A.; Cañamero, L.; Bard, K.A. Towards an Affect Space for Robots to Display Emotional Body Language. In Proceedings of the 19th IEEE International Conference on Robot and Human Interactive Communication, Viareggio, Italy, 13–15 September 2010. [Google Scholar] [CrossRef]
- Yagi, S.; Ise, N.; Yu, S.Q. Perception of Emotional Gait-like Motion of Mobile Humanoid Robot Using Vertical Oscillation. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020. [Google Scholar] [CrossRef]
- Yagi, S.; Nakata, Y.; Nakamura, Y.; Ishiguro, H. Can an Android’s Posture and Movement Discriminate Against the Ambiguous Emotion Perceived from Its Facial Expressions? PLoS ONE 2010, 16, e0254905. [Google Scholar] [CrossRef]
- Fernández-Rodicio, E.; Castro-González, A.; Gamboa-Montero, J.J.; Salichs, M.A. Perception of a Social Robot’s Mood Based on Different Types of Motions and Coloured Heart. In Proceedings of the 12th International Conference on Social Robotics, Golden, CO, USA, 14–18 November 2020. [Google Scholar] [CrossRef]
- Kaushik, R.; Simmons, R. Perception of Emotion in Torso and Arm Movements on Humanoid Robot Quori. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, Electr Network, Boulder, CO, USA, 8–11 August 2021. [Google Scholar] [CrossRef]
- Bradley, M.M.; Miccoli, L.; Escrig, M.A.; Lang, P.J. The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 2008, 45, 602–607. [Google Scholar] [CrossRef]
- Henderson, R.R.; Bradley, M.M.; Lang, P.J. Emotional imagery and pupil diameter. Psychophysiology 2017, 55, e13050. [Google Scholar] [CrossRef]
- Lee, C.L.; Kim, S.; Lim, M. Consumer Attention to a Coffee Brewing Robot: An Eye-Tracking Study. J. Sens. Stud. 2024, 39, e12950. [Google Scholar] [CrossRef]
- Zhang, N.; Zhang, J.; Jiang, S.; Ge, W. The Effects of Layout Order on Interface Complexity: An Eye-Tracking Study for Dashboard Design. Sensors 2024, 24, 5966. [Google Scholar] [CrossRef]
- Bandyopadhyay, A.; Sarkar, S.; Mukherjee, A.; Bhattacherjee, S.; Basu, S. Identifying Emotional Facial Expressions in Practice: A Study on Medical Students. Indian. J. Psychol. Med. 2021, 43, 51–57. [Google Scholar] [CrossRef]
- Goeleven, E.; De Raedt, R.; Leyman, L.; Verschuere, B. The Karolinska Directed Emotional Faces: A Validation Study. Cogn. Emot. 2008, 22, 1094–1118. [Google Scholar] [CrossRef]
Emotion | Behaviors * | DoF * | Description |
---|---|---|---|
E1 (Happiness) | B1 | D3, D4, D15, D16 | Raise arms |
B2 | Raise arms and make forearm vertically | ||
B3 | Stretch arms laterally | ||
B4 | Raise and wave arms | ||
B5 | D2, D3, D4, D15, D16, D17 | Wave arms in front of body | |
E2 (Anger) | B6 | D1, D2, D3, D4, D15, D16, D17 | Hold arms together and turn the head sideways |
B7 | D1, D3, D16 | Arm akimbo and turn the head sideways | |
B8 | D3, D15, D16, D17 | Arm akimbo and point by hand | |
E3 (Sadness) | B9 | D1, D7, D8, D11, D12 | Crouch and shake the head |
B10 | D2, D3, D4, D15, D16, D17 | Keep hands in front of body | |
B11 | D3, D4, D6, D13, D15, D16 | Bend forward and hands down | |
E4 (Surprise) | B12 | D3, D4, D15, D16 | Raise arms vertically |
B13 | D6, D7, D12, D13 | Bend backward | |
E5 (Fear) | B14 | D3, D4, D6, D7, D8, D11, D12, D13, D15, D16 | Bend backward, hold the head in hands, crouch down |
B15 | D6, D7, D8, D9, D10, D11, D12, D13 | Stand back | |
B16 | D3, D4, D7, D8, D11, D12, D15, D16 | Crouch down and hold the head in hands | |
E6 (Disgust) | B17 | D15, D16, D17 | Push one arm out |
B18 | D3, D4, D15, D16, D17 | Take back one arm and the other arm down | |
B19 | D3, D4, D15, D16 | Clamp the arms |
Intensity Level | E1 * (Rate) | E2 * (Rate) | E3 * (Rate) | E4 * (Rate) | E5 * (Rate) | E6 * (Rate) |
---|---|---|---|---|---|---|
L1 | B4 (95.45%) | - | - | - | B16 (83.33%) | - |
L2 | B1 (90.91%) | - | - | - | B15 (81.82%) B14 (68.18%) | - |
L3 | B12 (63.64%) B5 (57.57%) | B7 (65.15%) | B9 (56.06%) B14 (37.88%) B11 (66.67%) | B12 (36.36%) B13 (48.48%) | B9 (51.51%) | B17 (63.64%) |
L4 | B2 (51.52%) | B8 (56.06%) B6 (39.39%) | B16 (18.18%) | - | B19 (46.97%) | B15 (18.18%) B7 (31.82%) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gao, W.; Shen, S.; Ji, Y.; Tian, Y. Human Perception of the Emotional Expressions of Humanoid Robot Body Movements: Evidence from Survey and Eye-Tracking Measurements. Biomimetics 2024, 9, 684. https://doi.org/10.3390/biomimetics9110684
Gao W, Shen S, Ji Y, Tian Y. Human Perception of the Emotional Expressions of Humanoid Robot Body Movements: Evidence from Survey and Eye-Tracking Measurements. Biomimetics. 2024; 9(11):684. https://doi.org/10.3390/biomimetics9110684
Chicago/Turabian StyleGao, Wa, Shiyi Shen, Yang Ji, and Yuan Tian. 2024. "Human Perception of the Emotional Expressions of Humanoid Robot Body Movements: Evidence from Survey and Eye-Tracking Measurements" Biomimetics 9, no. 11: 684. https://doi.org/10.3390/biomimetics9110684
APA StyleGao, W., Shen, S., Ji, Y., & Tian, Y. (2024). Human Perception of the Emotional Expressions of Humanoid Robot Body Movements: Evidence from Survey and Eye-Tracking Measurements. Biomimetics, 9(11), 684. https://doi.org/10.3390/biomimetics9110684