Emotion Recognizing by a Robotic Solution Initiative (EMOTIVE Project)
Abstract
:1. Introduction
- to identify if traditional machine learning algorithms could be used to evaluate each user’s emotions independently (Intra-classification task);
- to compare emotion recognitionin two types of robotic modalities: static robot (which does not perform any movement) and motion robot (whichperforms movements in concordant with the emotions elicited); and
- to assess the acceptability and usability of assistive robot from the end-user point of view.
2. Materials and Methods
2.1. Robot Used for Experimentation
- Twenty freedom degrees for natural and expressive movements.
- Perception components to identify and interact with the person talking to it.
- Speech recognition and dialogue available in 15 languages.
- Bumpers, infrared sensors, 2D and 3D cameras, sonars for omnidirectional and autonomous navigation, and an inertial measurement unit (IMU).
- Touch sensors, LEDs, and microphones for multimodal interactions.
- Open and fully programmable platform.
2.2. Recruitment
- No significant neuropsychiatric symptoms evaluated by neuropsychiatric inventory (NPI) [31]: through use of LabView, an interface to upload data relating to the NPI has been created in order to run the calculation of the uploaded data;
- No significant visual or hearing loss; and
- No cognitive impairment evaluated by mini mental state examination (MMSE) [32]: MMSE score ≥ 27.
- No completed and signed informed consent;
- Incomplete acceptability and usability assessment; and
- Recorded video that is not properly visible.
2.3. Experimentation Protocol
2.4. Data Analysis
2.4.1. Feature Extraction
2.4.2. Classification
- KNN—a non-parametric algorithm employed for regressions and classification. Class membership of each point is calculated by a majority vote of the closest neighbors of each point [3]: a query point is assigned the data class that has the greatest number of representatives within the point’s closest neighbors [3]. In our case, we used K = 3 [3].
2.5. Statistical Analysis
3. Results
3.1. Participant Characteristics
3.2. Emotion Analysis
3.3. Usability and Acceptability Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
AMQ | Almere Model Questionnaire |
ANX | anxiety |
ATT | attitude |
FC | facilitating conditions |
ITU | intention to use |
PAD | perceived adaptability |
PENJ | perceived enjoyment |
PEOU | perceived ease of use |
PS | perceived sociability |
PU | perceived usefulness |
SI | social influence |
SP | social presence |
References
- Darwin, C. The Expression of Emotion in Man and Animals; D. Appleton And Company: New York, NY, USA, 1899. [Google Scholar]
- Ekman, P. Afterword: Universality of Emotional Expression? A Personal History of the Dispute. In The Expression of the Emotions in Man and Animals; Darwin, C., Ed.; Oxford University Press: New York, NY, USA, 1998; pp. 363–393. [Google Scholar]
- Sorrentino, A.; Fiorini, L.; Fabbricotti, I.; Sancarlo, D.; Ciccone, F.; Cavallo, F. Exploring Human attitude during Human-Robot Interaction. In Proceedings of the 29th IEEE International Symposium on Robot and Human Interactive Communication, Naples, Italy, 31 August–4 September 2020. [Google Scholar]
- Horstmann, A.C.; Krämer, N.C. Great Expectations? Relation of Previous Experiences With Social Robots in Real Life or in the Media and Expectancies Based on Qualitative and Quantitative Assessment. Front. Psychol. 2019, 10, 939. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Vinciarelli, A.; Pantic, M.; Bourlard, H. Social signal processing: Survey of an emerging domain. Image Vis. Comput. 2009, 27, 1743–1759. [Google Scholar] [CrossRef] [Green Version]
- Nocentini, O.; Fiorini, L.; Acerbi, G.; Sorrentino, A.; Mancioppi, G.; Cavallo, F. A survey of behavioural models for social robots. Robotics 2019, 8, 54. [Google Scholar] [CrossRef] [Green Version]
- De Carolis, B.; Ferilli, S.; Palestra, G. Simulating empathic behavior in a social assistive robot. Multimed. Tools Appl. 2017, 76, 5073–5094. [Google Scholar] [CrossRef]
- Tanevska, A.; Rea, F.; Sandini, G.; Cañamero, L.; Sciutti, A. A Socially Adaptable Framework for Human-Robot Interaction. Front. Robot. AI 2020, 7, 121. [Google Scholar] [CrossRef]
- Chumkamon, S.; Hayashi, E.; Masato, K. Intelligent emotion and behavior based on topological consciousness and adaptive resonance theory in a companion robot. Biol. Inspired Cogn. Arch. 2016, 18, 51–67. [Google Scholar] [CrossRef]
- Cavallo, F.; Semeraro, F.; Fiorini, L.; Magyar, G.; Sinčák, P.; Dario, P. Emotion Modelling for Social Robotics Applications: A Review. J. Bionic Eng. 2018, 15, 185–203. [Google Scholar] [CrossRef]
- Mei, Y.; Liu, Z.T. An emotion-driven attention model for service robot. In Proceedings of the 2016 12th World Congress on Intelligent Control and Automation (WCICA), Guilin, China, 12–15 June 2016; pp. 1526–1531. [Google Scholar]
- Hirschberg, J.; Manning, C.D. Advances in natural language processing. Science 2015, 349, 261–266. [Google Scholar] [CrossRef]
- Nho, Y.H.; Seo, J.W.; Seol, W.J.; Kwon, D.S. Emotional interaction with a mobile robot using hand gestures. In Proceedings of the 2014 11th International Conference on Ubiquitous Robots and Ambient Intelligence, Kuala Lumpur, Malaysia, 12–15 November 2014; pp. 506–509. [Google Scholar]
- Röning, J.; Holappa, J.; Kellokumpu, V.; Tikanmäki, A.; Pietikäinen, M. Minotaurus: A system for affective human–robot interaction in smart environments. Cogn. Comput. 2014, 6, 940–953. [Google Scholar] [CrossRef]
- Jitviriya, W.; Koike, M.; Hayashi, E. Behavior selection system based on emotional variations. In Proceedings of the 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Kobe, Japan, 31 August–4 September 2015; pp. 462–467. [Google Scholar]
- Van Chien, D.; Sung, K.J.; Trung, P.X.; Kim, J.W. Emotion expression of humanoid robot by modification of biped walking pattern. In Proceedings of the 2015 15th International Conference on Control, Automation and Systems (ICCAS), Busan, Korea, 13–16 October 2015; pp. 741–743. [Google Scholar]
- Sinčák, P.; Novotná, E.; Cádrik, T.; Magyar, G.; Mach, M.; Cavallo, F.; Bonaccorsi, M. Cloud-based Wizard of Oz as a service. In Proceedings of the 2015 IEEE 19th International Conference on Intelligent Engineering Systems (INES), Bratislava, Slovakia, 3–5 September 2015; pp. 445–448. [Google Scholar]
- Leo, M.; Del Coco, M.; Carcagnì, P.; Distante, C.; Bernava, M.; Pioggia, G.; Palestra, G. Automatic emotion recognition in robot-children interaction for ASD treatment. In Proceedings of the 2015 IEEE International Conference on Computer Vision Workshop (ICCVW), Santiago, Chile, 7–13 December 2015; pp. 537–545. [Google Scholar]
- Mazzei, D.; Zaraki, A.; Lazzeri, N.; De Rossi, D. Recognition and expression of emotions by a symbiotic android head. In Proceedings of the 2014 14th IEEE-RAS International Conference on Humanoid Robots (Humanoids), Madrid, Spain, 18–20 November 2014; pp. 134–139. [Google Scholar]
- Boccanfuso, L.; Barney, E.; Foster, C.; Ahn, Y.A.; Chawarska, K.; Scassellati, B.; Shic, F. Emotional robot to examine differences in play patterns and affective response of Children with and without ASD. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; pp. 19–26. [Google Scholar]
- Cao, H.L.; Esteban, P.G.; De Beir, A.; Simut, R.; Van De Perre, G.; Lefeber, D.; Vanderborght, B. ROBEE: A homeostatic-based social behavior controller for robots in Human-Robot Interaction experiments. In Proceedings of the 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO), 5–10 December 2014; pp. 516–521. [Google Scholar]
- Han, J.; Xie, L.; Li, D.; He, Z.J.; Wang, Z.L. Cognitive emotion model for eldercare robot in smart home. China Commun. 2015, 12, 32–41. [Google Scholar]
- Thinakaran, P.; Guttman, D.; Taylan Kandemir, M.; Arunachalam, M.; Khanna, R.; Yedlapalli, P.; Ranganathan, N. Chapter 11—Visual Search Optimization. Editor(s): James Reinders, Jim Jeffers. In High Performance Parallelism Pearls; Morgan Kaufmann: Burlington, MA, USA, 2015; pp. 191–209. [Google Scholar]
- Deng, J.; Pang, G.; Zhang, Z.; Pang, Z.; Yang, H.; Yang, G. cGAN based facial expression recognition for human-robot interaction. IEEE Access 2019, 7, 9848–9859. [Google Scholar] [CrossRef]
- Sang, D.V.; Cuong, L.T.B.; Van Thieu, V. Multi-task learning for smile detection, emotion recognition and gender classification. In Proceedings of the Eighth International Symposium on Information and Communication Technology, New York, NY, USA, 7–8 December 2017; pp. 340–347. [Google Scholar]
- Shan, K.; Guo, J.; You, W.; Lu, D.; Bie, R. Automatic facial expression recognition based on a deep convolutional-neural-network structure. In Proceedings of the 2017 IEEE 15th International Conference on Software Engineering Research, Management and Applications (SERA), London, UK, 7–9 June 2017; pp. 123–128. [Google Scholar]
- Siam, A.I.; Soliman, N.F.; Algarni, A.D.; Abd El-Samie, F.E.; Sedik, A. Deploying Machine Learning Techniques for Human Emotion Detection. Comput. Intell. Neurosci. 2022, 2, 8032673. [Google Scholar] [CrossRef] [PubMed]
- SoftBank Robotics Home Page. Available online: https://www.softbankrobotics.com/emea/en/pepper (accessed on 27 January 2022).
- World Medical Association. World Medical Association Declaration of Helsinki: Ethical principles for medical research involving human subjects. JAMA 2013, 310, 2191. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- von Elm, E.; Altman, D.G.; Egger, M.; Pocock, S.J.; Gøtzsche, P.C.; Vandenbroucke, J.P.; STROBE-Initiative. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: Guidelines for reporting of observational studies. Internist 2008, 49, 688–693. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cummings, J.L.; Mega, M.; Gray, K.; Rosenberg-Thompson, S.; Carusi, D.A.; Gornbein, J. The Neuropsychiatric Inventory: Comprehensive assessment of psychopathology in dementia. Neurology 1994, 44, 2308–2314. [Google Scholar] [CrossRef] [Green Version]
- Folstein, M.; Folstein, S.; McHugh, P. Mini-mental state: A practical method for grading the cognitive state of patients for the clinician. J. Psychiatr. Res. 1975, 12, 189–198. [Google Scholar] [CrossRef]
- Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. International affective picture System (IAPS): Affective ratings of pictures and instruction manual. In Technical Report A-8; University of Florida: Gainesville, FL, USA, 2008. [Google Scholar]
- Lang, P.J. Behavioral treatment and bio-behavioral assessment: Computer applications. In Technology in Mental Health Care Delivery Systems; Sidowski, J.B., Johnson, J.H., Williams, E.A., Eds.; Ablex: Norwood, NJ, USA, 1980; pp. 119–137. [Google Scholar]
- Ekman, P.; Friesen, W.V.; Hager, J.C. Facial Action Coding System. In Manual and Investigator’s Guide; Research Nexus: Salt Lake City, UT, USA, 2002. [Google Scholar]
- Gottman, J.M.; McCoy, K.; Coan, J.; Collier, H. The Specific Affect Coding System (SPAFF) for Observing Emotional Communication in Marital and Family Interaction; Erlbaum: Mahwah, NJ, USA, 1995. [Google Scholar]
- Heerink, M.; Kröse, B.J.A.; Wielinga, B.J.; Evers, V. Assessing acceptance of assistive social agent technology by older adults: The Almere model. Int. J. Soc. Robot. 2010, 2, 361–375. [Google Scholar] [CrossRef] [Green Version]
- Borsci, S.; Federici, S.; Lauriola, M. On the dimensionality of the System Usability Scale: A test of alternative measurement models. Cogn. Process. 2009, 10, 193–197. [Google Scholar] [CrossRef]
- Baltrusaitis, T.; Zadeh, A.; Lim, Y.C.; Morency, L.P. OpenFace 2.0: Facial behavior analysis toolkit. In Proceedings of the 13th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2018), Xi’an, China, 15–19 May 2018; pp. 59–66.
- Baltrusaitis, T. Posted on 28 October 2019. Available online: https://github.com/TadasBaltrusaitis/OpenFace/wiki/Output-Format (accessed on 21 October 2020).
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Bisong, E. Google Colaboratory. In Building Machine Learning and Deep Learning Models on Google Cloud Platform; Apress: Berkeley, CA, USA, 2019; pp. 59–64. [Google Scholar]
- Rudovic, O.; Lee, J.; Dai, M.; Schuller, B.; Picard, R.W. Personalized machine learning for robot perception of affect and engagement in autism therapy. Sci. Robot. 2018, 3, eaao6760. [Google Scholar] [CrossRef] [Green Version]
- Li, X.; Ono, C.; Warita, N.; Shoji, T.; Nakagawa, T.; Usukura, H.; Yu, Z.; Takahashi, Y.; Ichiji, K.; Sugita, N.; et al. Heart Rate Information-Based Machine Learning Prediction of Emotions Among Pregnant Women. Front. Psychiatry 2022, 12, 799029. [Google Scholar] [CrossRef] [PubMed]
- Rakshit, R.; Reddy, V.R.; Deshpande, P. Emotion detection and recognition using HRV features derived from photoplethysmogram signals. In Proceedings of the 2nd Workshop on Emotion Representations and Modelling for Companion Systems, Tokyo, Japan, 16 November 2016; pp. 1–6. [Google Scholar]
- Cheng, Z.; Shu, L.; Xie, J.; Chen, C.P. A novel ECG-based real-time detection method of negative emotions in wearable applications. In Proceedings of the 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC), Shenzhen, China, 15–18 December 2017; pp. 296–301. [Google Scholar]
- Jang, E.H.; Rak, B.; Kim, S.H.; Sohn, J.H. Emotion classification by machine learning algorithm using physiological signals. Proc. Comput. Sci. Inf. Technol. Singap. 2012, 25, 1–5. [Google Scholar]
- Guo, H.W.; Huang, Y.S.; Lin, C.H.; Chien, J.C.; Haraikawa, K.; Shieh, J.S. Heart rate variability signal features for emotion recognition by using principal component analysis and support vectors machine. In Proceedings of the 2016 IEEE 16th International Conference on Bioinformatics and Bioengineering (BIBE), Taichung, Taiwan, 31 October–2 November 2012; pp. 274–277. [Google Scholar]
- Dominguez-Jimenez, J.A.; Campo-Landines, K.C.; Martínez-Santos, J.C.; Delahoz, E.J.; Contreras-Ortiz, S.H. A machine learning model for emotion recognition from physiological signals. Biomed. Signal Process. Control. 2020, 55, 101646. [Google Scholar] [CrossRef]
- Zheng, B.S.; Murugappan, M.; Yaacob, S. Human emotional stress assessment through Heart Rate Detection in a customized protocol experiment. In Proceedings of the 2012 IEEE Symposium on Industrial Electronics and Applications, Bandung, Indonesia, 23–26 September 2012; pp. 293–298. [Google Scholar]
- Ferdinando, H.; Seppänen, T.; Alasaarela, E. Comparing features from ECG pattern and HRV analysis for emotion recognition system. In Proceedings of the 2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB), Chiang Mai, Thailand, 5–7 October 2012; pp. 1–6. [Google Scholar]
- Ayata, D.; Yaslan, Y.; Kamasak, M.E. Emotion based music recommendation system using wearable physiological sensors. IEEE Trans. Consum. Electron. 2018, 64, 196–203. [Google Scholar] [CrossRef]
Positive | Negative |
---|---|
To smile | To step back slightly showing disgust |
To clap hands | To cry |
To raise arms and cheer | To bend chest forward showing boredom |
To blow a kiss | To turn head left and right quickly showing fear |
To wave | To bow head showing sadness |
To make an appreciation | To fold arms showing confusion |
Parameter | Category | Types |
---|---|---|
Behavioral | Emotion | Joy, sadness, fear, anger, disgust, neutral |
Gaze | Directed gaze, mutual face gaze, none | |
Facial expressions | Smile, laugh, raise eyebrows, frown, inexpressive |
All n = 27 | Static Robot n = 18 | Accordant Motion n = 9 | p-Value | |
---|---|---|---|---|
Gender | 0.411 | |||
Men/Women | 12/15 | 7/11 | 5/4 | |
Men (%) | 44.40 | 38.90 | 55.60 | |
Age (years) | 0.049 | |||
Mean ± SD | 40.48 ± 10.82 | 37.61 ± 8.14 | 46.22 ± 13.56 | |
Range | 28–66 | 28–53 | 31–66 | |
Educational level | 0.194 | |||
Degree—n (%) | 24 (88.90) | 17 (94.40) | 7 (77.80) | |
High school—n (%) | 3 (11.10) | 1 (5.60) | 2 (22.20) |
All n = 27 | Static Robot n = 18 | Accordant Motion n = 9 | p-Value | |
---|---|---|---|---|
SUS | 0.157 | |||
Mean ± SD | 72.87 ± 13.11 | 75.42 ± 14.98 | 67.78 ± 6.18 | |
Range * | 45.00–100.00 | 45.00–100.00 | 60.00–77.50 | |
AMQ | 0.716 | |||
ANX | ||||
Mean ± SD | 7.59 ± 2.54 | 7.62 ± 2.60 | 7.33 ± 2.54 | |
Range * | 4–13 | 4–13 | 4–11 | |
ATT | 0.726 | |||
Mean ± SD | 11.59 ± 1.88 | 11.50 ± 2.01 | 11.78 ± 1.71 | |
Range * | 7–15 | 7–15 | 9–14 | |
FC | 0.226 | |||
Mean ± SD | 6.18 ± 1.88 | 6.50 ± 2.09 | 5.56 ± 1.24 | |
Range * | 2–10 | 2–10 | 4–8 | |
ITU | 0.525 | |||
Mean ± SD | 8.44 ± 3.13 | 8.72 ± 3.18 | 7.89 ± 3.14 | |
Range * | 3–15 | 3–15 | 3–12 | |
PAD | 0.701 | |||
Mean ± SD | 10.74 ± 1.72 | 10.83 ± 1.85 | 10.55 ± 1.51 | |
Range * | 7–15 | 7–15 | 8–13 | |
PENJ | 0.624 | |||
Mean ± SD | 20.18 ± 2.97 | 20.39 ± 3.29 | 19.78 ± 2.33 | |
Range * | 15–25 | 15–25 | 16–24 | |
PEOU | 0.525 | |||
Mean ± SD | 16.96 ± 2.71 | 16.72 ± 3.02 | 17.44 ± 2.01 | |
Range * | 12–21 | 12–21 | 14–20 | |
PS | 0.527 | |||
Mean ± SD | 13.74 ± 2.72 | 13.50 ± 3.18 | 14.22 ± 1.48 | |
Range * | 4–18 | 4–18 | 12–16 | |
PU | 0.519 | |||
Mean ± SD | 9.85 ± 2.26 | 10.05 ± 2.48 | 9.44 ± 1.81 | |
Range * | 5–15 | 5–15 | 7–12 | |
SI | 0.197 | |||
Mean ± SD | 5.48 ± 2.08 | 5.11 ± 2.13 | 6.22 ± 1.85 | |
Range * | 2–9 | 2–8 | 4–9 | |
SP | 0.194 | |||
Mean ± SD | 14.44 ± 2.79 | 13.94 ± 2.62 | 15.44 ± 3.00 | |
Range * | 9–19 | 9–19 | 9–19 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
D’Onofrio, G.; Fiorini, L.; Sorrentino, A.; Russo, S.; Ciccone, F.; Giuliani, F.; Sancarlo, D.; Cavallo, F. Emotion Recognizing by a Robotic Solution Initiative (EMOTIVE Project). Sensors 2022, 22, 2861. https://doi.org/10.3390/s22082861
D’Onofrio G, Fiorini L, Sorrentino A, Russo S, Ciccone F, Giuliani F, Sancarlo D, Cavallo F. Emotion Recognizing by a Robotic Solution Initiative (EMOTIVE Project). Sensors. 2022; 22(8):2861. https://doi.org/10.3390/s22082861
Chicago/Turabian StyleD’Onofrio, Grazia, Laura Fiorini, Alessandra Sorrentino, Sergio Russo, Filomena Ciccone, Francesco Giuliani, Daniele Sancarlo, and Filippo Cavallo. 2022. "Emotion Recognizing by a Robotic Solution Initiative (EMOTIVE Project)" Sensors 22, no. 8: 2861. https://doi.org/10.3390/s22082861
APA StyleD’Onofrio, G., Fiorini, L., Sorrentino, A., Russo, S., Ciccone, F., Giuliani, F., Sancarlo, D., & Cavallo, F. (2022). Emotion Recognizing by a Robotic Solution Initiative (EMOTIVE Project). Sensors, 22(8), 2861. https://doi.org/10.3390/s22082861