Expressure: Detect Expressions Related to Emotional and Cognitive Activities Using Forehead Textile Pressure Mechanomyography
Abstract
:1. Introduction
1.1. Paper Contribution
1.2. Paper Structure
2. Background and Related Work
2.1. The Psychology of Emotional Facial Expressions
2.2. Wearable Expression and Cognitive Activity Detection: Related Work
2.3. Textile Mechanomyography
2.4. Challenges
2.4.1. Anatomical Sensing Challenges
2.4.2. Psychological Challenges
3. Apparatus
4. Experiment 1: Instructed Mimicking
4.1. Participants
4.2. Experiment Design
4.3. Data Analysis
- per pixel average of all frames:
- sum of per pixel differences:
- sum of positive or negative values of per pixel differences:
- the frame which has the maximum mean pixel value as
- the frame with the minimum mean value as
- the frame with the maximum standard deviation from the stream as
- the per-pixel average of the frames, whose pixel value is greater than the frame pixel average:
4.4. Cross-Validation of Detecting Facial Expressions
- Mode 1: three eyebrow states {Neutral, Lower, Raise}
- Mode 2: seven emotions {Neutral, Joy, Disgust, Fear, Anger, Sadness, Surprise}
- Mode 3: emotion groups based on eyebrow {{Neutral, Joy, Sadness},{Anger, Disgust},{Fear, Surprise}}
4.5. Cross-Mode Validation
5. Experiment 2: Cognitive Loads
5.1. Participants
5.2. Experiment Design
5.3. Data Analysis Method
5.4. Correlating with Short Time Window Cognitive Activities
5.5. Predicting Cognitive Loads during Longer Periods
6. Conclusions, Discussion and Future Outlook
Author Contributions
Funding
Conflicts of Interest
References
- Healey, J.; Picard, R. SmartCar: Detecting driver stress. In Proceedings of the 15th International Conference on Pattern Recognition, ICPR-2000, Barcelona, Spain, 3–7 September 2000; Volume 4, pp. 218–221. [Google Scholar]
- Sun, F.T.; Kuo, C.; Cheng, H.T.; Buthpitiya, S.; Collins, P.; Griss, M. Activity-aware mental stress detection using physiological sensors. In Proceedings of the International Conference on Mobile Computing, Applications, and Services, Santa Clara, CA, USA, 25–28 October 2010; pp. 282–301. [Google Scholar]
- Villarejo, M.V.; Zapirain, B.G.; Zorrilla, A.M. A stress sensor based on Galvanic Skin Response (GSR) controlled by ZigBee. Sensors 2012, 12, 6075–6101. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Delaney, J.; Brodie, D. Effects of short-term psychological stress on the time and frequency domains of heart-rate variability. Percept. Motor Skills 2000, 91, 515–524. [Google Scholar] [CrossRef] [PubMed]
- Choi, J.; Gutierrez-Osuna, R. Using heart rate monitors to detect mental stress. In Proceedings of the 2009 Sixth International Workshop on Wearable and Implantable Body Sensor Networks, Berkeley, CA, USA, 3–5 June 2009; pp. 219–223. [Google Scholar]
- Durantin, G.; Gagnon, J.F.; Tremblay, S.; Dehais, F. Using near infrared spectroscopy and heart rate variability to detect mental overload. Behav. Brain Res. 2014, 259, 16–23. [Google Scholar] [CrossRef] [Green Version]
- Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA, 2000. [Google Scholar]
- Tao, J.; Tan, T. Affective Computing: A review. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction, Beijing, China, 22–24 October 2005; pp. 981–995. [Google Scholar]
- Kumari, J.; Rajesh, R.; Pooja, K. Facial expression recognition: A survey. Procedia Comput. Sci. 2015, 58, 486–491. [Google Scholar] [CrossRef] [Green Version]
- Grafsgaard, J.; Wiggins, J.B.; Boyer, K.E.; Wiebe, E.N.; Lester, J. Automatically recognizing facial expression: Predicting engagement and frustration. In Proceedings of the 6th International Conference on Educational Data Mining (EDM 2013), Memphis, TN, USA, 6–9 July 2013. [Google Scholar]
- Le, T.H.N.; Prabhu, U.; Savvides, M. A novel eyebrow segmentation and eyebrow shape-based identification. In Proceedings of the IEEE International Joint Conference on Biometrics, Clearwater, FL, USA, 29 September–2 October 2014; pp. 1–8. [Google Scholar]
- Bartlett, M.S.; Littlewort, G.; Fasel, I.; Movellan, J.R. Real Time Face Detection and Facial Expression Recognition: Development and Applications to Human Computer Interaction. In Proceedings of the 2003 Conference on Computer Vision and Pattern Recognition Workshop, Madison, WI, USA, 16–22 June 2003; Volume 5, p. 53. [Google Scholar]
- Littlewort, G.; Bartlett, M.S.; Fasel, I.; Susskind, J.; Movellan, J. Dynamics of facial expression extracted automatically from video. Image Vis. Comput. 2006, 24, 615–625. [Google Scholar] [CrossRef]
- Mollahosseini, A.; Chan, D.; Mahoor, M.H. Going deeper in facial expression recognition using deep neural networks. In Proceedings of the 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Placid, NY, USA, 7–10 March 2016; pp. 1–10. [Google Scholar]
- Ding, H.; Zhou, S.K.; Chellappa, R. FaceNet2ExpNet: Regularizing a deep face recognition net for expression recognition. In Proceedings of the 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), Washington, DC, USA, 30 May–3 June 2017; pp. 118–126. [Google Scholar]
- Zhao, G.; Huang, X.; Taini, M.; Li, S.Z.; PietikäInen, M. Facial expression recognition from near-infrared videos. Image Vis. Comput. 2011, 29, 607–619. [Google Scholar] [CrossRef]
- Kolli, A.; Fasih, A.; Al Machot, F.; Kyamakya, K. Non-intrusive car driver’s emotion recognition using thermal camera. In Proceedings of the Joint INDS’11 & ISTET’11, Klagenfurt, Austria, 25–27 July 2011; pp. 1–5. [Google Scholar]
- Khan, M.M.; Ingleby, M.; Ward, R.D. Automated facial expression classification and affect interpretation using infrared measurement of facial skin temperature variations. ACM Trans. Auton. Adapt. Syst. 2006, 1, 91–113. [Google Scholar] [CrossRef]
- LUNDQVIST, L.O. Facial EMG reactions to facial expressions: A case of facial emotional contagion? Scand. J. Psychol. 1995, 36, 130–141. [Google Scholar] [CrossRef] [Green Version]
- Ravaja, N.; Bente, G.; Kätsyri, J.; Salminen, M.; Takala, T. Virtual character facial expressions influence human brain and facial EMG activity in a decision-making game. IEEE Trans. Affect. Comput. 2018, 9, 285–298. [Google Scholar] [CrossRef] [Green Version]
- Mullen, T.; Kothe, C.; Chi, Y.M.; Ojeda, A.; Kerth, T.; Makeig, S.; Cauwenberghs, G.; Jung, T.P. Real-time modeling and 3D visualization of source dynamics and connectivity using wearable EEG. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 2184–2187. [Google Scholar]
- Lin, C.T.; Chuang, C.H.; Huang, C.S.; Tsai, S.F.; Lu, S.W.; Chen, Y.H.; Ko, L.W. Wireless and wearable EEG system for evaluating driver vigilance. IEEE Trans. Biomed. Circuits Syst. 2014, 8, 165–176. [Google Scholar]
- Casson, A.J.; Yates, D.C.; Smith, S.J.; Duncan, J.S.; Rodriguez-Villegas, E. Wearable electroencephalography. IEEE Eng. Med. Biol. Mag. 2010, 29, 44–56. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Scheirer, J.; Fernandez, R.; Picard, R.W. Expression glasses: A wearable device for facial expression recognition. In Proceedings of the CHI’99 Extended Abstracts on Human Factors in Computing Systems, Pittsburgh, PA, USA, 15–20 May 1999; pp. 262–263. [Google Scholar]
- Masai, K.; Sugiura, Y.; Ogata, M.; Kunze, K.; Inami, M.; Sugimoto, M. Facial expression recognition in daily life by embedded photo reflective sensors on smart eyewear. In Proceedings of the 21st International Conference on Intelligent User Interfaces, Sonoma, CA, USA, 7–10 March 2016; pp. 317–326. [Google Scholar]
- Darwin, C.; Prodger, P. The Expression of the Emotions in Man and Animals; Oxford University Press: Oxford, UK, 1998. [Google Scholar]
- Ekman, P. Darwin and Facial Expression: A Century of Research in Review; ISHK: Los Altos, CA, USA, 2006. [Google Scholar]
- Ekman, P. Facial expression and emotion. Am. Psychol. 1993, 48, 384. [Google Scholar] [CrossRef] [PubMed]
- Celani, G.; Battacchi, M.W.; Arcidiacono, L. The understanding of the emotional meaning of facial expressions in people with autism. J. Autism Dev. Disorders 1999, 29, 57–66. [Google Scholar] [CrossRef] [PubMed]
- Terzopoulos, D.; Waters, K. Physically-based facial modelling, analysis, and animation. J. Visual. Comput. Anim. 1990, 1, 73–80. [Google Scholar] [CrossRef]
- Ekman, P.; Friesen, W.V. The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica 1969, 1, 49–98. [Google Scholar] [CrossRef]
- Ekman, P.; Keltner, D. Universal Facial Expressions of Emotion. Nonverbal Communication: Where Nature Meets Culture; Segerstrale, U., Molnar, P., Eds.; Routledge: London, UK, 1997; pp. 27–46. [Google Scholar]
- Lane, R.D.; Nadel, L. Cognitive Neuroscience of Emotion; Oxford University Press: Oxford, UK, 2002. [Google Scholar]
- Damasio, A.R.; Grabowski, T.J.; Bechara, A.; Damasio, H.; Ponto, L.L.; Parvizi, J.; Hichwa, R.D. Subcortical and cortical brain activity during the feeling of self-generated emotions. Nature Neurosci. 2000, 3, 1049. [Google Scholar] [CrossRef]
- Mavratzakis, A.; Herbert, C.; Walla, P. Emotional facial expressions evoke faster orienting responses, but weaker emotional responses at neural and behavioural levels compared to scenes: A simultaneous EEG and facial EMG study. Neuroimage 2016, 124, 931–946. [Google Scholar] [CrossRef]
- Niedenthal, P.M.; Halberstadt, J.B.; Margolin, J.; Innes-Ker, Å.H. Emotional state and the detection of change in facial expression of emotion. Eur. J. Soc. Psychol. 2000, 30, 211–222. [Google Scholar] [CrossRef]
- Gruebler, A.; Suzuki, K. Design of a wearable device for reading positive expressions from facial emg signals. IEEE Trans. Affect. Comput. 2014, 5, 227–237. [Google Scholar] [CrossRef]
- Dimberg, U. Facial reactions to facial expressions. Psychophysiology 1982, 19, 643–647. [Google Scholar] [CrossRef]
- Dimberg, U.; Thunberg, M.; Elmehed, K. Unconscious facial reactions to emotional facial expressions. Psychol. Sci. 2000, 11, 86–89. [Google Scholar] [CrossRef] [PubMed]
- de Wied, M.; van Boxtel, A.; Zaalberg, R.; Goudena, P.P.; Matthys, W. Facial EMG responses to dynamic emotional facial expressions in boys with disruptive behavior disorders. J. Psychiatr. Res. 2006, 40, 112–121. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Katsis, C.; Goletsis, Y.; Rigas, G.; Fotiadis, D. A wearable system for the affective monitoring of car racing drivers during simulated conditions. Transport. Res. Part C Emerg. Technol. 2011, 19, 541–551. [Google Scholar] [CrossRef]
- Kwon, J.; Kim, L. Emotion recognition using a glasses-type wearable device via multi-channel facial responses. arXiv 2019, arXiv:1905.05360. [Google Scholar]
- Lee, S.; Min, C.; Montanari, A.; Mathur, A.; Chang, Y.; Song, J.; Kawsar, F. Automatic Smile and Frown Recognition with Kinetic Earables. In Proceedings of the 10th Augmented Human International Conference, Reims, France, 11–12 March 2019; pp. 1–4. [Google Scholar]
- Friedman, N.; Fekete, T.; Gal, Y.K.; Shriki, O. EEG-based Prediction of Cognitive Load in Intelligence Tests. Front. Hum. Neurosci. 2019, 13, 191. [Google Scholar] [CrossRef] [PubMed]
- Ren, P.; Ma, X.; Lai, W.; Zhang, M.; Liu, S.; Wang, Y.; Li, M.; Ma, D.; Dong, Y.; He, Y.; et al. Comparison of the Use of Blink Rate and Blink Rate Variability for Mental State Recognition. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 867–875. [Google Scholar] [CrossRef]
- Schaule, F.; Johanssen, J.O.; Bruegge, B.; Loftness, V. Employing consumer wearables to detect office workers’ cognitive load for interruption management. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 2, 1–20. [Google Scholar] [CrossRef]
- Eggins, B.R. Skin contact electrodes for medical applications. Analyst 1993, 118, 439–442. [Google Scholar] [CrossRef]
- Madeleine, P.; Bajaj, P.; Søgaard, K.; Arendt-Nielsen, L. Mechanomyography and electromyography force relationships during concentric, isometric and eccentric contractions. J. Electromyogr. Kinesiol. 2001, 11, 113–121. [Google Scholar] [CrossRef]
- Esposito, D.; Andreozzi, E.; Fratini, A.; Gargiulo, G.; Savino, S.; Niola, V.; Bifulco, P. A Piezoresistive Sensor to Measure Muscle Contraction and Mechanomyography. Sensors 2018, 18, 2553. [Google Scholar] [CrossRef] [Green Version]
- Zhou, B.; Altamirano, C.; Zurian, H.; Atefi, S.; Billing, E.; Martinez, F.; Lukowicz, P. Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction. Sensors 2017, 17, 2585. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhou, B.; Sundholm, M.; Cheng, J.; Cruz, H.; Lukowicz, P. Never skip leg day: A novel wearable approach to monitoring gym leg exercises. In Proceedings of the 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom), Sydney, Australia, 14–18 March 2016; pp. 1–9. [Google Scholar]
- Tipples, J.; Atkinson, A.P.; Young, A.W. The eyebrow frown: A salient social signal. Emotion 2002, 2, 288. [Google Scholar] [CrossRef]
- Olszanowski, M.; Pochwatko, G.; Kuklinski, K.; Scibor-Rylski, M.; Lewinski, P.; Ohme, R.K. Warsaw set of emotional facial expression pictures: A validation study of facial display photographs. Front. Psychol. 2015, 5, 1516. [Google Scholar] [CrossRef] [PubMed]
- Zhou, B.; Koerger, H.; Wirth, M.; Zwick, C.; Martindale, C.; Cruz, H.; Eskofier, B.; Lukowicz, P. Smart soccer shoe: monitoring foot-ball interaction with shoe integrated textile pressure sensor matrix. In Proceedings of the 2016 ACM International Symposium on Wearable Computers, Heidelberg, Germany, 12–16 September 2016; pp. 64–71. [Google Scholar]
- Hu, M.K. Visual pattern recognition by moment invariants. IRE Trans. Inf. Theory 1962, 8, 179–187. [Google Scholar]
- Plutchik, R. The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am. Sci. 2001, 89, 344–350. [Google Scholar] [CrossRef]
- Kirchner, W.K. Age differences in short-term retention of rapidly changing information. J. Exp. Psychol. 1958, 55, 352. [Google Scholar] [CrossRef]
- Kane, M.J.; Conway, A.R.; Miura, T.K.; Colflesh, G.J. Working memory, attention control, and the N-back task: A question of construct validity. J. Exp. Psychol. Learn. Mem. Cognit. 2007, 33, 615. [Google Scholar] [CrossRef] [Green Version]
- Porter, F.C. Testing consistency of two histograms. arXiv 2008, arXiv:0804.0380. [Google Scholar]
- Perusquía-Hernández, M.; Hirokawa, M.; Suzuki, K. A wearable device for fast and subtle spontaneous smile recognition. IEEE Trans. Affect. Comput. 2017, 8, 522–533. [Google Scholar] [CrossRef]
- Bălan, O.; Moise, G.; Petrescu, L.; Moldoveanu, A.; Leordeanu, M.; Moldoveanu, F. Emotion Classification Based on Biophysical Signals and Machine Learning Techniques. Symmetry 2020, 12, 21. [Google Scholar] [CrossRef] [Green Version]
- Waterhouse, I.K.; Child, I.L. Frustration and the quality of performance. J. Pers. 1953, 21, 298–311. [Google Scholar] [CrossRef] [PubMed]
- Child, I.L.; Waterhouse, I.K. Frustration and the quality of performance: II. A theoretical statement. Psychol. Rev. 1953, 60, 127. [Google Scholar] [CrossRef] [PubMed]
No. | Stimulus/Input Conditions | p | Num. | |
---|---|---|---|---|
1 | Any positive stimulus | 30.5152 | 0.2100 | 0 |
2 | Any positive input | 25.9685 | 0.4100 | 0 |
3 | Both inputs equal stimuli | 34.9081 | 0.0900 | 0 |
4 | Any positive correct input | 42.8637 | 0.0100 | 1 |
5 | Any positive input or stimulus | 24.8547 | 0.4700 | 0 |
6 | Any positive input or stimulus | 57.3435 | 0.0002 | 18 |
and both inputs match stimuli | ||||
7 | Any input matches stimulus | 112.6133 | 0.0000 | 68 |
8 | Any positive input does not | 44.1696 | 0.0100 | 4 |
match stimulus (red feedback) |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhou, B.; Ghose, T.; Lukowicz, P. Expressure: Detect Expressions Related to Emotional and Cognitive Activities Using Forehead Textile Pressure Mechanomyography. Sensors 2020, 20, 730. https://doi.org/10.3390/s20030730
Zhou B, Ghose T, Lukowicz P. Expressure: Detect Expressions Related to Emotional and Cognitive Activities Using Forehead Textile Pressure Mechanomyography. Sensors. 2020; 20(3):730. https://doi.org/10.3390/s20030730
Chicago/Turabian StyleZhou, Bo, Tandra Ghose, and Paul Lukowicz. 2020. "Expressure: Detect Expressions Related to Emotional and Cognitive Activities Using Forehead Textile Pressure Mechanomyography" Sensors 20, no. 3: 730. https://doi.org/10.3390/s20030730
APA StyleZhou, B., Ghose, T., & Lukowicz, P. (2020). Expressure: Detect Expressions Related to Emotional and Cognitive Activities Using Forehead Textile Pressure Mechanomyography. Sensors, 20(3), 730. https://doi.org/10.3390/s20030730