Next Article in Journal
The Effect of Stress on a Personal Identification System Based on Electroencephalographic Signals
Previous Article in Journal
Design of a Cavity for the High-Power Radio-Frequency Quadrupole Coupler Test for the ANTHEM Project
Previous Article in Special Issue
A Curiosity Estimation in Storytelling with Picture Books for Children Using Wearable Sensors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Advancements in Sensors and Analyses for Emotion Sensing

Psychological Process Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan
Sensors 2024, 24(13), 4166; https://doi.org/10.3390/s24134166
Submission received: 21 June 2024 / Accepted: 24 June 2024 / Published: 27 June 2024
(This article belongs to the Special Issue Advanced-Sensors-Based Emotion Sensing and Recognition)

1. Introduction

Exploring the objective signals associated with subjective emotional states has practical significance. Emotional experiences play a fundamental role in people’s well-being [1] and in business success, such as in the development of new products [2]. However, subjective ratings of emotion have several limitations. First, the scientific rigor of self-reports is controversial [3]. Second, subjective ratings can be biased with respect to demand characteristics [4] and social desirability [5]. Third, dynamic, continuous ratings are difficult while performing active tasks [6]. Fourth, specific individuals, such as those with alexithymia [7], those with dementia [8], and young children [9], have difficulty evaluating emotional experiences. Finally, bodily and behavioral emotional responses may be elicited without subjective experience [10]. Measurements of objective signals associated with emotional experiences complement subjective ratings by providing objective, unbiased, dynamic, monitorable, and subtle emotional information.
Many studies have developed emotion-sensing methods based on physical or physiological signals [11]. For example, previous studies reported that facial expressions [12] and autonomic nervous system activity [13] reflect the subjective experiences of basic emotional categories. Other studies reported that facial electromyography (EMG) and electrodermal activity (EDA) reflect the dimensions of emotional valence and arousal, respectively [14]. Several wearable devices have been developed to implement emotion sensing in real-life situations [15], and various machine-learning (ML) models have been proposed to analyze these emotion-related data [16].
However, several limitations remain in current emotion-sensing methods. For example, whether emotion categories could be estimated using facial expressions [17] or physiological signals [18] is controversial. The magnitude of the correlations between facial EMG activity and subjective valence experiences is reportedly weak in some studies (e.g., ref. [19]). Only a small proportion of wearable emotion-sensing devices have been empirically validated [20]. Several challenges for emotion analysis have been pointed out, including an advanced analysis of multimodal and real-life data [16].
The present Special Issue brings together a collection of new articles that have explored new sensors or analyses for emotion sensing aimed at overcoming these limitations. These articles are introduced in the following sections.

2. Advancements in Sensors

Some of the studies in this Special Issue explored new sensors or measures for emotion sensing. Hsu and Sato [21] validated a video-based automated facial action analysis by simultaneously recording the video data of facial expressions and facial EMG. The results indicated positive correlations between automated facial action signals and facial EMG, although the sensitivity of the former signals was weaker than the latter. To identify new measures for emotion sensing, Sato and Kochiyama [22] recorded EMG from the facial and trapezius muscles, fingertip temperature, and dynamic ratings of subjective emotional valence and arousal while participants viewed emotional film clips. Valence ratings were linearly associated with corrugator and zygomatic EMG; fingertip temperature changes were associated with dynamic arousal ratings, while trapezius EMG was unrelated to either valence or arousal ratings. These results suggest that new measures, including automatically analyzed facial action units and fingertip temperature, are useful for emotion sensing.
Several studies in this Special Issue tested the effectiveness of wearable devices for implementing emotion sensing in real-life situations. Balconi et al. [23] recorded several autonomic nervous system activity measures, including pulse volume amplitude, using a portable system—alongside behavioral and self-reported data—in visually impaired and sighted participants during product selection in a grocery store. Pulse volume amplitudes correlated negatively with time spent on product identification and positively with simplicity in finding products among the entire sample, suggesting that the physiological signals reflected positive emotional reactions. Algumaei et al. [24] recorded heart rate (HR) signals using wearable sensors and measured task performance and subjective frustration levels during a collaborative computer-based simulation task among three members. Some measures of physiological synchrony estimated based on HR signals revealed associations with task performance and subjective frustration level. Arabian et al. [25] recorded EDA and HR signals from wearable sensors on the hand and chest, respectively, while participants observed emotional images. ML modeling for features selected from these signals effectively identified the emotional states. Wang et al. [26] detected driver fatigue based on dual-channel electroencephalography (EEG) and HR signals recorded using a portable device. The analysis of the datasets demonstrated that an EEG-based real-time evaluation of driving fatigue status is possible. Onishi et al. [27] measured children’s behavior in story-telling activities using wearable motion sensors attached to their heads. The children’s curiosity level could be estimated based on their head motion data. These findings reveal that recording physiological or physical signals using wearable devices provides useful information about emotional states in real-life situations.
In summary, these studies suggest that further exploration of emotion-sensing sensors is a promising line of research.

3. Advancements in Analyses

Several studies in this Special Issue reported new analyses aimed at improving emotion sensing. John and Kawanishi [28] proposed a novel method for analyzing multimodal information to address missing modality and multimodal portability problems. Experiments using existing datasets validated the proposed method, showing better accuracy of emotion classification from audio-visual data than other methods. Wang et al. [29] proposed a new feature extraction method using the amplitude level quantization technique to analyze HR variability for emotion sensing. The emotion classification accuracy of this method surpassed existing techniques. Bahameish et al. [30] proposed robust strategies to reduce the effects of overfitting and data leakage in ML modeling using HR variability for stress detection. Their analysis revealed a model that accurately differentiated stress from neutral states with high generalizability. Xu et al. [31] proposed a network model based on a gated recurrent unit and multi-head attention for speech emotion recognition. The experimental results showed that this model had better classification performance for speech emotion than commonly used models. Li et al. [32] proposed a method to add perturbation to generate diverse emotional responses in dialogue systems, which can be useful for emotion recognition. The experimental results showed that the method improved the accuracy and diversity of emotional expressions compared with baseline methods. These studies illustrate the development of new analyses to enhance the efficacy and usefulness of emotion sensing using physical and physiological signals.

4. Conclusions

The findings in this Special Issue illustrate recent advancements in sensors and analyses for emotion sensing. Subjective emotional states can be more accurately estimated in more realistic situations with improved sensors and analyses. Understanding the associations between subjective emotional and physical/physiological states is also theoretically important and has been of great interest because it can teach us the psychological mechanisms underlying subjective emotional experiences [33]. Further research is warranted to investigate emotion-sensing methods using objective signals.

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Lyubomirsky, S. Why are some people happier than others? The role of cognitive and motivational processes in well-being. Am. Psychol. 2001, 56, 239–249. [Google Scholar] [CrossRef] [PubMed]
  2. Meiselman, H.L. A review of the current state of emotion research in product development. Food Res. Int. 2015, 76, 192–199. [Google Scholar] [CrossRef]
  3. Paul, E.S. Towards a comparative science of emotion: Affect and consciousness in humans and animals. Neurosci. Biobehav. Rev. 2020, 108, 749–770. [Google Scholar] [CrossRef] [PubMed]
  4. Saxena, A.; Khanna, A.; Gupta, D. Emotion recognition and detection methods: A comprehensive survey. J. Artif. Intell. Syst. 2020, 2, 53–79. [Google Scholar] [CrossRef]
  5. Westermann, R.; Spies, K.; Stahl, G.; Hesse, F.W. Relative effectiveness and validity of mood induction procedures: A meta-analysis. Eur. J. Soc. Psychol. 1996, 26, 557–580. [Google Scholar] [CrossRef]
  6. Ruef, A.M.; Levenson, R.W. Continuous measurement of emotion: The affect rating dial. In Handbook of Emotion Elicitation and Assessment; Coan, J.A., Allen, J.J.B., Eds.; Oxford University Press: New York, NY, USA, 2007; pp. 286–297. [Google Scholar]
  7. Suslow, T.; Kersting, A. Beyond face and voice: A review of alexithymia and emotion perception in music, odor, taste, and touch. Front. Psychol. 2021, 12, 707599. [Google Scholar] [CrossRef] [PubMed]
  8. Logsdon, R.G.; Gibbons, L.E.; McCurry, S.M.; Teri, L. Assessing quality of life in older adults with cognitive impairment. Psychosom. Med. 2002, 64, 510–519. [Google Scholar] [CrossRef] [PubMed]
  9. Chambers, C.T.; Johnston, C. Developmental differences in children’s use of rating scales. J. Pediatr. Psychol. 2002, 27, 27–36. [Google Scholar] [CrossRef] [PubMed]
  10. Winkielman, P.; Berridge, K.C. Unconscious emotion. Curr. Dir. Psychol. Sci. 2004, 13, 120–123. [Google Scholar] [CrossRef]
  11. Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human emotion recognition: Review of sensors and methods. Sensors 2020, 20, 592. [Google Scholar] [CrossRef]
  12. Sariyanidi, E.; Gunes, H.; Cavallaro, A. Automatic analysis of facial affect: A survey of registration, representation, and recognition. IEEE Trans Pattern Anal. Mach. Intell. 2015, 37, 1113–1133. [Google Scholar] [CrossRef] [PubMed]
  13. Levenson, R.W. The autonomic nervous system and emotion. Emot. Rev. 2014, 6, 100–112. [Google Scholar] [CrossRef]
  14. Cacioppo, J.T.; Berntson, G.G.; Klein, D.J. What is an emotion? The role of somatovisceral afference, with special emphasis on somatovisceral “illusions”. In Emotion and Social Behavior. IX; Clark, M.S., Ed.; Sage: Thousand Oaks, CA, USA, 1992; pp. 63–98. [Google Scholar]
  15. Saganowski, S.; Perz, B.; Polak, A.G.; Kazienko, P. Emotion recognition for everyday life using physiological signals from wearables: A systematic literature review. IEEE Trans. Affect Comput. 2023, 14, 1876–1897. [Google Scholar] [CrossRef]
  16. Younis, E.M.G.; Mohsen, S.; Houssein, E.H.; Ibrahim, O.A.S. Machine learning for human emotion recognition: A comprehensive review. Neural Comput. Appl. 2024, 36, 8901–8947. [Google Scholar] [CrossRef]
  17. Duran, J.I.; Reisenzein, R.; Fernandez-Dols, J.-M. Coherence between emotions and facial expressions: A research synthesis. In The Science of Facial Expression; Fernandez-Dols, J.-M., Russell, J.A., Eds.; Oxford University Press: New York, NY, USA, 2017; pp. 107–129. [Google Scholar]
  18. Kreibig, S.D. Autonomic nervous system activity in emotion: A review. Biol. Psychol. 2010, 84, 394–421. [Google Scholar] [CrossRef] [PubMed]
  19. Sato, W.; Minemoto, K.; Ikegami, A.; Nakauma, M.; Funami, T.; Fushiki, T. Facial EMG correlates of subjective hedonic responses during food consumption. Nutrients 2020, 12, 1174. [Google Scholar] [CrossRef] [PubMed]
  20. Peake, J.M.; Kerr, G.; Sullivan, J.P. A critical review of consumer wearables, mobile applications, and equipment for providing biofeedback, monitoring stress, and sleep in physically active populations. Front. Physiol. 2018, 9, 743. [Google Scholar] [CrossRef] [PubMed]
  21. Hsu, C.T.; Sato, W. Electromyographic validation of spontaneous facial mimicry detection using automated facial action coding. Sensors 2023, 23, 9076. [Google Scholar] [CrossRef]
  22. Sato, W.; Kochiyama, T. Exploration of emotion dynamics sensing using trapezius EMG and fingertip temperature. Sensors 2022, 22, 6553. [Google Scholar] [CrossRef] [PubMed]
  23. Balconi, M.; Acconito, C.; Angioletti, L. Emotional effects in object recognition by the visually impaired people in grocery shopping. Sensors 2022, 22, 8442. [Google Scholar] [CrossRef]
  24. Algumaei, M.; Hettiarachchi, I.; Veerabhadrappa, R.; Bhatti, A. Physiological synchrony predict task performance and negative emotional state during a three-member collaborative task. Sensors 2023, 23, 2268. [Google Scholar] [CrossRef]
  25. Arabian, H.; Alshirbaji, T.A.; Schmid, R.; Wagner-Hartl, V.; Chase, J.G.; Moeller, K. Harnessing wearable devices for emotional intelligence: Therapeutic applications in digital health. Sensors 2023, 23, 8092. [Google Scholar] [CrossRef] [PubMed]
  26. Wang, L.; Song, F.; Zhou, T.H.; Hao, J.; Ryu, K.H. EEG and ECG-based multi-sensor fusion computing for real-time fatigue driving recognition based on feedback mechanism. Sensors 2023, 23, 8386. [Google Scholar] [CrossRef] [PubMed]
  27. Ohnishi, A.; Kosaka, S.; Hama, Y.; Saito, K.; Terada, T. A curiosity estimation in storytelling with picture book for children using wearable sensors. Sensors 2024, 24, 4043. [Google Scholar] [CrossRef]
  28. John, V.; Kawanishi, Y. Progressive learning of a multimodal classifier Accounting for different modality combinations. Sensors 2023, 23, 4666. [Google Scholar] [CrossRef] [PubMed]
  29. Wang, M.L.; Hao, J.; Zhou, T.H. ECG multi-emotion recognition based on heart rate variability signal features. Sensors 2023, 23, 8636. [Google Scholar] [CrossRef] [PubMed]
  30. Bahameish, M.; Stockman, T.; Carrión, J.R. Strategies for reliable stress recognition: A machine learning approach using heart rate variability features. Sensors 2024, 24, 3210. [Google Scholar] [CrossRef] [PubMed]
  31. Xu, C.; Liu, Y.; Song, W.; Liang, Z.; Chen, X. A new network structure for speech emotion recognition research. Sensors 2024, 24, 1429. [Google Scholar] [CrossRef] [PubMed]
  32. Li, B.; Zhao, H.; Zhang, Z. Diversifying emotional dialogue generation via selective adversarial training. Sensors 2023, 23, 5904. [Google Scholar] [CrossRef]
  33. Friedman, B.H. Feelings and the body: The Jamesian perspective on autonomic specificity of emotion. Biol. Psychol. 2010, 84, 383–393. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sato, W. Advancements in Sensors and Analyses for Emotion Sensing. Sensors 2024, 24, 4166. https://doi.org/10.3390/s24134166

AMA Style

Sato W. Advancements in Sensors and Analyses for Emotion Sensing. Sensors. 2024; 24(13):4166. https://doi.org/10.3390/s24134166

Chicago/Turabian Style

Sato, Wataru. 2024. "Advancements in Sensors and Analyses for Emotion Sensing" Sensors 24, no. 13: 4166. https://doi.org/10.3390/s24134166

APA Style

Sato, W. (2024). Advancements in Sensors and Analyses for Emotion Sensing. Sensors, 24(13), 4166. https://doi.org/10.3390/s24134166

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop