Review of Studies on Emotion Recognition and Judgment Based on Physiological Signals
Abstract
:1. Introduction
2. Research and Application of EEG
3. Research and Application of EDA, ECG and EMG
4. Research on and Application of Multimodal Physiological Signals
4.1. Two Psychological Signals
4.2. Multiple Physiological Signals
5. Ethical and Privacy Concerns Related to the Use of Physiological Signals for Emotion Recognition
6. Conclusions and Prospects
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Hussain, M.S.; Alzoubi, O.; Calvo, R.A.; D’Mello, S.K. Affect detection from multichannel physiology during learning sessions with auto tutor. Artif. Intell. Educ. 2011, 6738, 131–138. [Google Scholar]
- Paul, E.S.; Cuthill, I.; Norton, V.; Woodgate, J.; Mendl, M. Mood and the speed of decisions about anticipated resources and hazards. Evol. Hum. Behav. 2011, 32, 21–28. [Google Scholar] [CrossRef]
- Karray, F.; Alemzadeh, M.; Saleh, J.A.; Arab, M.N. Human-computer interaction: Overview on state of the art. Int. J. Smart Sens. Intell. Syst. 2008, 1, 137–159. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Y.; Cheng, C.; Zhang, Y.D. Multimodal emotion recognition based on manifold learning and convolution neural network. Multimed. Tools Appl. 2022, 81, 33253–33268. [Google Scholar] [CrossRef]
- Quispe, K.G.M.; Utyiama, D.M.S.; dos Santos, E.M.; Oliveira, H.A.B.F.; Souto, E.J. Applying self-supervised representation learning for emotion recognition using physiological signals. Sensors 2022, 22, 9102. [Google Scholar] [CrossRef]
- Dasdemir, Y. Cognitive investigation on the effect of augmented reality-based reading on emotion classification performance: A new dataset. Biomed. Signal Process. Control 2022, 78, 103942. [Google Scholar] [CrossRef]
- Hernandez-Melgarejo, G.; Luviano-Juarez, A.; Fuentes-Aguilar, R.Q. A framework to model and control the state of presence in virtual reality systems. IEEE Trans. Affect. Comput. 2022, 13, 1854–1867. [Google Scholar] [CrossRef]
- Dissanayake, V.; Seneviratne, S.; Rana, R.; Wen, E.; Kaluarachchi, T.; Nanayakkara, S. SigRep: Toward robust wearable emotion recognition with contrastive representation learning. IEEE Access 2022, 10, 18105–18120. [Google Scholar] [CrossRef]
- Lee, Y.K.; Pae, D.S.; Hong, D.K.; Lim, M.T.; Kang, T.K. Emotion recognition with short-period physiological signals using bimodal sparse autoencoders. Intell. Autom. Soft Comput. 2022, 32, 657–673. [Google Scholar] [CrossRef]
- Pusarla, N.; Singh, A.; Tripathi, S. Learning DenseNet features from EEG based spectrograms for subject independent emotion recognition. Biomed. Signal Process. Control 2022, 74, 103485. [Google Scholar] [CrossRef]
- Moin, A.; Aadil, F.; Ali, Z.; Kang, D.W. Emotion recognition framework using multiple modalities for an effective human-computer interaction. J. Supercomput. 2023. [Google Scholar] [CrossRef]
- Kim, S.H.; Yang, H.J.; Nguyen, N.A.T.; Prabhakar, S.K.; Lee, S.W. WeDea: A new eeg-based framework for emotion recognition. IEEE J. Biomed. Health Inform. 2022, 26, 264–275. [Google Scholar] [CrossRef]
- Romeo, L.; Cavallo, A.; Pepa, L.; Bianchi-Berthouze, N.; Pontil, M. Multiple instance learning for emotion recognition using physiological signals. IEEE Trans. Affect. Comput. 2022, 13, 389–407. [Google Scholar] [CrossRef]
- Mert, A. Modality encoded latent dataset for emotion recognition. Biomed. Signal Process. Control 2023, 79, 104140. [Google Scholar] [CrossRef]
- Fu, Z.Z.; Zhang, B.N.; He, X.R.; Li, Y.X.; Wang, H.Y.; Huang, J. Emotion recognition based on multi-modal physiological signals and transfer learning. Front. Neurosci. 2022, 16, 1000716. [Google Scholar] [CrossRef]
- Pusarla, N.; Singh, A.; Tripathi, S. Normal inverse gaussian features for EEG-based automatic emotion recognition. IEEE Trans. Instrum. Meas. 2022, 71, 6503111. [Google Scholar] [CrossRef]
- Katada, S.; Okada, S. Biosignal-based user-independent recognition of emotion and personality with importance weighting. Multimed. Tools Appl. 2022, 81, 30219–30241. [Google Scholar] [CrossRef]
- Hasnul, M.A.; Ab Aziz, N.A.; Abd Aziz, A. Augmenting ECG data with multiple filters for a better emotion recognition system. Arab. J. Sci. Eng. 2023, 1–22. [Google Scholar] [CrossRef]
- Shi, H.; Zhao, H.; Yao, W. A transfer fusion framework for body sensor networks (BSNs): Dynamic domain adaptation from distribution evaluation to domain evaluation. Inf. Fusion 2023, 91, 338–351. [Google Scholar] [CrossRef]
- Anuragi, A.; Sisodia, D.S.; Pachori, R.B. EEG-based cross-subject emotion recognition using Fourier-Bessel series expansion based empirical wavelet transform and NCA feature selection method. Inf. Sci. 2022, 610, 508–524. [Google Scholar] [CrossRef]
- Asiain, D.; de Leon, J.P.; Beltran, J.R. MsWH: A multi-sensory hardware platform for capturing and analyzing physiological emotional signals. Sensors 2022, 22, 5775. [Google Scholar] [CrossRef]
- Zontone, P.; Affanni, A.; Bernardini, R.; Del Linz, L.; Piras, A.; Rinaldo, R. Analysis of physiological signals for stress recognition with different car handling setups. Electronics 2022, 11, 888. [Google Scholar] [CrossRef]
- Xie, L.P.; Lu, C.H.; Liu, Z.; Chen, W.; Zhu, Y.W.; Xu, T. The evaluation of automobile interior acceleration sound fused with physiological signal using a hybrid deep neural network. Mech. Syst. Signal Process. 2023, 184, 109675. [Google Scholar] [CrossRef]
- Yang, M.Q.; Lin, L.; Milekic, S. Affective image classification basedon usereye movement and EEG experience information. Interact. Comput. 2018, 30, 417–432. [Google Scholar] [CrossRef]
- Yoon, H.J.; Chung, S.Y. EEG-based emotion estimation using Bayesian weighted-log-posterior function and perception convergence algorithm. Comput. Biol. Med. 2013, 43, 2230–2237. [Google Scholar] [CrossRef]
- Andreu-Perez, A.R.; Kiani, M.; Andreu-Perez, J.; Reddy, P.; Andreu-Abela, J.; Pinto, M.; Izzetoglu, K. Single-trial recognition of video gamer’s expertise from brainhaemodynamic and facial emotion responses. Brain Sci. 2021, 11, 106. [Google Scholar] [CrossRef]
- Zhang, J.; Zhou, Y.T.; Liu, Y. EEG-based emotion recognition using an improved radial basis function neural network. J. Ambient Intell. Humaniz. Comput. 2020, 5. [Google Scholar] [CrossRef]
- Chew, L.H.; Teo, J.; Mountstephens, J. Aesthetic preference recognition of 3D shapes using EEG. Cogn. Neurodynamics 2016, 10, 165–173. [Google Scholar] [CrossRef] [Green Version]
- Chanel, G.; Kronegg, J.; Grandjean, D.; Pun, T. Emotion assessment: Arousal evaluation using EEG’s and peripheral physiological signals. Multimed. Content Represent. Classif. Secur. 2006, 4105, 530–537. [Google Scholar]
- Wagh, K.P.; Vasanth, K. Performance evaluation of multi-channel electroencephalogram signal (EEG) based time frequency analysis for humane motion recognition. Biomed. Signal Process. Control 2022, 78, 103966. [Google Scholar] [CrossRef]
- Ozdemir, M.A.; Degirmenci, M.; Izci, E.; Akan, A. EEG-based emotion recognition with deep convolutional neural networks. Biomed. Eng.-Biomed. Tech. 2021, 66, 43–57. [Google Scholar] [CrossRef] [PubMed]
- Abadi, M.K.; Subramanian, R.; Kia, S.M.; Avesani, P.; Patras, I.; Sebe, N. DECAF:meg-based multimodal database for decoding affective physiological responses. IEEE Trans. Affect. Comput. 2015, 6, 209–222. [Google Scholar] [CrossRef]
- Tang, Z.C.; Li, X.T.; Xia, D.; Hu, Y.D.; Zhang, L.T.; Ding, J. An art therapy evaluation method based on emotion recognition using EEG deep temporal features. Multimed. Tools Appl. 2022, 81, 7085–7101. [Google Scholar] [CrossRef]
- Soroush, M.Z.; Maghooli, K.; Setarehdan, S.K.; Nasrabadi, A.M. Emotionrecognitionusing EEG phase space dynamics and Poincare intersections. Biomed. Signal Process. Control 2020, 59, 101918. [Google Scholar] [CrossRef]
- Halim, Z.; Rehan, M. On identification of driving-induced stress using electroencephalogram signals: A framework based on wearable safety-critical scheme and machine learning. Inf. Fusion 2020, 53, 66–79. [Google Scholar] [CrossRef]
- Lu, M.L.; Hu, S.T.; Mao, Z.; Liang, P.; Xin, S.; Guan, H.Y. Research on work efficiency and light comfort based on EEG evaluation method. Build. Environ. 2020, 183, 107122. [Google Scholar] [CrossRef]
- Balan, O.; Moise, G.; Moldoveanu, A.; Leordeanu, M.; Moldoveanu, F. Fear level classification based on emotional dimensions and machine learning techniques. Sensors 2019, 19, 1738. [Google Scholar] [CrossRef] [Green Version]
- Al Hammadi, A.Y.; Yeun, C.Y.; Damiani, E.; Yoo, P.D.; Hu, J.K.; Yeun, H.K.; Yim, M.S. Explainable artificial intelligence to evaluate industrial internal security using EEG signals in IoT framework. AD Hoc Netw. 2021, 123, 102641. [Google Scholar] [CrossRef]
- Guo, F.; Li, M.M.; Hu, M.C.; Li, F.X.; Lin, B.Z. Distinguishing and quantifying the visual aesthetics of a product: An integrated approach of eye-tracking and EEG. Int. J. Ind. Ergon. 2019, 71, 47–56. [Google Scholar] [CrossRef]
- Priyasad, D.; Fernando, T.; Denman, S.; Sridharan, S.; Fookes, C. Affect recognition from scalp-EEG using channel-wise encoder networks coupled with geometric deep learning and multi-channel feature fusion. Knowl. -Based Syst. 2022, 250, 109038. [Google Scholar] [CrossRef]
- Xie, L.P.; Lu, C.H.; Liu, Z.E.; Yan, L.R.; Xu, T. Study of auditory brain cognition laws-based recognition method of automobile sound quality. Front. Hum. Neurosci. 2021, 15, 663049. [Google Scholar] [CrossRef]
- Yan, L.X.; Wan, P.; Qin, L.Q.; Zhu, D.Y. The induction and detection method of angry driving evidences from EEG and physiological signals. Discret. Dyn. Nat. Soc. 2018, 2018, 3702795. [Google Scholar] [CrossRef]
- Li, J.J.; Wang, Q.A. Review of individual differences from transfer learning. Her. Russ. Acad. Sci. 2022, 92, 549–557. [Google Scholar] [CrossRef]
- Mohsen, S.; Alharbi, A.G. EEG-based human emotion prediction using an LSTM model. In Proceedings of the 2021 IEEE International Midwest Symposium on Circuits and Systems, Lansing, MI, USA, 9–11 August 2021; pp. 458–461. [Google Scholar]
- Liapis, A.; Katsanos, C.; Sotiropoulos, D.G.; Karousos, N.; Xenos, M. Stress in interactive applications: Analysis of the valence-arousal space based on physiological signals and self-reported data. Multimed. Tools Appl. 2017, 76, 5051–5071. [Google Scholar] [CrossRef]
- Feng, H.H.; Golshan, H.M.; Mahoor, M.H. A wavelet-based approach to emotion classification using EDA signals. Expert Syst. Appl. 2018, 112, 77–86. [Google Scholar] [CrossRef]
- van der Zwaag, M.D.; Janssen, J.H.; Westerink, J.H.D.M. Directing physiology and mood through music: Validation of an affective music player. IEEE Trans. Affect. Comput. 2013, 4, 57–68. [Google Scholar] [CrossRef]
- Lin, W.Q.; Li, C.; Zhang, Y.J. Interactive application of dataglove based on emotion recognition and judgment system. Sensors 2022, 22, 6327. [Google Scholar] [CrossRef] [PubMed]
- Hu, Q.W.; Li, X.H.; Fang, H.; Wan, Q. The tactile perception evaluation of wood surface with different roughness and shapes: A study using galvanic skin response. Wood Res. 2022, 67, 311–325. [Google Scholar] [CrossRef]
- Yin, G.H.; Sun, S.Q.; Yu, D.A.; Li, D.J.; Zhang, K.J. A multimodal framework for large-scale emotion recognition by fusing music and electrodermal activity signals. ACM Trans. Multimed. Comput. Commun. Appl. 2022, 18, 78. [Google Scholar] [CrossRef]
- Romaniszyn-Kania, P.; Pollak, A.; Danch-Wierzchowska, M.; Kania, D.; Mysliwiec, A.P.; Pietka, E.; Mitas, A.W. Hybrid system of emotion evaluation in physiotherapeutic procedures. Sensors 2020, 21, 6343. [Google Scholar] [CrossRef]
- Sepulveda, A.; Castillo, F.; Palma, C.; Rodriguez-Fernandez, M. Emotion recognition from ECG signals using wavelet scattering and machine learning. Appl. Sci. 2021, 11, 4945. [Google Scholar] [CrossRef]
- Wang, X.Y.; Guo, Y.Q.; Ban, J.; Xu, Q.; Bai, C.L.; Liu, S.L. Driver emotion recognition of multiple-ECG-feature fusion based on BP network and D-S evidence. IET Intell. Transp. Syst. 2020, 14, 815–824. [Google Scholar] [CrossRef]
- Wu, M.H.; Chang, T.C. Evaluation of effect of music on human nervous system by heart rate variability analysis using ECG sensor. Sens. Mater. 2021, 33, 739–753. [Google Scholar] [CrossRef]
- Fu, Y.J.; Leong, H.V.; Ngai, G.; Huang, M.X.; Chan, S.C.F. Physiological mouse: Toward an emotion-aware mouse. Univers. Access Inf. Soc. 2017, 16, 365–379. [Google Scholar] [CrossRef]
- Hu, J.Y.; Li, Y. Electrocardiograph based emotion recognition via WGAN-GP data enhancement and improved CNN. In Proceedings of the 15th International Conference on Intelligent Robotics and Applications-Smart Robotics for Society, Intelligent Robotics and Applications, Harbin, China, 1–3 August 2022; pp. 155–164. [Google Scholar]
- Bornemann, B.; Winkielman, P.; van der Meer, E. Can you feel what you do not see? Using internal feedback to detect briefly presented emotional stimuli. Int. J. Psychophysiol. 2012, 85, 116–124. [Google Scholar] [CrossRef]
- Du, G.L.; Long, S.Y.; Yuan, H. Non-contact emotion recognition combining heart rate and facial expression for interactive gaming environments. IEEE Access 2020, 8, 11896–11906. [Google Scholar] [CrossRef]
- Lin, W.Q.; Li, C.; Sun, S.Q. Deep convolutional neural network for emotion recognition using EEG and peripheral physiological signal. Lect. Notes Comput. Sci. 2017, 10667, 385–394. [Google Scholar]
- Lin, W.Q.; Li, C.; Zhang, Y.J. Emotion visualization system based on physiological signals combined with the picture and scene. Inf. Vis. 2022, 21, 393–404. [Google Scholar] [CrossRef]
- Wu, Q.; Dey, N.; Shi, F.Q.; Crespo, R.G.; Sherratt, R.S. Emotion classification on eye-tracking and electroencephalograph fused signals employing deep gradient neural networks. Appl. Soft Comput. 2021, 110, 107752. [Google Scholar] [CrossRef]
- Tang, Z.C.; Xia, D.; Li, X.T.; Wang, X.Y.; Ying, J.C.; Yang, H.C. Evaluation of the effect of music on idea generation using electrocardiography and electroencephalography signals. Int. J. Technol. Des. Educ. 2022. [Google Scholar] [CrossRef]
- Singh, R.R.; Conjeti, S.; Banerjee, R. A comparative evaluation of neural network classifiers for stress level analysis of automotive drivers using physiological signals. Biomed. Signal Process. Control 2013, 8, 740–754. [Google Scholar] [CrossRef]
- Katsigiannis, S.; Ramzan, N. DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Health Inform. 2018, 22, 98–107. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Laparra-Hernandez, J.; Belda-Lois, J.M.; Medina, E.; Campos, N.; Poveda, R. EMG and GSR signals for evaluating user’s perception of different types of ceramic flooring. Int. J. Ind. Ergon. 2009, 39, 326–332. [Google Scholar] [CrossRef]
- Zhang, X.W.; Liu, J.Y.; Shen, J.; Li, S.J.; Hou, K.C.; Hu, B.; Gao, J.; Zhang, T. Emotion recognition from multimodal physiological signals using a regularized deep fusion deep fusion of kernel machine. IEEE Trans. Cybern. 2021, 51, 4386–4399. [Google Scholar] [CrossRef]
- Jang, E.H.; Byun, S.; Park, M.S.; Sohn, J.H. Reliability of physiological responses induced by basic emotions: A pilot study. J. Physiol. Anthropol. 2019, 38, 15. [Google Scholar] [CrossRef]
- Yoo, G.; Seo, S.; Hong, S.; Kim, H. Emotion extraction based on multi bio-signal using back-propagation neural network. Multimed. Tools Appl. 2018, 77, 4925–4937. [Google Scholar] [CrossRef]
- Khezri, M.; Firoozabadi, M.; Sharafat, A.R. Reliable emotion recognition system based on dynamic adaptive fusion of forehead bio-potentials and physiological signals. Comput. Methods Programs Biomed. 2015, 122, 149–164. [Google Scholar] [CrossRef]
- Zhou, F.; Qu, X.D.; Jiao, J.X.; Helander, M.G. Emotion prediction from physiological signals: A comparison study between visual and auditory elicitors. Interact. Comput. 2014, 26, 285–302. [Google Scholar] [CrossRef]
- Yan, M.S.; Deng, Z.; He, B.W.; Zou, C.S.; Wu, J.; Zhu, Z.J. Emotion classification with multichannel physiological signals using hybrid feature and adaptive decision fusion. Biomed. Signal Process. Control 2022, 71, 103235. [Google Scholar] [CrossRef]
- Anolli, L.; Mantovani, F.; Mortillaro, M.; Vescovo, A.; Agliati, A.; Confalonieri, L.; Realdon, O.; Zurloni, V.; Sacchi, A. A multimodal database as a background for emotional synthesis, recognition and training in e-learning systems. Lect. Notes Comput. Sci. 2005, 3784, 566–573. [Google Scholar]
- Picard, R.W.; Vyzas, E.; Healey, J. Toward machine emotionalintelligence: Analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 1175–1191. [Google Scholar] [CrossRef] [Green Version]
- Fleureau, J.; Guillotel, P.; Quan, H.T. Physiological-based affect event detector for entertainment video applications. IEEE Trans. Affect. Comput. 2012, 3, 379–385. [Google Scholar] [CrossRef]
- Kim, K.H.; Bang, S.W.; Kim, S.R. Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng. Comput. 2004, 42, 419–427. [Google Scholar] [CrossRef] [PubMed]
- Pour, P.A.; Hussain, M.S.; AlZoubi, O.; D’Mello, S.; Calvo, R.A.; Aleven, V.; Kay, J.; Mostow, J. The impact of system feedback on learners’ affective and physiological states. Intell. Tutoring Syst. Proc. 2010, 6094 Pt 1, 264–272. [Google Scholar]
- Liu, Y.; Ritchie, J.M.; Lim, T.; Kosmadoudi, Z.; Sivanathan, A.; Sung, R.C.W. A fuzzy psycho-physiological approach to enable the understanding of an engineer’s affect status during CAD activities. Comput.-Aided Des. 2014, 54, 19–38. [Google Scholar] [CrossRef] [Green Version]
- Pinto, G.; Carvalho, J.M.; Barros, F.; Soares, S.C.; Pinho, A.J.; Bras, S. Multimodal emotion evaluation: A physiological model for cost-effective emotion classification. Sensors 2020, 20, 3510. [Google Scholar] [CrossRef]
- Zhuang, J.R.; Guan, Y.J.; Nagayoshi, H.; Yuge, L.; Lee, H.H.; Tanaka, E. Two-dimensional emotion evaluation with multiple physiological signals. Adv. Affect. Pleasurable Des. 2019, 774, 158–168. [Google Scholar]
- Garg, A.; Chaturvedi, V.; Kaur, A.B.; Varshney, V.; Parashar, A. Machine learning model for mapping of music mood and human emotion based on physiological signals. Multimed. Tools Appl. 2022, 81, 5137–5177. [Google Scholar] [CrossRef]
- Zhuang, J.R.; Guan, Y.J.; Nagayoshi, H.; Muramatsu, K.; Watanuki, K.; Tanaka, E. Real-time emotion recognition system with multiple physiological signals. J. Adv. Mech. Des. Syst. Manuf. 2019, 13, JAMDSM0075. [Google Scholar] [CrossRef] [Green Version]
- Lin, W.Q.; Li, C.; Zhang, Y.M. Model of emotion judgment based on features of multiple physiological signals. Appl. Sci. 2022, 12, 4998. [Google Scholar] [CrossRef]
- Albraikan, A.; Hafidh, B.; El Saddik, A. iAware: A real-time emotional biofeedback system based on physiological signals. IEEE Access 2018, 6, 78780–78789. [Google Scholar] [CrossRef]
- Chen, S.H.; Tang, J.J.; Zhu, L.; Kong, W.Z. A multi-stage dynamical fusion network for multimodal emotion recognition. Cogn. Neurodynamics 2022. [Google Scholar] [CrossRef]
- Niu, Y.F.; Wang, D.L.; Wang, Z.W.; Sun, F.; Yue, K.; Zheng, N. User experience evaluation in virtual reality based on subjective feelings and physiological signals. J. Imaging Sci. Technol. 2019, 63, 060413. [Google Scholar] [CrossRef]
- Chung, S.C.; Yang, H.K. A real-time emotionality assessment (RTEA) system based on psycho-physiological evaluation. Int. J. Neurosci. 2008, 118, 967–980. [Google Scholar] [CrossRef]
- Uluer, P.; Kose, H.; Gumuslu, E.; Barkana, D.E. Experience with an affective robot assistant for children with hearing disabilities. Int. J. Soc. Robot. 2021. [Google Scholar] [CrossRef]
- Kiruba, K.; Sharmila, D. AIEFS and HEC based emotion estimation using physiological measurements for the children with autism spectrum disorder. Biomed. Res. 2016, 27, S237–S250. [Google Scholar]
- Habibifar, N.; Salmanzadeh, H. Improving driving safety by detecting negative emotions with biological signals: Which is the best? Transp. Res. Rec. 2022, 2676, 334–349. [Google Scholar] [CrossRef]
- Hssayeni, M.D.; Ghoraani, B. Multi-modal physiological data fusion for affect estimation using deep learning. IEEE Access 2021, 9, 21642–21652. [Google Scholar] [CrossRef]
- Chen, L.L.; Zhao, Y.; Ye, P.F.; Zhang, J.; Zou, J.Z. Detecting driving stress in physiological signals based on multimodal feature analysis and kernel classifiers. Expert Syst. Appl. 2017, 85, 279–291. [Google Scholar] [CrossRef]
- Saffaryazdi, N.; Goonesekera, Y.; Saffaryazdi, N.; Hailemariam, N.D.; Temesgen, E.G.; Nanayakkara, S.; Broadbent, E.; Billinghurst, M. Emotion recognition in conversations using brain and physiological signals. In Proceedings of the 27th International Conference on Intelligent User Interfaces, University Helsinki, Electricity Network, Helsinki, Finland, 22–25 March 2022; pp. 229–242. [Google Scholar]
- Ma, R.X.; Yan, X.; Liu, Y.Z.; Li, H.L.; Lu, B.L. Sex difference in emotion recognition under sleep deprivation: Evidence from EEG and eye-tracking. In Proceedings of the IEEE Engineering in Medicine and Biology Society Conference Proceedings, Virtual Conference, 1–5 November 2021; pp. 6449–6452. [Google Scholar]
- Singson, L.N.B.; Sanchez, M.T.U.R.; Villaverde, J.F. Emotion recognition using short-term analysis of heart rate variability and ResNet architecture. In Proceedings of the 13th International Conference on Computer and Automation Engineering, Melbourne, VIC, Australia, 20–22 March 2021; pp. 15–18. [Google Scholar]
- Hinduja, S.; Kaur, G.; Canavan, S. Investigation into recognizing context over time using physiological signals. In Proceedings of the 9th International Conference on Affective Computing and Intelligent Interaction (ACII), Nara, Japan, 28 September–1 October 2021. [Google Scholar]
- Keller, H.E.; Lww, S. Ethical issues surrounding human participants research using the Internet. Ethics Behav. 2003, 13, 211–221. [Google Scholar] [CrossRef]
- Cowie, R. Ethical issues in affective computing. In The Oxford Handbook of Affective Computing (Oxford Library of Psychology); Oxford University Press: New York, NY, USA, 2015; p. 334. [Google Scholar]
- Giardino, W.J.; Eban-Rothschild, A.; Christoffel, D.J.; Li, S.B.; Malenka, R.C.; de Lecea, L. Parallel circuits from the bed nucleiof stria terminalis to the lateral hypothalamus drive opposing emotional states. Nature Neurosci. 2016, 21, 1084–1095. [Google Scholar] [CrossRef] [PubMed]
- Sarma, P.; Barma, S. Review on stimuli presentation for affect analysis based on EEG. IEEE Access 2020, 8, 51991–52009. [Google Scholar] [CrossRef]
- Sun, Z.D.; Li, X.B. Privacy-phys: Facial video-based physiological modification for privacy protection. IEEE Signal Process. Lett. 2022, 29, 1507–1511. [Google Scholar] [CrossRef]
- Pal, S.; Mukhopadhyay, S.; Suryadevara, N. Development and progress in sensors and technologies for human emotion recognition. Sensors 2021, 21, 5554. [Google Scholar] [CrossRef] [PubMed]
- Liao, D.; Shu, L.; Liang, G.D.; Li, Y.X.; Zhang, Y.; Zhang, W.Z.; Xu, X.M. Design and evaluation of affective virtual reality system based on multimodal physiological signals and self-assessment manikin. IEEE J. Electromagn. Microw. Med. Biology. 2020, 4, 216–224. [Google Scholar] [CrossRef]
- Preethi, M.; Nagaraj, S.; Mohan, P.M. Emotion based media playback system using PPG signal. In Proceedings of the 6th International Conference on Wireless Communications, Signal Processing and Networking, Chennai, India, 25–27 March 2021; pp. 426–430. [Google Scholar]
- Machanje, D.I.; Orero, J.O.; Marsala, C. Distress recognition from speech analysis: A pairwise association rules-based approach. In Proceedings of the IEEE Symposium Series on Computational Intelligence, Xiamen, China, 6–9 December 2019; pp. 842–849. [Google Scholar]
- Chen, J.W.; Konrad, J.; Ishwar, P. VGAN-based image representation learning for privacy-preserving facial expression recognition. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Salt Lake City, UT, USA, 18–22 June 2018; p. 00207. [Google Scholar]
- Nakashima, Y.; Koyama, T.; Yokoya, N.; Babaguchi, N. Facial expression preserving privacy protection using image melding. In Proceedings of the 2015 IEEE International Conference on Multimedia and Expo (ICME), Turin, Italy, 29 June–3 July 2015; p. 7177394. [Google Scholar]
- Ullah, A.; Wang, J.; Anwar, M.S.; Ahmad, A.; Nazir, S.; Khan, H.U.; Fei, Z.S. Fusion of machine learning and privacy preserving for secure facial expression recognition. Secur. Commun. Netw. 2021, 2021, 6673992. [Google Scholar] [CrossRef]
- Can, Y.S.; Ersoy, C. Privacy-preserving federated deep learning for wearable IoT-based biomedical monitoring. ACM Trans. Internet Technol. 2021, 21, 21. [Google Scholar] [CrossRef]
Type of Feature Selection | Classifier | Fear Evaluation Modality | |||
---|---|---|---|---|---|
Two-Level | Four-Level | ||||
F1 Score (%) | Accuracy (%) | F1 Score (%) | Accuracy (%) | ||
No Feature Selection | DNN1 | 81.40 | 81.41 | 59.89 | 62.67 |
DNN2 | 77.01 | 76.99 | 36.96 | 47.74 | |
DNN3 | 81.09 | 81.14 | 49.11 | 57.12 | |
DNN4 | 78.51 | 78.52 | 24.51 | 41.67 | |
SCN | 77.15 | 78.50 | 45.25 | 46.20 | |
SVM | 81.64 | 81.64 | 80.85 | 81.70 | |
RF | 89.96 | 90.07 | 82.59 | 83.24 | |
LDA | 69.09 | 69.10 | 64.96 | 65.32 | |
KNN | 83.38 | 83.36 | 80.52 | 80.73 |
Type of Feature Selection | Classifier | Fear Evaluation Modality | |||
---|---|---|---|---|---|
Two-Level | Four-Level | ||||
F1 Score (%) | Accuracy (%) | F1 Score (%) | Accuracy (%) | ||
No Feature Selection | DNN1 | 81.99 | 81.99 | 67.46 | 68.98 |
DNN2 | 78.16 | 78.14 | 55.92 | 58.85 | |
DNN3 | 82.21 | 82.26 | 57.70 | 60.94 | |
DNN4 | 79.14 | 79.12 | 30.13 | 43.63 | |
SCN | 75.12 | 75.50 | 51.20 | 51.50 | |
SVM | 83.15 | 83.13 | 83.46 | 84.01 | |
RF | 93.11 | 93.13 | 85.33 | 85.74 | |
LDA | 70.46 | 70.52 | 60.98 | 61.46 | |
KNN | 85.84 | 85.82 | 82.94 | 83.24 |
JAWS | JAWS_pos | JAWS_neg | ||||
---|---|---|---|---|---|---|
Without PCA (%) | With PCA (%) | Without PCA (%) | With PCA (%) | Without PCA (%) | With PCA (%) | |
ACC | 65.85 | 73.17 | 60.89 | 73.17 | 63.41 | 78.05 |
Signals | Applications |
---|---|
EEG | CAD activity; Cognitive level; Driver’s mentality; Environmental design; Insider risk evaluation; Playing games; Product aesthetics; Product design; Response to automobile sound; Response to songs and music; Viewing paintings, videos, films. |
EDA | CAD activity; Completing arithmetic tasks; Driving stress; Driver’s mentality; Experience of virtual reality; Hearing aid systems; Learners’ physiology; Making decisions in response to health-related messages; Playing games; Pleasure caused by tactile sensations; Response to songs and music; Social and communicative behavior of children; Viewing paintings, films, videos, scenery. |
ECG | CAD activity; Completing arithmetic tasks; Driving stress; Driver’s mentality; Experience of virtual reality; Learners’ physiology; Making decisions in response to health-related messages; Playing games; Product design; Response to songs and music; Viewing paintings, films, videos. |
EMG | Completing arithmetic tasks; Driver’s mentality; Learners’ physiology; Playing games; Response to songs and music; Viewing paintings, films, videos. |
Pulse wave | Response to songs and music; Viewing paintings, scenery. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lin, W.; Li, C. Review of Studies on Emotion Recognition and Judgment Based on Physiological Signals. Appl. Sci. 2023, 13, 2573. https://doi.org/10.3390/app13042573
Lin W, Li C. Review of Studies on Emotion Recognition and Judgment Based on Physiological Signals. Applied Sciences. 2023; 13(4):2573. https://doi.org/10.3390/app13042573
Chicago/Turabian StyleLin, Wenqian, and Chao Li. 2023. "Review of Studies on Emotion Recognition and Judgment Based on Physiological Signals" Applied Sciences 13, no. 4: 2573. https://doi.org/10.3390/app13042573
APA StyleLin, W., & Li, C. (2023). Review of Studies on Emotion Recognition and Judgment Based on Physiological Signals. Applied Sciences, 13(4), 2573. https://doi.org/10.3390/app13042573