Graph Theoretical Analysis of EEG Functional Connectivity Patterns and Fusion with Physiological Signals for Emotion Recognition
Abstract
:1. Introduction
- Assessing the performance of graph theory analysis of EEG signals for the problem of emotion recognition.
- Proposing a novel framework for multimodal emotion recognition from EEG and peripheral physiological signals. The novelty of the method stands in the exploitation of graph theory measures for the feature extraction of EEG signals, along with a fusion scheme of these graph theory features with statistical features from peripheral physiological signals.
- Testing the accuracy of different classifiers and a CNN for the emotion recognition problem based on the aforementioned analysis framework.
- Examining the performance the proposed framework in two different scenarios; a subject-dependent scenario and a subject-independent scenario.
- Evaluating the two different scenarios of the proposed framework using the DEAP dataset [23].
2. Related Work
2.1. Single Modality Emotion Recognition
2.2. Multimodal Emotion Recognition
3. Materials and Methods
3.1. Dataset
3.2. Data Analysis
3.2.1. Feature Extraction
3.2.2. Experimental Design
4. Results and Discussion
4.1. Subject-Dependent Results
4.2. Subject-Independent Results
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
EEG | Electroencephalograph |
CNN | Convolutional neural network |
HCI | Human–computer interaction |
ECG | Electrocardiograph |
GSR | Galvanic skin response |
MI | Mutual information |
SVM | Support vector machines |
RF | Random forest |
XGB | Extreme gradient boosting |
PCA | Principal component analysis |
LDA | Linear discriminant analysis |
GBDT | Gradient boosting decision tree |
PLV | Phase locking value |
P-GCNN | PLV-based graph CNN |
HR | Heart rate |
EMG | Electromyograph |
PPG | Photoplethysmograph |
LSTM | Long short-term memory |
RSP | Respiration |
SC | Skin conductivity |
GA | Genetic algorithm |
LOOCV | Leave one out cross validation |
References
- Picard, R.W.; Picard, R. EEG-detected olfactory imagery to reveal covert consciousness in minimally conscious state. Brain Inj. 1997, 29, 1729–1735. [Google Scholar]
- Akinloye, F.O.; Obe, O.; Boyinbode, O. Development of an affective-based e-healthcare system for autistic children. Sci. Afr. 2020, 9, e00514. [Google Scholar] [CrossRef]
- Lara-Alvarez, C.; Mitre-Hernandez, H.; Flores, J.J.; Pérez-Espinosa, H. Induction of emotional states in educational video games through a fuzzy control system. IEEE Trans. Affect. Comput. 2018, 12, 66–77. [Google Scholar] [CrossRef]
- Kumar, S.; Yadava, M.; Roy, P.P. Fusion of EEG response and sentiment analysis of products review to predict customer satisfaction. Inf. Fusion 2019, 52, 41–52. [Google Scholar] [CrossRef]
- Samara, A.; Galway, L.; Bond, R.; Wang, H. Affective state detection via facial expression analysis within a human–computer interaction context. J. Ambient. Intell. Humaniz. Comput. 2019, 10, 2175–2184. [Google Scholar] [CrossRef] [Green Version]
- Picard, R.W.; Vyzas, E.; Healey, J. Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 1175–1191. [Google Scholar] [CrossRef] [Green Version]
- Wu, S.; Xu, X.; Shu, L.; Hu, B. Estimation of valence of emotion using two frontal EEG channels. In Proceedings of the 2017 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Kansas City, MO, USA, 13–16 November 2017; pp. 1127–1130. [Google Scholar]
- Sarkar, P.; Etemad, A. Self-supervised learning for ecg-based emotion recognition. In Proceedings of the ICASSP 2020–2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Barcelona, Spain, 4–8 May 2020; pp. 3217–3221. [Google Scholar]
- Domínguez-Jiménez, J.A.; Campo-Landines, K.C.; Martínez-Santos, J.C.; Delahoz, E.J.; Contreras-Ortiz, S.H. A machine learning model for emotion recognition from physiological signals. Biomed. Signal Process. Control. 2020, 55, 101646. [Google Scholar] [CrossRef]
- Abdullah, S.M.S.A.; Ameen, S.Y.A.; Sadeeq, M.A.; Zeebaree, S. Multimodal emotion recognition using deep learning. J. Appl. Sci. Technol. Trends 2021, 2, 52–58. [Google Scholar] [CrossRef]
- Wang, Z.; Tong, Y.; Heng, X. Phase-locking value based graph convolutional neural networks for emotion recognition. IEEE Access 2019, 7, 93711–93722. [Google Scholar] [CrossRef]
- Wu, X.; Zheng, W.L.; Li, Z.; Lu, B.L. Investigating EEG-based functional connectivity patterns for multimodal emotion recognition. J. Neural Eng. 2022, 19, 016012. [Google Scholar] [CrossRef]
- Sánchez-Reyes, L.M.; Rodríguez-Reséndiz, J.; Avecilla-Ramírez, G.N.; García-Gomar, M.L.; Robles-Ocampo, J.B. Impact of EEG Parameters Detecting Dementia Diseases: A Systematic Review. IEEE Access 2021, 9, 78060–78074. [Google Scholar] [CrossRef]
- Ortiz-Echeverri, C.J.; Salazar-Colores, S.; Rodríguez-Reséndiz, J.; Gómez-Loenzo, R.A. A New Approach for Motor Imagery Classification Based on Sorted Blind Source Separation, Continuous Wavelet Transform, and Convolutional Neural Network. Sensors 2019, 19, 4541. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Padfield, N.; Zabalza, J.; Zhao, H.; Masero, V.; Ren, J. EEG-based brain-computer interfaces using motor-imagery: Techniques and challenges. Sensors 2019, 19, 1423. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Padfield, N.; Ren, J.; Qing, C.; Murray, P.; Zhao, H.; Zheng, J. Multi-segment majority voting decision fusion for MI EEG brain-computer interfacing. Cogn. Comput. 2021, 13, 1484–1495. [Google Scholar]
- Padfield, N.; Ren, J.; Murray, P.; Zhao, H. Sparse learning of band power features with genetic channel selection for effective classification of EEG signals. Neurocomputing 2021, 463, 566–579. [Google Scholar] [CrossRef]
- Anagnostopoulou, A.; Styliadis, C.; Kartsidis, P.; Romanopoulou, E.; Zilidou, V.; Karali, C.; Karagianni, M.; Klados, M.; Paraskevopoulos, E.; Bamidis, P.D. Computerized physical and cognitive training improves the functional architecture of the brain in adults with Down syndrome: A network science EEG study. Netw. Neurosci. 2021, 5, 274–294. [Google Scholar] [CrossRef] [PubMed]
- Jalili, M. Graph theoretical analysis of Alzheimer’s disease: Discrimination of AD patients from healthy subjects. Inf. Sci. 2017, 384, 145–156. [Google Scholar] [CrossRef]
- Supriya, S.; Siuly, S.; Wang, H.; Zhang, Y. Epilepsy detection from eeg using complex network techniques: A review. IEEE Rev. Biomed. Eng. 2021; early access. [Google Scholar] [CrossRef]
- Mahmud, M.S.; Yeasin, M.; Shen, D.; Arnott, S.R.; Alain, C.; Bidelman, G.M. What brain connectivity patterns from EEG tell us about hearing loss: A graph theoretic approach. In Proceedings of the 2018 10th International Conference on Electrical and Computer Engineering (ICECE), Dhaka, Bangladesh, 20–22 December 2018; pp. 205–208. [Google Scholar]
- Van der Velde, B.; Haartsen, R.; Kemner, C. Test-retest reliability of EEG network characteristics in infants. Brain Behav. 2019, 9, e01269. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. Deap: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 2011, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Ko, B.C. A brief review of facial emotion recognition based on visual information. Sensors 2018, 18, 401. [Google Scholar] [CrossRef] [PubMed]
- Schuller, B.W. Speech emotion recognition: Two decades in a nutshell, benchmarks, and ongoing trends. Commun. ACM 2018, 61, 90–99. [Google Scholar] [CrossRef]
- Kar, N.B.; Babu, K.S.; Sangaiah, A.K.; Bakshi, S. Face expression recognition system based on ripplet transform type II and least square SVM. Multimed. Tools Appl. 2019, 78, 4789–4812. [Google Scholar] [CrossRef]
- Hasani, B.; Mahoor, M.H. Facial expression recognition using enhanced deep 3D convolutional neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, Honolulu, HI, USA, 21–26 July 2017; pp. 2278–2288. [Google Scholar]
- Zhao, J.; Mao, X.; Chen, L. Speech emotion recognition using deep 1D & 2D CNN LSTM networks. Biomed. Signal Process. Control. 2019, 47, 312–323. [Google Scholar]
- Doma, V.; Pirouz, M. A comparative analysis of machine learning methods for emotion recognition using EEG and peripheral physiological signals. J. Big Data 2020, 7, 1–21. [Google Scholar] [CrossRef] [Green Version]
- Song, T.; Lu, G.; Yan, J. Emotion recognition based on physiological signals using convolution neural networks. In Proceedings of the 2020 12th International Conference on Machine Learning and Computing, Shenzhen, China, 15–17 February 2020; pp. 161–165. [Google Scholar]
- Yang, S.; Yang, G. Emotion Recognition of EMG Based on Improved LM BP Neural Network and SVM. J. Softw. 2011, 6, 1529–1536. [Google Scholar] [CrossRef] [Green Version]
- Jerritta, S.; Murugappan, M.; Wan, K.; Yaacob, S. Emotion recognition from facial EMG signals using higher order statistics and principal component analysis. J. Chin. Inst. Eng. 2014, 37, 385–394. [Google Scholar] [CrossRef]
- Latha, G.C.P.; Priya, M.M. Multirate Analysis and Neural Network Based Classification of Human Emotions Using Facial Electromyography Signals. ARPN J. Eng. Appl. Sci. 2016, 11, 12767–12776. [Google Scholar]
- Mithbavkar, S.A.; Shah, M.S. Analysis of EMG Based Emotion Recognition for Multiple People and Emotions. In Proceedings of the 2021 IEEE 3rd Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS), Tainan, Taiwan, 28–30 May 2021; pp. 1–4. [Google Scholar]
- Yang, W.; Rifqi, M.; Marsala, C.; Pinna, A. Physiological-based emotion detection and recognition in a video game context. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–8. [Google Scholar]
- Xie, J.; Xu, X.; Shu, L. WT feature based emotion recognition from multi-channel physiological signals with decision fusion. In Proceedings of the 2018 First Asian Conference on Affective Computing and Intelligent Interaction (ACII Asia), Beijing, China, 20–22 May 2018; pp. 1–6. [Google Scholar]
- Gong, P.; Ma, H.T.; Wang, Y. Emotion recognition based on the multiple physiological signals. In Proceedings of the 2016 IEEE International Conference on Real-time Computing and Robotics (RCAR), Angkor Wat, Cambodia, 6–10 June 2016; pp. 140–143. [Google Scholar]
- Zhao, S.; Gholaminejad, A.; Ding, G.; Gao, Y.; Han, J.; Keutzer, K. Personalized emotion recognition by personality-aware high-order learning of physiological signals. Acm Trans. Multimed. Comput. Commun. Appl. TOMM 2019, 15, 1–18. [Google Scholar] [CrossRef] [Green Version]
- Boonthong, P.; Kulkasem, P.; Rasmequan, S.; Rodtook, A.; Chinnasarn, K. Fisher feature selection for emotion recognition. In Proceedings of the 2015 International Computer Science and Engineering Conference (ICSEC), Chiang Mai, Thailand, 23–26 November 2015; pp. 1–6. [Google Scholar]
- Cui, Y.; Luo, S.; Tian, Q.; Zhang, S.; Peng, Y.; Jiang, L.; Jin, J.S. Mutual information-based emotion recognition. In The Era of Interactive Media; Springer: New York, NY, USA, 2013; pp. 471–479. [Google Scholar]
- Torres-Valencia, C.; Álvarez-López, M.; Orozco-Gutiérrez, Á. SVM-based feature selection methods for emotion recognition from multimodal data. J. Multimodal User Interfaces 2017, 11, 9–23. [Google Scholar] [CrossRef]
- Park, C.H.; Sim, K.B. The novel feature selection method based on emotion recognition system. In Proceedings of the International Conference on Intelligent Computing; Springer: Berlin/Heidelberg, Germany, 2006; pp. 731–740. [Google Scholar]
- Zhang, H. Expression-EEG based collaborative multimodal emotion recognition using deep autoencoder. IEEE Access 2020, 8, 164130–164143. [Google Scholar] [CrossRef]
- Zhang, Y.; Cheng, C.; Zhang, Y. Multimodal emotion recognition using a hierarchical fusion convolutional neural network. IEEE Access 2021, 9, 7943–7951. [Google Scholar] [CrossRef]
- Tzirakis, P.; Trigeorgis, G.; Nicolaou, M.A.; Schuller, B.W.; Zafeiriou, S. End-to-end multimodal emotion recognition using deep neural networks. IEEE J. Sel. Top. Signal Process. 2017, 11, 1301–1309. [Google Scholar] [CrossRef] [Green Version]
- Ranganathan, H.; Chakraborty, S.; Panchanathan, S. Multimodal emotion recognition using deep learning architectures. In Proceedings of the 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Placid, NY, USA, 7–10 March 2016; pp. 1–9. [Google Scholar]
- Hassan, M.M.; Alam, M.G.R.; Uddin, M.Z.; Huda, S.; Almogren, A.; Fortino, G. Human emotion recognition using deep belief network architecture. Inf. Fusion 2019, 51, 10–18. [Google Scholar] [CrossRef]
- Tang, H.; Liu, W.; Zheng, W.L.; Lu, B.L. Multimodal emotion recognition using deep neural networks. In Proceedings of the International Conference on Neural Information Processing; Springer: Cham, Germany, 2017; pp. 811–819. [Google Scholar]
- Citron, F.M.; Gray, M.A.; Critchley, H.D.; Weekes, B.S.; Ferstl, E.C. Emotional valence and arousal affect reading in an interactive way: Neuroimaging evidence for an approach-withdrawal framework. Neuropsychologia 2014, 56, 79–89. [Google Scholar] [CrossRef] [Green Version]
- Zaki, M.; Alquraini, A.; Sheltami, T.R. Home Automation using EMOTIV: Controlling TV by Brainwaves. J. Ubiquitous Syst. Pervasive Netw. 2018, 10, 27–32. [Google Scholar] [CrossRef]
- Jeong, J.; Chae, J.H.; Kim, S.Y.; Han, S.H. Nonlinear dynamic analysis of the EEG in patients with Alzheimer’s disease and vascular dementia. J. Clin. Neurophysiol. 2001, 18, 58–67. [Google Scholar] [CrossRef] [Green Version]
- Watts, D.J.; Strogatz, S.H. Collective dynamics of ‘small-world’networks. Nature 1998, 393, 440–442. [Google Scholar] [CrossRef]
- De Vico Fallani, F.; Astolfi, L.; Cincotti, F.; Mattia, D.; La Rocca, D.; Maksuti, E.; Salinari, S.; Babiloni, F.; Vegso, B.; Kozmann, G.; et al. Evaluation of the brain network organization from EEG signals: A preliminary evidence in stroke patient. Anat. Rec. Adv. Integr. Anat. Evol. Biol. Adv. Integr. Anat. Evol. Biol. 2009, 292, 2023–2031. [Google Scholar] [CrossRef]
- Rubinov, M.; Sporns, O. Complex network measures of brain connectivity: Uses and interpretations. Neuroimage 2010, 52, 1059–1069. [Google Scholar] [CrossRef]
- Bullmore, E.; Sporns, O. The economy of brain network organization. Nat. Rev. Neurosci. 2012, 13, 336–349. [Google Scholar] [CrossRef] [PubMed]
- Rodríguez-Abreo, O.; Rodríguez-Reséndiz, J.; Montoya-Santiyanes, L.; Álvarez-Alvarado, J.M. Non-linear regression models with vibration amplitude optimization algorithms in a microturbine. Sensors 2021, 22, 130. [Google Scholar] [CrossRef] [PubMed]
- Cruz-Miguel, E.E.; García-Martínez, J.R.; Rodríguez-Reséndiz, J.; Carrillo-Serrano, R.V. A new methodology for a retrofitted self-tuned controller with open-source fpga. Sensors 2020, 20, 6155. [Google Scholar] [CrossRef] [PubMed]
- Lin, Y.P.; Wang, C.H.; Jung, T.P.; Wu, T.L.; Jeng, S.K.; Duann, J.R.; Chen, J.H. EEG-Based Emotion Recognition in Music Listening. IEEE Trans. Biomed. Eng. 2010, 57, 1798–1806. [Google Scholar] [CrossRef]
- Anh, V.H.; Van, M.N.; Ha, B.B.; Quyet, T.H. A real-time model based Support Vector Machine for emotion recognition through EEG. In Proceedings of the 2012 International Conference on Control, Automation and Information Sciences (ICCAIS), Saigon, Vietnam, 26–29 November 2012; pp. 191–196. [Google Scholar] [CrossRef] [Green Version]
- Pandey, P.; Seeja, K. Subject independent emotion recognition from EEG using VMD and deep learning. J. King Saud-Univ.-Comput. Inf. Sci. 2019, 35, 1730–1738. [Google Scholar] [CrossRef]
- Chao, H.; Dong, L.; Liu, Y.; Lu, B. Emotion recognition from multiband EEG signals using CapsNet. Sensors 2019, 19, 2212. [Google Scholar] [CrossRef] [Green Version]
- Joshi, V.M.; Ghongade, R.B. EEG based emotion detection using fourth order spectral moment and deep learning. Biomed. Signal Process. Control. 2021, 68, 102755. [Google Scholar] [CrossRef]
- Xing, X.; Li, Z.; Xu, T.; Shu, L.; Hu, B.; Xu, X. SAE+ LSTM: A New framework for emotion recognition from multi-channel EEG. Front. Neurorobot. 2019, 13, 37. [Google Scholar] [CrossRef]
Peripheral Physiological Signals Features | |
---|---|
Feature | Total Number of Features |
Mean | 8 |
Variance | 8 |
Standard deviation | 8 |
Maximum value | 8 |
Minimum value | 8 |
Skewness | 8 |
Kurtosis | 8 |
25% quantile range | 8 |
50% quantile range | 8 |
75% quantile range | 8 |
Zero-crossing rate | 8 |
Approximate entropy | 8 |
Global Graph Measures | |
Graph Measure | Total Number of Features |
Characteristic path length | 1 |
Global efficiency | 1 |
Transitivity | 1 |
Modularity | 1 |
Density | 1 |
Local Graph Measures | |
Graph Measure | Total Number of Features |
Clustering coefficient | 32 |
Local efficiency | 32 |
Betweenness centrality | 32 |
Degree centrality | 32 |
Total | 224 |
Physiological Features | Graph Theory Features | Concatenation | ||||
---|---|---|---|---|---|---|
Valence | Arousal | Valence | Arousal | Valence | Arousal | |
SVM | 68.5 ± 4.76 | 71.12 ± 6.42 | 71.5 ± 5.21 | 72.58 ± 7.12 | 82.4 ± 5.39 | 81.15 ± 8.39 |
RF | 72.7 ± 5.18 | 73.64 ± 5.12 | 75.2 ± 5.19 | 78.27 ± 6.26 | 82.68 ± 5.77 | 81.9 ± 7.09 |
XGB | 73.2 ± 4.76 | 75.34 ± 8.07 | 79.8 ± 4.98 | 80.12 ± 8.51 | 83.41 ± 6.09 | 82.92 ± 7.41 |
CNN | 76.5 ± 5.14 | 78.24 ± 7.35 | 81.2 ± 5.41 | 80.89 ± 6.72 | 83.94 ± 6.77 | 83.87 ± 7.72 |
Without GA Feature Selection | With GA Feature Selection | |||
---|---|---|---|---|
Valence | Arousal | Valence | Arousal | |
SVM | 82.4 ± 5.39 | 81.15 ± 8.39 | 85.71 ± 5.27 | 84.37 ± 7.32 |
RF | 82.68 ± 5.77 | 81.9 ± 7.09 | 87.65 ± 4.68 | 86.92 ± 6.06 |
XGB | 83.41 ± 6.09 | 82.92 ± 7.41 | 87.78 ± 4.99 | 87.72 ± 6.39 |
CNN | 83.94 ± 6.77 | 83.87 ± 7.72 | 88.27 ± 5.43 | 90.84 ± 6.15 |
Paper | Method | Valence | Arousal |
---|---|---|---|
Wang et al. [11] |
| 73.31 ± 11.66 | 77.03 ± 11.49 |
Tang et al. [50] |
| 83.82 ± 5.01 | 83.23 ± 2.61 |
Zhang et al. [44] |
| 84.71 ± – | 83.28 ± – |
Wu et al. [12] |
| 85.34 ± 2.90 | 86.61 ± 3.76 |
Our work |
| 88.27 ± 5.43 | 90.84 ± 6.15 |
Without GA Feature Selection | With GA Feature Selection | |||
---|---|---|---|---|
Valence | Arousal | Valence | Arousal | |
CNN | 55.62 ± 4.42 | 57.38 ± 6.12 | 75.44 ± 5.14 | 78.76 ± 5.42 |
Paper | Method | Valence | Arousal |
---|---|---|---|
Pandey et al. [60] |
| 62.5 | 61.25 |
Chao et al. [61] |
| 66.73 | 68.28 |
Joshi et al. [62] |
| 75.5 | 76 |
Xing et al. [63] |
| 81.1 | 74.38 |
Our work |
| 75.44 ± 5.14 | 78.76 ± 5.42 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xefteris, V.-R.; Tsanousa, A.; Georgakopoulou, N.; Diplaris, S.; Vrochidis, S.; Kompatsiaris, I. Graph Theoretical Analysis of EEG Functional Connectivity Patterns and Fusion with Physiological Signals for Emotion Recognition. Sensors 2022, 22, 8198. https://doi.org/10.3390/s22218198
Xefteris V-R, Tsanousa A, Georgakopoulou N, Diplaris S, Vrochidis S, Kompatsiaris I. Graph Theoretical Analysis of EEG Functional Connectivity Patterns and Fusion with Physiological Signals for Emotion Recognition. Sensors. 2022; 22(21):8198. https://doi.org/10.3390/s22218198
Chicago/Turabian StyleXefteris, Vasileios-Rafail, Athina Tsanousa, Nefeli Georgakopoulou, Sotiris Diplaris, Stefanos Vrochidis, and Ioannis Kompatsiaris. 2022. "Graph Theoretical Analysis of EEG Functional Connectivity Patterns and Fusion with Physiological Signals for Emotion Recognition" Sensors 22, no. 21: 8198. https://doi.org/10.3390/s22218198
APA StyleXefteris, V. -R., Tsanousa, A., Georgakopoulou, N., Diplaris, S., Vrochidis, S., & Kompatsiaris, I. (2022). Graph Theoretical Analysis of EEG Functional Connectivity Patterns and Fusion with Physiological Signals for Emotion Recognition. Sensors, 22(21), 8198. https://doi.org/10.3390/s22218198