Feature Selection for Continuous within- and Cross-User EEG-Based Emotion Recognition
Abstract
:1. Introduction
- An analysis of existing EEG features and their selection for recognizing time-varying emotions induced by longer stumuli.
- An evaluation of the impact of various cross-validation evaluation frameworks, and thereby use cases, on the design and performance of EEG-based emotion classification.
2. Related Work
3. Methods
3.1. Dataset & Preprocessing
3.2. Feature Extraction & Selection
3.2.1. Feature Extraction
- PSD: The power spectral density, reflective of the distribution of signal power across frequencies [58], is one of the most common EEG features. The PSD was calculated for different EEG frequency bands, as different bands have been associated with different processes [12]. Theta waves are associated with affective processing [59], whereas the slower alpha band reflects attentional demands such as alertness and expectancy [60]. The entire alpha band reflects task-related processes [60]. Alpha waves tend to occur when someone is in a relaxed state of mind, while beta waves tend to occur when an individual is in more of an active state [61]. The gamma band reflects a reflective aspect when processing emotional material [62].Following AMIGOS [29], the Welch method with windows of 128 samples (1 s) were used to calculate the PSDs. These PSDs, X, were then averaged over each frequency band, and the logarithms were obtained as features. The terms ‘low’ and ‘high’ refer to the minimum and maximum frequency range within a band.
- Spectral Asymmetry (Asymm): Spectral asymmetry leverages both frequency-domain and spatial information about emotional changes in the brain [41]. The asymmetry in the different bands across channels has been shown to be indicative of different emotions. For example, the alpha asymmetry between frontal lobe channels F3 and F4 can relate to valence [63] and the beta asymmetry between parietal lobe channels P3 and P4 can correlate with angry facial expressions [64].The differential spectral asymmetry was calculated by taking the difference of the PSD features from symmetric channels from the left and right hemispheres
- Hjorth: The Hjorth Mobility (HM) is an estimation of the signal’s mean frequency, and the Hjorth Complexity (HC) reflects the bandwidth and the change in frequency [65]. It is defined as the square root of the variance of the signal derivative, normalized by the variance of the signal, while not yet widely adopted, Hjorth features have been shown to be relevant for emotion recognition [43]. Equations are as given in [66].
- Detrended Fluctuation Analysis (DFA): DFA quantifies the statistical persistence, or auto-correlation, property of non-stationary physiological signals [67]. Briefly, DFA evaluates the detrended and integrated signal as a function of window size. Commonly used in many fields, including for ECG analysis, DFA has also been found to be beneficial for EEG emotion recognition [68]. Equations are as given in [66].
- Fractal Dimension (FD): Fractal dimension approaches, such as the Petrosian fractal dimension (PFD) and Higuchi fractal dimension (HFD), are a measure of signal complexity [69] and are commonly used for non-stationary and transient signals. The Higuchi fractal dimension has been used more frequently in emotion recognition works [12], but in neurophysiology, both Higuchi and Petrosian fractal dimensions are commonly cited [70]. Equations are as given in [66].
- Entropy (Ent): Entropy is a measure of chaos, or disorder, in a system or signal, and is, therefore, used to understand signal complexity [71]. Here, the spectral entropy (SpecEnt), the entropy of the PSD [72], and the SVD entropy (svdEnt), an indicator of how many vectors are needed to reconstruct an adequate explanation of a signal [72], were used. Hatamikia et al. [44] found that spectral entropy outperformed the Petrosian and Katz fractal dimensions for emotion recognition. Gupta et al. [35] used SVD entropy as part of a set of features to classify discrete emotion for short movies. Equations are as given in [72].
- Fisher Information (FI): The Fisher Information is a measure of how much information a random variable carries about the data that it models. It is also known as the expected value of the observed information [73]. Although less commonly used, FI of EEG has been shown to contain affect information [35]. Equations are as given in [66].
3.2.2. Feature Selection
3.3. Emotion Classification & Evaluation Metrics
3.4. Evaluation Frameworks
3.5. Statistical Testing
4. Results
4.1. LOPO Cross-Validation
4.2. Lomo Cross-Validation
4.2.1. LOMO-Inter Cross-Validation
4.2.2. Lomo-Within Cross-Validation
4.3. Lopmo Cross-Validation
4.4. Comparative Performance Analysis
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Mental Health Commission of Canada. Making the Case for Investing in Mental Health in Canada; Technical Report; Mental Health Commission of Canada: Ottawa, ON, Canada, 2013. [Google Scholar]
- Public Health Agency of Canada. Mood and Anxiety Disorders in Canada. 2015. Available online: https://www.canada.ca/content/dam/canada/health-canada/migration/healthy-canadians/publications/diseases-conditions-maladies-affections/mental-mood-anxiety-anxieux-humeur/alt/mental-mood-anxiety-anxieux-humeur-eng.pdf (accessed on 6 December 2019).
- Osuch, E.A.; Vingilis, E.; Fisman, S.; Summerhurst, C. Early Intervention in Mood and Anxiety Disorders: The First Episode Mood and Anxiety Program (FEMAP). Healthc. Q. 2016, 18, 42–49. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mower, E.; Member, S.; Mataric, M.J. A Framework for Automatic Human Emotion. IEEE Trans. Audio Speech Lang. Process. 2011, 19, 1057–1070. [Google Scholar] [CrossRef]
- Valstar, M.F.; Pantic, M. Induced Disgust, Happiness and Surprise: An Addition to the MMI Facial Expression Database. In Proceedings of the 3rd International Workshop on EMOTION (satellite of LREC): Corpora for Research on Emotion and Affect, Valletta, Malta, 23 May 2010; pp. 65–70. [Google Scholar]
- Niu, X.; Chen, L.; Xie, H.; Chen, Q.; Li, H. Emotion pattern recognition using physiological signals. Sens. Transducers 2014, 172, 147. [Google Scholar]
- He, C.; Yao, Y.j.; Ye, X.s. An Emotion Recognition System Based on Physiological Signals Obtained by Wearable Sensors. In Wearable Sensors and Robots; Springer: Singapore, 2017; pp. 15–25. [Google Scholar] [CrossRef]
- Liu, Y.; Sourina, O.; Nguyen, M.K. Real-time EEG-based emotion recognition and its applications. In Transactions on Computational Science XII; Springer: Berlin/Heidelberg, Germany, 2011; pp. 256–277. [Google Scholar]
- Balconi, M.; Lucchiari, C. EEG correlates (event-related desynchronization) of emotional face elaboration: A temporal analysis. Neurosci. Lett. 2006, 392, 118–123. [Google Scholar] [CrossRef] [PubMed]
- Wang, X.W.; Nie, D.; Lu, B.L. EEG-based emotion recognition using frequency domain features and support vector machines. In International Conference on Neural Information Processing; Springer: Berlin/Heidelberg, Germany, 2011; pp. 734–743. [Google Scholar]
- Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A review of emotion recognition using physiological signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Alarcao, S.M.; Fonseca, M.J. Emotions Recognition Using EEG Signals: A Survey. IEEE Trans. Affect. Comput. 2019, 10, 374–393. [Google Scholar] [CrossRef]
- Gross, J.J.; Levenson, R.W. Emotion elicitation using films. Cogn. Emot. 1995, 9, 87–108. [Google Scholar] [CrossRef]
- Schaefer, A.; Nils, F.; Philippot, P.; Sanchez, X. Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers. Cogn. Emot. 2010, 24, 1153–1172. [Google Scholar] [CrossRef]
- Wang, X.W.; Nie, D.; Lu, B.L. Emotional state classification from EEG data using machine learning approach. Neurocomputing 2014, 129, 94–106. [Google Scholar] [CrossRef]
- Liu, Y.J.; Yu, M.; Zhao, G.; Song, J.; Ge, Y.; Shi, Y. Real-time movie-induced discrete emotion recognition from EEG signals. IEEE Trans. Affect. Comput. 2017, 9, 550–562. [Google Scholar] [CrossRef]
- Jung, T.P.; Sejnowski, T.J. Multi-modal approach for affective computing. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 291–294. [Google Scholar]
- Miranda-Correa, J.A.; Patras, I. A multi-task cascaded network for prediction of affect, personality, mood and social context using eeg signals. In Proceedings of the 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), Xi’an, China, 15–19 May 2018; pp. 373–380. [Google Scholar] [CrossRef]
- Kolodyazhniy, V.; Kreibig, S.D.; Gross, J.J.; Roth, W.T.; Wilhelm, F.H. An affective computing approach to physiological emotion specificity: Toward subject-independent and stimulus-independent classification of film-induced emotions. Psychophysiology 2011, 48, 908–922. [Google Scholar] [CrossRef] [PubMed]
- Jenke, R.; Peer, A.; Buss, M. Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affect. Comput. 2014, 5, 327–339. [Google Scholar] [CrossRef]
- Nakisa, B.; Rastgoo, M.N.; Tjondronegoro, D.; Chandran, V. Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors. Expert Syst. Appl. 2018, 93, 143–155. [Google Scholar] [CrossRef] [Green Version]
- Widiyanti, E.; Endah, S.N. Feature Selection for Music Emotion Recognition. In Proceedings of the 2018 2nd International Conference on Informatics and Computational Sciences (ICICoS), Semarang, Indonesia, 30–31 October 2018; pp. 1–5. [Google Scholar]
- Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef] [Green Version]
- Siddharth, S.; Jung, T.P.; Sejnowski, T.J. Utilizing Deep Learning Towards Multi-modal Bio-sensing and Vision-based Affective Computing. IEEE Trans. Affect. Comput. 2019, 13, 96–107. [Google Scholar] [CrossRef] [Green Version]
- Adolphs, R.; Tranel, D.; Damasio, A.R. Dissociable neural systems for recognizing emotions. Brain Cogn. 2003, 52, 61–69. [Google Scholar] [CrossRef] [PubMed]
- Malandrakis, N.; Potamianos, A.; Evangelopoulos, G.; Zlatintsi, A. A supervised approach to movie emotion tracking. In Proceedings of the 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Prague, Czech Republic, 22–27 May 2011; pp. 2376–2379. [Google Scholar] [CrossRef]
- Baveye, Y.; Dellandrea, E.; Chamaret, C.; Chen, L. LIRIS-ACCEDE: A Video Database for Affective Content Analysis. IEEE Trans. Affect. Comput. 2015, 6, 43–55. [Google Scholar] [CrossRef] [Green Version]
- Soleymani, M.; Asghari-Esfeden, S.; Fu, Y.; Pantic, M. Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection. IEEE Trans. Affect. Comput. 2016, 7, 17–28. [Google Scholar] [CrossRef]
- Miranda-Correa, J.A.; Khomami Abadi, M.; Sebe, N.; Patras, I. AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups. IEEE Trans. Affect. Comput. 2018, 3045, 1–14. [Google Scholar] [CrossRef] [Green Version]
- Hussain, I.; Park, S.J. HealthSOS: Real-time health monitoring system for stroke prognostics. IEEE Access 2020, 8, 213574–213586. [Google Scholar] [CrossRef]
- Hussain, I.; Park, S.J. Quantitative evaluation of task-induced neurological outcome after stroke. Brain Sci. 2021, 11, 900. [Google Scholar] [CrossRef] [PubMed]
- Chang, E.J.; Rahimi, A.; Benini, L.; Wu, A.Y.A. Hyperdimensional Computing-based Multimodality Emotion Recognition with Physiological Signals. In Proceedings of the 2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), Hsinchu, Taiwan, 18–20 March 2019; pp. 137–141. [Google Scholar] [CrossRef] [Green Version]
- Althobaiti, T.; Katsigiannis, S.; West, D.; Ramzan, N. Examining Human-Horse Interaction by Means of Affect Recognition via Physiological Signals. IEEE Access 2019, 7, 77857–77867. [Google Scholar] [CrossRef]
- Shukla, A.; Gullapuram, S.S.; Katti, H.; Kankanhalli, M.; Winkler, S.; Subramanian, R. Recognition of Advertisement Emotions with Application to Computational Advertising. IEEE Trans. Affect. Comput. 2020, 13, 781–792. [Google Scholar] [CrossRef] [Green Version]
- Gupta, A.; Sahu, H.; Nanecha, N.; Kumar, P.; Roy, P.P.; Chang, V. Enhancing text using emotion detected from EEG signals. J. Grid Comput. 2019, 17, 325–340. [Google Scholar] [CrossRef]
- Mehmood, R.M.; Du, R.; Lee, H.J. Optimal feature selection and deep learning ensembles method for emotion recognition from human brain EEG sensors. IEEE Access 2017, 5, 14797–14806. [Google Scholar] [CrossRef]
- Kumar, P.; Scheme, E. A deep spatio-temporal model for EEG-based imagined speech recognition. In Proceedings of the ICASSP 2021–2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Toronto, ON, Canada, 6–11 June 2021; pp. 995–999. [Google Scholar]
- Rayatdoost, S.; Rudrauf, D.; Soleymani, M. Multimodal Gated Information Fusion for Emotion Recognition from EEG Signals and Facial Behaviors. In Proceedings of the 2020 22nd International Conference on Multimodal Interaction, Utrecht, The Netherlands, 25–29 October 2020; pp. 655–659. [Google Scholar]
- Zhong, P.; Wang, D.; Miao, C. EEG-based emotion recognition using regularized graph neural networks. IEEE Trans. Affect. Comput. 2020, 13, 1290–1301. [Google Scholar] [CrossRef]
- Bhardwaj, A.; Gupta, A.; Jain, P.; Rani, A.; Yadav, J. Classification of human emotions from EEG signals using SVM and LDA Classifiers. In Proceedings of the 2015 2nd International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 19–20 February 2015; pp. 180–185. [Google Scholar]
- Lin, Y.P.; Wang, C.H.; Jung, T.P.; Wu, T.L.; Jeng, S.K.; Duann, J.R.; Chen, J.H. EEG-based emotion recognition in music listening. IEEE Trans. Biomed. Eng. 2010, 57, 1798–1806. [Google Scholar]
- Petrantonakis, P.C.; Hadjileontiadis, L.J. Emotion recognition from EEG using higher order crossings. IEEE Trans. Inf. Technol. Biomed. 2009, 14, 186–197. [Google Scholar] [CrossRef]
- Li, X.; Song, D.; Zhang, P.; Zhang, Y.; Hou, Y.; Hu, B. Exploring EEG features in cross-subject emotion recognition. Front. Neurosci. 2018, 12, 162. [Google Scholar] [CrossRef] [Green Version]
- Hatamikia, S.; Nasrabadi, A.M. Recognition of emotional states induced by music videos based on nonlinear feature extraction and som classification. In Proceedings of the 2014 21th Iranian Conference on Biomedical Engineering (ICBME), Tehran, Iran, 26–28 November 2014; pp. 333–337. [Google Scholar]
- Solhjoo, S.; Nasrabadi, A.M.; Golpayegani, M.R.H. EEG-based mental task classification in hypnotized and normal subjects. In Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, Shanghai, China, 17–18 January 2006; pp. 2041–2043. [Google Scholar]
- Shoeibi, A.; Ghassemi, N.; Alizadehsani, R.; Rouhani, M.; Hosseini-Nejad, H.; Khosravi, A.; Panahiazar, M.; Nahavandi, S. A comprehensive comparison of handcrafted features and convolutional autoencoders for epileptic seizures detection in EEG signals. Expert Syst. Appl. 2020, 163, 113788. [Google Scholar] [CrossRef]
- Zhang, J.; Chen, P.; Nichele, S.; Yazidi, A. Emotion Recognition Using Time-frequency Analysis of EEG Signals and Machine Learning. In Proceedings of the 2019 IEEE Symposium Series on Computational Intelligence (SSCI), Xiamen, China, 6–9 December 2019; pp. 404–409. [Google Scholar]
- Atkinson, J.; Campos, D. Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers. Expert Syst. Appl. 2016, 47, 35–41. [Google Scholar] [CrossRef]
- Yin, Z.; Wang, Y.; Liu, L.; Zhang, W.; Zhang, J. Cross-subject EEG feature selection for emotion recognition using transfer recursive feature elimination. Front. Neurorobotics 2017, 11, 19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Koelstra, S.; Mühl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I.Y. DEAP: A Database for Emotion Analysis Using Physiological Signals. IEEE Trans. Affect. Comput. 2011, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Abadi, M.K.; Subramanian, R.; Kia, S.M.; Avesani, P.; Patras, I.; Sebe, N. DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses. IEEE Trans. Affect. Comput. 2015, 6, 209–222. [Google Scholar] [CrossRef]
- Soleymani, M.; Pantic, M.; Pun, T. Multimodal Emotion Recognition in Response to Videos. IEEE Trans. Affect. Comput. 2012, 3, 211–223. [Google Scholar] [CrossRef] [Green Version]
- Hutchison, D.; Mitchell, J.C. Transactions on Computational Science XVII; Springer: Berlin/Heidelberg, Germany, 1973; pp. 172–185. [Google Scholar] [CrossRef]
- Shukla, J.; Barreda-Angeles, M.; Oliver, J.; Nandi, G.C.; Puig, D. Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity. IEEE Trans. Affect. Comput. 2019, 12, 857–869. [Google Scholar] [CrossRef]
- Mou, W.; Gunes, H.; Patras, I. Alone versus in-a-group: A multi-modal framework for automatic affect recognition. ACM Trans. Multimed. Comput. Commun. Appl. 2019, 15, 47. [Google Scholar] [CrossRef]
- Gómez-Herrero, G.; Rutanen, K.; Egiazarian, K. Blind source separation by entropy rate minimization. IEEE Signal Process. Lett. 2009, 17, 153–156. [Google Scholar] [CrossRef]
- Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161. [Google Scholar] [CrossRef]
- Dressler, O.; Schneider, G.; Stockmanns, G.; Kochs, E. Awareness and the EEG power spectrum: Analysis of frequencies. Br. J. Anaesth. 2004, 93, 806–809. [Google Scholar] [CrossRef] [Green Version]
- Aftanas, L.; Varlamov, A.; Pavlov, S.; Makhnev, V.; Reva, N. Event-related synchronization and desynchronization during affective processing: Emergence of valence-related time-dependent hemispheric asymmetries in theta and upper alpha band. Int. J. Neurosci. 2001, 110, 197–219. [Google Scholar] [CrossRef]
- Klimesch, W.; Doppelmayr, M.; Russegger, H.; Pachinger, T.; Schwaiger, J. Induced alpha band power changes in the human EEG and attention. Neurosci. Lett. 1998, 244, 73–76. [Google Scholar] [CrossRef] [PubMed]
- Plass-Oude Bos, D. EEG-based Emotion Recognition. Influ. Vis. Audit. Stimuli 2006, 56, 1–17. [Google Scholar]
- Oathes, D.J.; Ray, W.J.; Yamasaki, A.S.; Borkovec, T.D.; Castonguay, L.G.; Newman, M.G.; Nitschke, J. Worry, Generalized Anxiety Disorder, and Emotion: Evidence from the EEG gamma band. Biol. Psychol. 2008, 79, 165–170. [Google Scholar] [CrossRef] [Green Version]
- Schmidt, L.A.; Trainor, L.J. Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cogn. Emot. 2001, 15, 487–500. [Google Scholar] [CrossRef]
- Schutter, D.J.; Putman, P.; Hermans, E.; van Honk, J. Parietal electroencephalogram beta asymmetry and selective attention to angry facial expressions in healthy human subjects. Neurosci. Lett. 2001, 314, 13–16. [Google Scholar] [CrossRef] [PubMed]
- Hjorth, B. EEG analysis based on time domain properties. Electroencephalogr. Clin. Neurophysiol. 1970, 29, 306–310. [Google Scholar] [CrossRef]
- Bao, F.S.; Liu, X.; Zhang, C. PyEEG: An Open Source Python Module for EEG/MEG Feature Extraction. Comput. Intell. Neurosci. 2011, 2011, 406391. [Google Scholar] [CrossRef] [PubMed]
- Peng, C.K.; Havlin, S.; Stanley, H.E.; Goldberger, A.L. Quantification of scaling exponents and crossover phenomena in nonstationary heartbeat time series. Chaos Interdiscip. J. Nonlinear Sci. 1995, 5, 82–87. [Google Scholar] [CrossRef]
- Sanyal, S.; Banerjee, A.; Pratihar, R.; Maity, A.K.; Dey, S.; Agrawal, V.; Sengupta, R.; Ghosh, D. Detrended Fluctuation and Power Spectral Analysis of alpha and delta EEG brain rhythms to study music elicited emotion. In Proceedings of the 2015 International Conference on Signal Processing, Computing and Control (ISPCC), Solan, India, 24–26 September 2015; pp. 205–210. [Google Scholar]
- Goh, C.; Hamadicharef, B.; Henderson, G.; Ifeachor, E. Comparison of fractal dimension algorithms for the computation of EEG biomarkers for dementia. In Proceedings of the 2nd International Conference on Computational Intelligence in Medicine and Healthcare (CIMED2005), Costa da Caparica, Portugal, 29 June–1 July 2005. [Google Scholar]
- Kesić, S.; Spasić, S.Z. Application of Higuchi’s fractal dimension from basic to clinical neurophysiology: A review. Comput. Methods Programs Biomed. 2016, 133, 55–70. [Google Scholar] [CrossRef]
- Roberts, S.J.; Penny, W.; Rezek, I. Temporal and spatial complexity measures for electroencephalogram based brain–computer interfacing. Med. Biol. Eng. Comput. 1999, 37, 93–98. [Google Scholar] [CrossRef] [PubMed]
- Vallat, R. EntroPy. 2018. Available online: https://github.com/raphaelvallat/entropy (accessed on 20 February 2019).
- James, C.J.; Lowe, D. Extracting multisource brain activity from a single electromagnetic channel. Artif. Intell. Med. 2003, 28, 89–104. [Google Scholar] [CrossRef] [PubMed]
- Goshvarpour, A.; Goshvarpour, A. A novel approach for EEG electrode selection in automated emotion recognition based on Lagged Poincare’s Indices and sLORETA. Cogn. Comput. 2020, 12, 602–618. [Google Scholar] [CrossRef]
- Xiong, R.; Kong, F.; Yang, X.; Liu, G.; Wen, W. Pattern Recognition of Cognitive Load Using EEG and ECG Signals. Sensors 2020, 20, 5122. [Google Scholar] [CrossRef] [PubMed]
- Zhang, D.; Yao, L.; Chen, K.; Wang, S.; Chang, X.; Liu, Y. Making sense of spatio-temporal preserving representations for EEG-based human intention recognition. IEEE Trans. Cybern. 2019, 50, 3033–3044. [Google Scholar] [CrossRef]
- Bota, P.J.; Wang, C.; Fred, A.L.; Placido Da Silva, H. A Review, Current Challenges, and Future Possibilities on Emotion Recognition Using Machine Learning and Physiological Signals. IEEE Access 2019, 7, 140990–141020. [Google Scholar] [CrossRef]
- Schaaff, K.; Schultz, T. Towards Emotion Recognition from Electroencephalographic Signals. In Proceedings of the 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, Amsterdam, The Netherlands, 10–12 September 2009; pp. 1–6. [Google Scholar]
- Tian, L.; Muszynski, M.; Lai, C.; Moore, J.D.; Kostoulas, T.; Lombardo, P.; Pun, T.; Chanel, G. Recognizing induced emotions of movie audiences: Are induced and perceived emotions the same? In Proceedings of the 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), San Antonio, TX, USA, 23–26 October 2017; pp. 28–35. [Google Scholar] [CrossRef] [Green Version]
- Massey, F.J., Jr. The Kolmogorov–Smirnov test for goodness of fit. J. Am. Stat. Assoc. 1951, 46, 68–78. [Google Scholar] [CrossRef]
- Kruskal, W.H.; Wallis, W.A. Use of ranks in one-criterion variance analysis. J. Am. Stat. Assoc. 1952, 47, 583–621. [Google Scholar] [CrossRef]
- Anh, V.H.; Van, M.N.; Ha, B.B.; Quyet, T.H. A real-time model based support vector machine for emotion recognition through EEG. In Proceedings of the 2012 International Conference on Control, Automation and Information Sciences (ICCAIS), Saigon, Vietnam, 26–29 November 2012; pp. 191–196. [Google Scholar]
- Tung, K.; Liu, P.K.; Chuang, Y.C.; Wang, S.H.; Wu, A.Y. Entropy-assisted multi-modal emotion recognition framework based on physiological signals. In Proceedings of the 2018 IEEE-EMBS Conference on Biomedical Engineering and Sciences (IECBES), Sarawak, Malaysia, 3–6 December 2018; pp. 22–26. [Google Scholar] [CrossRef] [Green Version]
- Wang, S.H.; Li, H.T.; Chang, E.J.; Wu, A.Y.A. Entropy-assisted emotion recognition of valence and arousal using XGBoost classifier. IFIP Adv. Inf. Commun. Technol. 2018, 519, 249–260. [Google Scholar] [CrossRef]
- Rayatdoost, S.; Soleymani, M. Cross-corpus EEG-based emotion recognition. In Proceedings of the 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP), Aalborg, Denmark, 17–20 September 2018; pp. 1–6. [Google Scholar]
- Zheng, W.L.; Lu, B.L. Personalizing EEG-based affective models with transfer learning. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, New York, NY, USA, 9–15 July 2016; pp. 2732–2738. [Google Scholar]
- Jin, Y.M.; Luo, Y.D.; Zheng, W.L.; Lu, B.L. EEG-based emotion recognition using domain adaptation network. In Proceedings of the 2017 International Conference on Orange Technologies (ICOT), Singapore, 8–10 December 2017; pp. 222–225. [Google Scholar]
Dataset | Movie Details | Duration |
---|---|---|
AMIGOS [29], EEG-14 Channels, 128 Hz | The Descent. Dir. Neil Marshall. Lionsgate. 2005 | 23:35 |
Back to School Mr. Bean. Dir. John Birkin. Tiger Aspect Productions. 1994. | 18:43 | |
The Dark Knight. Dir. Christopher Nolan. Warner Bross. 2008. | 23:30 | |
Up. Dirs. Pete Docter and Bob Peterson. Walt Disney Pictures and Pixar Animation Studios. 2009 | 14:06 |
Known Subject | Unknown Subject | |
---|---|---|
Known Stimulus | Subject- and stimulus-dependent (k-fold, LOO) | Subject-independent (LOPO) |
Unknown Stimulus | Stimulus-independent (LOMO) | Subject- and Stimulus-independent (LOPMO) |
Validation Scheme | #Users | #Train (Users ∗ Movies) | #Test (Users ∗ Movies) | |
---|---|---|---|---|
LOPO | 37, 4 movies | 36 ∗ 4 | 1 ∗ 4 | |
LOMO | Inter | 37, 4 movies | 37 ∗ 3 | 37 ∗ 1 |
Within | Valence-26 Arousal-19 | 1 ∗ 3 | 1 ∗ 1 | |
LOPMO | 34, 4 movies | 33 ∗ 3 | 1 ∗ 1 |
CLF, FS | Valence | Arousal | ||
---|---|---|---|---|
F1 | #Features | F1 | #Features | |
LDA, SFS | 0.687 | 24 | 0.683 | 168 |
LDA, SBS | 0.695 | 77 | 0.699 | 63 |
LDA, ALL | 0.648 | 217 | 0.664 | 217 |
SVM, SFS | 0.680 | 72 | 0.676 | 75 |
SVM, SBS | 0.706 † | 45 | 0.694 | 49 |
SVM, ALL | 0.648 | 217 | 0.643 | 217 |
CLSTM | 0.741 *,† | - | 0.677 | - |
LDA | SVM | ||||
---|---|---|---|---|---|
Valence | Arousal | Valence | Arousal | ||
SFS | 1 | T7 HFD | T7 HFD | T7 PSD | T7 PSD |
2 | AF4 HFD | AF4 HM | AF4 PSD | P8 HFD | |
3 | T8 HFD | T7 PSD | P7 HFD | AF3 HFD | |
4 | F8 PFD | T7 PSD | FC Asymm | T7 PSD | |
5 | FC6 HM | T8 HFD | F34 Asymm | F4 SpecEnt | |
0.651 (95%) | 0.643 (94%) | 0.644 (95%) | 0.632 (94%) | ||
SBS | 1 | T8 HFD | T8 HFD | T8 HFD | T7 HFD |
2 | AF4 HFD | AF4 HFD | FC5 PSD | F7 svdEnt | |
3 | F7 PFD | T7 PSD | FC5 PSD | FC5 PSD | |
4 | FC5 HM | T7 PSD | F7 PSD | F7 HFD | |
5 | O2 HC | F8 HM | O2 HC | FC5 PSD | |
0.633 (91%) | 0.620 (89%) | 0.639 (91%) | 0.629 (91%) |
CLF, FS | Valence | Arousal | ||
---|---|---|---|---|
F1 | #Features | F1 | #Features | |
LDA, SFS | 0.636 | 119 | 0.675 | 99 |
LDA, SBS | 0.646 | 68 | 0.684 | 63 |
LDA, ALL | 0.604 | 217 | 0.647 | 217 |
SVM, SFS | 0.632 | 115 | 0.650 | 131 |
SVM, SBS | 0.648 | 73 | 0.657 | 67 |
SVM, ALL | 0.610 | 217 | 0.626 | 217 |
CLSTM | 0.612 | - | 0.603 | - |
LDA | SVM | ||||
---|---|---|---|---|---|
Valence | Arousal | Valence | Arousal | ||
SFS | 1 | T7 PSD | T8 PSD | T7 PSD | T8 PSD |
2 | AF3 PSD | T7 HFD | AF4 PSD | T7 HFD | |
3 | F8 HFD | F7 HM | T8 HFD | AF4 HC | |
4 | AF4 HC | T7 svdEnt | AF4 HFD | T7 svdEnt | |
5 | T8 HFD | FC5 PSD | F8 HFD | P Asymm | |
0.571 (90%) | 0.608 (90%) | 0.604 (96%) | 0.616 (95%) | ||
SBS | 1 | T7 HFD | T8 HFD | T7 HFD | T7 HFD |
2 | AF4 HFD | T8 PSD | AF4 HFD | F7 svdEnt | |
3 | T8 HFD | T7 PSD | F8 PSD | F7 HFD | |
4 | F7 PSD | T7 PSD | T8 HFD | FC5 PSD | |
5 | FC5 PSD | P8 PSD | F8 PSD | FC5 PSD | |
0.578 (90%) | 0.598 (87%) | 0.607 (94%) | 0.603 (92%) |
CLF, FS | Valence | Arousal | ||
---|---|---|---|---|
F1 | #Features | F1 | #Features | |
LDA, SFS | 0.596 * | 20 | 0.657 * | 20 |
LDA, SBS | 0.592 * | 13 | 0.642 * | 20 |
LDA, ALL | 0.445 | 217 | 0.470 | 217 |
SVM, SFS | 0.593 † | 53 | 0.616 † | 35 |
SVM, SBS | 0.578 † | 30 | 0.612 † | 26 |
SVM, ALL | 0.481 | 217 | 0.528 ‡ | 217 |
CLSTM | 0.543 *† | - | 0.579 *† | - |
LDA | SVM | ||||
---|---|---|---|---|---|
Valence | Arousal | Valence | Arousal | ||
SFS | 1 | T7 HFD | T7 HFD | T7 HFD | T7 PFD |
2 | AF4 PSD | FC6 PSD | T8 PSD | P8 PFD | |
3 | T8 PSD | AF3 PSD slow | O2 DFA | P7 PFD | |
4 | FC6 HFD | AF3 PSD | AF4 FI | F7 PFD | |
5 | P Asymm | T7 FI | AF4 PFD | O1 PFD | |
0.569 (95%) | 0.606 (92%) | 0.574 (97%) | 0.549 (89%) | ||
SBS | 1 | T7 HFD | T7 HFD | T8 PSD | T8 PSD |
2 | AF3 PSD | AF3 HFD | O2 HC | T Asymm | |
3 | T8 HFD | AF3 PSD slow | AF3 PSD | AF3 PSD | |
4 | T8 PSD | F4 PSD | T Asymm | P7 PSD | |
5 | AF4 FI | F7 HFD | F8 PSD | P7 HC | |
0.560 (95%) | 0.616 (96%) | 0.539 (93%) | 0.562 (92%) |
CLF, FS | Valence | Arousal | ||
---|---|---|---|---|
F1 | #Features | F1 | #Features | |
LDA, SFS | 0.575 | 61 | 0.670 *‡ | 102 |
LDA, SBS | 0.594 * | 28 | 0.686 *‡ | 72 |
LDA, ALL | 0.521 | 217 | 0.605 | 217 |
SVM, SFS | 0.586 † | 32 | 0.593 | 123 |
SVM, SBS | 0.578 † | 49 | 0.624 † | 27 |
SVM, ALL | 0.511 | 217 | 0.574 | 217 |
CLSTM | 0.639 *† | - | 0.682 *† | - |
LDA | SVM | ||||
---|---|---|---|---|---|
Valence | Arousal | Valence | Arousal | ||
SFS | 1 | FC6 HM | T8 HFD | P8 PSD | T7 HFD |
2 | F8 PSD | T8 PSD slow | T7 HFD | F7 PSD | |
3 | O1 DFA | AF3 svdEnt | AF3 HFD | AF3 svdEnt | |
4 | AF Asymm | T7 HFD | T8 HFD | O2 PSD | |
5 | P Asymm | F4 HFD | T8 PSD | F8 PSD | |
0.484 (84%) | 0.630 (94%) | 0.571 (97%) | 0.576 (97%) | ||
SBS | 1 | AF4 HFD | T8 HFD | T7 HFD | T8 HFD |
2 | T7 HFD | T8 PSD slow | AF3 HFD | T8 PSD | |
3 | T8 HFD | T7 PSD | T8 HFD | T7 HFD | |
4 | AF3 PSD | T7 PSD | T7 svdEnt | F4 HFD | |
5 | P8 PSD | AF3 PSD | T8 PSD | FC5 PSD | |
0.544 (92%) | 0.626 (91%) | 0.561 (97%) | 0.587 (94%) |
Valence | Arousal | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
Precision | Recall | F1 | CLF | FS | Precision | Recall | F1 | CLF | FS | |
AMIGOS [29] | 0.557 | NB | FLD | 0.571 | NB | FLD | ||||
LOPO | 0.729 | 0.722 | 0.706 | SVM | SBS | 0.794 | 0.691 | 0.699 | LDA | SBS |
0.779 | 0.738 | 0.741 | CLSTM | - | 0.743 | 0.695 | 0.677 | CLSTM | - | |
LOMO- Inter | 0.643 | 0.731 | 0.648 | SVM | SBS | 0.716 | 0.682 | 0.684 | LDA | SBS |
0.649 | 0.649 | 0.612 | CLSTM | - | 0.635 | 0.63 | 0.603 | CLSTM | - | |
LOMO- Within | 0.635 | 0.638 | 0.596 | LDA | SFS | 0.700 | 0.692 | 0.657 | LDA | SFS |
0.612 | 0.611 | 0.543 | CLSTM | - | 0.636 | 0.666 | 0.579 | CLSTM | - | |
LOPMO | 0.666 | 0.655 | 0.594 | LDA | SBS | 0.738 | 0.716 | 0.686 | LDA | SBS |
0.711 | 0.710 | 0.639 | CLSTM | - | 0.747 | 0.735 | 0.682 | CLSTM | - |
Source | CLF | Feats | Length | F1-Score Valence Arousal | |
---|---|---|---|---|---|
AMIGOS 2018 [29] | NB | PSD & Asymm | Long | 0.557 | 0.571 |
Short | 0.576 | 0.592 | |||
Siddharth et al., 2019 [24] | ELM | PSD, Deep & Entropy | Short | 0.800 | 0.740 |
Miranda-Correa and Patras 2018 [18] | CNN | PSD & EEG-sequence | Short & Long | 0.580 | 0.570 |
RNN | 0.570 | 0.590 | |||
Fusion | 0.590 | 0.610 | |||
Wang et al., 2018 [84] | XGB | PSD & Asymm | Short | 0.577 | 0.604 |
SVM | 0.556 | 0.557 | |||
Tung et al., 2019 [83] | XGB (1) | Enrtopy-domain | Short | 0.575 | 0.568 |
XGB (2) | 0.753 | 0.568 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bendrich, N.; Kumar, P.; Scheme, E. Feature Selection for Continuous within- and Cross-User EEG-Based Emotion Recognition. Sensors 2022, 22, 9282. https://doi.org/10.3390/s22239282
Bendrich N, Kumar P, Scheme E. Feature Selection for Continuous within- and Cross-User EEG-Based Emotion Recognition. Sensors. 2022; 22(23):9282. https://doi.org/10.3390/s22239282
Chicago/Turabian StyleBendrich, Nicole, Pradeep Kumar, and Erik Scheme. 2022. "Feature Selection for Continuous within- and Cross-User EEG-Based Emotion Recognition" Sensors 22, no. 23: 9282. https://doi.org/10.3390/s22239282
APA StyleBendrich, N., Kumar, P., & Scheme, E. (2022). Feature Selection for Continuous within- and Cross-User EEG-Based Emotion Recognition. Sensors, 22(23), 9282. https://doi.org/10.3390/s22239282