Investigating EEG Patterns for Dual-Stimuli Induced Human Fear Emotional State
Abstract
:1. Introduction
- A new paradigm to induce emotion, specifically feelings of fear, with two different ways of elicitation: self versus audio/video clips has been proposed
- We attempted to identify if low-amplitude EEG signals acquired from neuro-paradigm based on emotional imagery or recalling past memories could be classified effectively with comparatively strong stimuli based on audio/video clips
- As CSP has been widely used in different scenarios of EEG based BCI applications, such as motor imagery, this work attempts to investigate whether it is also a good choice for emotion recognition.
- We compared conventional CSP with its regularized algorithms to investigate if classification performance could be improved
- By analyzing the spatial filter weight distributions, different electrode set configurations were found and, ultimately, subject independent electrode placement was obtained with a minor compromise on classification performance
- We confirmed that self-induced versus audio/video induced fear feelings exhibit subject-independent neural signatures at critical frequency bands and brain regions.
2. Materials and Methods
2.1. Experimental Setup
2.1.1. Participants
2.1.2. EEG Recordings
2.1.3. Procedure
- a.
- Single block of Session-I (Self-induced)
- 5 s of baseline signal collection
- Verbal signal was given to start the activity
- Participant recalled the same incident that they mentioned in questionnaire
- After 60 s, the activity stopped
- 30 s rest time
- b.
- Single block of Session-II (Videos-induced)
- 5 s of baseline signal collection
- Video clip starts and verbal signal was given to start the activity
- Display of the movie clip to induce fear (120–180 s)
- Video finished
- Display of the movie clip to relax (30 s)
- Video finished
2.2. Data Analysis
2.2.1. EEG Data Acquisition
2.2.2. Segmentation into Samples
2.2.3. EEG Preprocessing
2.2.4. Feature Extraction Based on CSP
Variables Considered. |
Input: EEG Signals: the EEG Signals from which extracting the CSP features. These signals are of a structure such that: EEG Signals.x: the EEG signals as a [Ns × Nc × Nt] Matrix where Ns: number of EEG samples per sample Nc: number of channels (EEG electrodes) Nt: number of samples EEG Signals.y: a [1 × Nt] vector containing the class labels for each sample EEG Signals.s: the sampling frequency (in Hz) CSP Matrix: the CSP projection matrix nbFilterPairs: number of pairs of CSP filters to be used. The number of features extracted would be twice the value of this parameter. The filters selected are the one corresponding to the lowest and highest eigenvalues. Onput: Features: the features extracted from the above mentioned EEG data set as a [Nt × (nb Filter Pairs × 2 + 1)] matrix, with the class labels as the last column. |
2.3. Classification
2.4. Conventional CSP vs. Regularized Algorithms
- Conventional CSP (CSP) (already explained earlier)
- Composite CSP (CCSP)
- Composite CSP with Kullback–Leibler divergence (CCSP-KL)
- CSP with Tikhonov regularization (TR-CSP)
- CSP with weighted Tikhonov regularization (WTR-CSP)
- CSP with diagonal loading using cross validation (DL-CSP-auto)
- CSP with diagonal loading using cross validation (DL-CSP)
2.5. Electrode Reduction
3. Results and Discussion
3.1. Classification Performance
Statistical Analysis
3.2. Electrode Reduction
3.2.1. Subject-Dependent Channel Selection
3.2.2. Common Channels/Subject Independent Channels
3.2.3. Comparison to Related Work
Computational Efficiency
Electrode Reduction
3.2.4. Limitations
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Chai, X.; Wang, Q.; Zhao, Y.; Liu, X.; Bai, O.; Li, Y. Unsupervised domain adaptation techniques based on auto-encoder for non-stationary EEG-based emotion recognition. Comput. Biol. Med. 2016, 79, 205–214. [Google Scholar] [CrossRef] [PubMed]
- Gao, Y.; Lee, H.J.; Mehmood, R.M. Deep learninig of EEG signals for emotion recognition. In Proceedings of the 2015 IEEE International Conference on Multimedia & Expo Workshops (ICMEW), Turin, Italy, 29 June–3 July 2015. [Google Scholar]
- Akar, S.A.; Akdemir, S.; Kara, S.; Agambayev, S.; Bilgiç, V. Nonlinear analysis of EEGs of patients with major depression during different emotional states. Comput. Biol. Med. 2015, 67, 49–60. [Google Scholar] [CrossRef] [PubMed]
- Lee, Y.-Y.; Hsieh, S. Classifying different emotional states by means of EEG-based functional connectivity patterns. PLoS ONE 2014, 9, e95415. [Google Scholar] [CrossRef] [PubMed]
- Kashihara, K. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions. Front. Neurosci. 2014, 8, 244. [Google Scholar] [CrossRef] [PubMed]
- Widge, A.S.; Dougherty, D.D.; Moritz, C.T. Affective brain-computer interfaces as enabling technology for responsive psychiatric stimulation. Brain-Comput. Interfaces 2014, 1, 126–136. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lerner, J.S.; Keltner, D. Beyond valence: Toward a model of emotion-specific influences on judgement and choice. Cognit. Emot. 2000, 14, 473–493. [Google Scholar] [CrossRef] [Green Version]
- Suess, F.; Rahman, R.A. Mental imagery of emotions: Electrophysiological evidence. NeuroImage 2015, 114, 147–157. [Google Scholar] [CrossRef]
- Costa, V.D.; Lang, P.J.; Sabatinelli, D.; Versace, F.; Bradley, M.M. Emotional imagery: Assessing pleasure and arousal in the brain’s reward circuitry. Hum. Brain Map. 2010, 31, 1446–1457. [Google Scholar] [CrossRef]
- Lang, P.J.; McTeague, L.M. The anxiety disorder spectrum: Fear imagery, physiological reactivity, and differential diagnosis. Anxiety Stress Coping 2009, 22, 5–25. [Google Scholar] [CrossRef]
- Shin, L.M.; Dougherty, D.D.; Orr, S.P.; Pitman, R.K.; Lasko, M.; Macklin, M.L.; Alpert, N.M.; Fischman, A.J.; Rauch, S.L. Activation of anterior paralimbic structures during guilt-related script-driven imagery. Biol. Psychiatry 2000, 48, 43–50. [Google Scholar] [CrossRef]
- Kothe, C.A.; Makeig, S.; Onton, J.A. Emotion recognition from EEG during self-paced emotional imagery. In Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), Geneva, Switzerland, 2–5 September 2013. [Google Scholar]
- Hu, X.; Yu, J.; Song, M.; Yu, C.; Wang, F.; Sun, P.; Wang, D.; Zhang, D. EEG Correlates of Ten Positive Emotions. Front. Hum. Neurosci. 2017, 11, 26. [Google Scholar] [CrossRef] [PubMed]
- Zheng, W.-L.; Lu, B.-L. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Mental Dev. 2015, 7, 162–175. [Google Scholar] [CrossRef]
- Daly, I.; Malik, A.; Hwang, F.; Roesch, E.; Weaver, J.; Kirke, A.; Williams, D.; Miranda, E.; Nasuto, S.J. Neural correlates of emotional responses to music: An EEG study. Neurosci. Lett. 2014, 573, 52–57. [Google Scholar] [CrossRef] [PubMed]
- Chanel, G.; Kierkels, J.J.M.; Soleymani, M.; Pun, T. Short-term emotion assessment in a recall paradigm. Int. J. Hum.-Comput. Stud. 2009, 67, 607–627. [Google Scholar] [CrossRef] [Green Version]
- Iacoviello, D.; Petracca, A.; Spezialetti, M.; Placidi, G. A classification algorithm for electroencephalography signals by self-induced emotional stimuli. IEEE Trans. Cybern. 2016, 46, 3171–3180. [Google Scholar] [CrossRef] [PubMed]
- Li, M.; Lu, B.-L. Emotion classification based on gamma-band EEG. In Proceedings of the EMBC 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009. [Google Scholar]
- Nie, D.; Wang, Xi.; Shi, Li.; Lu, Ba. EEG-based emotion recognition during watching movies. In Proceedings of the 2011 5th International IEEE/EMBS Conference on in Neural Engineering (NER), Cancun, Mexico, 27 April–1 May 2011. [Google Scholar]
- Zhang, Y.; Nam, C.S.; Zhou, G.; Jin, J.; Wang, X.; Cichocki, A. Temporally constrained sparse group spatial patterns for motor imagery BCI. IEEE Trans. Cybern. 2018, 1–11. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Zhou, G.; Jin, J.; Zhao, Q.; Wang, X.; Cichocki, A. Sparse Bayesian classification of EEG for brain–computer interface. IEEE Trans. Neural Netw. Learn. Syst. 2016, 27, 2256–2267. [Google Scholar] [CrossRef]
- Jin, Z.; Zhou, G.; Gao, D.; Zhang, Y. EEG classification using sparse Bayesian extreme learning machine for brain–computer interface. Neural Comput. Appl. 2018. [Google Scholar] [CrossRef]
- Qiu, Z.; Jin, J.; Lam, H.-K.; Zhang, Y.; Wang, X.; Cichocki, A. Improved SFFS method for channel selection in motor imagery based BCI. Neurocomputing 2016, 207, 519–527. [Google Scholar] [CrossRef] [Green Version]
- Liu, Y.-H.; Huang, S.; Huang, Y.-D. Motor Imagery EEG Classification for Patients with Amyotrophic Lateral Sclerosis Using Fractal Dimension and Fisher’s Criterion-Based Channel Selection. Sensors 2017, 17, 1557. [Google Scholar] [CrossRef]
- Yang, Y.; Bloch, I.; Chevallier, S.; Wiart, J. Subject-Specific Channel Selection Using Time Information for Motor Imagery Brain–Computer Interfaces. Cognit. Comput. 2016, 8, 505–518. [Google Scholar] [CrossRef]
- Yang, J.; Singh, H.; Hines, E.L.; Schlaghecken, F.; Iliescu, D.D.; Leeson, M.S.; Stocks, N.G. Channel selection and classification of electroencephalogram signals: An artificial neural network and genetic algorithm-based approach. Artif. Intell. Med. 2012, 55, 117–126. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, J.; Chen, M.; Zhao, S.; Hu, S.; Shi, Z.; Cao, Y. Relieff-based EEG sensor selection methods for emotion recognition. Sensors 2016, 16, 1558. [Google Scholar] [CrossRef] [PubMed]
- Dai, S.; Wei, Q. Electrode channel selection based on backtracking search optimization in motor imagery brain–computer interfaces. J. Integr. Neurosci. 2017, 16, 241–254. [Google Scholar] [CrossRef] [PubMed]
- Handiru, V.S.; Prasad, V.A. Optimized Bi-Objective EEG Channel Selection and Cross-Subject Generalization with Brain–Computer Interfaces. IEEE Trans. Hum.-Mach. Syst. 2016, 46, 777–786. [Google Scholar] [CrossRef]
- Kang, H.; Nam, Y.; Choi, S. Composite common spatial pattern for subject-to-subject transfer. IEEE Signal Process. Lett. 2009, 16, 683–686. [Google Scholar] [CrossRef]
- Alarcao, S.M.; Fonseca, M.J. Emotions recognition using EEG signals: A survey. IEEE Trans. Affect. Comput. 2017. [CrossRef]
- Wang, S.; Gwizdka, J.; Chaovalitwongse, W.A. Using Wireless EEG Signals to Assess Memory Workload in the n-Back Task. IEEE Trans. Hum.-Mach. Syst. 2016, 46, 424–435. [Google Scholar] [CrossRef]
- Chumerin, N.; Manyakov, N.V.; van Vliet, M.; Robben, A.; Combaz, A.; van Hulle, M.M. Steady-state visual evoked potential-based computer gaming on a consumer-grade EEG device. IEEE Trans. Comput. Intel. AI Games 2013, 5, 100–110. [Google Scholar] [CrossRef]
- Rodríguez, A.; Rey, B.; Clemente, M.; Wrzesien, M.; Alcañiz, M. Assessing brain activations associated with emotional regulation during virtual reality mood induction procedures. Expert Syst. Appl. 2015, 42, 1699–1709. [Google Scholar] [CrossRef] [Green Version]
- Askari, E.; Setarehdan, S.K.; Sheikhani, A.; Mohammadi, M.R.; Teshnehlab, M. Designing a model to detect the brain connections abnormalities in children with autism using 3D-cellular neural networks and wavelet transform. J. Integr. Neurosci. 2018, 17, 391–411. [Google Scholar] [CrossRef] [PubMed]
- Wang, Q.; Sourina, O. Real-time mental arithmetic task recognition from EEG signals. IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 21, 225–232. [Google Scholar] [CrossRef] [PubMed]
- Lotte, F.; Guan, C. Regularizing common spatial patterns to improve BCI designs: Unified theory and new algorithms. IEEE Trans. Biomed. Eng. 2011, 58, 355–362. [Google Scholar] [CrossRef] [PubMed]
- Tonoyan, Y.; Chanwimalueang, T.; Mandic, D.P.; van Hulle, M.M. Discrimination of emotional states from scalp-and intracranial EEG using multiscale Rényi entropy. PLoS ONE 2017, 12, e0186916. [Google Scholar] [CrossRef] [PubMed]
- Becker, H.; Fleureau, J.; Guillotel, P.; Wendling, F.; Merlet, I.; Albera, L. Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources. IEEE Trans. Affect. Comput. 2017. [Google Scholar] [CrossRef]
- Gómez, O.; Quintero, L.; López, N.; Castro, J. An approach to emotion recognition in single-channel EEG signals: A mother child interaction. J. Phys. Conf. Ser. 2016, 705, 012051. [Google Scholar] [CrossRef]
- Ackermann, P.; Kohlschein, C.; Bitsch, J.A.; Wehrle, K.; Jeschke, S. EEG-based automatic emotion recognition: Feature extraction, selection and classification methods. In Proceedings of the 2016 IEEE 18th International Conference on in e-Health Networking, Applications and Services (Healthcom), Munich, Germany, 14–16 September 2016. [Google Scholar]
- Blankertz, B.; Tomioka, R.; Lemm, S.; Kawanabe, M.; Muller, K.-R. Optimizing spatial filters for robust EEG single-trial analysis. IEEE Signal Process. Mag. 2008, 25, 41–56. [Google Scholar] [CrossRef]
- Dornhege, G.; Blankertz, B.; Curio, G.; Muller, K.-R. Boosting bit rates in noninvasive EEG single-trial classifications by feature combination and multiclass paradigms. IEEE Trans. Biomed. Eng. 2004, 51, 993–1002. [Google Scholar] [CrossRef] [PubMed]
- Davis, J.J.J.; Lin, C.; Gillett, G.; Kozma, R. An Integrative Approach to Analyze Eeg Signals and Human Brain Dynamics in Different Cognitive States. J. Artif. Intell. Soft Comput. Res. 2017, 7, 287–299. [Google Scholar] [CrossRef] [Green Version]
- Buccino, A.P.; Keles, H.O.; Omurtag, A. Hybrid EEG-fNIRS asynchronous brain-computer interface for multiple motor tasks. PLoS ONE 2016, 11, e0146610. [Google Scholar] [CrossRef]
- Song, Y.J.; Sepulveda, F. Classifying siren-sound mental rehearsal and covert production vs. idle state towards onset detection in brain-computer interfaces. In Proceedings of the 2015 3rd International Winter Conference on Brain-Computer Interface (BCI), Sabuk, Korea, 12–14 January 2015. [Google Scholar]
- Wang, X.-W.; Nie, D.; Lu, B.-L. Emotional state classification from EEG data using machine learning approach. Neurocomputing 2014, 129, 94–106. [Google Scholar] [CrossRef]
- Lemm, S.; Blankertz, B.; Dickhaus, T.; Müller, K.-R. Introduction to machine learning for brain imaging. NeuroImage 2011, 56, 387–399. [Google Scholar] [CrossRef] [PubMed]
- Ledoit, O.; Wolf, M. A well-conditioned estimator for large-dimensional covariance matrices. J. Multivar. Anal. 2004, 88, 365–411. [Google Scholar] [CrossRef] [Green Version]
- Masood, N.; Farooq, H.; Mustafa, I. Selection of EEG channels based on Spatial filter weights. In Proceedings of the International Conference on Communication, Computing and Digital Systems (C-CODE), Islamabad, Pakistan, 8–9 March 2017. [Google Scholar]
- Balconi, M.; Lucchiari, C. Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis. Int. J. Psychophysiol. 2008, 67, 41–46. [Google Scholar] [CrossRef] [PubMed]
- Jatupaiboon, N.; Pan-ngum, S.; Israsena, P. Emotion classification using minimal EEG channels and frequency bands. In Proceedings of the 2013 10th International Joint Conference on Computer Science and Software Engineering (JCSSE), Maha Sarakham, Thailand, 29–31 May 2013. [Google Scholar]
- Valenzi, S.; Islam, T.; Jurica, P.; Cichocki, A. Individual classification of emotions using EEG. J. Biomed. Sci. Eng. 2014, 7, 604. [Google Scholar] [CrossRef]
- Petrantonakis, P.C.; Hadjileontiadis, L.J. Emotion recognition from EEG using higher order crossings. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 186–197. [Google Scholar] [CrossRef] [PubMed]
- Schmidt, L.A.; Trainor, L.J. Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cognit. Emot. 2001, 15, 487–500. [Google Scholar] [CrossRef] [Green Version]
- Arvaneh, M.; Guan, C.; Ang, K.K.; Quek, C. Optimizing the channel selection and classification accuracy in EEG-based BCI. IEEE Trans. Biomed. Eng. 2011, 58, 1865–1873. [Google Scholar] [CrossRef] [PubMed]
- Lan, T.; Erdogmus, D.; Adami, A.; Pavel, M.; Mathan, S. Salient EEG channel selection in brain computer interfaces by mutual information maximization. In Proceedings of the IEEE-EMBS 2005 27th Annual International Conference of the Engineering in Medicine and Biology Society, Shanghai, China, 17–18 January 2006. [Google Scholar]
- Zhuang, N.; Zeng, Y.; Yang, K.; Zhang, C.; Tong, L.; Yan, B. Investigating Patterns for Self-Induced Emotion Recognition from EEG Signals. Sensors 2018, 18, 841. [Google Scholar] [CrossRef] [PubMed]
S. No. | Videos | Emotional State | Video Length (s) |
---|---|---|---|
1. | Lights out movie trailer | Fear | 180 |
2. | Best Vacations: Jumping | Pleasant | 30 |
3. | Video clip from Insidious movie | Fear | 120 |
4. | Caught red-handed | Pleasant | 30 |
5. | Conjuring official trailer | Fear | 125 |
6. | Stunning China-UNESCO World Heritage | Pleasant | 30 |
7. | Die in Disaster Movies | Fear | 145 |
8. | Tourism Sites In Pakistan | Pleasant | 30 |
9. | Scene from The Eye-Horror movie | Fear | 120 |
10. | Berlin City Tour | Pleasant | 30 |
11. | Snakes catcher in Indian forest | Fear | 80 |
12. | BBC nature documentary 2016 | Pleasant | 30 |
13. | Female Restroom-Horror clip | Fear | 180 |
14. | Nat Geo Wild HD Ocean of Giants | Pleasant | 30 |
15. | Frightening Creepy Clown | Fear | 130 |
16. | 10-month-old babies | Pleasant | 30 |
17. | Scene from The Conjuring 2 | Fear | 120 |
18. | Roller Coaster & Candy Coaster | Pleasant | 30 |
19. | Fear of Snakes | Fear | 125 |
20. | Army Man surprises his 8-year-old daughter | Pleasant | 30 |
Subjects | Delta (1–3 Hz) | Theta (4–7 Hz) | Alpha (8–13 Hz) | Beta (14–30 Hz) | Low Gamma (31–50 Hz) | High Gamma (50–100 Hz) | (1–100 Hz) |
---|---|---|---|---|---|---|---|
S1 | 50.70 | 66.35 | 62.09 | 72.56 | 70.16 | 73.90 | 65.88 |
S2 | 53.70 | 59.26 | 55.56 | 70.37 | 70.52 | 66.67 | 66.67 |
S3 | 64.81 | 57.41 | 51.85 | 68.52 | 64.41 | 70.41 | 64.41 |
S4 | 62.96 | 59.26 | 59.32 | 72.50 | 65.00 | 69.64 | 69.64 |
S5 | 58.93 | 53.70 | 64.29 | 71.64 | 66.07 | 70.81 | 64.81 |
S6 | 53.70 | 55.36 | 61.11 | 61.11 | 71.81 | 70.91 | 70.91 |
S7 | 65.45 | 53.70 | 56.36 | 61.82 | 63.64 | 74.07 | 74.07 |
S8 | 57.50 | 62.96 | 55.56 | 61.11 | 73.81 | 75.00 | 71.00 |
S9 | 47.50 | 53.00 | 58.40 | 78.19 | 74.81 | 78.21 | 59.40 |
S10 | 53.56 | 61.22 | 65.29 | 56.12 | 57.14 | 58.18 | 67.87 |
S11 | 45.00 | 51.50 | 60.00 | 65.00 | 63.00 | 57.50 | 61.00 |
S12 | 50.39 | 59.04 | 63.82 | 71.80 | 70.35 | 78.77 | 64.42 |
S13 | 42.39 | 58.07 | 58.53 | 70.54 | 66.84 | 78.34 | 67.34 |
S14 | 60.00 | 82.22 | 71.11 | 75.56 | 93.33 | 94.81 | 87.41 |
S15 | 56.31 | 68.46 | 74.27 | 77.18 | 73.31 | 73.80 | 70.89 |
Mean value | 54.86 | 60.10 | 61.17 | 68.93 | 69.61 | 72.74 | 68.38 |
Algorithm/Frequency Band | CSP | CCSP1 | CCSP2 | TR_CSP | WTR_CSP | DL_CSP | DL_CSP_auto |
---|---|---|---|---|---|---|---|
(1–3 Hz) | 54.86 | 65.00 | 63.93 | 52.54 | 53.51 | 49.49 | 51.62 |
(4–7 Hz) | 60.10 | 65.85 | 66.30 | 61.30 | 62.05 | 60.91 | 60.52 |
(8–13 Hz) | 61.17 | 64.22 | 63.77 | 60.32 | 59.27 | 59.46 | 59.79 |
(14–30 Hz) | 68.93 | 71.38 | 71.21 | 69.24 | 69.05 | 66.76 | 68.08 |
(31–50 Hz) | 69.61 | 72.49 | 72.19 | 69.48 | 70.19 | 69.91 | 68.77 |
(50–100 Hz) | 72.74 | 76.97 | 75.95 | 72.06 | 72.86 | 72.20 | 66.28 |
Subjects | AF3 | F7 | F3 | FC5 | T7 | P7 | O1 | O2 | P8 | T8 | FC6 | F4 | F8 | AF4 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
S1 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S2 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S3 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S4 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S5 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S6 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S7 | √ | √ | √ | √ | √ | √ | √ | √ | √ | |||||
S8 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S9 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S10 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S11 | √ | √ | √ | √ | √ | √ | √ | |||||||
S12 | √ | √ | √ | √ | √ | √ | √ | √ | √ | |||||
S13 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S14 | √ | √ | √ | √ | √ | √ | √ | √ | ||||||
S15 | √ | √ | √ | √ | √ | √ | √ | |||||||
Frequency for appearance | 12 | 8 | 9 | 4 | 11 | 11 | 9 | 6 | 9 | 6 | 6 | 12 | 8 | 8 |
No. of Electrodes in Selected Configuration | Possible Configurations | Mean Accuracy Achieved |
---|---|---|
6 | AF3 F4 T7 P7 F3 O1 | 64.49 |
AF3 F4 T7 P7 F3 P8 | 70.35 | |
AF3 F4 T7 P7 O1 P8 | 66.86 | |
7 | AF3 F4 T7 P7 F3 O1 P8 | 65.01 |
8 | AF3 F4 T7 P7 F3 O1 P8 F8 | 69.94 |
AF3 F4 T7 P7 F3 O1 P8 AF4 | 74.81 | |
AF3 F4 T7 P7 F3 O1 P8 F7 | 71.67 | |
9 | AF3 F4 T7 P7 F3 O1 P8 F8 F7 | 74.45 |
AF3 F4 T7 P7 F3 O1 P8 F8 AF4 | 67.92 | |
AF3 F4 T7 P7 F3 O1 P8 F7 AF4 | 71.15 |
Studies/Year | Type of Study (Emotion Recognition/Others) | Classifier | EEG Device with Total Number of Electrodes | Classification Performance | Relevant Frequency Band/Brain Regions |
---|---|---|---|---|---|
Zhuang et al. [58] | Self-induced emotion recognition (joy, neutrality, sadness, disgust, anger, and fear) | SVM | g.HIamp System with 62 electrodes | 54.52% | High frequency rhythm from electrodes distributed in bilateral temporal, prefrontal, and occipital lobes produced outstanding performance. |
Jatupaiboon et al. [52] | Emotion recognition two emotions (i.e., positive and negative) | SVM | Emotiv (14 electrodes (7 pairs)) | With all channels: 85.41% With five pairs or ten electrodes 84.18% | Gamma band |
Zhang et al. [27] | four emotional states (joy, fear, sadness, and relaxation) | SVM | 32 | Originally 32 channels Reduced to 8 channels with 58.51% versus the best classification accuracy 59.13% | It can be found that the high frequency bands (beta, gamma) play a more important role in emotion processing. |
Zheng et al. [14] | positive, neutral, and negative | kNN, logistic regression, SVM, and deep belief networks (DBNs) | ESI Neuroscan with 62 channels | With all 62 electrodes, DE features and SVM classifier obtained accuracy of 83.99% 4 channels: degradation 82.88% 6 channels 85.03% | Beta and gamma bands |
Kothe et al. [12] | Self-induced emotion: positive vs. negative | Logistic Regression | Bio Semi 250 gel based 250 electrodes | 71.3% | - |
Chanel et al. [16] | Memory recall: negatively excited, positively excited, calm-neutral states | LDA, Linear SVM, Prob. Linear SVM, RVM | Bio Semi Active II System with 64 electrodes | 63% | - |
Lacoviello et al. [17] | self-induced emotions: disgust vs. relax | SVM | EnobioNE 8 channels | With T8 channel only accuracy above 90% | - |
Li and Lu [18] | Happiness vs. sadness | Linear SVM | 62 channel | 93.5% | Gamma Band (30–100 Hz) |
Wang et al. [36] | Arithmetic mental task | SVM | 14 Emotiv EPOC | 97.14% with 14 electrodes. 97.11% with four electrodes | - |
Author’s work | Fear emotion recognition: self- vs. video-induced | LDA | 14 | 76.97 with all 14 channels 74.81 with 8 channels | High gamma and beta band |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Masood, N.; Farooq, H. Investigating EEG Patterns for Dual-Stimuli Induced Human Fear Emotional State. Sensors 2019, 19, 522. https://doi.org/10.3390/s19030522
Masood N, Farooq H. Investigating EEG Patterns for Dual-Stimuli Induced Human Fear Emotional State. Sensors. 2019; 19(3):522. https://doi.org/10.3390/s19030522
Chicago/Turabian StyleMasood, Naveen, and Humera Farooq. 2019. "Investigating EEG Patterns for Dual-Stimuli Induced Human Fear Emotional State" Sensors 19, no. 3: 522. https://doi.org/10.3390/s19030522
APA StyleMasood, N., & Farooq, H. (2019). Investigating EEG Patterns for Dual-Stimuli Induced Human Fear Emotional State. Sensors, 19(3), 522. https://doi.org/10.3390/s19030522