Musical Emotions Recognition Using Entropy Features and Channel Optimization Based on EEG
Abstract
:1. Introduction
2. Materials and Methods
2.1. EEG Experiment
2.1.1. Materials
2.1.2. EEG Signal Acquisition and Sample Data
2.2. Feature Extraction of EEG Signals
2.2.1. Approximate Entropy
2.2.2. Sample Entropy
2.3. KNN Classification Algorithm
2.4. Channel Selection Based on the PSO Algorithm
- (1)
- For the position vector of the particle , we use the Sigmoid function to linearly map the position vector, and obtain the weight of 30 channels in the range [0, 1], and then set the threshold to be 0.5. If some channel’s weight is greater than 0.5, the channel is selected, and the value is set to be 1 compulsorily, otherwise, the channel is abandoned, and the value is set to be 0 compulsorily.
- (2)
- For the selected channels, we use the KNN algorithm in Section 2.3 to calculate the fitness value of the ith particle based on the eigenvalue calculation in Section 2.2. During calculating the fitness value, the ten-fold cross-validation method is used to randomly divide the two eigenvalues (ApEn and SampEn) into 10 parts. Each time, one part is selected as the test set, and the other 9 parts are used as the training set, then, KNN is applied to obtain the corresponding recognition accuracy. After running 10 times in turn, we take the average of 10 runs as the fitness value.
- (3)
- Repeat the above step (1) and (2) for each particle to obtain the fitness value of all particles. is defined as the position vector corresponding to the optimal fitness value of the ith particle in the t iterative process, is defined as the position vector corresponding to the global optimal solution (that is, the maximum fitness value of the population) in the t iterative process, where , is the number of current iterations.
Algorithm 1. Pseudo-code of the PSO algorithm. |
Input: the maximum number of iterations tmax, total population size n, dimension D. Output: optimal channel number, best fitness. |
1. Set the parameters and generate the initial population randomly. 2. Calculate the fitness value of the population. For i = 1→n For j = 1→D If j is selected, perform feature extraction of EEG signals for channel j. Calculate ApEn and SampEn of all the sample data by Equations (1)–(7). End If End For the selected channels, the sample data is divided into 10 parts randomly, where one part is selected as the test set, and the remaining nine parts are used as the training set. Then take the average of 10 times as the fitness value by KNN. End 3. Update position vectors , . 4. For t = 1→tmax 5. Use Equations (8)–(9) to update the position of the population. 6. Repeat 2 and 3. 7. Determine whether the maximum iterations is reached, and if so, the iteration ends, and the optimal solution is output. Otherwise, the cycle continues. 8. End |
3. Results
3.1. Two Classifications of Emotions Based on “Waltz No. 2”
3.2. Three Classification of Emotions Based on “No. 14 Couplets”
3.3. Four Emotions Classifications Based on “Symphony No. 5 in C Minor”
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Chen, X.; Zhang, P.W.; Mao, Z.J.; Huang, Y.F.; Jiang, D.M.; Zhang, Y.N. Accurate EEG-based Emotion Recognition on Combined Features Using Deep Convolutional Neural Networks. IEEE Access 2019, 7, 44317–44328. [Google Scholar] [CrossRef]
- Autthasan, P.; Du, X.Q.; Perera, M.; Arnin, J.; Lamyai, S.; Itthipuripat, S.; Yagi, T.; Manoonpong, P.; Wilaiprasitporn, T. A Single-channel Consumer-grade EEG Device for Brain- computer Interface: Enhancing Detection of SSVEP and Its Amplitude Modulation. IEEE Sens. J. 2020, 20, 3366–3378. [Google Scholar] [CrossRef]
- Sawangjai, P.; Hompoonsup, S.; Leelaarporn, P.; Kongwudhikunakorn, S.; Wilaiprasitporn, T. Consumer grade EEG Measuring Sensors as Research Tools: A review. IEEE Sens. J. 2020, 20, 3996–4024. [Google Scholar] [CrossRef]
- Islam, M.R.; Moni, M.A.; Islam, M.M.; Rashed-Al-Mahfuz, M.; Islam, M.S.; Hasan, M.K.; Hossain, M.S.; Ahmad, M.; Uddin, S.; Azad, A.K.; et al. Emotion Recognition from EEG Signal Focusing on Deep Learning and Shallow Learning Techniques. IEEE Access 2021, 9, 94601–94624. [Google Scholar] [CrossRef]
- Baumgartner, T.; Esslen, M.; Jäncke, L. From Emotion Perception to Emotion Experience: Emotions Evoked by Pictures and Classical Music. Int. J. Psychophysiol. 2006, 60, 34–43. [Google Scholar] [CrossRef]
- Sheykhivand, S.; Mousavi, Z.; Rezaii, T.Y.; Farzamnia, A. Recognizing Emotions Evoked by Music Using CNN-LSTM Networks on EEG Signals. IEEE Access 2020, 8, 139332–139345. [Google Scholar] [CrossRef]
- Hu, W.; Huang, G.; Li, L.; Zhang, L.; Zhang, Z.; Liang, Z. Video-triggered EEG-emotion Public Databases and Current Methods: A survey. Brain Sci. Adv. 2020, 6, 255–287. [Google Scholar] [CrossRef]
- Song, T.; Zheng, W.; Lu, C.; Zong, Y.; Zhang, X.; Cui, Z. MPED: A Multi-modal Physiological Emotion Database for Discrete Emotion Recognition. IEEE Access 2019, 7, 12177–12191. [Google Scholar] [CrossRef]
- Yu, M.; Xiao, S.; Hua, M.; Wang, H.; Chen, X.; Tian, F.; Li, Y. EEG-based Emotion Recognition in an Immersive Virtual Reality Environment: From Local Activity to Brain Network Features. Biomed. Signal Process. Control. 2022, 72, 103349. [Google Scholar] [CrossRef]
- Suhaimi, N.S.; Mountstephens, J.; Teo, J. A Dataset for Emotion Recognition Using Virtual Reality and EEG (DER-VREEG): Emotional State Classification Using Low-Cost Wearable VR-EEG Headsets. Big Data Cogn. Comput. 2022, 6, 16. [Google Scholar] [CrossRef]
- Juslin, P.N.; Västfjäll, D. Emotional Responses to Music: The Need to Consider Underlying Mechanisms. Behav. Brain Sci. 2008, 31, 559–575. [Google Scholar] [CrossRef] [PubMed]
- Juslin, P.N.; Liljeström, S.; Västfjäll, D.; Barradas, G.; Silva, A. An Experience Sampling Study of Emotional Reactions to Music: Listener, Music, and Situation. Emotion 2008, 8, 668–683. [Google Scholar] [CrossRef] [PubMed]
- Koelsch, S. Brain Correlates of Music-evoked Emotions. Nat. Rev. Neurosci. 2014, 15, 170–180. [Google Scholar] [CrossRef]
- Koelsch, S. Investigating the Neural Encoding of Emotion with Music. Neuron 2018, 98, 1075–1079. [Google Scholar] [CrossRef] [PubMed]
- Daly, I.; Williams, D.; Hwang, F.; Kirke, A.; Miranda, E.R.; Nasuto, S.J. Electroencephalography Reflects the Activity of Sub-cortical Brain Regions during Approach-withdrawal Behaviour While Listening to Music. Sci. Rep. 2019, 9, 9415. [Google Scholar] [CrossRef] [PubMed]
- Mikutta, C.; Altorfer, A.; Strik, W.; Koenig, K. Emotions, Arousal, and Frontal Alpha Rhythm Asymmetry During Beethoven’s 5th Symphony. Brain Topogr. 2012, 25, 423–430. [Google Scholar] [CrossRef] [PubMed]
- Lee, Y.Y.; See, A.R.; Chen, S.C.; Liang, C.K. Effect of Music Listening on Frontal EEG Asymmetry. Appl. Mech. Mater. 2013, 311, 502–506. [Google Scholar] [CrossRef]
- Schmidt, B.; Hanslmayr, S. Resting Frontal EEG Alpha-asymmetry Predicts the Evaluation of Affective Musical Stimuli. Neurosci. Lett. 2009, 460, 237–240. [Google Scholar] [CrossRef]
- Sharma, S.; Sasidharan, A.; Marigowda, V.; Vijay, M.; Sharma, S.; Mukundan, C.S.; Pandit, L.; Masthi, N.R.R. Indian Classical Music with Incremental Variation in Tempo and Octave Promotes Better Anxiety Reduction and Controlled Mind Wandering-A Randomised Controlled EEG Study. Explor. J. Sci. Heal. 2021, 17, 115–121. [Google Scholar] [CrossRef]
- Schmidt, L.A.; Trainor, L.J. Frontal Brain Electrical Activity (EEG) Distinguishes Valence and Intensity of Musical Emotions. Cogn. Emot. 2001, 15, 487–500. [Google Scholar] [CrossRef]
- Tsang, C.D.; Trainor, L.J.; Santesso, D.L.; Tasker, S.L.; Schmidt, L.A. Frontal EEG Responses as a Function of Affective Musical Features. Ann. N. Y. Acad. Sci. 2001, 930, 439–442. [Google Scholar] [CrossRef] [PubMed]
- Sammler, D.; Grigutsch, M.; Fritz, T.; Koelsch, S. Music and Emotion: Electrophysiological Correlates of the Processing of Pleasant and Unpleasant Music. Psychophysiology 2007, 44, 293–304. [Google Scholar] [CrossRef]
- Briesemeister, B.B.; Tamm, S.; Heine, A.; Jacobs, A.M. Approach the Good, withdraw from the Bad—A Review on Frontal Alpha Asymmetry Measures in Applied Psychological Research. Psychology 2013, 4, 261–267. [Google Scholar] [CrossRef]
- Hou, Y.; Chen, S. Distinguishing Different Emotions Evoked by Music via Electroencephalographic Signals. Comput. Intell. Neurosci. 2019, 2019, 1–18. [Google Scholar] [CrossRef] [PubMed]
- Daimi, S.N.; Goutam, S. Influence of Music Liking on EEG Based Emotion Recognition. Biomed. Signal Process. Control. 2021, 64, 102251. [Google Scholar]
- Balasubramanian, G.; Kanagasabai, A.; Mohan, J.; Seshadri, N.P.G. Music Induced Emotion Using Wavelet Packet Decomposition—An EEG Study. Biomed. Signal Process. Control. 2018, 42, 115–128. [Google Scholar] [CrossRef]
- Bhatti, A.M.; Majid, M.; Anwar, S.M.; Khan, B. Human Emotion Recognition and Analysis in Response to Audio Music Using Brain Signals. Comput. Hum. Behav. 2016, 65, 267–275. [Google Scholar] [CrossRef]
- Hu, B.; Li, X.-W.; Sun, S.-T.; Ratcliffe, M. Attention Recognition in EEG-based Affective Learning Research Using CFS + KNN Algorithm. IEEE/ACM Trans. Comput. Biol. Bioinform. 2016, 15, 38–45. [Google Scholar] [CrossRef]
- Wei, Y.; Wu, Y.; Tudor, J. A Real-time Wearable Emotion Detection Headband Based on EEG Measurement. Sens. Actuators A Phys. 2017, 263, 614–621. [Google Scholar] [CrossRef]
- Lin, Y.-P.; Wang, C.-H.; Jung, T.-P.; Wu, T.-L.; Jeng, S.-K.; Duann, J.-R.; Chen, J.-H. EEG-Based Emotion Recognition in Music Listening. IEEE Trans. Biomed. Eng. 2010, 57, 1798–1806. [Google Scholar]
- Ahmed, T.; Islam, M.; Ahmad, M. Human Emotion Modeling Based on Salient Global Features of EEG Signal. In Proceedings of the 2nd International Conference on Advances in Electrical Engineering; IEEE: Piscataway, NJ, USA, 2013; pp. 246–251. [Google Scholar]
- Wang, X.-W.; Nie, D.; Lu, B.-L. Emotional State Classification from EEG Data Using Machine Learning Approach. Neurocomputing 2014, 129, 94–106. [Google Scholar] [CrossRef]
- Subha, D.P.; Joseph, P.K.; Acharya, U.R.; Lim, C.M. EEG Signal Analysis: A Survey. J. Med. Syst. 2010, 34, 195–212. [Google Scholar] [CrossRef] [PubMed]
- Pincus, S.M. Approximate entropy (ApEn) as a complexity measure. Chaos 1995, 5, 110–117. [Google Scholar] [CrossRef]
- Chen, T.; Ju, S.-H.; Yuan, X.-H.; Elhoseny, M.; Ren, F.-J.; Fan, M.-Y.; Chen, Z.-G. Emotion Recognition Using Empirical Mode Decomposition and Approximation Entropy. Comput. Electr. Eng. 2018, 72, 383–392. [Google Scholar] [CrossRef]
- Richman, J.S.; Moorman, J.R. Physiological Time-series Analysis Using Approximate Entropy and Sample Entropy. Am. J. Physiol.-Heart Circ. Physiol. 2000, 278, 2039–2049. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.-H.; Chen, I.-Y.; Chiueh, H.; Liang, S.-F. A Low-cost Implementation of Sample Entropy in Wearable Embedded Systems: An Example of Online Analysis for Sleep EEG. IEEE Trans. Instrum. Meas. 2021, 70, 1–12. [Google Scholar] [CrossRef]
- Li, D.; Liang, Z.; Wang, Y.-H. Parameter Selection in Permutation Entropy for an Electroencephalographic Measure of Isoflurane Anesthetic Drug Effect. J. Clin. Monit. Comput. 2013, 27, 113–123. [Google Scholar] [CrossRef]
- Chen, W.-T.; Wang, Z.-Z.; Xie, H.-B.; Yu, W.-Y. Characterization of Surface EMG Signal Based on Fuzzy Entropy. Neural Syst. Rehabil. Eng. IEEE Trans. 2007, 15, 266–272. [Google Scholar] [CrossRef]
- Särkelä, M.O.K.; Ermes, M.J.; Van Gils, M.J.; Yli-Hankala, A.M.; Jäntti, V.H.; Vakkuri, A.P. Quantification of Epileptiform Electroencephalographic Activity during Sevoflurane Mask Induction. Anesthesiology 2007, 107, 928–938. [Google Scholar] [CrossRef]
- Li, X.-L.; Li, D.; Liang, Z.-H.; Voss, L.J.; Sleigh, J.W. Analysis of Depth of Anesthesia with Hilbert–Huang Spectral Entropy. Clin. Neurophysiol. 2008, 119, 2465–2475. [Google Scholar] [CrossRef]
- Zuo, X.; Zhang, C.; Hämäläinen, T.; Gao, H.-B.; Fu, Y.; Cong, F.-Y. Cross-subject Emotion Recognition Using Fused Entropy Features of EEG. Entropy 2022, 24, 1281. [Google Scholar] [CrossRef] [PubMed]
- Liang, Z.-H.; Wang, Y.-H.; Sun, X.; Duan, L.; Voss, L.J.; Sleigh, J.W.; Hagihira, S.; Li, X.-L. EEG Entropy Measures in Anesthesia. Front. Comput. Neuroence 2015, 9, 16. [Google Scholar] [CrossRef] [PubMed]
- Zheng, W.-L.; Lu, B.-L. Investigating Critical Frequency Bands and Channels for EEG-based Emotion Recognition with Deep Neural Networks. IEEE Trans. Auton. Ment. Dev. 2015, 7, 162–175. [Google Scholar] [CrossRef]
- Zheng, W.-M. Multichannel EEG-based Emotion Recognition Via Group Sparse Canonical Correlation Analysis. IEEE Trans. Cogn. Dev. Syst. 2017, 9, 281–290. [Google Scholar] [CrossRef]
- Rahman, M.A.; Hossain, M.F.; Hossain, M.; Ahmmed, R. Employing PCA and t-statistical Approach for Feature Extraction and Classification of Emotion from Multichannel EEG Signal. Egypt. Inform. J. 2020, 21, 23–35. [Google Scholar] [CrossRef]
- Zenter, M.; Grandjean, D.; Scherer, K.R. Emotions Evoked by the Sound of Music: Characterization, Classification, and Measurement. Emotion 2008, 8, 494–521. [Google Scholar] [CrossRef]
- Shahabi, H.; Moghimi, S. Toward Automatic Detection of Brain Responses to Emotional Music through Analysis of EEG Effective Connectivity. Comput. Hum. Behav. 2016, 58, 231–239. [Google Scholar] [CrossRef]
- Mohammadi, Z.; Frounci, J.; Amiri, M. Wavelet-based Emotion Recognition System Using EEG Signal. Neural Comput. Appl. 2017, 28, 1985–1990. [Google Scholar] [CrossRef]
- Yohanes, R.E.J.; Ser, W.; Huang, G.-B. Discrete Wavelet Transform Coefficients for Emotion Recognition from EEG Signals. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August–1 September 2012. [Google Scholar]
- Murugappan, M.; Rizon, M.; Nagarajan, R.; Yaacob, S. EEG Feature Extraction for Classifying Emotions Using FCM and FKM. In Proceedings of the 7th WSEAS International Conference on Applied Computer and Applied Computational Science, Hangzhou, China, 6–8 April 2008. [Google Scholar]
- Sohaib, A.T.; Qureshi, S.; Hagelbäck, J.; Hilborn, O.; Jerčić, P. Evaluating Classifiers for Emotion Recognition Using EEG. In International Conference on Augmented Cognition; Springer: Berlin/Heidelberg, Germany, 2013; pp. 492–501. [Google Scholar]
- Hosseini, M.P.; Hosseini, A.; Ahi, K. A review on machine learning for EEG signal processing in bioengineering. IEEE Rev. Biomed. Eng. 2020, 14, 204–218. [Google Scholar] [CrossRef]
- Wen, Z.Y.; Xu, R.F.; Du, J.C. A Novel Convolutional Neural Networks for Emotion Recognition Based on EEG Signal. In Proceedings of the 2017 International Conference on Security, Pattern Analysis, and Cybernetics, Shenzhen, China, 15–17 December 2017. [Google Scholar]
- Huang, D.M.; Chen, S.T.; Liu, C.; Zheng, L.; Tian, Z.H.; Jiang, D.Z. Differences First in Asymmetric Brain: A Bi-hemisphere Discrepancy Convolutional Neural Network for EEG Emotion Recognition. Neurocomputing 2021, 448, 140–151. [Google Scholar] [CrossRef]
- Thammasan, N.; Fukui, K.I.; Numao, M. Application of Deep Belief Networks in EEG-based Dynamic Music-emotion Recognition. In 2016 International Joint Conference on Neural Networks (IJCNN); IEEE: Piscataway, NJ, USA, 2016; pp. 881–888. [Google Scholar]
- Liu, J.; Wu, G.; Luo, Y.; Qiu, S.; Yang, S.; Li, W.; Bi, Y. EEG-based Emotion Classification Using a Deep Neural Network and Sparse Autoencoder. Front. Syst. Neurosci. 2020, 14, 43. [Google Scholar] [CrossRef] [PubMed]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Cochrane, T. A Simulation Theory of Musical Expressivity. Australas. J. Philos. 2010, 88, 191–207. [Google Scholar] [CrossRef]
- Du Pre, E.; Hanke, M.; Poline, J.B. Nature Abhors a Paywall: How Open Science Can Realize the Potential of Naturalistic Stimuli. Neuroimage 2020, 216, 116330. [Google Scholar] [CrossRef]
- Goldberg, H.; Preminger, S.; Malach, R. The Emotion–action Link? Naturalistic Emotional Stimuli Preferentially Activate the Human Dorsal Visual Stream. Neuroimage 2014, 84, 254–264. [Google Scholar] [CrossRef] [PubMed]
- Singhal, A.; Kumar, P.; Saini, R.; Roy, P.P.; Dogra, D.P.; Kim, B.G. Summarization of Videos by Analyzing Affective State of the User through Crowdsource. Cogn. Syst. Res. 2018, 52, 917–930. [Google Scholar] [CrossRef]
- Kaur, B.; Singh, D.; Roy, P.P. EEG Based Emotion Classification Mechanism in BCI. Procedia Comput. Sci. 2018, 132, 752–758. [Google Scholar] [CrossRef]
- Liu, Y.; Yu, M.; Zhao, G.; Song, J.; Ge, Y.; Shi, Y. Real-time Movie-induced Discrete Emotion Recognition from EEG Signals. IEEE Trans. Affect. Comput. 2018, 9, 550–562. [Google Scholar] [CrossRef]
- Zheng, W.; Liu, W.; Lu, Y.; Lu, B.; Cichocki, A. Emotion Meter: A Multimodal Framework for Recognizing Human Emotions. IEEE Trans. Cybern. 2019, 49, 1110–1122. [Google Scholar] [CrossRef]
- Adams, B.L. The Effect of Visual/aural Conditions on the Emotional Response to Music; ProQuest Dissertations Publishing: Ann Arbor, MI, USA; The Florida State University: Tallahassee, FL, USA, 1994; p. 9434127. [Google Scholar]
- Geringer, J.M.; Cassidy, J.W.; Byo, J.L. Effects of Music with Video on Responses of Nonmusic Majors: An Exploratory Study. J. Res. Music Educ. 1996, 44, 240–251. [Google Scholar] [CrossRef]
- Acharya, U.R.; Sree, S.V.; Swapna, G.; Martis, R.J.; Suri, J.S. Automated EEG Analysis of Epilepsy: A Review. Knowl.-Based Syst. 2013, 45, 147–165. [Google Scholar] [CrossRef]
- Murugappan, M.; Nagarajan, R.; Yaacob, S. Combining Spatial Filtering and Wavelet Transform for Classifying Human Emotions Using EEG Signals. J. Med. Biol. Eng. 2011, 31, 45–51. [Google Scholar] [CrossRef]
- Hosseini, S.A.; Naghibi-Sistani, M.B. Emotion Recognition Method Using Entropy Analysis of EEG Signals. Int. J. Image Graph. Signal Process. 2011, 3, 30–36. [Google Scholar] [CrossRef]
- Jie, X.; Cao, R.; Li, L. Emotion Recognition Based on the Sample Entropy of EEG. Bio-Mediical Mater. Eng. 2014, 24, 1185–1192. [Google Scholar] [CrossRef] [PubMed]
- Shi, Y.; Eberhart, R.C. Empirical Study of Particle Swarm Optimization. In Proceedings of the 1999 Congress on Evolutionary Computation, CEC99 (Cat. No. 99TH8406); IEEE: Piscataway, NJ, USA, 1999; pp. 1945–1950. [Google Scholar]
- Murugappan, M.; Nagarajan, R.; Yaacob, S. Comparison of Different Wavelet Features from EEG Signals for Classifying Human Emotions. In 2009 IEEE Symposium on Industrial Electronics & Applications; IEEE: Piscataway, NJ, USA, 2009; pp. 836–841. [Google Scholar]
- Panagiotis, C.; Leontios, J. Emotion Recognition from Brain Signals Using Hybrid Adaptive Filtering and Higher Order Crossings Analysis. IEEE Trans. Affect. Comput. 2010, 1, 81–97. [Google Scholar]
- Spiers, H.J.; Maguire, E.A. Decoding Human Brain Activity during Real-world Experiences. Trends Cogn. Sci. 2007, 11, 356–365. [Google Scholar] [CrossRef]
- Islam, M.S.U.; Kumar, A. A Review on Emotion Recognition with Machine Learning Using EEG Signals. ECS Trans. 2022, 107, 5105. [Google Scholar] [CrossRef]
- Suhaimi, N.S.; Mountstephens, J.; Teo, J. EEG-Based Emotion Recognition: A State-of-the-Art Review of Current Trends and Opportunities. Comput. Intell. Neurosci. 2020, 2020, 1–19. [Google Scholar] [CrossRef]
- Al-Qazzaz, N.K.; Sabir, M.K.; Ali, S.H.B.M.; Ahmad, S.A.; Grammer, K. Complexity and Entropy Analysis to Improve Gender Identification from Emotional-Based EEGs. J. Healthc. Eng. 2021, 2021, 8537000. [Google Scholar] [CrossRef]
Material | Sample Information | Emotion (Sample Size) | Subject | ||
---|---|---|---|---|---|
Time Segment | Emotion Aroused | Sample Size | |||
Waltz No. 2 | 0:20–1:16 | Pleasure | 57 | Pleasure (114) Excitement (88) | No. 1, 6, 7, 12, 13, 18, 19, 24, 25, 30 |
1:17–2:21 | Excitement | 65 | |||
2:22–3:01 | Pleasure | 40 | |||
3:02–3:24 | Excitement | 23 | |||
3:25–3:41 | Pleasure | 17 | |||
No. 14 Couplets | 0:06–0:33 | Excitement | 28 | Briskness (59) Excitement (71) Nervousness (10) | No. 2, 5, 8, 11, 14, 17, 20, 23, 26, 29 |
0:34–0:42 | Briskness | 9 | |||
0:43–0:52 | Nervousness | 10 | |||
0:53–1:07 | Excitement | 15 | |||
1:08–1:57 | Briskness | 50 | |||
1:58–2:25 | Excitement | 28 | |||
The first movement of “Symphony No. 5 in C minor” | 0:16–0:45 | Passion | 30 | Passion (48) Relaxation (59) Cheerfulness (36) Nervousness (97) | No. 3, 4, 9, 10, 15, 16, 21, 22, 27, 28 |
0:46–1:05 | Relaxation | 20 | |||
1:06–1:23 | Cheerfulness | 18 | |||
1:24–1:41 | Passion | 18 | |||
1:42–1:53 | Relaxation | 12 | |||
1:54–2:10 | Nervousness | 17 | |||
2:11–2:37 | Relaxation | 27 | |||
2:38–2:55 | Cheerfulness | 18 | |||
2:56–4:15 | Nervousness | 80 |
Channel | Subject | Total | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
1 | 6 | 7 | 12 | 13 | 18 | 19 | 24 | 25 | 30 | ||
Fp1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
F3 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 8 |
F7 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 6 |
FT9 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 5 |
FC5 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 3 |
FC1 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 4 |
C3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 |
T7 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 5 |
CP5 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 7 |
CP1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 6 |
Pz | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 10 |
P3 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 6 |
P7 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 7 |
O1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 7 |
Oz | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 4 |
O2 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 6 |
P4 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
P8 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
CP6 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 5 |
CP2 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
Cz | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
C4 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
T8 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 8 |
FT10 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 4 |
FC6 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 4 |
FC2 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 7 |
F4 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 2 |
F8 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 8 |
Fp2 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 5 |
Fz | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 4 |
Accuracy (%) | 81.47 | 82.69 | 84.66 | 84.12 | 83.70 | 82.21 | 68.84 | 70.53 | 82.15 | 83.20 |
Channel | Subject | Total | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
2 | 5 | 8 | 11 | 14 | 17 | 20 | 23 | 26 | 29 | ||
Fp1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 9 |
F3 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 6 |
F7 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
FT9 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 8 |
FC5 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 6 |
FC1 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 6 |
C3 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 8 |
T7 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 5 |
CP5 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 7 |
CP1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 4 |
Pz | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
P3 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 6 |
P7 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 5 |
O1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 6 |
Oz | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 7 |
O2 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 6 |
P4 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 5 |
P8 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 6 |
CP6 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 6 |
CP2 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 7 |
Cz | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 4 |
C4 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
T8 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 8 |
FT10 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
FC6 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 4 |
FC2 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 3 |
F4 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 6 |
F8 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 5 |
Fp2 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
Fz | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 4 |
Accuracy (%) | 81.79 | 68.57 | 83.57 | 82.14 | 80.36 | 64.29 | 82.14 | 82.50 | 86.07 | 74.64 |
Channel | Subject | Total | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
3 | 4 | 9 | 10 | 15 | 16 | 21 | 22 | 27 | 28 | ||
Fp1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 7 |
F3 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 7 |
F7 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 7 |
FT9 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 6 |
FC5 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 4 |
FC1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 8 |
C3 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 4 |
T7 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 5 |
CP5 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 8 |
CP1 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 3 |
Pz | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 7 |
P3 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 7 |
P7 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 5 |
O1 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 7 |
Oz | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 6 |
O2 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 9 |
P4 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 6 |
P8 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 7 |
CP6 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 5 |
CP2 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 6 |
Cz | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 3 |
C4 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 5 |
T8 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 6 |
FT10 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 5 |
FC6 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
FC2 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 7 |
F4 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 5 |
F8 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 5 |
Fp2 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 6 |
Fz | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 6 |
Accuracy (%) | 68.33 | 72.29 | 69.38 | 73.96 | 76.67 | 68.33 | 52.50 | 49.79 | 74.17 | 76.04 |
Channel | Fp1 * | F3 * | F7 | FT9 | FC5 | FC1 | C3 | T7 | CP5 * | CP1 |
FRSOC | 1.265 | 1.265 | 0.096 | 1.145 | 0.783 | 1.084 | 0.783 | 0.904 | 1.325 | 0.783 |
OCSR (%) | 70.0 | 70.0 | 53.3 | 63.3 | 43.3 | 60.0 | 43.3 | 50.0 | 73.3 | 43.3 |
Channel | Pz * | P3 | P7 | O1 | Oz | O2 * | P4 | P8 | CP6 | CP2 |
FRSOC | 1.265 | 1.145 | 1.024 | 1.145 | 1.024 | 1.265 | 0.904 | 1.084 | 0.096 | 1.145 |
OCSR (%) | 70.0 | 63.3 | 56.7 | 63.3 | 56.7 | 70.0 | 50.0 | 60.0 | 53.3 | 63.3 |
Channel | Cz | C4 ** | T8 * | FT10 ** | FC6 ** | FC2 | F4 | F8 | Fp2 | Fz |
FRSOC | 0.723 | 0.663 | 1.325 | 0.663 | 0.663 | 1.024 | 0.783 | 1.084 | 0.843 | 0.843 |
OCSR (%) | 40.0 | 36.7 | 73.3 | 36.7 | 36.7 | 56.7 | 43.3 | 60.0 | 46.7 | 46.7 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xie, Z.; Pan, J.; Li, S.; Ren, J.; Qian, S.; Ye, Y.; Bao, W. Musical Emotions Recognition Using Entropy Features and Channel Optimization Based on EEG. Entropy 2022, 24, 1735. https://doi.org/10.3390/e24121735
Xie Z, Pan J, Li S, Ren J, Qian S, Ye Y, Bao W. Musical Emotions Recognition Using Entropy Features and Channel Optimization Based on EEG. Entropy. 2022; 24(12):1735. https://doi.org/10.3390/e24121735
Chicago/Turabian StyleXie, Zun, Jianwei Pan, Songjie Li, Jing Ren, Shao Qian, Ye Ye, and Wei Bao. 2022. "Musical Emotions Recognition Using Entropy Features and Channel Optimization Based on EEG" Entropy 24, no. 12: 1735. https://doi.org/10.3390/e24121735
APA StyleXie, Z., Pan, J., Li, S., Ren, J., Qian, S., Ye, Y., & Bao, W. (2022). Musical Emotions Recognition Using Entropy Features and Channel Optimization Based on EEG. Entropy, 24(12), 1735. https://doi.org/10.3390/e24121735