Novel Approach for Emotion Detection and Stabilizing Mental State by Using Machine Learning Techniques
Abstract
:1. Introduction
2. Literature Survey
- Subtle emotions are not recognized.
- No work on emotion detection of mentally disabled people is done.
- Emotions are captured in controlled environment. Because of that natural emotions are not get captured.
- Therapeutic intervention is not proposed to stable emotional state.
3. Proposed System
4. Method
4.1. Participant
4.2. Equipment and Brain Signals
4.3. Procedure
4.3.1. Database Collection
- The participant was seated in front of the experimenter.
- When the participant getst settled down, the experimenter starts the experiment.
- Renu saline solution is applied on the sensor material to increase sensing capacity.
- Epoc 14 channel EEG Headset is placed on the skull of the participant.
- Electrode is adjusted till the maximum accuracy is not achieved.
- Some time is given to the participant to understand his/her mental state.
- Participant is asked to choose stimuli for activation of the emotion.
- Three types of stimulus thoughts, audio and video are used to elicit emotions in them. Subjects are asked to choose the type of stimulus.
- When participants convey the ready signal, EEG signal acquisition started. The minimum time of the recording is 1 min. Recording is done at a 128 Hz sampling rate.
- This procedure is repeated for capturing four types of emotion signal (angry, calm, happy and sad).
4.3.2. Pre-Processing
- Zero phase distortion.
- A filter transfer function equal to the squared magnitude of the original filter transfer function.
- A filter order that is double the order of the filter specified by coefficients of denominator and numerator.
4.3.3. Feature Extraction
- (a)
- Chirplet Transform
- Convert signal column vector into row vector.
- Select chirp rate as 5 and sampling frequency as 128;
- C = 5;
- Fs = 128 hz;
- The length of window function can be selected as
- h = 2v; v = {0, 1, 2, 4, 8, 16, 32}
- h = {1, 2, 4, 8, 16, 32, 64, 128}
- Ideally the value of window function, h should be equal to sampling frequency.
- Use reshape function:
- RESHAPE function returns the M-by-N matrix whose elements
- are taken column wise from X. An error results if X does not have have M*N elements. RESHAPE(X, …, [], …) calculates the length
- of the dimension represented by [], such that the product
- of the dimensions equals NUMEL(X). The value of NUMEL(X) must be evenly divisible by the product of the specified dimensions.
- SIGLCT = reshape(SIGLCT,size(SIGLCT,2)/h,h);
- [Sig_Len] = size(SIGLCT,1);%Row
- WindowVector = 1:Sig_Len;
- Lh = 0;
- tt = (1:Sig_Len)/fs;
- TFR= zeros (Sig_Len,Sig_Len);
- tau = 0;
- for i1 = 1:Sig_Len
- % Amplitude
- Ti = WindowVector(i1);
- tau = -min([round(Sig_Len/2)-1, Lh, ti-1]):min([round(Sig_Len/2)-1, Lh, Sig_Len-ti]);
- indices= rem(Sig_Len + tau,Sig_Len) + 1;
- rSig = SIGLCT(ti + tau, 1);
- TFR(indices,i1) = rSig. ∗ conj(h(Lh + 1+tau)). ∗ exp(-j ∗ 2.0 ∗ pi ∗ (c/2) ∗ (tt(ti + tau)-tt(i1)).^2)’;
- end
- TFR = fft(TFR);
- TFR = TFR(1:round(end/2),:);
- TFR1 = mean (TFR, 1);
- (b)
- Feature Ranking
- (c)
- Observations
- It is found that there is high correlation between features.
- It is suggested that in Chirplet transform value of h, window function should be equal to sampling frequency fs but It is observed that for the increasing value of h, the calculated value of all the features was getting very low which is difficult to processed. The ranking of the features are mentioned in Table 3.
- For h = 64, the harmonic distortion of the signal is calculated as negligible near about 0. The lower value of harmonic distortion (THD) of the signal is represented a more faithful reproduction of the original recording; and communication systems and also less interference with other devices and higher transmit power for the signal of interest.
- Power spectral density function (PSD) ranked first by feature selection algorithm. It shows the strength of the variations (energy) as a function of frequency. In other words, it shows at which frequencies variations are strong and at which frequencies variations are weak. Best and worst feature ranked by linear regression model.
- The band power is ranked high for all the value of h. Hence the band power could be the most prominent feature to classify emotion. The delta band power for both deep sleep and wakefulness, the former would be very high and the latter very low.
- In Figure 6, correlations between features are shown. It is observed that the correlation between all the features are very high, hence, first all 24 features are considered for classification of emotion and then accuracy of KNN classifier is tested for the features which are ranked high by feature ranking algorithm. For classification, the KNN classifier is used.
4.3.4. Classification
- The FVSET file is created after feature extraction. The dimension of the file is (24 rows ∗ 120 cols). Each row represent extracted feature of signal. Each column represents 24 feature of the signal. The 1 to 30 columns represent features of anger (annoying, angry, and nervous) emotion class. The 31 to 60 columns represent features of calm (calm, peaceful, relaxed) emotion class. The 61 to 90 columns represent features of happy (excited, happy, pleased) emotion class. The 91 to 120 columns represent features of sad (sleepy, bored, sad) emotion class. In Figure 7, accuracy of K nearest neighbour-KNN, convolutional neural network—CNN, recurrent neural network—RNN and deep neural network—DNN is graphically mentioned. It is found to be very low. [20] The accuracy of emotion detection is mentioned below.
- At the time of signal acquisition it was getting difficult for the subject to label the emotion.
- Participants experienced sudden change in the emotions.
- Participants selected stimulus randomly.
- Features are highly correlated with each other.
- The calculated values are very low to discriminate from other feature.
4.3.5. Music Therapy as an Intervention Technique
5. Discussion
- There are variations in the band power of each person.
- The visualization of the band power of each captured signal can be viewed in Emotive Epoc Pro. The band power is graphically represented by showing theta wave (4–8 Hz), alpha (8–12 Hz), low beta (12–16 Hz), beta (16–25 Hz) and gamma (25–45 Hz) variations in the captured signal with respect to time. The significance of each wave is mentioned in the Table 8.
- It is observed that theta wave activity is very high for the intense emotion of any type.
- Mostly gamma wave activity is very low near to negligible for any type of emotion. Gamma wave activity can be observed when the participants were trying to remember or recall bad events.
- It is observed that gamma wave activity got reduced after application of intervention in the form of meditation music.
- While listening meditation music, it is found that all types of brain activity got reduced and participants were feeling calm mental stage.
- Meditation music lowers brain activity.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Matiko, J.W.; Beeby, S.P.; Tudor, J. Real time emotion detection within a wireless sensor network and its impact on power consumption. IET Wirel. Sens. Syst. 2014, 4, 183–190. [Google Scholar] [CrossRef]
- Kimmatkar, N.V.; Vijaya Babu, B. A Survey and Comparative Analysis of Various Existing Techniques Used to Develop an Intelligent Emotion Recognition System Using EEG Signal Analysis; Serials Publications Pvt. Ltd.: New Delhi, India, 2017; pp. 707–717. [Google Scholar]
- Bhardwaj, A.; Gupta, A.; Jain, P.; Rani, A.; Yadav, J. Classification of human emotions from EEG signals using SVM and LDA Classifiers. In Proceedings of the 2015 2nd International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 19–20 February 2015. [Google Scholar]
- Gawali, B.W.; Rao, S.; Abhang, P.; Rokade, P.; Mehrotra, S.C. Classification of EEG signals for different emotional states. In Proceedings of the Fourth International Conference on Advances in Recent Technologies in Communication and Computing (ARTCom2012), Bangalore, India, 19–20 October 2012; pp. 177–181. [Google Scholar]
- Blaiech, H.; Neji, M.; Wali, A.; Alimi, A.M. Emotion recognition by analysis of EEG signals. In Proceedings of the 13th International Conference on Hybrid Intelligent Systems (HIS 2013), Gammarth, Tunisia, 4–6 December 2013. [Google Scholar]
- Kaundanya, V.L.; Patil, A.; Panat, A. Performance of k-NN classifier for emotion detection using EEG signals. In Proceedings of the 2015 International Conference on Communications and Signal Processing (ICCSP), Melmaruvathur, India, 2–4 April 2015. [Google Scholar]
- Murugappan, M. Human emotion classification using wavelet transform and KNN. In Proceedings of the 2011 International Conference on Pattern Analysis and Intelligence Robotics, Kuala Lumpur, Malaysia, 28–29 June 2011. [Google Scholar]
- Mehmood, R.M.; Lee, H.J. Towards emotion recognition of EEG brain signals using Hjorth parameters and SVM. Adv. Sci. Technol. Lett. Biosci. Med. Res. 2015, 91, 24–27. [Google Scholar]
- Mehmood, R.M.; Lee, H.J. Emotion classification of EEG brain signal using SVM and KNN. In Proceedings of the 2015 IEEE International Conference on Multimedia & Expo Workshops (ICMEW), Turin, Italy, 29 June–3 July 2015. [Google Scholar]
- Zheng, W.L.; Lu, B.L. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 2015, 7, 162–175. [Google Scholar] [CrossRef]
- Duan, R.N.; Zhu, J.Y.; Lu, B.L. Differential entropy feature for EEG-based emotion classification. In Proceedings of the 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), San Diego, CA, USA, 6–8 November 2013. [Google Scholar]
- Harischandra, J.; Perera, M.U.S. Intelligent emotion recognition system using brain signals (EEG). In Proceedings of the 2012 IEEE-EMBS Conference on Biomedical Engineering and Sciences, Langkawi, Malaysia, 17–19 December 2012. [Google Scholar]
- Fan, J.; Wade, J.W.; Bian, D.; Key, A.P.; Warren, Z.E.; Mion, L.C.; Sarkar, N. A Step towards EEG-based brain computer interface for autism intervention. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015. [Google Scholar]
- bin Yunus, J. The effect of noise removing on emotional classification. In Proceedings of the 2012 International Conference on Computer & Information Science (ICCIS), Kuala Lumpur, Malaysia, 12–14 June 2012. [Google Scholar]
- Ramaraju, S.; Izzidien, A.; Roula, M.A. The detection and classification of the mental state elicited by humor from EEG patterns. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015. [Google Scholar]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. Deap: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 2011, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Ekanayake, H. P300 and Emotiv EPOC: DoesEmotiv EPOC Capture Real EEG? 2010. Available online: http://neurofeedback.visaduma.info/emotivresearch.htm (accessed on 28 April 2020).
- EMOTIV. Brain Controlled Technology Using Emotiv’s Algorithms |EMOTIV [Online]. 2020. Available online: https://www.emotiv.com/brain-controlled-technology/ (accessed on 12 June 2020).
- Yu, G.; Zhou, Y. General linear chirplet transform. Mech. Syst. Signal Process. 2016, 70, 958–973. [Google Scholar] [CrossRef]
- Feature Selection for Machine Learning in Python. Available online: https://machinelearningmastery.com/feature-selection-machine-learning-python/ (accessed on 12 May 2020).
- Caesarendra, W.; Tjahjowidodo, T.; Pamungkas, D. EMG based classification of hand gestures using PCA and ANFIS. In Proceedings of the 2017 International Conference on Robotics, Biomimetics, and Intelligent Computational Systems (Robionetics), Bali, Indonesia, 23–25 August 2017. [Google Scholar]
- Kimmatkar, N.V.; Babu, B.V. Initial analysis of brain EEG signal for mental state detection of human being. In Proceedings of the 2017 International Conference on Trends in Electronics and Informatics (ICEI), Tirunelveli, India, 11–12 May 2017; pp. 287–295. [Google Scholar] [CrossRef]
- Kimmatkar, N.V.; Babu, V.B. Human Emotion Classification from Brain EEG Signal Using Multimodal Approach of Classifier. In Proceedings of the 2018 International Conference on Intelligent Information Technology, Chennai, India, 11–14 December 2018. [Google Scholar]
Paper | Signal Acquisition | Pre Processing | Feature Extraction | Features | Classification | Emotional State Detection |
---|---|---|---|---|---|---|
[3] | BIOPAC, MP150 4 channel | BPF | ICA | Power Spectral Density | SVM | Happy, Sad, Disgust, Neutral, Fear, Surprised Anger |
[4] | 64 Channel EEG Cap | BPF | STM | Mean, St. Dev. Variance, Covariance | LDA | Happy, Relaxed, Sad |
[5] | Emotiv EPOC 14 channel EEG headset | BPF | FFT | Arousal, Valance, Dominance | Fuzzy Rules | Neutrality, Joy, Sadness, Fear, Anger, Disgust, Surprise |
[6] | An ADlnstruments’ power lab instrument | BPF | WT | Mean, St Dev. Variance, RMS Value, Skewness, Power, Entropy | KNN | Sad, Happy |
[7] | 64 channel EEG electrodes | BPF | WT, STFT | Standard Deviation, Entropy, Power | KNN | Disgust, Happy, Surprise, Fear, Neutral |
[8] | Emotiv EPOC 14 channel EEG headset | BPF | STM | Valence Arousal | SVM, KNN | Happy, Calm, Neutral, Sad, Scare |
[9] | Emotiv EPOC 14 channel EEG headset | ICA | STM | Frequency Band Power | SVM, KNN | Happy, Calm, Sad, Scared |
[10] | 64 Channel EEG Cap | BPF | PCA | DE, RASM, ASM | DBN, SVM, LR and KNN | Negative, Positive, Neutral |
[11] | 64 Channel EEG Cap | BPF | PCA | DE, DASM, RASM, ES | SVM, KNN | Negative, Positive, Neutral |
[12] | 64 Channel EEG Cap | BPF | DWT | Frequency Band Power | Fuzzy Rules | Bands |
[13] | 14-channel EEG | STM | FFT, STM | Spectral Features, Power Spectral Density | BN, NB, SVM, MMPN, KNN | Engagement, Mental Workload, Enjoyment, Frustration, Boredom |
[14] | 14-channel EEG | ICA, PCA | CSP | Frequency Band Power | SVM | Bands Separation |
[15] | 14-channel EEG | PCA | CSP | Frequency Band Power | SVM | Neutral and Humor |
[16] | 64 Channel EEG Cap | BPF | WT, FT | Frequency Band Power Spectral Density | Single-Trial Classification | Sadness, Amusement, Fear, Anger, Frustration, And Surprise. |
Mean | ∑fx/∑f |
Std | √[∑D²/N] |
Variance | σ×σ |
IEEG | |
Mean Absolute Value | |
Modified mean absolute value type 1 | |
Modified mean absolute value type II | |
Simple Square Integral SSI | |
Variance of EEG VEEG | |
Root mean square RMS | |
Difference absolute standard deviation value mean2(DASTD) | |
Autoregressive PXX | |
Hjorth activity HA | |
Hjorth mobility HM | |
Hjorth complexity HC | |
Waveform length WL | |
`Harmonic Distortion |
Feature Set | h = 2 | h = 64 |
---|---|---|
Mean | 19 | 21 |
STD | 11 | 10 |
VAR | 7 | 9 |
SKEW | 18 | 17 |
Kurtosis | 20 | 18 |
IEEG | 4 | 7 |
MAV | 3 | 4 |
MAV1 | 12 | 19 |
MAV2 | 22 | 22 |
SSI | 2 | 16 |
VEEG | 9 | 11 |
RMS | 10 | 14 |
DASTD | 5 | 5 |
AREG_PXX | 24 | 23 |
HA | 6 | 12 |
HM | 16 | 8 |
HC | 15 | 15 |
WL | 14 | 13 |
BANDPOWER | 1 | 2 |
PERODOGRAM1(Max Pow) | 13 | 6 |
PERODOGRAM2(max Pow index) | 21 | 20 |
ENVOLEPE(Harmonic Distance), | 23 | 24 |
Hilbert | 8 | 3 |
PSD | 17 | 1 |
Emotion | HA | HM | HC | THD | PSD | Mean | Std. Dev | VEEG | DASTD | PXX | DASTD |
---|---|---|---|---|---|---|---|---|---|---|---|
Band Power | |||||||||||
HP | |||||||||||
Anger | 60 | 53.33 | 56.67 | 56.67 | 66.67 | 63.33 | 60 | 63.33 | 56.67 | 36.67 | 40 |
Calm | 46.67 | 63.33 | 63.33 | 46.67 | 53.33 | 46.67 | 36.67 | 33.33 | 50 | 53.33 | 56.67 |
Happy | 43.33 | 56.67 | 53.33 | 43.33 | 50 | 40 | 43.33 | 26.67 | 36.67 | 76.67 | 66.67 |
Sad | 56.67 | 50 | 46.67 | 63.33 | 63.33 | 56.67 | 50 | 53.33 | 73.33 | 53.33 | 36.67 |
Emotion | KNN Accuracy |
---|---|
Anger | 46.67 |
Calm | 53.33 |
Happy | 33.33 |
Sad | 70.00 |
Sr. No | Signal | Detected Emotion by KNN Classifier |
---|---|---|
1 | Test1 | Angry |
2 | Test2 | Sleepy |
3 | Test3 | Calm |
4 | Test4 | Sleepy |
5 | Test5 | Angry |
6 | Test6 | Calm |
7 | Test7 | Angry |
8 | Test8 | Calm |
9 | Test9 | Sleepy |
Signal | Detected Emotion | Signal | Detected Emotion |
---|---|---|---|
Test1 | Relaxed | Test11 | Relaxed |
Test2 | Pleased | Test12 | Relaxed |
Test3 | Relaxed | Test13 | Nervous |
Test4 | Relaxed | Test14 | Sleepy |
Test5 | Nervous | Test15 | Pleased |
Test6 | Relaxed | Test16 | Relaxed |
Test7 | Nervous | Test17 | Nervous |
Test8 | Nervous | Test18 | Relaxed |
Test9 | Relaxed | Test19 | Relaxed |
Test10 | Relaxed | Test20 | Pleased |
Brain Wave Type | Frequency Range (Hz) | Location | Mental States and Conditions |
---|---|---|---|
Delta wave | 0–3.5 Hz | Frontal Lobe | Dreamless, deep, sleep, unconscious |
Theta wave | 4–7.5 Hz | Midline, Temporal | Enthusiastic, fantasy, imaginary |
Alpha wave | 8–12 Hz | Frontal, Occipital | Relaxed, calm, conscious |
Low beta wave | 12–15 Hz | Frontal | Relaxed and focused, integrated |
Mid-range beta wave | 16–20 Hz | Frontal, | Thinking, aware of self and surroundings |
High beta wave | 21 to 30 Hz | Frontal, Central | Alertness, agitation |
Gamma wave | 30 to 100 Hz | Frontal, Central | Motor functions, higher mental activity |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kimmatkar, N.V.; Babu, B.V. Novel Approach for Emotion Detection and Stabilizing Mental State by Using Machine Learning Techniques. Computers 2021, 10, 37. https://doi.org/10.3390/computers10030037
Kimmatkar NV, Babu BV. Novel Approach for Emotion Detection and Stabilizing Mental State by Using Machine Learning Techniques. Computers. 2021; 10(3):37. https://doi.org/10.3390/computers10030037
Chicago/Turabian StyleKimmatkar, Nisha Vishnupant, and B. Vijaya Babu. 2021. "Novel Approach for Emotion Detection and Stabilizing Mental State by Using Machine Learning Techniques" Computers 10, no. 3: 37. https://doi.org/10.3390/computers10030037
APA StyleKimmatkar, N. V., & Babu, B. V. (2021). Novel Approach for Emotion Detection and Stabilizing Mental State by Using Machine Learning Techniques. Computers, 10(3), 37. https://doi.org/10.3390/computers10030037