Emotion Classification Based on CWT of ECG and GSR Signals Using Various CNN Models
Abstract
:1. Introduction
- Emotions can be elicited using virtual reality, which resembles the real environment.
- Participants with diverse cultural backgrounds and in different age groups who are not familiar with one another can be evaluated.
- ECG and GSR recordings can be captured when participants are in a naturalistic environment using smart bands.
- ECG and GSR recordings were obtained while participants watched short videos. Emotions were classified using the novel approach of obtaining CWT coefficients and classifying based on deep learning models.
- ECG recordings were obtained while participants watched long-duration videos individually and in groups, and emotions were classified using the novel technique.
2. Related Works
3. Methodology
3.1. Data Description
3.1.1. Case 1: Valence and Arousal Classification with Short Videos
3.1.2. Case 2: Valence and Arousal Classification with Long Videos
3.2. Pre-Processing
3.2.1. Filtering
3.2.2. Segmentation
3.3. Continuous Wavelet Transform
3.4. Scalogram
3.5. Convolutional Neural Network
3.5.1. DenseNet CNN
3.5.2. MobileNet CNN
3.5.3. NASNet Mobile CNN
3.5.4. InceptionResnetV2 CNN
3.5.5. EfficientNetB7 CNN
4. Results and Discussion
4.1. Case 1: Arousal and Valence Classification Using ECG and GSR Data while Participants Watched Short Videos
4.1.1. ECG Arousal Classification
4.1.2. ECG Valence Classification
4.1.3. GSR Arousal Classification
4.1.4. GSR Valence Classification
4.2. Case 2: Arousal and Valence Classification of Participants while Watching Long Videos in Individual and Group Settings
4.2.1. Individual Arousal Classification
4.2.2. Individual Valence Classification
4.2.3. Group Arousal Classification
4.2.4. Group Valence Classification
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Egger, M.; Ley, M.; Hanke, S. Emotion recognition from physiological signal analysis: A review. Electron. Notes Theor. Comput. Sci. 2019, 343, 35–55. [Google Scholar] [CrossRef]
- Lee, M.S.; Lee, Y.K.; Pae, D.S.; Lim, M.T.; Kim, D.W.; Kang, T.K. Fast emotion recognition based on single pulse PPG signal with convolutional neural network. Appl. Sci. 2019, 9, 3355. [Google Scholar] [CrossRef] [Green Version]
- Bulagang, A.F.; Weng, N.G.; Mountstephens, J.; Teo, J. A review of recent approaches for emotion classification using electrocardiography and electrodermography signals. Inform. Med. Unlocked 2020, 20, 100363. [Google Scholar] [CrossRef]
- Dessai, A.; Virani, H. Emotion Classification using Physiological Signals: A Recent Survey. In Proceedings of the 2022 IEEE International Conference on Signal Processing, Informatics, Communication and Energy Systems (SPICES), Thiruvananthapuram, India, 10–12 March 2022; IEEE: Piscataway, NJ, USA; Volume 1, p. 333. [Google Scholar]
- Meléndez, J.C.; Satorres, E.; Reyes-Olmedo, M.; Delhom, I.; Real, E.; Lora, Y. Emotion recognition changes in a confinement situation due to COVID-19. J. Environ. Psychol. 2020, 72, 101518. [Google Scholar] [CrossRef] [PubMed]
- Goshvarpour, A.; Abbasi, A.; Goshvarpour, A. An accurate emotion recognition system using ECG and GSR signals and matching pursuit method. Biomed. J. 2017, 40, 355–368. [Google Scholar] [CrossRef]
- Hsu, Y.; Wang, J.; Chiang, W.; Hung, C. Automatic ECG-Based Emotion Recognition in Music Listening. IEEE Trans. Affect. Comput. 2020, 11, 85–99. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A Multimodal Database for Affect Recognition and Implicit Tag-ging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef] [Green Version]
- Subramanian, R.; Wache, J.; Abadi, M.K.; Vieriu, R.L.; Winkler, S.; Sebe, N. Ascertain: Emotion and personality recognition using commercial sensors. IEEE Trans. Affect. Comput. 2018, 9, 147–160. [Google Scholar] [CrossRef]
- Miranda-Correa, J.A.; Abadi, M.K.; Sebe, N.; Patras, I. AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups. IEEE Trans. Affect. Comput. 2021, 12, 479–493. [Google Scholar] [CrossRef] [Green Version]
- Maggio, A.V.; Bonomini, M.P.; Leber, E.L.; Arini, P.D. Quantification of ventricular repolarization dispersion using digital processing of the surface ECG. In Advances in Electrocardiograms—Methods and Analysis; IntechOpen: Rijeka, Croatia, 2012. [Google Scholar]
- Dessai, A.; Virani, H. Emotion Detection Using Physiological Signals. In Proceedings of the 2021 International Conference on Electrical, Computer and Energy Technologies (ICECET), Cape Town, South Africa, 9–10 December 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–4. [Google Scholar]
- Al Machot, F.; Elmachot, A.; Ali, M.; Al Machot, E.; Kyamakya, K. A deep-learning model for subject-independent human emotion recognition using electrodermal activity sensors. Sensors 2019, 19, 1659. [Google Scholar] [CrossRef] [Green Version]
- Available online: https://en.wikipedia.org/wiki/Continuous_wavelet_transform (accessed on 20 March 2023).
- Sepúlveda, A.; Castillo, F.; Palma, C.; Rodriguez-Fernandez, M. Emotion recognition from ECG signals using wavelet scattering and machine learning. Appl. Sci. 2021, 11, 4945. [Google Scholar] [CrossRef]
- Shukla, J.; Barreda-Angeles, M.; Oliver, J.; Nandi, G.C.; Puig, D. Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity. IEEE Trans. Affect. Comput. 2019, 12, 857–869. [Google Scholar] [CrossRef]
- Romeo, L.; Cavallo, A.; Pepa, L.; Bianchi-Berthouze, N.; Pontil, M. Multiple Instance Learning for Emotion Recognition Using Physiological Signals. IEEE Trans. Affect. Comput. 2022, 13, 389–407. [Google Scholar] [CrossRef]
- Bulagang, A.F.; Mountstephens, J.; Teo, J. Multiclass emotion prediction using heart rate and virtual reality stimuli. J. Big Data 2021, 8, 12. [Google Scholar] [CrossRef]
- Dessai, A.U.; Virani, H.G. Emotion Detection and Classification Using Machine Learning Techniques. In Multidisciplinary Applications of Deep Learning-Based Artificial Emotional Intelligence; IGI Global: Hershey, PA, USA, 2023; pp. 11–31. [Google Scholar]
- Dar, M.N.; Akram, M.U.; Khawaja, S.G.; Pujari, A.N. CNN and LSTM-based emotion charting using physiological signals. Sensors 2020, 20, 4551. [Google Scholar] [CrossRef] [PubMed]
- Santamaria-Granados, L.; Munoz-Organero, M.; Ramirez-Gonzalez, G.; Abdulhay, E.; Arunkumar, N.J.I.A. Using deep convolutional neural network for emotion detection on a physiological signals dataset (AMIGOS). IEEE Access 2018, 7, 57–67. [Google Scholar] [CrossRef]
- Hammad, D.S.; Monkaresi, H. ECG-Based Emotion Detection via Parallel-Extraction of Temporal and Spatial Features Using Convolutional Neural Network. Traitement Du Signal 2022, 39, 43–57. [Google Scholar] [CrossRef]
- Lee, M.; Lee, Y.K.; Lim, M.T.; Kang, T.K. Emotion recognition using convolutional neural network with selected statistical photoplethysmogram features. Appl. Sci. 2020, 10, 3501. [Google Scholar] [CrossRef]
- Harper, R.; Southern, J. A bayesian deep learning framework for end-to- end prediction of emotion from heartbeat. IEEE Trans. Affect. Comput. 2020, 13, 985–991. [Google Scholar] [CrossRef] [Green Version]
- Ismail, S.N.M.S.; Aziz, N.A.A.; Ibrahim, S.Z.; Nawawi, S.W.; Alelyani, S.; Mohana, M.; Chun, L.C. Evaluation of electrocardiogram: Numerical vs. image data for emotion recognition system. F1000Research 2021, 10, 1114. [Google Scholar]
- Pólya, T.; Csertő, I. Emotion Recognition Based on the Structure of Narratives. Electronics 2023, 12, 919. [Google Scholar] [CrossRef]
- Bhangale, K.; Kothandaraman, M. Speech Emotion Recognition Based on Multiple Acoustic Features and Deep Convolutional Neural Network. Electronics 2023, 12, 839. [Google Scholar] [CrossRef]
- Dukić, D.; Sovic Krzic, A. Real-Time Facial Expression Recognition Using Deep Learning with Application in the Active Classroom Environment. Electronics 2022, 11, 1240. [Google Scholar] [CrossRef]
- Kamińska, D.; Aktas, K.; Rizhinashvili, D.; Kuklyanov, D.; Sham, A.H.; Escalera, S.; Nasrollahi, K.; Moeslund, T.B.; Anbarjafari, G. Two-Stage Recognition and beyond for Compound Facial Emotion Recognition. Electronics 2021, 10, 2847. [Google Scholar] [CrossRef]
- Gaddam, D.K.R.; Ansari, M.D.; Vuppala, S.; Gunjan, V.K.; Sati, M.M. Human facial emotion detection using deep learning. In ICDSMLA 2020: Proceedings of the 2nd International Conference on Data Science, Machine Learning and Applications; Springer: Singapore, 2022; pp. 1417–1427. [Google Scholar]
- Stanisław, S. Bringing Emotion Recognition Out of the Lab into Real Life: Recent Advances in Sensors and Machine Learning. Electronics 2022, 11, 496. [Google Scholar] [CrossRef]
- Wang, T.; Lu, C.; Sun, Y.; Yang, M.; Liu, C.; Ou, C. Automatic ECG classification using continuous wavelet transform and convolutional neural network. Entropy 2021, 23, 119. [Google Scholar] [CrossRef]
- Byeon, Y.-H.; Pan, S.-B.; Kwak, K.-C. Intelligent deep models based on scalograms of electrocardiogram signals for biometrics. Sensors 2019, 19, 935. [Google Scholar] [CrossRef] [Green Version]
- Mashrur, F.R.; Roy, A.D.; Saha, D.K. Automatic identification of arrhythmia from ECG using AlexNet convolutional neural network. In Proceedings of the 2019 4th International Conference on Electrical Information and Communication Technology (EICT), Khulna, Bangladesh, 20–22 December 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–5. [Google Scholar]
- Muzaffe, A. CNN based efficient approach for emotion recognition. J. King Saud Univ.-Comput. Inf. Sci. 2021, 34, 7335–7346. [Google Scholar]
- Garg, N.; Garg, R.; Anand, A.; Baths, V. Decoding the neural signatures of valence and arousal from portable EEG headset. Front. Hum. Neurosci. 2022, 16, 808. [Google Scholar] [CrossRef]
- Alsubai, S. Emotion Detection Using Deep Normalized Attention-Based Neural Network and Modified-Random Forest. Sensors 2023, 23, 225. [Google Scholar] [CrossRef] [PubMed]
- Castiblanco Jimenez, I.A.; Gomez Acevedo, J.S.; Olivetti, E.C.; Marcolin, F.; Ulrich, L.; Moos, S.; Vezzetti, E. User Engagement Comparison between Advergames and Traditional Advertising Using EEG: Does the User’s Engagement Influence Purchase Intention? Electronics 2022, 12, 122. [Google Scholar] [CrossRef]
- Cui, X.; Wu, Y.; Wu, J.; You, Z.; Xiahou, J.; Ouyang, M. A review: Music-emotion recognition and analysis based on EEG signals. Front. Neuroinform. 2022, 16, 997282. [Google Scholar] [CrossRef] [PubMed]
- Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161. [Google Scholar] [CrossRef]
- Cai, Y.; Li, X.; Li, J. Emotion Recognition Using Different Sensors, Emotion Models, Methods and Datasets: A Comprehensive Review. Sensors 2023, 23, 2455. [Google Scholar] [CrossRef]
- Khan, C.M.T.; Ab Aziz, N.A.; Raja, J.E.; Nawawi, S.W.B.; Rani, P. Evaluation of Machine Learning Algorithms for Emotions Recognition using Electrocardiogram. Emerg. Sci. J. 2022, 7, 147–161. [Google Scholar] [CrossRef]
- DenseNet|Densely Connected Convolutional Networks, YouTube. 2022. Available online: https://www.youtube.com/watch?v=hCg9bolMeJM (accessed on 29 March 2023).
- Ahmed, A. Architecture of Densenet-121, OpenGenus IQ: Computing Expertise & Legacy. OpenGenus IQ: Computing Expertise & Legacy. 2021. Available online: https://iq.opengenus.org/architecture-of-densenet121/ (accessed on 29 March 2023).
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. MobileNets: Efficient convolutional Neural Networks for Mobile Vision Applications. arXiv 2017, arXiv:1704.04861. Available online: https://arxiv.org/abs/1704.04861 (accessed on 29 March 2023).
- Available online: https://sh-tsang.medium.com/review-nasnet-neural-architecture-search-network-image-classification (accessed on 29 March 2023).
- Nair, D. NASNet—A Brief Overview. OpenGenus IQ: Computing Expertise & Legacy. 2020. Available online: https://iq.opengenus.org/nasnet (accessed on 29 March 2023).
- Elhamraoui, Z. INCEPTIONRESNETV2 Simple Introduction, Medium. 2020. Available online: https://medium.com/@zahraelhamraoui1997/inceptionresnetv2-simple-introduction-9a2000edcdb6 (accessed on 29 March 2023).
- Szegedy, C.; Ioffe, S.; Vanhoucke, V.; Alemi, A. Inception-V4, inception-resnet and the impact of residual connections on learning. arXiv 2016, arXiv:1602.07261. Available online: https://arxiv.org/abs/1602.07261 (accessed on 29 March 2023).
- Tan, M.; Le, Q. EFFICIENTNETV2: Smaller Models and Faster Training. arXiv 2021, arXiv:2104.00298v3. Available online: https://arxiv.org/pdf/2104.00298 (accessed on 29 March 2023).
- Available online: https://towardsdatascience.com/a-look-at-precision-recall-and-f1-score (accessed on 29 March 2023).
- Sarkar, P.; Etemad, A. Self-supervised ECG representation learning for emotion recognition. IEEE Trans. Affect. Comput. 2020, 13, 1541–1554. [Google Scholar] [CrossRef]
Video Number | Category | Source Dataset | Source Movie | Video Duration (Minutes) |
---|---|---|---|---|
1 | HVLA | DECAF | August Rush | 01:30 |
6 | LVLA | DECAF | My Girl | 01:00 |
8 | LVHA | MAHNOB | Silent Hill | 01:10 |
12 | HVHA | DECAF | Airplane | 01:25 |
Video Number | Category |
---|---|
17 | LVHA |
18 | HVLA |
19 | LVLA |
20 | LVLA |
CNN MODEL | Architectural Specifications | Size | Depth | Total Parameters | Trainable Parameters | Non-Trainable Parameters |
---|---|---|---|---|---|---|
MobileNet | Depthwise separable convolution; smaller and faster | 16 Mb | 88 | 3,228,864 | 3,206,976 | 21,888 |
NASNetMobile | Blocks are not predefined but are detected using reinforcement learning search method | 23 Mb | _ | 4,269,716 | 4,232,978 | 36,738 |
DenseNet201 | Features of one layer act as input to every other layer; resolves vanishing gradient problem | 80 Mb | 201 | 18,321,984 | 18,092,928 | 229,056 |
InceptionResnetV2 | Multiple-sized convolutional filters in inception architecture are combined with residual connections | 215 Mb | 572 | 54,336,736 | 54,276,192 | 60,544 |
EfficientNetB7 | Uses a combination of training-aware neural architecture search and scaling | 813 | 64,097,687 | 63,786,960 | 310,727 |
No. | CNN Model | Accuracy (%) | Precision | Recall | F1 Score | Data Execution Time (s) |
---|---|---|---|---|---|---|
1 | InceptionResNetV2 | 91.45 | 0.91 | 0.90 | 0.90 | 920.778 |
2 | MobileNet | 90.55 | 0.90 | 0.90 | 0.90 | 215.212 |
3 | DenseNet201 | 90.55 | 0.92 | 0.92 | 0.92 | 850.947 |
4 | NASNetMobile | 89.27 | 0.90 | 0.90 | 0.90 | 297.214 |
5 | EfficientNetB7 | 80.18 | 0.82 | 0.80 | 0.80 | 1743.611 |
No. | CNN Model | Accuracy (%) | Precision | Recall | F1 Score | Data Execution Time (s) |
---|---|---|---|---|---|---|
1 | InceptionResNetV2 | 91.27 | 0.9 | 0.9 | 0.9 | 892.63 |
2 | MobileNet | 90.55 | 0.93 | 0.93 | 0.93 | 183.081 |
3 | DenseNet201 | 92 | 0.93 | 0.93 | 0.93 | 895.516 |
4 | NASNetMobile | 89.09 | 0.92 | 0.92 | 0.92 | 303.057 |
5 | EfficientNetB7 | 71.64 | 0.78 | 0.72 | 0.70 | 17,772.865 |
No. | CNN Model | Accuracy (%) | Precision | Recall | F1 Score | Data Execution Time (s) |
---|---|---|---|---|---|---|
1 | InceptionResNetV2 | 97.58 | 0.98 | 0.98 | 0.98 | 205.267 |
2 | MobileNet | 98.39 | 0.98 | 0.98 | 0.98 | 62.214 |
3 | DenseNet201 | 99.19 | 0.98 | 0.98 | 0.98 | 163.155 |
4 | NASNetMobile | 96.77 | 0.97 | 0.97 | 0.97 | 76.611 |
5 | EfficientNetB7 | 76.61 | 0.75 | 0.75 | 0.75 | 486.086 |
No. | CNN Model | Accuracy (%) | Precision | Recall | F1 Score | Data Execution Time (s) |
---|---|---|---|---|---|---|
1 | InceptionResNetV2 | 91.13 | 0.95 | 0.95 | 0.95 | 202.349 |
2 | MobileNet | 99.19 | 0.97 | 0.97 | 0.97 | 43.428 |
3 | DenseNet201 | 96.77 | 0.97 | 0.97 | 0.97 | 186.739 |
4 | NASNetMobile | 95.97 | 0.93 | 0.92 | 0.92 | 79.452 |
5 | EfficientNetB7 | 79.84 | 0.79 | 0.78 | 0.78 | 479.478 |
No. | CNN Model | ECG Arousal Accuracy (%) | GSR Arousal Accuracy (%) | ECG Valence Accuracy (%) | GSR Valence Accuracy (%) |
---|---|---|---|---|---|
1 | InceptionResNetV2 | 91.45 | 97.58 | 91.27 | 91.13 |
2 | MobileNet | 90.55 | 98.39 | 90.55 | 99.19 |
3 | DenseNet201 | 90.55 | 99.19 | 92 | 96.77 |
4 | NASNetMobile | 89.27 | 96.77 | 89.09 | 95.97 |
5 | EfficientNetB7 | 80.18 | 76.61 | 71.64 | 79.84 |
Ref. No. | Databases | Biosignals | Methods | Accuracy |
---|---|---|---|---|
[2] | DEAP | ECG | 1D CNN | Valence: 75.3%; arousal: 76.2% |
[14] | DEAP, MAHNOB | GSR | 1D CNN | MAHNOB: valence and arousal: 78% DEAP: valence and arousal: 82% |
[21] | DREAMER, AMIGOS | ECG, GSR | 1D CNN, LSTM | DREAMER: valence and arousal: 89.25% AMIGOS ECG valence and arousal: 98.73%; GSR valence and arousal: 63.67% |
[22] | AMIGOS | ECG, GSR | 1D CNN | ECG valence: 75%; arousal: 76% GSR valence: 75%; arousal: 71% |
[24] | DEAP | PPG | 1D CNN | Valence: 82.1%; arousal: 80.9% |
[25] | DREAMER, AMIGOS | ECG | CNN-LSTM (Bayseian) | DREAMER: valence: 86%; arousal: 83% AMIGOS: valence: 90%; arousal: 88% |
[53] | DREAMER | ECG | 1D CNN | Self-supervised CNN: valence: 85%; arousal: 85.9% Fully supervised CNN: valence: 66.6%; arousal: 70.7% |
Proposed Work | AMIGOS | ECG (short video data) | InceptionResNetV2 CNN | Valence: 91.27% Arousal: 91.45% |
GSR (short video data) | MobileNet CNN | Valence: 99.19% Arousal: 98.39% |
No. | CNN Model | Accuracy (%) | Precision | Recall | F1 Score | Data Execution Time (s) |
---|---|---|---|---|---|---|
1 | InceptionResNetV2 | 99.8 | 1 | 1 | 1 | 857.976 |
2 | MobileNet | 99.8 | 1 | 1 | 1 | 211.122 |
3 | DenseNet201 | 99.8 | 1 | 1 | 1 | 881.271 |
4 | NASNetMobile | 99.8 | 1 | 1 | 1 | 300.944 |
No. | CNN Model | Accuracy (%) | Precision | Recall | F1 Score | Data Execution Time (s) |
---|---|---|---|---|---|---|
1 | InceptionResNetV2 | 99.8 | 1 | 0.9 | 1 | 314.81 |
2 | MobileNet | 99.8 | 1 | 1 | 1 | 174.88 |
3 | DenseNet201 | 99.8 | 1 | 1 | 1 | 932.799 |
4 | NASNetMobile | 99.8 | 1 | 1 | 1 | 561.291 |
No. | CNN Model | Accuracy (%) | Precision | Recall | F1 Score | Data Execution Time (s) |
---|---|---|---|---|---|---|
1 | MobileNet | 99.82 | 1 | 1 | 1 | 223.211 |
2 | DenseNet201 | 99.10 | 1 | 1 | 1 | 960.337 |
3 | InceptionResNetV2 | 98.74 | 1 | 1 | 1 | 871.891 |
4 | NASNetMobile | 98.74 | 0.99 | 0.99 | 0.99 | 301.785 |
No. | CNN Model | Accuracy (%) | Precision | Recall | F1 Score | Data Execution Time (s) |
---|---|---|---|---|---|---|
1 | MobileNet | 99.82 | 1 | 1 | 1 | 191.844 |
2 | DenseNet201 | 99.82 | 1 | 1 | 1 | 849.547 |
3 | NASNetMobile | 99.45 | 1 | 1 | 1 | 300.732 |
4 | InceptionResNetV2 | 99.27 | 1 | 1 | 1 | 872.207 |
No. | CNN Model | ECG Short Video Arousal Accuracy (%) | ECG Short Video Valence Accuracy (%) | ECG Individual Long Video Arousal Accuracy (%) | ECG Individual Long Video Valence Accuracy (%) | ECG Group Long Video Arousal Accuracy (%) | ECG Group Long Video Valence Accuracy (%) |
---|---|---|---|---|---|---|---|
1 | InceptionResNetV2 | 91.45 | 91.27 | 99.8 | 99.8 | 98.74 | 99.27 |
2 | MobileNet | 90.55 | 90.55 | 99.8 | 99.8 | 99.82 | 99.82 |
3 | DenseNet201 | 90.55 | 92 | 99.8 | 99.8 | 99.10 | 99.82 |
4 | NASNetMobile | 89.27 | 89.09 | 99.8 | 99.8 | 98.74 | 99.45 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dessai, A.; Virani, H. Emotion Classification Based on CWT of ECG and GSR Signals Using Various CNN Models. Electronics 2023, 12, 2795. https://doi.org/10.3390/electronics12132795
Dessai A, Virani H. Emotion Classification Based on CWT of ECG and GSR Signals Using Various CNN Models. Electronics. 2023; 12(13):2795. https://doi.org/10.3390/electronics12132795
Chicago/Turabian StyleDessai, Amita, and Hassanali Virani. 2023. "Emotion Classification Based on CWT of ECG and GSR Signals Using Various CNN Models" Electronics 12, no. 13: 2795. https://doi.org/10.3390/electronics12132795
APA StyleDessai, A., & Virani, H. (2023). Emotion Classification Based on CWT of ECG and GSR Signals Using Various CNN Models. Electronics, 12(13), 2795. https://doi.org/10.3390/electronics12132795