Temporal Convolutional Network-Enhanced Real-Time Implicit Emotion Recognition with an Innovative Wearable fNIRS-EEG Dual-Modal System
Abstract
:1. Introduction
1.1. Background and Motivation
1.2. Related Works
1.3. Research Gap
1.4. Contribution
2. Materials and Methods
2.1. Participants
2.2. Experimental Procedure
2.3. Overview of the New Portable Wearable Functional Near-Infrared Spectroscopy-Electroencephalography (fNIRS-EEG) System
2.3.1. Functional Near-Infrared Spectroscopy-Electroencephalography (fNIRS-EEG) Data Preprocessing
- (1)
- Functional near-infrared spectroscopy (fNIRS) data preprocessing
- (2)
- Electroencephalography (EEG) data preprocessing
2.3.2. Functional Near-Infrared Spectroscopy-Electroencephalography (fNIRS-EEG) Correlation Analysis
2.4. Temporal Convolutional Network (TC-ResNet) Model
2.4.1. Temporal Convolution for Emotion Recognition
2.4.2. Temporal Convolutional Network (TC-ResNet) Architecture
2.4.3. Temporal Convolutional Network (TC-ResNet8) Setup
- (1)
- Data processing
- (2)
- Training
- (3)
- Evaluation
3. Results and Analysis
3.1. Functional Near-Infrared Spectroscopy (fNIRS) Data Analysis
3.2. Electroencephalography (EEG) Data Analysis
3.3. Functional Near-Infrared Spectroscopy-Electroencephalography (fNIRS-EEG) Data Correlation
3.4. Analysis of Emotion Recognition Classification Results
3.5. Temporal Convolutional Network (TC-ResNet) Model Evaluation
4. Results and Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Jiang, X.; Fan, J.; Zhu, Z.; Wang, Z.; Guo, Y.; Liu, X.; Jia, F.; Dai, C. Cybersecurity in neural interfaces: Survey and future trends. Comput. Biol. Med. 2023, 167, 107604. [Google Scholar] [CrossRef]
- Liu, Z.-T.; Xie, Q.; Wu, M.; Cao, W.-H.; Mei, Y.; Mao, J.-W. Speech emotion recognition based on an improved brain emotion learning model. Neurocomputing 2018, 309, 145–156. [Google Scholar] [CrossRef]
- Liu, H.; Cai, H.; Lin, Q.; Zhang, X.; Li, X.; Xiao, H. FEDA: Fine-grained emotion difference analysis for facial expression recognition. Biomed. Signal Process. Control 2023, 79, 104209. [Google Scholar] [CrossRef]
- Zhang, F.; Li, X.-C.; Lim, C.P.; Hua, Q.; Dong, C.-R.; Zhai, J.-H. Deep Emotional Arousal Network for Multimodal Sentiment Analysis and Emotion Recognition. Inf. Fusion 2022, 88, 296–304. [Google Scholar] [CrossRef]
- Rahman, M.M.; Sarkar, A.K.; Hossain, M.A.; Hossain, M.S.; Islam, M.R.; Hossain, M.B.; Quinn, J.M.W.; Moni, M.A. Recognition of human emotions using EEG signals: A review. Comput. Biol. Med. 2021, 136, 104696. [Google Scholar] [CrossRef]
- Condell, E.; Aseem, S.; Suvranu, D.; Xavier, I. Deep learning in fNIRS: A review. Neurophotonics 2022, 9, 041411. [Google Scholar] [CrossRef]
- Vanutelli, M.E.; Grippa, E. 104. Resting lateralized activity (fNIRS) predicts the cortical response and appraisal of emotions. Clin. Neurophysiol. 2016, 127, e156. [Google Scholar] [CrossRef]
- Bandara, D.; Velipasalar, S.; Bratt, S.; Hirshfield, L. Building predictive models of emotion with functional near-infrared spectroscopy. Int. J. Hum.-Comput. Stud. 2018, 110, 75–85. [Google Scholar] [CrossRef]
- Manelis, A.; Huppert, T.J.; Rodgers, E.; Swartz, H.A.; Phillips, M.L. The role of the right prefrontal cortex in recognition of facial emotional expressions in depressed individuals: fNIRS study. J. Affect. Disord. 2019, 258, 151–158. [Google Scholar] [CrossRef]
- Floreani, E.D.; Orlandi, S.; Chau, T. A pediatric near-infrared spectroscopy brain-computer interface based on the detection of emotional valence. Front. Hum. Neurosci. 2022, 16, 938708. [Google Scholar] [CrossRef]
- Yeung, M.K. The prefrontal cortex is differentially involved in implicit and explicit facial emotion processing: An fNIRS study. Biol. Psychol. 2023, 181, 108619. [Google Scholar] [CrossRef]
- Zheng, W.L.; Zhu, J.Y.; Lu, B.L. Identifying Stable Patterns over Time for Emotion Recognition from EEG. IEEE Trans. Affect. Comput. 2019, 10, 417–429. [Google Scholar] [CrossRef]
- Zhang, Y.; Chen, J.; Tan, J.H.; Chen, Y.; Chen, Y.; Li, D.; Yang, L.; Su, J.; Huang, X.; Che, W. An Investigation of Deep Learning Models for EEG-Based Emotion Recognition. Front. Neurosci. 2020, 14, 622759. [Google Scholar] [CrossRef]
- Gao, Q.; Yang, Y.; Kang, Q.; Tian, Z.; Song, Y. EEG-based Emotion Recognition with Feature Fusion Networks. Int. J. Mach. Learn. Cybern. 2022, 13, 421–429. [Google Scholar] [CrossRef]
- Zheng, Y.; Ding, J.; Liu, F.; Wang, D. Adaptive neural decision tree for EEG based emotion recognition. Inf. Sci. 2023, 643, 119160. [Google Scholar] [CrossRef]
- Jiaming, C.; Theodore, J.H.; Pulkit, G.; Jana, M.K. Enhanced spatiotemporal resolution imaging of neuronal activity using joint electroencephalography and diffuse optical tomography. Neurophotonics 2021, 8, 015002. [Google Scholar] [CrossRef]
- Abtahi, M.; Borgheai, S.B.; Jafari, R.; Constant, N.; Diouf, R.; Shahriari, Y.; Mankodiya, K. Merging fNIRS-EEG Brain Monitoring and Body Motion Capture to Distinguish Parkinsons Disease. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 1246–1253. [Google Scholar] [CrossRef]
- Tan, X.; Fan, Y.; Sun, M.; Zhuang, M.; Qu, F. An emotion index estimation based on facial action unit prediction. Pattern Recognit. Lett. 2022, 164, 183–190. [Google Scholar] [CrossRef]
- Bendjoudi, I.; Vanderhaegen, F.; Hamad, D.; Dornaika, F. Multi-label, multi-task CNN approach for context-based emotion recognition. Inf. Fusion 2021, 76, 422–428. [Google Scholar] [CrossRef]
- Alruily, M. Sentiment analysis for predicting stress among workers and classification utilizing CNN: Unveiling the mechanism. Alex. Eng. J. 2023, 81, 360–370. [Google Scholar] [CrossRef]
- Jiang, Y.C.; Ma, R.; Qi, S.; Ge, S.; Sun, Z.; Li, Y.; Song, J.; Zhang, M. Characterization of Bimanual Cyclical Tasks From Single-Trial EEG-fNIRS Measurements. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 146–156. [Google Scholar] [CrossRef]
- Yi, L.; Xie, G.; Li, Z.; Li, X.; Zhang, Y.; Wu, K.; Shao, G.; Lv, B.; Jing, H.; Zhang, C.; et al. Automatic depression diagnosis through hybrid EEG and near-infrared spectroscopy features using support vector machine. Front. Neurosci. 2023, 17, 1205931. [Google Scholar] [CrossRef]
- Lin, J.; Lu, J.; Shu, Z.; Han, J.; Yu, N. Subject-Specific Modeling of EEG-fNIRS Neurovascular Coupling by Task-Related Tensor Decomposition. IEEE Trans. Neural Syst. Rehabil. Eng. 2024, 32, 452–461. [Google Scholar] [CrossRef]
- Carvalho, S.; Leite, J.; Galdo-Álvarez, S.; Gonçalves, Ó.F. The Emotional Movie Database (EMDB): A Self-Report and Psychophysiological Study. Appl. Psychophysiol. Biofeedback 2012, 37, 279–294. [Google Scholar] [CrossRef]
- Zheng, Q.; Chi, A.; Shi, B.; Wang, Y.; Ma, Q.; Zhou, F.; Guo, X.; Zhou, M.; Lin, B.; Ning, K. Differential features of early childhood motor skill development and working memory processing: Evidence from fNIRS. Front. Behav. Neurosci. 2023, 17, 1279648. [Google Scholar] [CrossRef]
- Karmakar, S.; Kamilya, S.; Dey, P.; Guhathakurta, P.K.; Dalui, M.; Bera, T.K.; Halder, S.; Koley, C.; Pal, T.; Basu, A. Real time detection of cognitive load using fNIRS: A deep learning approach. Biomed. Signal Process. Control 2023, 80, 104227. [Google Scholar] [CrossRef]
- Sahar, J.; Seyed, K.S.; David, A.B.; Meryem, A.Y. Motion artifact detection and correction in functional near-infrared spectroscopy: A new hybrid method based on spline interpolation method and Savitzky–Golay filtering. Neurophotonics 2018, 5, 015003. [Google Scholar] [CrossRef]
- Hong, K.-S.; Khan, M.J.; Hong, M.J. Feature Extraction and Classification Methods for Hybrid fNIRS-EEG Brain-Computer Interfaces. Front. Hum. Neurosci. 2018, 12, 246. [Google Scholar] [CrossRef]
- Bizzego, A.; Balagtas, J.P.M.; Esposito, G. Commentary: Current Status and Issues Regarding Pre-processing of fNIRS Neuroimaging Data: An Investigation of Diverse Signal Filtering Methods Within a General Linear Model Framework. Front. Hum. Neurosci. 2020, 14, 00247. [Google Scholar] [CrossRef]
- Firooz, S.; Setarehdan, S.K. IQ estimation by means of EEG-fNIRS recordings during a logical-mathematical intelligence test. Comput. Biol. Med. 2019, 110, 218–226. [Google Scholar] [CrossRef]
- Fogazzi, D.V.; Neary, J.P.; Sonza, A.; Reppold, C.T.; Kaiser, V.; Scassola, C.M.; Casali, K.R.; Rasia-Filho, A.A. The prefrontal cortex conscious and unconscious response to social/emotional facial expressions involve sex, hemispheric laterality, and selective activation of the central cardiac modulation. Behav. Brain Res. 2020, 393, 112773. [Google Scholar] [CrossRef]
- Stropahl, M.; Bauer, A.-K.R.; Debener, S.; Bleichner, M.G. Source-Modeling Auditory Processes of EEG Data Using EEGLAB and Brainstorm. Front. Hum. Neurosci. 2018, 12, 2018. [Google Scholar] [CrossRef]
- Zheng, J.; Li, Y.; Zhai, Y.; Zhang, N.; Yu, H.; Tang, C.; Yan, Z.; Luo, E.; Xie, K. Effects of sampling rate on multiscale entropy of electroencephalogram time series. Biocybern. Biomed. Eng. 2023, 43, 233–245. [Google Scholar] [CrossRef]
- Aghajani, H.; Garbey, M.; Omurtag, A. Measuring Mental Workload with EEG+fNIRS. Front. Hum. Neurosci. 2017, 11, 00359. [Google Scholar] [CrossRef]
- Dong, Y.; Tang, X.; Li, Q.; Wang, Y.; Jiang, N.; Tian, L.; Zheng, Y.; Li, X.; Zhao, S.; Li, G.; et al. An Approach for EEG Denoising Based on Wasserstein Generative Adversarial Network. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 3524–3534. [Google Scholar] [CrossRef]
- Li, R.; Zhao, C.; Wang, C.; Wang, J.; Zhang, Y. Enhancing fNIRS Analysis Using EEG Rhythmic Signatures: An EEG-Informed fNIRS Analysis Study. IEEE Trans. Biomed. Eng. 2020, 67, 2789–2797. [Google Scholar] [CrossRef]
- Abidi, A.; Nouira, I.; Assali, I.; Saafi, M.A.; Bedoui, M.H. Hybrid Multi-Channel EEG Filtering Method for Ocular and Muscular Artifact Removal Based on the 3D Spline Interpolation Technique. Comput. J. 2022, 65, 1257–1271. [Google Scholar] [CrossRef]
- Kang, G.; Jin, S.-H.; Keun Kim, D.; Kang, S.W. T59. EEG artifacts removal using machine learning algorithms and independent component analysis. Clin. Neurophysiol. 2018, 129, e24. [Google Scholar] [CrossRef]
- Rosenbaum, D.; Leehr, E.J.; Kroczek, A.; Rubel, J.A.; Int-Veen, I.; Deutsch, K.; Maier, M.J.; Hudak, J.; Fallgatter, A.J.; Ehlis, A.-C. Neuronal correlates of spider phobia in a combined fNIRS-EEG study. Sci. Rep. 2020, 10, 12597. [Google Scholar] [CrossRef]
- Xu, H.; Li, C.; Shi, T. Is the z-score standardized RSEI suitable for time-series ecological change detection? Comment on Zheng et al. (2022). Sci. Total Environ. 2022, 853, 158582. [Google Scholar] [CrossRef]
- Zhang, Y.; Suda, N.; Lai, L.; Chandra, V.J.A. Hello Edge: Keyword Spotting on Microcontrollers. arXiv 2017, arXiv:1711.07128. [Google Scholar] [CrossRef]
- Tang, R.; Lin, J. Deep Residual Learning for Small-Footprint Keyword Spotting. arXiv 2017, arXiv:1710.10361. [Google Scholar] [CrossRef]
- Cheng, C.; Parhi, K.K. Fast 2D Convolution Algorithms for Convolutional Neural Networks. IEEE Trans. Circuits Syst. I Regul. Pap. 2020, 67, 1678–1691. [Google Scholar] [CrossRef]
- He, F.; Liu, T.; Tao, D. Why ResNet Works? Residuals Generalize. IEEE Trans. Neural Netw. Learn. Syst. 2020, 31, 5349–5362. [Google Scholar] [CrossRef]
- Ioffe, S.; Szegedy, C.J.J.o. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. arXiv 2015, arXiv:1502.03167. [Google Scholar] [CrossRef]
- Abadi, M.; Barham, P.; Chen, J.; Chen, Z.; Zhang, X.J.U.A. TensorFlow: A system for large-scale machine learning. arXiv 2016, arXiv:1605.08695. [Google Scholar] [CrossRef]
- Prechelt, L.J.S.B.H. Early Stopping—But When? In Neural Networks: Tricks of the Trade; Springer: Berlin/Heidelberg, Germany, 1998. [Google Scholar] [CrossRef]
- Zibman, S.; Daniel, E.; Alyagon, U.; Etkin, A.; Zangen, A. Interhemispheric cortico-cortical paired associative stimulation of the prefrontal cortex jointly modulates frontal asymmetry and emotional reactivity. Brain Stimul. 2019, 12, 139–147. [Google Scholar] [CrossRef]
- Segar, R.; Chhabra, H.; Sreeraj, V.S.; Parlikar, R.; Kumar, V.; Ganesan, V.; Kesavan, M. fNIRS study of prefrontal activation during emotion recognition–A Potential endophenotype for bipolar I disorder? J. Affect. Disord. 2021, 282, 869–875. [Google Scholar] [CrossRef]
- Liang, Z.; Oba, S.; Ishii, S. An unsupervised EEG decoding system for human emotion recognition. Neural Netw. 2019, 116, 257–268. [Google Scholar] [CrossRef] [PubMed]
- Gao, C.; Uchitomi, H.; Miyake, Y. Influence of Multimodal Emotional Stimulations on Brain Activity: An Electroencephalographic Study. Sensors 2023, 23, 4801. [Google Scholar] [CrossRef] [PubMed]
- Xie, J.; Lan, P.; Wang, S.; Luo, Y.; Liu, G. Brain Activation Differences of Six Basic Emotions Between 2D Screen and Virtual Reality Modalities. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 700–709. [Google Scholar] [CrossRef] [PubMed]
- Baldo, D.; Viswanathan, V.S.; Timpone, R.J.; Venkatraman, V. The heart, brain, and body of marketing: Complementary roles of neurophysiological measures in tracking emotions, memory, and ad effectiveness. Psychol. Mark. 2022, 39, 1979–1991. [Google Scholar] [CrossRef]
- Vanutelli, M.E.; Grippa, E.; Balconi, M. 105. Hemodynamic (fNIRS), electrophysiological (EEG) and autonomic responses to affective pictures: A multi-method approach to the study of emotions. Clin. Neurophysiol. 2016, 127, e156. [Google Scholar] [CrossRef]
- Jin, Z.; Xing, Z.; Wang, Y.; Fang, S.; Gao, X.; Dong, X. Research on Emotion Recognition Method of Cerebral Blood Oxygen Signal Based on CNN-Transformer Network. Sensors 2023, 23, 8643. [Google Scholar] [CrossRef] [PubMed]
- Tang, T.B.; Chong, J.S.; Kiguchi, M.; Funane, T.; Lu, C.K. Detection of Emotional Sensitivity Using fNIRS Based Dynamic Functional Connectivity. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 894–904. [Google Scholar] [CrossRef]
- Andreu-Perez, A.R.; Kiani, M.; Andreu-Perez, J.; Reddy, P.; Andreu-Abela, J.; Pinto, M.; Izzetoglu, K. Single-Trial Recognition of Video Gamer’s Expertise from Brain Haemodynamic and Facial Emotion Responses. Brain Sci. 2021, 11, 106. [Google Scholar] [CrossRef] [PubMed]
- Sánchez-Reolid, R.; Martínez-Sáez, M.C.; García-Martínez, B.; Fernández-Aguilar, L.; Ros, L.; Latorre, J.M.; Fernández-Caballero, A. Emotion Classification from EEG with a Low-Cost BCI Versus a High-End Equipment. Int. J. Neural Syst. 2022, 32, 2250041. [Google Scholar] [CrossRef] [PubMed]
- Chatterjee, S.; Byun, Y.-C. EEG-Based Emotion Classification Using Stacking Ensemble Approach. Sensors 2022, 22, 8550. [Google Scholar] [CrossRef]
- Shah, S.J.H.; Albishri, A.; Kang, S.S.; Lee, Y.; Sponheim, S.R.; Shim, M. ETSNet: A deep neural network for EEG-based temporal–spatial pattern recognition in psychiatric disorder and emotional distress classification. Comput. Biol. Med. 2023, 158, 106857. [Google Scholar] [CrossRef]
- Su, Y.; Hu, B.; Xu, L.; Cai, H.; Moore, P.; Zhang, X.; Chen, J. EmotionO+: Physiological signals knowledge representation and emotion reasoning model for mental health monitoring. In Proceedings of the 2014 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) 2014, Belfast, UK, 2–5 November 2014; pp. 529–535. [Google Scholar] [CrossRef]
- Sun, Y.; Ayaz, H.; Akansu, A.N. Neural correlates of affective context in facial expression analysis: A simultaneous EEG-fNIRS study. In Proceedings of the 2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP), Orlando, FL, USA, 14–16 December 2015. [Google Scholar] [CrossRef]
- Sun, Y.; Ayaz, H.; Akansu, A.N. Multimodal Affective State Assessment Using fNIRS + EEG and Spontaneous Facial Expression. Brain Sci. 2020, 10, 85. [Google Scholar] [CrossRef]
- Wang, Y.; Yang, Z.; Ji, H.; Li, J.; Liu, L.; Zhuang, J. Cross-Modal Transfer Learning From EEG to Functional Near-Infrared Spectroscopy for Classification Task in Brain-Computer Interface System. Front. Psychol. 2022, 13, 2022. [Google Scholar] [CrossRef]
- Zhao, Q.; Zhang, X.; Chen, G.; Zhang, J. EEG and fNIRS emotion recognition based on modal attention map convolutional feature fusion. Zhejiang Univ. J. 2023, 57, 1987–1997. Available online: https://kns.cnki.net/kcms/detail/33.1245.T.20231017.0939.008.html (accessed on 6 March 2024).
- Maher, A.; Mian Qaisar, S.; Salankar, N.; Jiang, F.; Tadeusiewicz, R.; Pławiak, P.; Abd El-Latif, A.A.; Hammad, M. Hybrid EEG-fNIRS brain-computer interface based on the non-linear features extraction and stacking ensemble learning. Biocybern. Biomed. Eng. 2023, 43, 463–475. [Google Scholar] [CrossRef]
- Al-Shargie, F.; Kiguchi, M.; Badruddin, N.; Dass, S.C.; Hani, A.F.M.; Tang, T.B. Mental stress assessment using simultaneous measurement of EEG and fNIRS. Biomed. Opt. Express 2016, 7, 3882–3898. [Google Scholar] [CrossRef] [PubMed]
- Güven, A.; Altınkaynak, M.; Dolu, N.; İzzetoğlu, M.; Pektaş, F.; Özmen, S.; Demirci, E.; Batbat, T. Combining functional near-infrared spectroscopy and EEG measurements for the diagnosis of attention-deficit hyperactivity disorder. Neural Comput. Appl. 2020, 32, 8367–8380. [Google Scholar] [CrossRef]
- Kassab, A.; Hinnoutondji Toffa, D.; Robert, M.; Lesage, F.; Peng, K.; Khoa Nguyen, D. Hemodynamic changes associated with common EEG patterns in critically ill patients: Pilot results from continuous EEG-fNIRS study. NeuroImage Clin. 2021, 32, 102880. [Google Scholar] [CrossRef]
- Xu, T.; Zhou, Z.; Yang, Y.; Li, Y.; Li, J.; Bezerianos, A.; Wang, H. Motor Imagery Decoding Enhancement Based on Hybrid EEG-fNIRS Signals. IEEE Access 2023, 11, 65277–65288. [Google Scholar] [CrossRef]
Technical Indicators | Parameters |
---|---|
Measurement projects | fNIRS: HbO, HbO, Hb. EEG: brain electric activity. |
Channels | fNIRS: 24 channels. EEG: 20-lead. |
Sampling frequency | fNIRS: ≤150 Hz. EEG: ≥500 Hz. |
Weight of the main unit | fNIRS: ≤300 g. EEG: ≤65 g. |
Size of the main unit | fNIRS: ≤8.5 × 8.5 × 3.5 cm. EEG: 6 × 8.5 × 2 cm. |
Light/Electrode source type | fNIRS: LED. EEG: AgCl, wet electrodes. |
Data transmission | Bluetooth, in real-time transmission distance 20 m. |
Acquisition, Expansion | D-LAB plug-in for synchronization compatibility and support for EEG, fMRI, tDCS, and other device extensions. |
Sensor technology | fNIRS: built-in 9-axis motion sensor. EEG: built-in 3-axis motion sensor. |
fNIRS/EEG battery part | 1. Power adapter: input is 100 V–240 V, 50/60 Hz, output is 5 V. 2. Battery type: built-in lithium battery, can be supplemented with external batteries and power banks to extend battery life. 3. Battery dimensions: 3 × 2.5 × 0.6 cm. 4. Battery capacity: 1400 mah. 5. Battery output voltage: 3.7 V. 6. Battery efficiency: 92%. 7. Battery endurance Time: ≥3 h. |
fNIRS partial parameters | 1. Spectral type: successive waves; wavelength (760 nm, 850 nm). 2. Source-probe quantity: 10, 8 (weight ≤ 12 g). 3. Detector type: SiPDs; sensitivity (<1 pW); dynamic range (≥90 dB). 4. Functions: one-stop data preprocessing, event and data editing, artifact correction, probe position editing, dynamic display of oximetry status, GLM, fast real-time display of 2D mapping maps, support for display of HbO, HbR, Hb status, signal quality detection in 2D, scalp, cerebral cortex, and glass view, etc. |
EEG partial parameters | 1. Bandwidths: 0–250 Hz; sync accuracy (≤1 ms). 2. Noise level: 1 uV rms. 3. Common mode rejection ratio: ≥120 dB. 4. Functions: 3D current density map, 3D FFT mapping and spectrum analysis, inter-/intra-group comparison, real-time display of the signal detection status of each electrode, MATLAB and other real-time communication and remote control ports, support for automatic filtering and classification of different EEG waves, online EEG impedance detection, filtering settings and data analysis, etc. |
Evoke Emotions | Average Value | Standard Deviation |
---|---|---|
Calm | 9.351 | 0.20149 |
Sad | 9.474 | 0.21360 |
Happy | 9.1 | 0.16456 |
Fear | 9.54 | 0.27401 |
Labels | Film Clip Sources | Clips | Chinese Audience Web Rating |
---|---|---|---|
Calm | Tip of the Tongue China | 2 | 9.0 |
Sad | Tangshan Earthquake | 5 | 9.9 |
Happy | Lost in Thailand/Kung Fu Panda | 6/4 | 9.7/9.8 |
Fear | A Wicked Ghost/Soul Ferry/double pupils | 5/3/2 | 9.3/8.9/8.8 |
Method | Weights | MACs | Output |
---|---|---|---|
2D convolution | |||
This method |
Parameter Name | Parameters | Descriptive |
---|---|---|
Window size (n_mels) | 250 | 250 × 0.004 = 1, window size is 1 s |
Time step (step_len) | 25 | Capture a window every 0.1 s |
fNIRS_seq_len | Number of fNIRS features | - |
EEG_seq_len | Number of EEG features | - |
Delta | Theta | Alpha | Beta | Gamma | |
---|---|---|---|---|---|
Mean | r = −0.372 | r = −0.515 | r = −0.266 | r = 0.682 | r = 0.988 |
p = 0.628 | p = 0.485 | p = 0.234 | p = 0.318 | p = 0.012 | |
Variance | r = −0.202 | r = −0.328 | r = −0.441 | r = −0.854 | r = −0.905 |
p = 0.798 | p = 0.672 | p = 0.020 | p = 0.036 | p = 0.095 | |
Slope | r = − 0.725 | r = −0.705 | r = − 0.813 | r = −0.572 | r = − 0.108 |
p = 0.028 | p = 0.030 | p = 0.187 | p = 0.428 | p = 0.892 | |
Peak | r = −0.08 | r = −0.213 | r = −0.081 | r = −0.914 | r = −0.844 |
p = 0.920 | p = 0.787 | p = 0.919 | p = 0.086 | p = 0.116 | |
Kurtosis | r = 0.901 | r = 0.833 | r = 0.959 | r = 0.692 | r = 0.007 |
p = 0.099 | p = 0.167 | p = 0.041 | p = 0.308 | p = 0.993 |
Modality | Accuracy (%) |
---|---|
fNIRS | 86.70 [55] |
89.49 [56] | |
91.44 [57] | |
EEG | 98.78 [58] |
99.55 [59] | |
99.57 [60] | |
fNIRS + EEG | 99.81 |
Data Type | Models | Accuracy (%) | Time (ms) |
---|---|---|---|
fNIRS-EEG | RF [61] | 99.11 | - |
fNIRS-EEG | RHMM-SVM [62] | 75 | - |
fNIRS-EEG | SVM [63] | 80 | - |
fNIRS-EEG | CNN [55] | 82 | - |
fNIRS-EEG | CNN-Transformer [55] | 86.7 | - |
fNIRS-EEG | R-CSP-E [64] | 66.83 | - |
fNIRS-EEG | backpropagation ANN [61] | 96.92 | - |
fNIRS-EEG | MA-MP-GF [65] | 95.71 | - |
fNIRS-EEG | Stacking ensemble learning [66] | 95.83 | - |
fNIRS-EEG | This work | 99.81 | 1.1 |
Models | Accuracy (%) | Time (ms) | FLOPs (M) |
---|---|---|---|
TC-ResNet8-1.5 | 99.82 | 2.8 | 6.6 |
TC-ResNet14 | 99.83 | 2.5 | 6.1 |
TC-ResNet14-1.5 | 99.85 | 5.7 | 13.4 |
2D-TC-ResNet8 | 99.81 | 10.1 | 35.8 |
TC-ResNet8-pooling | 98.63 | 3.5 | 4 |
This work | 99.81 | 1.1 | 3 |
Parameters | [67] | [68] | [69] | [22] | [70] | This Work |
---|---|---|---|---|---|---|
Wavelength (nm) | 695, 830 | 730, 850 | 760, 850 | 750, 850 | 762, 845.5 | 760, 850 |
fNIRS/EEG Channels | 23/16 | 16/4 | 128/19 | 8/32 | 20/64 | 24/20 |
fNIRS/EEG sampling rate (Hz) | 10/256 | 2/2500 | 20/500 | 10/500 | 5/1000 | ≤150/≥500 |
Photopolar spacing (mm) (see Supplementary Materials S2 fNIRS optodes) | 30 | 25 | 25–50 | 35 | - | 10–55 |
Light source type | - | LED | - | - | LED | LED |
Detector type | APD | Photodiode | - | - | APD | SiPDs |
Source-probe quantity | 16, 8 | 4, 10 | 32, 32 | 8, 2 | - | 10, 8 |
Data transmission | Wireline transmission | Wireline transmission | Wireline transmission | Wireline transmission | Wireline transmission | Bluetooth wireless |
High-density measurement | No support | No support | Support | No support | Support | Support |
Operational complexity | Ordinary | Ordinary | Highly complex | Simple | Complex | Simple |
Instrument power supply method | Direct AC power supply | Direct plug-in or battery-powered with rechargeable batteries | Direct AC power supply | Direct AC power supply | Direct AC power supply | Direct plug-in or battery-powered with rechargeable batteries |
Portable and compact design | Wearable | Wearable | Wearable | Wearable | Wearable | Small, portable, and wearable |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, J.; Yu, K.; Wang, F.; Zhou, Z.; Bi, Y.; Zhuang, S.; Zhang, D. Temporal Convolutional Network-Enhanced Real-Time Implicit Emotion Recognition with an Innovative Wearable fNIRS-EEG Dual-Modal System. Electronics 2024, 13, 1310. https://doi.org/10.3390/electronics13071310
Chen J, Yu K, Wang F, Zhou Z, Bi Y, Zhuang S, Zhang D. Temporal Convolutional Network-Enhanced Real-Time Implicit Emotion Recognition with an Innovative Wearable fNIRS-EEG Dual-Modal System. Electronics. 2024; 13(7):1310. https://doi.org/10.3390/electronics13071310
Chicago/Turabian StyleChen, Jiafa, Kaiwei Yu, Fei Wang, Zhengxian Zhou, Yifei Bi, Songlin Zhuang, and Dawei Zhang. 2024. "Temporal Convolutional Network-Enhanced Real-Time Implicit Emotion Recognition with an Innovative Wearable fNIRS-EEG Dual-Modal System" Electronics 13, no. 7: 1310. https://doi.org/10.3390/electronics13071310
APA StyleChen, J., Yu, K., Wang, F., Zhou, Z., Bi, Y., Zhuang, S., & Zhang, D. (2024). Temporal Convolutional Network-Enhanced Real-Time Implicit Emotion Recognition with an Innovative Wearable fNIRS-EEG Dual-Modal System. Electronics, 13(7), 1310. https://doi.org/10.3390/electronics13071310