Decoding Subject-Driven Cognitive States from EEG Signals for Cognitive Brain–Computer Interface
Abstract
:1. Introduction
2. Methods
2.1. General Procedure
2.2. Experimental Paradigm Design
2.3. Data Preprocessing
2.4. Network Architecture
2.4.1. Channel and Frequency Attention (CFA) Module
2.4.2. Training Process and Strategies
2.4.3. Evaluation Metrics
3. Results
3.1. Model Performance Analysis
3.1.1. Ablation Analysis
3.1.2. Qualitative Analysis (Visual Analysis)
3.2. Comparative Analysis
3.3. Time Length Impact
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wolpaw, J.R.; Birbaumer, N.; McFarland, D.J.; Pfurtscheller, G.; Vaughan, T.M. Brain–computer interfaces for communication and control. Clin. Neurophysiol. 2002, 113, 767–791. [Google Scholar] [CrossRef] [PubMed]
- Gao, S.; Wang, Y.; Gao, X.; Hong, B. Visual and Auditory Brain–Computer Interfaces. IEEE Trans. Biomed. Eng. 2014, 61, 1436–1447. [Google Scholar] [CrossRef] [PubMed]
- Kawala-Sterniuk, A.; Browarska, N.; Al-Bakri, A.; Pelc, M.; Zygarlicki, J.; Sidikova, M.; Martinek, R.; Gorzelanczyk, E.J. Summary of over Fifty Years with Brain-Computer Interfaces—A Review. Brain Sci. 2021, 11, 43. [Google Scholar] [CrossRef] [PubMed]
- Hochberg, L.R.; Bacher, D.; Jarosiewicz, B.; Masse, N.Y.; Simeral, J.D.; Vogel, J.; Haddadin, S.; Liu, J.; Cash, S.S.; van der Smagt, P.; et al. Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature 2012, 485, 372–375. [Google Scholar] [CrossRef] [PubMed]
- Edelman, B.J.; Meng, J.; Suma, D.; Zurn, C.; Nagarajan, E.; Baxter, B.S.; Cline, C.C.; He, B. Noninvasive neuroimaging enhances continuous neural tracking for robotic device control. Sci. Robot. 2019, 4, eaaw6844. [Google Scholar] [CrossRef]
- Flesher, S.N.; Collinger, J.L.; Foldes, S.T.; Weiss, J.M.; Downey, J.E.; Tyler-Kabara, E.C.; Bensmaia, S.J.; Schwartz, A.B.; Boninger, M.L.; Gaunt, R.A. Intracortical microstimulation of human somatosensory cortex. Sci. Transl. Med. 2016, 8, 361ra141. [Google Scholar] [CrossRef]
- Arpaia, P.; Esposito, A.; Natalizio, A.; Parvis, M. How to successfully classify EEG in motor imagery BCI: A metrological analysis of the state of the art. J. Neural Eng. 2022, 19, 031002. [Google Scholar] [CrossRef]
- Velliste, M.; Perel, S.; Spalding, M.C.; Whitford, A.S.; Schwartz, A.B. Cortical control of a prosthetic arm for self-feeding. Nature 2008, 453, 1098–1101. [Google Scholar] [CrossRef] [PubMed]
- Makeig, S.; Delorme, A.; Westerfield, M.; Jung, T.-P.; Townsend, J.; Courchesne, E.; Sejnowski, T.J. Electroencephalographic Brain Dynamics Following Manually Responded Visual Targets. PLOS Biol. 2004, 2, e176. [Google Scholar] [CrossRef] [PubMed]
- Shih, J.J.; Krusienski, D.J.; Wolpaw, J.R. Brain-Computer Interfaces in Medicine. Mayo Clin. Proc. 2012, 87, 268–279. [Google Scholar] [CrossRef]
- Ju, X.; Li, M.; Tian, W.; Hu, D. EEG-based emotion recognition using a temporal-difference minimizing neural network. Cogn. Neurodynamics 2024, 18, 405–416. [Google Scholar] [CrossRef] [PubMed]
- Wu, D.; Jiang, X.; Peng, R. Transfer learning for motor imagery based brain–computer interfaces: A tutorial. Neural Netw. 2022, 153, 235–253. [Google Scholar] [CrossRef] [PubMed]
- Veeranki, Y.R.; McNaboe, R.; Posada-Quintero, H.F. EEG-Based Seizure Detect. Using Var. -Freq. Complex Demodulation Convolutional Neural Networks. Signals 2023, 4, 816–835. [Google Scholar] [CrossRef]
- Galán, F.; Nuttin, M.; Lew, E.; Ferrez, P.W.; Vanacker, G.; Philips, J.; Millán J del, R. A brain-actuated wheelchair: Asynchronous and non-invasive Brain–computer interfaces for continuous control of robots. Clin. Neurophysiol. 2008, 119, 2159–2169. [Google Scholar] [CrossRef] [PubMed]
- Doud, A.J.; Lucas, J.P.; Pisansky, M.T.; He, B. Continuous Three-Dimensional Control of a Virtual Helicopter Using a Motor Imagery Based Brain-Computer Interface. PLoS ONE 2011, 6, e26322. [Google Scholar] [CrossRef] [PubMed]
- Royer, A.S.; Doud, A.J.; Rose, M.L.; He, B. EEG Control of a Virtual Helicopter in 3-Dimensional Space Using Intelligent Control Strategies. IEEE Trans. Neural Syst. Rehabil. Eng. 2010, 18, 581–589. [Google Scholar] [CrossRef] [PubMed]
- LaFleur, K.; Cassady, K.; Doud, A.; Shades, K.; Rogin, E.; He, B. Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface. J. Neural Eng. 2013, 10, 046003. [Google Scholar] [CrossRef]
- Meng, J.; Zhang, S.; Bekyo, A.; Olsoe, J.; Baxter, B.; He, B. Noninvasive Electroencephalogram Based Control of a Robotic Arm for Reach and Grasp Tasks. Sci. Rep. 2016, 6, 38565. [Google Scholar] [CrossRef] [PubMed]
- Chengaiyan, S.; Retnapandian, A.S.; Anandan, K. Identification of vowels in consonant–vowel–consonant words from speech imagery based EEG signals. Cogn. Neurodynamics 2019, 14, 1–19. [Google Scholar] [CrossRef]
- Philip, J.T.; George, S.T. Visual P300 Mind-Speller Brain-Computer Interfaces: A Walk Through the Recent Developments With Special Focus on Classification Algorithms. Clin. EEG Neurosci. 2019, 51, 19–33. [Google Scholar] [CrossRef]
- Aghili, S.N.; Kilani, S.; Khushaba, R.N.; Rouhani, E. A spatial-temporal linear feature learning algorithm for P300-based brain-computer interfaces. Heliyon 2023, 9, e15380. [Google Scholar] [CrossRef]
- Wang, Z.; Chen, C.; Li, J.; Wan, F.; Sun, Y.; Wang, H. ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection Improvement. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 991–1000. [Google Scholar] [CrossRef] [PubMed]
- Du, P.; Li, P.; Cheng, L.; Li, X.; Su, J. Single-trial P300 classification algorithm based on centralized multi-person data fusion CNN. Front. Neurosci. 2023, 17, 1132290. [Google Scholar] [CrossRef]
- Apicella, A.; Arpaia, P.; De Benedetto, E.; Donato, N.; Duraccio, L.; Giugliano, S.; Prevete, R. Enhancement of SSVEPs Classification in BCI-Based Wearable Instrumentation Through Machine Learning Techniques. IEEE Sens. J. 2022, 22, 9087–9094. [Google Scholar] [CrossRef]
- Na, R.; Zheng, D.; Sun, Y.; Han, M.; Wang, S.; Zhang, S.; Hui, Q.; Chen, X.; Zhang, J.; Hu, C. A Wearable Low-Power Collaborative Sensing System for High-Quality SSVEP-BCI Signal Acquisition. IEEE Internet Things J. 2022, 9, 7273–7285. [Google Scholar] [CrossRef]
- Ming, G.; Pei, W.; Gao, X.; Wang, Y. A high-performance SSVEP-based BCI using imperceptible flickers. J. Neural Eng. 2023, 20, 016042. [Google Scholar] [CrossRef] [PubMed]
- Zhao, S.; Wang, R.; Bao, R.; Yang, L. Spatially-coded SSVEP BCI without pre-training based on FBCCA. Biomed. Signal Process. Control 2023, 84, 104717. [Google Scholar] [CrossRef]
- Xiong, H.; Song, J.; Liu, J.; Han, Y. Deep transfer learning-based SSVEP frequency domain decoding method. Biomed. Signal Process. Control 2024, 89, 105931. [Google Scholar] [CrossRef]
- Rivera-Flor, H.; Guerrero-Mendez, C.D.; Hernandez-Ossa, K.A.; Delisle-Rodriguez, D.; Mello, R.; Bastos-Filho, T.F. Compressive sensing applied to SSVEP-based brain–computer interface in the cloud for online control of a virtual wheelchair. Biomed. Signal Process. Control 2024, 89, 105698. [Google Scholar] [CrossRef]
- Wolpaw, J.R.; McFarland, D.J.; Neat, G.W.; Forneris, C.A. An EEG-based brain-computer interface for cursor control. Electroencephalogr. Clin. Neurophysiol. 1991, 78, 252–259. [Google Scholar] [CrossRef]
- Wolpaw, J.R.; McFarland, D.J. Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. Proc. Natl. Acad. Sci. USA 2004, 101, 17849–17854. [Google Scholar] [CrossRef] [PubMed]
- Onose, G.; Grozea, C.; Anghelescu, A.; Daia, C.; Sinescu, C.J.; Ciurea, A.V.; Spircu, T.; Mirea, A.; Andone, I.; Spânu, A.; et al. On the feasibility of using motor imagery EEG-based brain–computer interface in chronic tetraplegics for assistive robotic arm control: A clinical test and long-term post-trial follow-up. Spinal Cord 2012, 50, 599–608. [Google Scholar] [CrossRef] [PubMed]
- Choy, C.S.; Cloherty, S.L.; Pirogova, E.; Fang, Q. Virtual Reality Assisted Motor Imagery for Early Post-Stroke Recovery: A Review. IEEE Rev. Biomed. Eng. 2023, 16, 487–498. [Google Scholar] [CrossRef] [PubMed]
- Yuan, H.; He, B. Brain–Computer Interfaces Using Sensorimotor Rhythms: Current State and Future Perspectives. IEEE Trans. Biomed. Eng. 2014, 61, 1425–1435. [Google Scholar] [CrossRef] [PubMed]
- Blankertz, B.; Sannelli, C.; Halder, S.; Hammer, E.M.; Kübler, A.; Müller, K.-R.; Curio, G.; Dickhaus, T. Neurophysiological predictor of SMR-based BCI performance. NeuroImage 2010, 51, 1303–1309. [Google Scholar] [CrossRef] [PubMed]
- Stieger, J.R.; Engel, S.; Jiang, H.; Cline, C.C.; Kreitzer, M.J.; He, B. Mindfulness Improves Brain Computer Interface Performance by Increasing Control over Neural Activity in the Alpha Band. bioRxiv 2010. [Google Scholar] [CrossRef]
- Allison, B.Z.; Neuper, C. Could Anyone Use a BCI? Springer: London, UK, 2010; pp. 35–54. [Google Scholar] [CrossRef]
- Koizumi, K.; Ueda, K.; Nakao, M. Development of a Cognitive Brain-Machine Interface Based on a Visual Imagery Method. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018. [Google Scholar] [CrossRef]
- Yousefi, R.; Sereshkeh, A.R.; Chau, T. Development of a robust asynchronous brain-switch using ErrP-based error correction. J. Neural Eng. 2019, 16, 066042. [Google Scholar] [CrossRef] [PubMed]
- Shirer, W.R.; Ryali, S.; Rykhlevskaia, E.; Menon, V.; Greicius, M.D. Decoding Subject-Driven Cognitive States with Whole-Brain Connectivity Patterns. Cereb. Cortex 2011, 22, 158–165. [Google Scholar] [CrossRef] [PubMed]
- Delorme, A.; Makeig, S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef]
- Van der Maaten, L.; Hinton, G. Visualizing Data using t-SNE. J. Mach. Learn. Res. 2008, 11, 2579–2605. [Google Scholar]
- Lawhern, V.J.; Solon, A.J.; Waytowich, N.R.; Gordon, S.M.; Hung, C.P.; Lance, B.J. EEGNet: A Compact Convolutional Neural Netw. EEG-Based Brain–Comput. Interfaces. J. Neural Eng. 2018, 15, 056013. [Google Scholar] [CrossRef] [PubMed]
- Chen, X.; Teng, X.; Chen, H.S.; Pan, Y.; Geyer, P. Toward reliable signals decoding for electroencephalogram: A benchmark study to EEGNeX. Biomed. Signal Process. Control. 2024, 87, 105475. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. In Proceedings of the 3rd International Conference on Learning Representations (ICLR 2015), Diego, CA, USA, 7–9 May 2015. [Google Scholar]
- Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef]
- Veeranki, Y.R.; Diaz LR, M.; Swaminathan, R.; Posada-Quintero, H.F. Nonlinear Signal Processing Methods for Automatic Emotion Recognition Using Electrodermal Activity. IEEE Sens. J. 2024, 24, 8079–8093. [Google Scholar] [CrossRef]
- Yin, J.; Liu, A.; Li, C.; Qian, R.; Chen, X. Frequency Information Enhanced Deep EEG Denoising Network for Ocular Artifact Removal. IEEE Sens. J. 2022, 22, 21855–21865. [Google Scholar] [CrossRef]
- Gabardi, M.; Saibene, A.; Gasparini, F.; Rizzo, D.; Stella, F.A. A multi-artifact EEG denoising by frequency-based deep learning. arXiv 2023. [Google Scholar] [CrossRef]
- Dong, Y.; Tang, X.; Li, Q.; Wang, Y.; Jiang, N.; Tian, L.; Zheng, Y.; Li, X.; Zhao, S.; Li, G.; et al. An Approach for EEG Denoising Based on Wasserstein Generative Adversarial Network. Ieee Trans. Neural Syst. Rehabil. Eng. 2023, 31, 3524–3534. [Google Scholar] [CrossRef]
- Sun, W.; Su, Y.; Wu, X.; Wu, X.; Zhang, Y. EEG denoising through a wide and deep echo state network optimized by UPSO algorithm. Appl. Soft Comput. 2021, 105, 107149. [Google Scholar] [CrossRef]
- Xiong, W.; Ma, L.; Li, H. A general dual-pathway network for EEG denoising. Front. Neurosci. 2024, 17, 1258024. [Google Scholar] [CrossRef]
- Vansteensel, M.J.; Hermes, D.; Aarnoutse, E.J.; Bleichner, M.G.; Schalk, G.; van Rijen, P.C.; Leijten FS, S.; Ramsey, N.F. Brain-computer interfacing based on cognitive control. Ann. Neurol. 2010, 67, 809–816. [Google Scholar] [CrossRef]
- Ryun, S.; Kim, J.S.; Lee, S.H.; Jeong, S.; Kim, S.-P.; Chung, C.K. Movement Type Prediction before Its Onset Using Signals from Prefrontal Area: An Electrocorticography Study. Biomed Res. Int. 2014, 2014, 783203. [Google Scholar] [CrossRef] [PubMed]
- Chavarriaga, R.; Sobolewski, A.; Millán J del, R. Errare machinale est: The use of error-related potentials in brain-machine interfaces. Front. Neurosci. 2014, 8, 86996. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.; Makeig, S. Predicting Intended Movement Direction Using EEG from Human Posterior Parietal Cortexp; Springer: Berlin/Heidelberg, Germany, 2009; pp. 437–446. [Google Scholar] [CrossRef]
- Sanno, S.; Misawa, T.; Hirobayashi, S. Brain-computer interface for cursor control using brain activity in the prefrontal cortex. In Proceedings of the Asia Pacific Industrial Engineering & Management Systems Conference, Phuket, Thailand, 2–5 December 2012; pp. 1440–1446. [Google Scholar]
Input | Layer | Output | Feature Maps | Kernel | Stride |
---|---|---|---|---|---|
N1(32, 177, 60, 100) | conv1 | N2(32, 177, 60, 100) | 177 | 3 × 3 | 1 |
N2(32, 177, 60, 100) | CFA module | N2(32, 177, 60, 100) | 177 | - | - |
N2(32, 177, 60, 100) | pool1 | N3(32, 177, 12, 20) | 177 | 5 × 5 | 5 |
N3(32, 177, 12, 20) | conv2 | N4(32, 128, 12, 20) | 128 | 5 × 5 | 1 |
N4(32, 128, 12, 20) | pool2 | N5(32, 128, 6, 10) | 128 | 2 × 2 | 2 |
N5(32, 128, 6, 10) | conv3 | N6(32, 128, 6, 10) | 128 | 5 × 5 | 1 |
N6(32, 128, 6, 10) | conv4 | N7(32, 64, 6, 10) | 64 | 5 × 5 | 1 |
N7(32, 64, 6, 10) | dropout | N7(32, 64, 6, 10) | 64 | - | - |
N7(32, 64, 6, 10) | pool3 | N8(32, 64, 3, 5) | 64 | 2 × 2 | 2 |
N8(32, 64, 3, 5) | flatten | N9(32, 960) | 960 | - | - |
N9(32, 960) | fc | N10(32, 4) | 4 | - | - |
Input | Layer | Output | Feature Maps |
---|---|---|---|
(32, 177, 60, 100) | AdaptiveMaxPool2d | (32, 177, 1, 1) | 177 |
(32, 177, 60, 100) | AdaptiveAvgPool2d | (32, 177, 1, 1) | 177 |
(32, 177, 1, 1) | Linear(fc1) | (32, 59) | 59 |
(32, 177, 1, 1) | ReLU | (32, 59) | 59 |
(32, 59) | Linear(fc2) | (32, 177) | 177 |
(32, 177) | ReLU | (32, 177) | 177 |
(32, 177) | Sigmoid | (32, 177, 1, 1) | 177 |
Subject | Accuracy | Precision | Recall | F-Score | Kappa |
---|---|---|---|---|---|
S1 | 0.8661 | 0.8724 | 0.8661 | 0.8663 | 0.8214 |
S2 | 0.7304 | 0.7488 | 0.7304 | 0.7263 | 0.6406 |
S3 | 0.6670 | 0.7309 | 0.6670 | 0.6555 | 0.5569 |
S4 | 0.7116 | 0.7505 | 0.7116 | 0.7051 | 0.6151 |
S5 | 0.6920 | 0.7280 | 0.6920 | 0.6854 | 0.5890 |
S6 | 0.7536 | 0.7645 | 0.7536 | 0.7524 | 0.6715 |
S7 | 0.9089 | 0.9151 | 0.9089 | 0.9091 | 0.8784 |
Mean | 0.7614 | 0.7872 | 0.7614 | 0.7572 | 0.6818 |
Std. | 0.0913 | 0.0749 | 0.0913 | 0.0950 | 0.1215 |
Model | Accuracy | Precision | Recall | F-Score | Kappa | AUC Value |
---|---|---|---|---|---|---|
EEGNet_4_2 | 0.27 ± 0.02 | 0.27 ± 0.02 | 0.27 ± 0.02 | 0.26 ± 0.02 | 0.02 ± 0.03 | 0.51 ± 0.01 |
EEGNet_8_2 | 0.26 ± 0.03 | 0.26 ± 0.03 | 0.26 ± 0.03 | 0.26 ± 0.02 | 0.02 ± 0.03 | 0.51 ± 0.02 |
EEGNex | 0.36 ± 0.08 | 0.35 ± 0.08 | 0.36 ± 0.08 | 0.35 ± 0.08 | 0.14 ± 0.10 | 0.57 ± 0.05 |
TF-ResNet18 | 0.22 ± 0.01 | 0.06 ± 0.01 | 0.22 ± 0.01 | 0.09 ± 0.01 | 0.00 ± 0.01 | 0.50 ± 0.00 |
TF-VGG16 | 0.25 ± 0.00 | 0.06 ± 0.00 | 0.25 ± 0.00 | 0.10 ± 0.00 | 0.00 ± 0.00 | 0.50 ± 0.00 |
TF-LeNet | 0.43 ± 0.08 | 0.49 ± 0.09 | 0.43 ± 0.08 | 0.38 ± 0.11 | 0.25 ± 0.11 | 0.62 ± 0.05 |
TF-CNN | 0.70 ± 0.08 | 0.76 ± 0.07 | 0.70 ± 0.08 | 0.69 ± 0.08 | 0.60 ± 0.10 | 0.80 ± 0.05 |
TF-CNN-CFA | 0.76 ± 0.09 | 0.79 ± 0.07 | 0.76 ± 0.09 | 0.76 ± 0.10 | 0.68 ± 0.12 | 0.84 ± 0.06 |
Subjects | EEGNex | TF-LeNet | TF-CNN | TF-CNN-CFA | ||||
---|---|---|---|---|---|---|---|---|
Acc | Kappa | Acc | Kappa | Acc | Kappa | Acc | Kappa | |
S1 | 0.3679 ±0.03 | 0.1578 ±0.04 | 0.5232 ±0.09 | 0.3638 ±0.09 | 0.8295 ±0.05 | 0.7724 ±0.08 | 0.8661 ±0.02 | 0.8214 ±0.03 |
S2 | 0.3598 ±0.03 | 0.1475 ±0.04 | 0.3920 ±0.10 | 0.1926 ±0.12 | 0.6134 ±0.15 | 0.4838 ±0.17 | 0.7304 ±0.05 | 0.6406 ±0.04 |
S3 | 0.3259 ±0.02 | 0.1003 ±0.03 | 0.4589 ±0.06 | 0.2783 ±0.07 | 0.6643 ±0.07 | 0.5520 ±0.09 | 0.6670 ±0.08 | 0.5569 ±0.12 |
S4 | 0.4545 ±0.05 | 0.2730 ±0.06 | 0.3143 ±0.05 | 0.0931 ±0.06 | 0.7366 ±0.04 | 0.6491 ±0.05 | 0.7116 ±0.03 | 0.6151 ±0.05 |
S5 | 0.3223 ±0.02 | 0.0959 ±0.02 | 0.3464 ±0.08 | 0.1353 ±0.11 | 0.6152 ±0.09 | 0.4872 ±0.12 | 0.6920 ±0.04 | 0.5890 ±0.05 |
S6 | 0.2295 ±0.02 | 0.0285 ±0.04 | 0.4821 ±0.07 | 0.3096 ±0.08 | 0.7196 ±0.05 | 0.6266 ±0.05 | 0.7536 ±0.04 | 0.6715 ±0.04 |
S7 | 0.4366 ±0.06 | 0.2489 ±0.08 | 0.5196 ±0.08 | 0.3585 ±0.10 | 0.7250 ±0.13 | 0.6340 ±0.17 | 0.9089 ±0.03 | 0.8784 ±0.03 |
Mean | 0.3566 ±0.08 | 0.1421 ±0.10 | 0.4338 ±0.08 | 0.2473 ±0.10 | 0.7005 ±0.08 | 0.6007 ±0.10 | 0.7614 ±0.098 | 0.6818 ±0.12 |
N. Points | Training Set | Test Set | Image Resolution |
---|---|---|---|
1.5 s | 13,440 | 3360 | 60 × 100 |
2.0 s | 10,080 | 2520 | 60 × 100 |
2.5 s | 8064 | 2016 | 60 × 100 |
3.0 s | 6720 | 1680 | 60 × 100 |
3.5 s | 5712 | 1428 | 60 × 100 |
4.0 s | 5040 | 1260 | 60 × 100 |
4.5 s | 4436 | 1108 | 60 × 100 |
5.0 s | 4032 | 1008 | 60 × 100 |
Time Lengths L | Model | Class 1 | Class 2 | Class 3 | Class 4 | Overall |
---|---|---|---|---|---|---|
5.0 s | TF-LeNet | 0.39 | 0.34 | 0.33 | 0.42 | 0.37 ± 0.04 |
TF-CNN | 0.52 | 0.61 | 0.59 | 0.54 | 0.57 ± 0.04 | |
TF-CNN-CFA | 0.65 | 0.56 | 0.65 | 0.58 | 0.61 ± 0.05 | |
4.5 s | TF-LeNet | 0.31 | 0.37 | 0.36 | 0.48 | 0.38 ± 0.07 |
TF-CNN | 0.57 | 0.59 | 0.5 | 0.48 | 0.54 ± 0.05 | |
TF-CNN-CFA | 0.63 | 0.63 | 0.71 | 0.55 | 0.63 ± 0.07 | |
4.0 s | TF-LeNet | 0.42 | 0.42 | 0.32 | 0.48 | 0.41 ± 0.07 |
TF-CNN | 0.69 | 0.62 | 0.61 | 0.58 | 0.63 ± 0.05 | |
TF-CNN-CFA | 0.75 | 0.66 | 0.68 | 0.66 | 0.69 ± 0.04 | |
3.5 s | TF-LeNet | 0.44 | 0.4 | 0.38 | 0.41 | 0.41 ± 0.03 |
TF-CNN | 0.66 | 0.7 | 0.67 | 0.67 | 0.68 ± 0.02 | |
TF-CNN-CFA | 0.8 | 0.72 | 0.65 | 0.68 | 0.71 ± 0.07 | |
3.0 s | TF-LeNet | 0.51 | 0.39 | 0.45 | 0.39 | 0.44 ± 0.06 |
TF-CNN | 0.67 | 0.74 | 0.71 | 0.69 | 0.7 ± 0.03 | |
TF-CNN-CFA | 0.8 | 0.72 | 0.77 | 0.75 | 0.76 ± 0.03 | |
2.5 s | TF-LeNet | 0.57 | 0.46 | 0.55 | 0.53 | 0.53 ± 0.05 |
TF-CNN | 0.8 | 0.69 | 0.73 | 0.67 | 0.72 ± 0.06 | |
TF-CNN-CFA | 0.79 | 0.75 | 0.73 | 0.73 | 0.75 ± 0.03 | |
2.0 s | TF-LeNet | 0.64 | 0.52 | 0.45 | 0.45 | 0.52 ± 0.09 |
TF-CNN | 0.77 | 0.8 | 0.78 | 0.78 | 0.78 ± 0.01 | |
TF-CNN-CFA | 0.84 | 0.83 | 0.85 | 0.79 | 0.83 ± 0.03 | |
1.5 s | TF-LeNet | 0.64 | 0.52 | 0.57 | 0.65 | 0.6 ± 0.06 |
TF-CNN | 0.89 | 0.83 | 0.86 | 0.82 | 0.85 ± 0.03 | |
TF-CNN-CFA | 0.92 | 0.86 | 0.87 | 0.83 | 0.87 ± 0.04 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Huang, D.; Wang, Y.; Fan, L.; Yu, Y.; Zhao, Z.; Zeng, P.; Wang, K.; Li, N.; Shen, H. Decoding Subject-Driven Cognitive States from EEG Signals for Cognitive Brain–Computer Interface. Brain Sci. 2024, 14, 498. https://doi.org/10.3390/brainsci14050498
Huang D, Wang Y, Fan L, Yu Y, Zhao Z, Zeng P, Wang K, Li N, Shen H. Decoding Subject-Driven Cognitive States from EEG Signals for Cognitive Brain–Computer Interface. Brain Sciences. 2024; 14(5):498. https://doi.org/10.3390/brainsci14050498
Chicago/Turabian StyleHuang, Dingyong, Yingjie Wang, Liangwei Fan, Yang Yu, Ziyu Zhao, Pu Zeng, Kunqing Wang, Na Li, and Hui Shen. 2024. "Decoding Subject-Driven Cognitive States from EEG Signals for Cognitive Brain–Computer Interface" Brain Sciences 14, no. 5: 498. https://doi.org/10.3390/brainsci14050498
APA StyleHuang, D., Wang, Y., Fan, L., Yu, Y., Zhao, Z., Zeng, P., Wang, K., Li, N., & Shen, H. (2024). Decoding Subject-Driven Cognitive States from EEG Signals for Cognitive Brain–Computer Interface. Brain Sciences, 14(5), 498. https://doi.org/10.3390/brainsci14050498