1. Introduction
Brain-Computer Interface (BCI) is a direct link between a computer and the human brain [
1]. It is the most recent advancement in Human-Computer Interface technology (HCI). In contrast with the typical input devices (mouse, keyboard, etc.), BCI receives the waves generated by the brain at various locations in the human head and converts these signals into commands and actions that can control the computer. BCI applications are traditionally directed toward the medical field, mainly targeting patients with chronic neurological diseases [
2].
BCI systems are classified as invasive or non-invasive according to where the sensors are placed. A system is considered an invasive BCI [
3] if the array of electrodes used to collect brain signals is implanted inside the brain. On the other hand, when the electrodes are placed on top of the head, the system is non-invasive [
4]. The first category has the benefit of having a higher-quality signal with fewer artifacts and noise. Patients with serious brain disabilities, such as blindness or paralysis, are the primary targets of invasive BCI since neurosurgery is a risky and costly operation. The non-invasive category [
5] has many benefits, including portability, affordability, and safety of the method.
Electroencephalography (EEG) is the most frequently used non-invasive method [
6]. EEG, records the electrical activity of the brain as it occurs on the scalp’s surface [
7]. EEG equipment can be either clinical or commercial. Clinical devices are utilized in medical applications and typically have 64 or 32 electrodes. Commercial devices are more affordable and widely available mainly for lifestyle applications, and they have anywhere between 1 and 64 electrodes.
BCI has the potential to be integrated with 3D and Extended Reality (XR) software [
8,
9] to help people with neuro and motor rehabilitation. The BCI could be used to measure the patient’s brain activity, while the 3D/XR software could be used to provide the patient with immersive, interactive, and engaging experiences. For example, the patient could be guided through a series of exercises designed to help them improve their motor skills. The BCI could be used to measure the patient’s progress and provide feedback, allowing the patient to adjust the exercises accordingly. BCI has made a great advancement in the medical area with numerous applications, for example, controlling prosthetic limbs [
10] or smart home systems [
11]. As a result of this advancement, we are now seeing actions to deploy BCI outside of the medical arena. In the last decade, literature has shown that two out of five BCI application scenarios primarily focused on non-medical fields [
12]. One of these fields is the gaming industry [
13,
14]. BCI in gaming can be used to create a more immersive gaming experience. It allows gamers to use their thoughts and feelings to control the game, instead of using a mouse, keyboard, or gamepad. For example, a gamer could use their thoughts to move objects in the game or control the speed of their character. BCI-enabled gaming could also allow people with disabilities to enjoy gaming in a more interactive way. There are several potential benefits of using BCI in gaming [
15], including:
Increased immersion: one of the key advantages of using BCI in gaming is that it has the potential to increase immersion by allowing players to directly control game characters with their thoughts. This could lead to a more immersive and realistic gaming experience.
Increased interactivity: another benefit of using BCI in gaming is that it could allow for increased interactivity between players and game characters/environments. For example, if a player’s EEG signals indicated they are feeling scared, this could trigger a change in the game environment (e.g., an enemy appearing), which would then require the player to react accordingly (e.g., by fighting back).
Accessibility: another potential advantage of using BCI in gaming is that it could make games more accessible for people with disabilities who may not be able to use traditional input devices such as keyboards or controllers.
Despite the potential benefits, there are also challenges associated with using BCI in gaming [
16]. One of the main challenges is that BCI technology is still in its infancy and significant technical challenges need to be overcome before it can be widely used. For example, current BCI systems often have high error rates, making them impractical for gaming use. In addition, they require users to undergo a period of training to learn how to control the device. This can take a considerable amount of time, making it difficult for users to become proficient in using the device. These systems are designed to work with specific games, making them difficult to use with other types of games. There is also a need to develop more user-friendly and affordable BCI systems [
17].
This study presents an approach to create a BCI game that is controlled by mental commands as expressed by EEG signals. The Muse 2 headband is utilized to measure EEG signals from multiple players, and OpenViBE is employed to record EEG data, conduct offline signal processing and classification, and perform real-time data acquisition and online EEG analysis and classification. Finally, the 3D game is developed in the Unity software, in which players are required to collect coins through motor imagery actions, such as left and right movements. The aim of this work is to determine whether users can adjust and improve their performance in manipulating the BCI system through training.
5. Results
Table 2 presents the classification results of EEG signals into three classes: Right Motor Imagery, Left Motor Imagery, and Blink. The Right Motor Imagery, Left Motor Imagery and Blink columns present the True Positive Rate (TPR) and Precision results of the respective commands, while the Overall column is the classification accuracy (Acc) for each subject. The average classification accuracy for all 33 subjects is 96.9%. The highest TPR is obtained for Left Motor Imagery (95.6% average for all subjects) and the lowest for Right Motor Imagery (95.4% average for all subjects). The results indicate that the proposed method is able to effectively classify the EEG signals into the three categories, with a high degree of accuracy.
Most of the participants (26) had not previously taken part in a BCI game experiment, except seven of them (Subjects 1–5 and 32–33) that had participated in a previous BCI research [
28]. To ensure that they were properly prepared, an experienced researcher briefed them on the protocol before the experiment started. After the recordings and the offline processing are completed, the subjects were instructed to perform the mental commands of left/right motor imagery in order to evaluate the online classification results for 90 s (45 s for left and 45 s for right movement). This process was repeated after the completion of the testing phase for every subject, examining if users had improved after the training and testing process. For the training process, they were given ten practice trials to become familiar with the game environment. After they had become familiar with the game, they played ten more times to evaluate the BCI system. The total number of coins collected (average from the ten repetitions per subject) and the total number of coin clusters (average from the ten repetitions per subject) were used as evaluation metrics. A coin cluster is assumed completed if at least one of its coins is collected.
Table 3 presents the results of the online testing.
Table 3 displays the results of a BCI game for 33 subjects. The two metrics used to evaluate the game performance are Average Game Score and Average Coin Clusters. The Average Game Score for all subjects is 27.6 (55.3%). The best performing participant is subject 5 with a 39.4 (78.8%) average score and the worst is subject 8 with 21 (42%). During the testing phase, it is observed that users can be classified into three groups depending on their performance and their ability to synchronize their mental commands in the game as it is presented in
Figure 6. The first group comprises ten individuals who experienced difficulty while playing the game and their score ranges from 40% to 49.9%. The primary challenge faced by players within this category is their inability to effectively manipulate the BCI. The second group consists of 14 subjects who could handle the game quite successfully. Their game score ranges from 50% to 59.9%. The last group includes nine users that could control the BCI with excellent precision. Their game score ranges from 60% to 69.9% (six subjects) and 70% to 78.8% (three subjects).
For the second online evaluation metric, Average Coin Clusters are employed. The average number of clusters is 10.04 (59%) ranging from 7.8 (45.8%) to 13.8 (81.1%). Subject 15 had the lowest average score and subject 5 had the best. These results are observed to be higher than the Average Game score results because most of the subjects could not collect all three coins from a cluster due to synchronization.
Another online metric used to evaluate the results is User Improvement (
Table 4). Every subject performed 15 motor imagery commands for each left and right movement before and after playing the game in order to monitor user adaptation to the BCI (60 trials for each subject). The results indicate that users have a 7.5% increase in the accuracy of both mental commands. The average correctly classified mental commands for left motor imagery before playing the game are 73.73% while after playing the game is 80.80%. The average improvement for all subjects in left motor imagery is 7.1%. For right motor imagery movement, the average correctly classified mental commands before playing the game are 71.66%, and after are 79.73%. For this movement the average improvement is 8.07%. These results are presented in
Figure 7.
A Kolmogorov-Smirnov test was used to test for normality on the four data samples (Left MI before training, Left MI after training, Right MI before training and Right MI after training). The results of the first sample are D(33) = 0.18, p = 0.197. For the second sample are, D(33) = 0.18, p = 0.201. For the third sample are, D(33) = 0.15, p = 0.385. For the fourth sample are, D(33) = 0.14, p = 0.479. The results indicate that the data are normally distributed in all samples.
Since the data are normally distributed, a paired samples t-test is performed to assess the impact of user improvement before and after testing the game. The left motor imagery command results indicate a statistical difference before (M = 11.1, SD = 3.1) and after ((M = 12.1, SD = 2.6), [t(33) = 5.8, p < 0.001]) playing the game. The right motor imagery movement results also indicate a statistical difference before (M = 10.8, SD = 2.9) and after ((M = 12, SD = 2.3), [t(33) = 5.5, p < 0.001]) playing the game.
6. Discussion
This work article presents a 3D game that employs a non-invasive BCI. The game utilizes the Muse 2 EEG headband to gather EEG data, while the signals are processed and classified into three distinct classes—left and right motor imagery and blink—using the OpenViBE platform. The primary objective of the game is to evaluate the users’ adaptation and progress within the BCI environment after undergoing training. The classification algorithm employed is the MLP, which achieved an accuracy rate of 96.94%. The experiment involved 33 participants who effectively controlled an avatar by issuing mental commands in order to collect coins.
Our research aimed to explore the feasibility of commercial and low-cost wireless EEGs to create a reliable BCI system based on MI commands. The Muse 2 provides a user-friendly approach for studying motor imagery but it is important to acknowledge the limitations of the electrode positions employed in the system. The optimal electrode positions for capturing MI commands are over the motor cortex, specifically at the C3 and C4 locations in the 10–20 EEG system. These locations correspond to the primary motor cortex on the left and right sides of the brain, respectively. The sub-optimal electrode placement, lacking direct coverage of the motor cortex (such as C3 and C4 locations), may limit the spatial resolution and specificity of the recorded brain activity. As a result, the interpretation of neural signals related to motor imagery can be challenging, and the extracted features may not fully capture the intricacies of motor-related brain processes. To address this limitation, we employed a technique known as eye polarization, using left and right gaze. Through conducting a series of training sessions with multiple users, we aimed to evaluate the effectiveness and reliability of the resulting BCI system. Despite encountering certain limitations and challenges, our experimental results showed promising potential for the development of a reliable BCI game based on MI commands with a low-cost EEG headset.
While our BCI game showed promising results, it is essential to acknowledge its limitations. The game is designed around a three-second epoch, where the avatar continuously moves forward and receives commands for turning or jumping every 3 s based on the user’s mental command. The game maintains continuous movement without interruption or delay. However, in order to execute turning or jumping actions, the player must wait for a three-second interval. The avatar’s constant forward movement is implemented to ensure a captivating and engaging gameplay experience with no delays. Additionally, the placement of coins in our game is strategically designed to be collectible within the current analysis framework and time windows. Developing a BCI game requires careful adjustment of parameters to align with EEG processing. BCI game development necessitates the customization of parameters to suit the specific requirements and constraints of the chosen analysis approach. These considerations highlight the importance of adapting game designs to the capabilities and limitations of BCI systems, thus optimizing the user experience within the given constraints. Lastly, this BCI is not applicable to other games without adjustments.
Throughout the study, consistent stimuli are used during both the MI training phase and data collection for classification training. Participants engaged in asynchronous MI tasks, mentally simulating hand movement actions without physical execution. This approach ensured that all participants are exposed to identical mental tasks, eliminating potential confounding factors associated with varying stimuli. It is worth noting that Action Observation (AO), which involves observing motor actions performed by others (videos, games, real movement), activates similar neural pathways as MI. The main objective of AO is to visually perceive and process motor actions executed by someone else to activate relevant neural pathways and enhance motor cognition. While this study focused on MI training, future research could consider incorporating AO to explore potential synergistic effects on motor learning and performance. Combining both modalities may enhance neural activation and functional connectivity within the motor system, potentially leading to improved outcomes in motor skill acquisition and rehabilitation [
33].
A comparison of this work to other papers from the literature is provided in
Table 5.
Table 5 presents a comparative study of various studies that have used EEG devices to measure brain activity and control mental commands. Although a direct comparison is not feasible, since each study has employed different subjects, devices, techniques and methods for evaluation, several remarks related can be drawn from the
Table 5.
Hjørungdal et al. [
18] used three subjects, the Emotiv Epoc EEG device, four mental commands and the time to complete the task as the evaluation metric. Fiałek and Liarokapis [
20] used two different EEG devices, the Emotiv EPOC+ and NeuroSky MindWave, to compare them for two mental commands on 31 subjects. To evaluate their experiment they employed a combination of average rating values, learnability, satisfaction, performance, and effort.
Alchalabi et al. [
19] used four subjects, the Emotiv EPOC+ EEG device, two mental commands and average focus, average stress, average relaxation, average excitement, and average engagement as evaluation metrics. Djamal et al. [
21] used 20 subjects, the NeuroSky MindWave EEG device, two mental commands and the average accuracy of training and testing data as the evaluation metric. Joselli et al. [
22] used 11 subjects, the NeuroSky MindWave Mobile EEG device, one mental command and a combination of average player score, average missed cuts, average attention level, average stress level, average engagement level, and average evolution attention as evaluation metrics. Glavas et al. [
23] used 38 subjects and the Muse 2 headband to acquire EEG data for two mental commands. To evaluate their work they employed Classification Accuracy, Avg Game Score 1, Avg Game Score 2 and Improvement as evaluation metrics.
In this work, 33 subjects participated to evaluate the BCI game. In the presented literature, the minimum number of subjects is three and the maximum is 38 while the average number of subjects are 17.8 subjects, thus this study includes one of the largest datasets presented in the literature. Furthermore, the number of mental commands in this study is three, while the majority of the papers in the literature investigate BCI systems with two mental commands, with the exception of [
18], that includes four mental commands however with a very limited number of participants. Furthermore, in this study multiple repetitions were performed, with the participants playing the game 20 times (ten for practise and the rest for evaluation). The combination of a large number of participants, with three mental commands and multiple recordings for each subject (train, practice and evaluation) compose a very complex experimental landscape, which surpasses any previous attempt presented in the literature. Moreover, the evaluation metrics in this work are extensive (including: Classification Accuracy, Game Score, Number of Clusters and User Improvement) in order to clearly asses the capabilities of the proposed system and capture its potentials.
Author Contributions
Conceptualization, M.G.T.; methodology, G.P., K.G., K.D.T., A.T.T. and M.G.T.; software, G.P. and K.G.; validation, G.P. and K.G.; data curation, G.P. and K.G.; writing—original draft preparation, G.P. and K.G.; writing—review and editing, G.P., K.G., K.D.T., A.T.T. and M.G.T.; supervision, M.G.T.; project administration, M.G.T. All authors have read and agreed to the published version of the manuscript.
Funding
This work is partially supported from the project “AGROTOUR–New Technologies and Innovative Approaches to Agri-Food and Tourism to Boost Regional Excellence in Western Macedonia” (MIS 5047196), which is implemented under the Action “Reinforcement of the Research and Innovation Infrastructure”, funded by the Operational Programme “Competitiveness, Entrepreneurship and Innovation” (NSRF 2014-2020), and co-financed by Greece and the European Union (European Regional Development Fund). Additionally, this work has been co-financed by the European Union and Greek national funds through Operational Program Competitiveness, Entrepreneurship and Innovation under the call RESEARCH—CREATE—INNOVATE: “Intelli– WheelChair” (Project Code: T2EΔ K-02438).
Institutional Review Board Statement
The study was approved by the Research Ethics and Ethics Committee of the University of Western Macedonia (protocol code 246/2023 and date of approval 8 June 2023).
Informed Consent Statement
Informed consent was obtained from all subjects involved in the study.
Data Availability Statement
Data not available for this study.
Conflicts of Interest
The authors declare no conflict of interest.
Abbreviations
The following abbreviations are used in this manuscript:
BCI | Brain–computer interface |
HCI | Human–computer interface |
EEG | Electroencephalography |
EOG | Electrooculography |
LSL | Lab streaming layer |
MLP | Multi-layer perceptron |
MI | Motor Imagery |
XR | Extended Reality |
References
- Saha, S.; Mamun, K.A.; Ahmed, K.; Mostafa, R.; Naik, G.R.; Darvishi, S.; Khandoker, A.H.; Baumert, M. Progress in brain computer interface: Challenges and opportunities. Front. Syst. Neurosci. 2021, 15, 578875. [Google Scholar] [CrossRef] [PubMed]
- Wolpaw, J.R. Brain-computer interfaces (BCIs) for communication and control. In Proceedings of the 9th International ACM SIGACCESS Conference on Computers and Accessibility, Tempe, AZ, USA, 15–17 October 2007; pp. 1–2. [Google Scholar]
- Birbaumer, N. Breaking the silence: Brain–computer interfaces (BCI) for communication and motor control. Psychophysiology 2006, 43, 517–532. [Google Scholar] [CrossRef] [PubMed]
- Dornhege, G.; Millán, J.d.R.; Hinterberger, T.; McFarland, D.J.; Muller, K.R. Toward Brain-Computer Interfacing; Citeseer: State College, PA, USA, 2007; Volume 63. [Google Scholar]
- Kalagi, S.; Machado, J.; Carvalho, V.; Soares, F.; Matos, D. Brain computer interface systems using non-invasive electroencephalogram signal: A literature review. In Proceedings of the 2017 International Conference on Engineering, Technology and Innovation (ICE/ITMC), Madeira, Portugal, 27–29 June 2017; pp. 1578–1583. [Google Scholar]
- Abiri, R.; Borhani, S.; Sellers, E.W.; Jiang, Y.; Zhao, X. A comprehensive review of EEG-based brain–computer interface paradigms. J. Neural Eng. 2019, 16, 011001. [Google Scholar] [CrossRef] [PubMed]
- Teplan, M. Fundamentals of EEG measurement. Meas. Sci. Rev. 2002, 2, 1–11. [Google Scholar]
- Georgiev, D.D.; Georgieva, I.; Gong, Z.; Nanjappan, V.; Georgiev, G.V. Virtual reality for neurorehabilitation and cognitive enhancement. Brain Sci. 2021, 11, 221. [Google Scholar] [CrossRef]
- Wen, D.; Fan, Y.; Hsu, S.H.; Xu, J.; Zhou, Y.; Tao, J.; Lan, X.; Li, F. Combining brain–computer interface and virtual reality for rehabilitation in neurological diseases: A narrative review. Ann. Phys. Rehabil. Med. 2021, 64, 101404. [Google Scholar] [CrossRef]
- Robinson, N.; Mane, R.; Chouhan, T.; Guan, C. Emerging trends in BCI-robotics for motor control and rehabilitation. Curr. Opin. Biomed. Eng. 2021, 20, 100354. [Google Scholar] [CrossRef]
- Alrajhi, W.; Alaloola, D.; Albarqawi, A. Smart home: Toward daily use of BCI-based systems. In Proceedings of the 2017 International Conference on Informatics, Health & Technology (ICIHT), Riyadh, Saudi Arabia, 21–23 February 2017; pp. 1–5. [Google Scholar]
- Brunner, C.; Birbaumer, N.; Blankertz, B.; Guger, C.; Kübler, A.; Mattia, D.; Millán, J.D.R.; Miralles, F.; Nijholt, A.; Opisso, E.; et al. BNCI Horizon 2020: Towards a roadmap for the BCI community. Brain-Comput. Interfaces 2015, 2, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Marshall, D.; Coyle, D.; Wilson, S.; Callaghan, M. Games, gameplay, and BCI: The state of the art. IEEE Trans. Comput. Intell. AI Games 2013, 5, 82–99. [Google Scholar] [CrossRef]
- Vasiljevic, G.A.M.; de Miranda, L.C. Brain–computer interface games based on consumer-grade EEG Devices: A systematic literature review. Int. J. Hum.–Comput. Interact. 2020, 36, 105–142. [Google Scholar] [CrossRef]
- Kerous, B.; Skola, F.; Liarokapis, F. EEG-based BCI and video games: A progress report. Virtual Real. 2018, 22, 119–135. [Google Scholar] [CrossRef]
- Plass-Oude Bos, D.; Reuderink, B.; van de Laar, B.; Gürkök, H.; Mühl, C.; Poel, M.; Nijholt, A.; Heylen, D. Brain-computer interfacing and games. Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction; Springer: Cham, Switzerland, 2010; pp. 149–178. [Google Scholar]
- Stamps, K.; Hamam, Y. Towards inexpensive BCI control for wheelchair navigation in the enabled environment–a hardware survey. In Proceedings of the Brain Informatics: International Conference, BI 2010, Toronto, ON, Canada, 28–30 August 2010; pp. 336–345. [Google Scholar]
- Hjørungdal, R.M.; Sanfilippo, F.; Osen, O.; Rutle, A.; Bye, R.T. A game-based learning framework for controlling brain-actuated wheelchairs. In Proceedings of the 30th European Conference on Modelling and Simulation, Regensburg, Germany, 31 May–3 June 2016. [Google Scholar]
- Alchalcabi, A.E.; Eddin, A.N.; Shirmohammadi, S. More attention, less deficit: Wearable EEG-based serious game for focus improvement. In Proceedings of the 2017 IEEE 5th international conference on serious games and applications for health (SeGAH), Perth, WA, Australia, 2–4 April 2017; pp. 1–8. [Google Scholar]
- Fiałek, S.; Liarokapis, F. Comparing Two Commercial Brain Computer Interfaces for Serious Games and Virtual Environments. In Emotion in Games; Springer: Cham, Switzerland, 2016; pp. 103–117. [Google Scholar]
- Djamal, E.C.; Abdullah, M.Y.; Renaldi, F. Brain computer interface game controlling using fast fourier transform and learning vector quantization. J. Telecommun. Electron. Comput. Eng. 2017, 9, 71–74. [Google Scholar]
- Joselli, M.; Binder, F.; Clua, E.; Soluri, E. Mindninja: Concept, Development and Evaluation of a Mind Action Game Based on EEGs. In Proceedings of the 2014 Brazilian Symposium on Computer Games and Digital Entertainment, Porto Alegre, Brazil, 12–14 November 2014; pp. 123–132. [Google Scholar] [CrossRef]
- Glavas, K.; Prapas, G.; Tzimourta, K.D.; Giannakeas, N.; Tsipouras, M.G. Evaluation of the User Adaptation in a BCI Game Environment. Appl. Sci. 2022, 12, 12722. [Google Scholar] [CrossRef]
- Interaxon’s Muse 2. Available online: https://choosemuse.com/muse-2/ (accessed on 1 June 2023).
- Garcia-Moreno, F.M.; Bermudez-Edo, M.; Rodríguez-Fórtiz, M.J.; Garrido, J.L. A CNN-LSTM deep Learning classifier for motor imagery EEG detection using a low-invasive and low-Cost BCI headband. In Proceedings of the 2020 16th International Conference on Intelligent Environments (IE), Madrid, Spain, 20–23 July 2020; pp. 84–91. [Google Scholar]
- Chaudhary, M.; Mukhopadhyay, S.; Litoiu, M.; Sergio, L.E.; Adams, M.S. Understanding brain dynamics for color perception using wearable eeg headband. arXiv 2020, arXiv:2008.07092. [Google Scholar]
- Pu, L.; Lion, K.M.; Todorovic, M.; Moyle, W. Portable EEG monitoring for older adults with dementia and chronic pain-A feasibility study. Geriatr. Nurs. 2021, 42, 124–128. [Google Scholar] [CrossRef] [PubMed]
- Prapas, G.; Glavas, K.; Tzallas, A.T.; Tzimourta, K.D.; Giannakeas, N.; Tsipouras, M.G. Motor Imagery Approach for BCI Game Development. In Proceedings of the 2022 7th South-East Europe Design Automation, Computer Engineering, Computer Networks and Social Media Conference (SEEDA-CECNSM), Ioannina, Greece, 23–25 September 2022; pp. 1–5. [Google Scholar] [CrossRef]
- Kowaleski, J. BlueMuse. 2019. Available online: https://github.com/kowalej/BlueMuse (accessed on 10 October 2022).
- Kothe, C. Lab Streaming-Layer. 2018. Available online: https://github.com/sccn/labstreaminglayer (accessed on 8 May 2022).
- Marsland, S. Machine Learning: An Algorithmic Perspective; Chapman and Hall/CRC: Boca Raton, FL, USA, 2011. [Google Scholar]
- Raj, P.; Evangeline, P. The Digital Twin Paradigm for Smarter Systems and Environments: The Industry Use Cases; Academic Press: Cambridge, MA, USA, 2020. [Google Scholar]
- Miladinović, A.; Barbaro, A.; Valvason, E.; Ajčević, M.; Accardo, A.; Battaglini, P.P.; Jarmolowska, J. Combined and Singular Effects of Action Observation and Motor Imagery Paradigms on Resting-State Sensorimotor Rhythms. In Proceedings of the XV Mediterranean Conference on Medical and Biological Engineering and Computing—MEDICON 2019, Coimbra, Portugal, 26–28 September 2019; pp. 1129–1137. [Google Scholar]
Figure 1.
The Muse 2 headband is presented (right image). A screenshot from one recording is shown. The four EEG channels of the Muse 2 headband and their waves are presented (left image).
Figure 2.
This flowchart illustrates the two-step process of the proposed system. Initially, an offline processing phase is employed to train a classifier using EEG data. Following this, an online processing phase is used to take mental commands from the user, which are subsequently translated into in-game movement through the use of the trained classifier.
Figure 3.
Offline processing scenario to train the classifier. The first step of the scenario involves importing EEG recordings using a CSV file reader. Three separate boxes are used to handle the three different EEG recordings (Left, Right and Blink) of the BCI system. The signals are then filtered to remove frequencies outside the range of 8–40 Hz. The filtered signals are epoched in time windows of 3 s and EEG waves are calculated (Alpha, Beta 1, Beta 2 Gamma 1, Gamma 2). In the next step, the energy of the signals is calculated and a feature vector of all frequency bands energies is formulated in logarithmic scale. Finally, all feature vectors from all the different EGG recordings are used for the training of the classifier.
Figure 4.
Online Scenario for the proposed BCI system. Acquisition client connects with the LSL stream from BlueMuse in a specific port and the real time data processing starts. With the channel selector only the four EEG channels are included (TP9, TP10, AF7 AF8) and then the same process with the offline scenario is applied. The signals are filtered, EEG waves and energy are calculated and then this features are fed in the classifier to classify the mental commads. Finally, the LSL stream is employed to transmit the classifier’s results to the game. This was accomplished through the implementation of the LSL stream box, which facilitated the communication of data between OpenViBE and the game.
Figure 5.
Screenshot from the game-play. The avatar is moving on the platform depending on the user’s mental commands.
Figure 6.
Histogram that presents the different groups of users depending on their performance while testing the BCI game.
Figure 7.
The average user improvement for MI commands before and after playing the game.
Table 1.
The three classes are fully balanced and the feature vectors fed to the classifier for each class per subject is 97. The three classes are left MI, right MI, and blink.
Feature Vectors for Class 1 per Subject | Feature Vectors for Class 2 per Subject | Feature Vectors for Class 3 per Subject |
---|
97 | 97 | 97 |
Table 2.
Classification Results for the three classes. The obtained classification results in terms of True Positive Rate and Precision for each Motor Imagery command and Blink are presented for each subject. The last two columns provide the classification accuracy and ROC metric, respectively, for each subject.
Subjects | Right Motor Imagery (TPR/Precision) (%) | Left Motor Imagery (TPR/Precision) (%) | Blink (TPR/Precision) (%) | Overall
(Acc %) | ROC Area |
---|
1 | 76.6/93.4 | 94.5/79.7 | 100.0/100.0 | 90.3 | 0.946 |
2 | 97.2/100.0 | 100.0/96.7 | 100.0/100.0 | 99 | 0.997 |
3 | 99.3/100.0 | 100/98.7 | 100.0/100.0 | 99.7 | 1 |
4 | 95.2/89.8 | 89.7/94.6 | 100.0/100.0 | 94.9 | 0.993 |
5 | 99.3/98.7 | 98.6/99.4 | 100.0/100.0 | 99.3 | 0.997 |
6 | 84.1/85.0 | 85.5/84.6 | 100.0/100.0 | 89.8 | 0.968 |
7 | 84.8/98.5 | 99.3/84.4 | 99.3/99.0 | 94.4 | 0.993 |
8 | 100.0/99.0 | 99.3/98.7 | 100.0/100.0 | 99.7 | 1 |
9 | 95.2/93.5 | 94.6/95.0 | 99.3/100.0 | 96.5 | 0.997 |
10 | 99.3/97.7 | 97.5/98.8 | 100.0/100.0 | 99.1 | 0.998 |
11 | 78.6/94.8 | 96.2/81.3 | 100.0/100.0 | 91.1 | 0.982 |
12 | 96.6/98.4 | 97.9/100.0 | 100.0/100.0 | 98.1 | 0.999 |
13 | 91.7/77.5 | 73.8/90.4 | 99.3/99.7 | 88.2 | 0.935 |
14 | 97.9/91.3 | 91.0/97.5 | 100.0/100.0 | 96.3 | 0.997 |
15 | 99.0/100.0 | 100.0/99.0 | 100.0/100.0 | 99.6 | 1 |
16 | 100.0/98.0 | 97.9/100.0 | 100.0/100.0 | 99.3 | 0.997 |
17 | 91.7/100.0 | 100.0/92.4 | 100.0/100.0 | 97.2 | 0.997 |
18 | 93.8/92.9 | 92.8/93.8 | 100.0/100.0 | 95.5 | 0.994 |
19 | 100.0/100.0 | 100.0/100.0 | 100.0/100.0 | 100 | 1 |
20 | 97.9/96.9 | 96.6/97.9 | 100.0/100.0 | 98.1 | 0.994 |
21 | 95.2/98.5 | 99.3/93.8 | 99.3/100.0 | 97.9 | 0.997 |
22 | 100.0/95.2 | 95.2/100.0 | 100.0/100.0 | 98.3 | 0.994 |
23 | 97.2/94.6 | 95.2/96.4 | 100.0/100.0 | 97.4 | 0.998 |
24 | 99.3/98.7 | 100.0/99.0 | 98.6/100.0 | 99.3 | 1 |
25 | 96.6/100.0 | 100.0/96.3 | 99.3/100.0 | 98.5 | 0.994 |
26 | 99.3/99.3 | 100.0/99.0 | 99.3/100.0 | 99.5 | 1 |
27 | 99.0/100.0 | 100.0/99.0 | 100.0/100.0 | 99.6 | 1 |
28 | 100.0/83.6 | 80.4/100.0 | 100.0/100.0 | 93.4 | 0.982 |
29 | 99.0/96.0 | 96.0/98.8 | 100.0/100.0 | 98.3 | 1 |
30 | 99.0/95.0 | 94.8/99.8 | 100.0/100.0 | 97.9 | 0.993 |
31 | 95.9/96.9 | 96.9/95.9 | 100.0/100.0 | 97.6 | 0.998 |
32 | 91.8/93.7 | 93.8/91.9 | 100.0/100.0 | 95.2 | 0.993 |
33 | 97.9/99.0 | 99.0/98.0 | 100.0/100.0 | 98.9 | 0.996 |
Table 3.
Game Results. The Average Game Score column represents the average number of coins collected during the ten repetitions and the respective percentage. The Average Coin Clusters column represents the average number of coin clusters in which the subject collected at least 1 coin.
Subjects | Average Game Score | Average Coin Clusters |
---|
1 | 28.7 (57.4%) | 10.6 (62.3%) |
2 | 36.8 (73.6%) | 13.2 (77.6%) |
3 | 30.2 (60.4%) | 10.9 (64.1%) |
4 | 28.6 (57.2%) | 10.4 (61.1%) |
5 | 39.4 (78.8%) | 13.8 (81.1%) |
6 | 29.6 (59.2%) | 11.0 (64.7%) |
7 | 31.4 (62.8%) | 11.3 (66.4%) |
8 | 21.0 (42.0%) | 8.2 (48.2%) |
9 | 25.1 (50.2%) | 9.3 (54.7%) |
10 | 27.6 (55.2%) | 10.1 (59.4%) |
11 | 24.3 (48.6%) | 8.5 (50.0%) |
12 | 25.5 (51.0%) | 9.5 (55.8%) |
13 | 32.0 (64.0%) | 11.5 (67.6%) |
14 | 35.0 (70.0%) | 12.5 (73.5%) |
15 | 21.3 (42.6%) | 7.8 (45.8%) |
16 | 26.4 (52.8%) | 9.5 (55.8%) |
17 | 27.9 (55.8%) | 10.3 (60.5%) |
18 | 25.8 (51.6%) | 9.2 (54.1%) |
19 | 31.4 (62.8%) | 11.3 (66.4%) |
20 | 27.6 (55.2%) | 10.0 (58.8%) |
21 | 23.3 (46.6%) | 8.2 (48.2%) |
22 | 27.9 (55.8%) | 10.4 (61.1%) |
23 | 31.5 (63.0%) | 11.4 (67.0%) |
24 | 25.3 (50.6%) | 9.0 (52.9%) |
25 | 24.8 (49.6%) | 8.6 (50.5%) |
26 | 24.2 (48.4%) | 8.2 (48.2%) |
27 | 28.6 (57.2%) | 10.2 (60.0%) |
28 | 22.3 (44.6%) | 8.4 (49.4%) |
29 | 31.5 (63.0%) | 11.4 (67.0%) |
30 | 22.6 (45.2%) | 8.8 (51.7%) |
31 | 23.2 (46.4%) | 8.4 (49.4%) |
32 | 28.4 (56.8%) | 10.8 (63.5%) |
33 | 23.8 (47.6%) | 8.8 (51.7%) |
Table 4.
User Improvement. The table shows the number of correct Motor Imagery commands for 15 tries before and 15 tries after playing the game for each command. The results in the table present the True Positive Rate and Precision of each Motor Imagery command.
Subjects | Left MI (TPR/Precision) (%)
(Before Training) | Left MI (TPR/Precision) (%)
(After Training) | Right MI (TPR/Precision) (%)
(Before Training) | Right MI (TPR/Precision) (%)
(After Training) |
---|
1 | 73.3/66.8 | 80.0/80.0 | 66.7/71.4 | 80.0/80.0 |
2 | 86.7/81.3 | 93.3/87.5 | 80.0/88.5 | 86.7/92.9 |
3 | 80.0/70.6 | 86.7/72.2 | 66.7/76.9 | 66.7/83.3 |
4 | 73.3/73.3 | 73.3/78.6 | 73.3/73.3 | 80.0/75.0 |
5 | 93.3/82.4 | 100.0/88.2 | 80.0/92.3 | 86.7/100 |
6 | 80.0/75.0 | 80.0/80.0 | 73.3/78.6 | 80.0/80.0 |
7 | 86.7/81.3 | 93.3/93.3 | 80.0/85.7 | 93.3/93.3 |
8 | 46.7/50.0 | 60.0/56.3 | 53.3/50.0 | 53.3/57.1 |
9 | 66.7/71.4 | 73.3/78.6 | 73.3/68.8 | 80.0/75.0 |
10 | 73.3/68.8 | 80.0/80.0 | 66.7/71.4 | 80.0/80.0 |
11 | 66.7/62.5 | 73.3/66.8 | 60.0/64.3 | 66.7/71.4 |
12 | 66.7/66.7 | 80.0/75.0 | 66.7/66.7 | 73.3/78.6 |
13 | 80.0/80.0 | 93.3/87.5 | 80.0/80.0 | 86.7/92.9 |
14 | 86.7/92.9 | 93.3/100 | 93.3/87.5 | 100.0/93.8 |
15 | 60.0/52.9 | 53.3/57.1 | 46.7/53.8 | 60.0/56.3 |
16 | 66.7/62.5 | 73.3/64.7 | 60.0/64.3 | 60.0/69.2 |
17 | 80.0/75.0 | 80.0/80.0 | 73.3/78.6 | 80.0/80.0 |
18 | 73.3/68.8 | 80.0/75.0 | 66.7/71.4 | 73.3/78.6 |
19 | 86.7/92.9 | 93.3/87.5 | 93.3/87.5 | 86.7/92.9 |
20 | 80.0/75.0 | 80.0/80.0 | 73.3/78.6 | 80.0/80.0 |
21 | 53.3/57.1 | 60.0/64.3 | 60.0/56.3 | 66.7/62.5 |
22 | 66.7/62.5 | 73.3/73.3 | 60.0/64.3 | 73.3/73.3 |
23 | 86.7/72.2 | 100.0/78.9 | 66.7/83.3 | 73.3/100 |
24 | 100.0/75.0 | 100.0/88.2 | 66.7/100 | 86.7/100 |
25 | 93.3/87.5 | 93.3/87.5 | 86.7/92.9 | 86.7/92.9 |
26 | 66.7/90.9 | 80.0/70.6 | 53.3/73.7 | 66.7/76.9 |
27 | 100.0/78.9 | 100.0/100.0 | 73.3/100.0 | 100.0/100.0 |
28 | 40.0/100 | 53.3/88.9 | 100.0/62.5 | 93.3/66.7 |
29 | 100.0/100.0 | 100.0/100.0 | 100.0/100.0 | 100.0/100.0 |
30 | 0.0/- | 20.0/100.0 | 100.0/50.0 | 100.0/55.6 |
31 | 100.0/50.0 | 100.0/60.0 | 0.0/- | 33.3/100.0 |
32 | 66.7/90.9 | 86.7/100.0 | 93.3/73.7 | 100.0/88.2 |
33 | 53.3/72.7 | 80.0/100 | 80.0/63.2 | 100.0/83.3 |
Table 5.
Comparative study.
Authors | Subjects | EEG Device | Mental
Commands | Reps per
Subj | Experiment
Duration
(per Subj) | Evaluation Metrics |
---|
Hjørungdal et al. [18] | 3 | Emotiv EPOC+ | 4 | - | - | Time to Complete the Task |
Fiałek and Liarokapis [20] | 31 | NeuroSky MindWave
Emotiv EPOC+ | 2 | - | - | Avg Rating Values
Learnability, Satisfaction
Performance, Effort |
Alchalabi et al. [19] | 4 | Emotiv EPOC+ | 2 | 1 | - | Avg Focus (0.38), Avg Stress (0.49)
Avg Relaxation (0.32)
Avg Excitement (0.25)
Avg Engagement (0.65) |
Djamal et al. [21] | 20 | NeuroSky MindWave | 2 | 4 | - | Average Accuracy of Training and Testing Data |
Joselli et al. [22] | 11 | NeuroSky MindWave | 1 | 5 | 10 min | Avg Player Score (163.36)
Avg Missed Cuts (8)
Avg Attention Level (74)
Avg Stress Level (41.27)
Avg Engagement Level (75.73)
Avg Evolution Attention (1.70) |
Glavas et al. [23] | 38 | Muse 2 Headband | 2 | 20 | 40 min | Classification Accuracy (98.75%)
Avg Game Score 1 52.70%
Avg Game Score 2 70.35%
Improvement |
This work | 33 | Muse 2 Headband | 3 | 20 | 55 min | Classification Accuracy (96.94%)
Avg Game Score 27.6 (55.3%)
Avg Number of Clusters 10.04 (59%)
Avg Improvement 7.5% |
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).