Next Article in Journal
Etanercept Prevents Endothelial Dysfunction in Cafeteria Diet-Fed Rats
Previous Article in Journal
Emotional State of Mexican University Students in the COVID-19 Pandemic
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

EEG-Based Identification of Emotional Neural State Evoked by Virtual Environment Interaction

1
Yonsei Graduate Program in Cognitive Science, Yonsei University, Seoul 03722, Korea
2
Department of Psychology, Yonsei University, Seoul 03722, Korea
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Int. J. Environ. Res. Public Health 2022, 19(4), 2158; https://doi.org/10.3390/ijerph19042158
Submission received: 14 January 2022 / Revised: 11 February 2022 / Accepted: 11 February 2022 / Published: 14 February 2022

Abstract

:
Classifying emotional states is critical for brain–computer interfaces and psychology-related domains. In previous studies, researchers have tried to identify emotions using neural data such as electroencephalography (EEG) signals or brain functional magnetic resonance imaging (fMRI). In this study, we propose a machine learning framework for emotion state classification using EEG signals in virtual reality (VR) environments. To arouse emotional neural states in brain signals, we provided three VR stimuli scenarios to 15 participants. Fifty-four features were extracted from the collected EEG signals under each scenario. To find the optimal classification in our research design, three machine learning algorithms (XGBoost classifier, support vector classifier, and logistic regression) were applied. Additionally, various class conditions were used in machine learning classifiers to validate the performance of our framework. To evaluate the classification performance, we utilized five evaluation metrics (precision, recall, f1-score, accuracy, and AUROC). Among the three classifiers, the XGBoost classifiers showed the best performance under all experimental conditions. Furthermore, the usability of features, including differential asymmetry and frequency band pass categories, were checked from the feature importance of XGBoost classifiers. We expect that our framework can be applied widely not only to psychological research but also to mental health-related issues.

1. Introduction

The identification of neural states plays an important role in a variety of domains including brain–computer interaction (BCI) and psychological research [1,2,3]. In several previous studies, diverse methodologies have been applied to measure neural state variations among participants. Kevric and Subasi [4] proposed a motor imagery BCI model using electroencephalography (EEG) signals. Giuseppina et al. [5] utilized brain functional magnetic resonance imaging (fMRI) data to construct a BCI system. Among several brain imaging techniques used in associated research, the EEG-based approach is primarily used to analyze neural activities. In Xiaotong et al. [6], the authors suggested high temporal resolution reaching the millisecond level as advantages of EEG signals in associated research. Additionally, the signal reliability and mobility of recently used EEG collection devices have been positively evaluated for real-world BCI and clinical usage.
Based on the aforementioned advantages of EEG signals, several researchers collected EEG signals and used them to compare or classify the psychological states of participants. Duan et al. [7] attempted to classify the emotional states of participants using EEG signals. Fluctuations in EEG signals were recorded with movie clips used as stimuli. Secerbegovic et al. [8] used a single-channel EEG signal to differentiate between mental workload and stress in student groups. To investigate the different neural state variations in stress and workload, two scenarios were provided for distinct groups. The student group conducted 3-back visual tasks with high difficulty in the first scenario. For the other group, the same visual tasks (i.e., 3-back visual task) and Lumosity games were offered sequentially. Dong et al. [9] evaluated the spatial cognitive ability based on EEG signal analysis. They recorded and compared EEG signals before and after a spatial navigation training program. Navigation tasks for visible/hidden routes in a virtual city street environment were used for evaluation.
In recent research, virtual reality (VR) has been widely used as an experimental or therapeutic material. Kim et al. [10] proposed a human perception test framework for Hemispatial neglect based on VR. The test framework consists of screens with targets in 3D environments. Selections of participants about the target were recorded to assess their perception abilities. Elizabeth et al. [11] utilized VR tools to investigate the efficacy of exposure therapy for posttraumatic stress disorder (PTSD). Exposure therapy with imaginal exposure was offered in the display of VR devices. To compare the level of PTSD, clinician interviews were conducted before and after treatment. Torrico et al. [12] examined the emotional responses to chocolate in VR environments. Three environments, including traditional booths and VR devices, were used to identify differences in the effects of sensory responses.
Various virtual environments with EEG signals have been considered as research methods in the associated research. To check neural state variations in VR conditions, the measured EEG signals were analyzed by reaction from participants. Yu et al. [13] organized research designs using VR emotion recognition tasks and EEG measurements. Three different categories of stimuli (positive, neutral, and negative) were included in the tasks to compare emotional responses in experimental conditions. Functional brain network indices were calculated to detect the influence of tasks in a virtual environment. Tamburin et al. [14] used VR settings in their research to check the associations between craving and EEG measurements. Among diverse craving situations, the authors evaluated smoking-related cue reactivities in smokers and non-smokers. The participant groups (i.e., smokers and non-smokers) were exposed to two VR scenarios (mountain landscape and office). Band-pass power values were processed from EEG signals collected in virtual scenarios to distinguish the effects of the cues.
Li and Kim [15] evaluated the effects of task complexity in a VR environment using EEG signals. They calculated descriptive statistics from the behavioral responses. To compare responses between conditions, the authors conducted a repeated-measures ANOVA test. Kisker et al. [16] gathered electrophysiological correlates of fear and avoidance tendencies from EEG signals. Subsequently, Shapiro–Wilk tests were used to check the normality of the variables. Moreover, ANOVA and t-tests were applied to investigate the differences between responses and EEG analysis results.
Various studies have applied machine or deep learning algorithms to find latent patterns in EEG data. Qureshi et al. [17] utilized K-nearest neighbor (KNN) and fuzzy rough nearest neighbor (FRNN) algorithms to classify epileptic seizures from EEG signals. They suggested EEG feature extraction algorithms based on discrete wavelet transformation (DWT) methods. Additionally, the classification performances of fuzzy function-based algorithms were compared with traditional algorithms (such as decision tree and random forest). Furthermore, Geraedts et al. [18] proposed a fully automated EEG-based machine-learning pipeline to identify patients with PD. The proposed pipeline showed a better classification performance than the other frameworks.
Based on previous studies, we proposed a machine learning framework for neural state classification in a virtual environment based on EEG analysis. EEG signals were collected from 15 participants before and after the tasks to investigate the influences of VR tasks. A total of three virtual environments (high, low arousal, and social anxiety) were provided to participants for the evaluation of emotional responses. To identify optimal classification algorithms for our framework, three machine learning models (k-nearest neighbors, support vector machine, and random forest) were applied to the experiments in our study. Furthermore, we checked the important features of the classification models to interpret our experimental results.
The main contributions of our research are as follows. First, regarding the external environment during EEG signal collection, we provided various virtual situations in VR devices to investigate the influence of virtual content on neural state variation. Second, for checking suitable classification algorithms in our research scheme, we found the appropriateness of the XGBoost classifier based on comparisons between three machine learning classifiers. Third, as regards identification of important features in classification tasks, we verified the usability of fourteen features for classification among fifty-four features widely used in previous studies. Finally, we proposed a machine learning-based emotion classification framework with EEG signals in a virtual environment. This framework can contribute to future research in the field of mental states-related health issues (such as PTSD, obsessive-compulsive disorder, autism spectrum disorder, and attention deficit hyperactivity disorder).

2. Materials and Methods

2.1. Overview

Our experimental design comprised six steps. First, we collected VR-evoked EEG signals from participants in three virtual environment conditions. Second, EEG signals were preprocessed to perform feature extraction. Third, 54 features were extracted from the preprocessed EEG signals. Fourth, lasso and ridge regression models were used to select the proper features from the 54 extracted features. Fifth, we applied three machine learning classifiers to classify emotional neural states based on EEG signal features. Finally, the classification performance was evaluated using five evaluation indices. The detailed steps are shown in Figure 1.

2.2. Participants

Fifteen participants (eight men and seven women; age range, 19–38 years; mean = 26.27, standard deviation = 5.38) participated in the study. None of the participants reported any mental illness. The experiments were designed and conducted in accordance with the institutional review board approval at Yonsei University. In the experiment, participants were required to experience three types of 360-degree VR content on YouTube to induce different neural states [19,20,21]. Oculus Quest 2, a VR head-mounted display, was used to experience the VR content.

2.3. EEG Data Acquisition and Experimental Procedure

The three types of VR content—low arousal, high arousal, and social anxiety—were provided as stimuli. In the low-arousal condition, a virtual calm sea view was shown to experience a low-arousal state. In contrast, environments with skydiving were shown to experience a high-arousal state in high-arousal conditions. A previous study validated that skydiving experiences elicited a hyperarousal state [22]. To arouse a relatively higher emotional state than the aforementioned two conditions, we conducted group interviews under social anxiety conditions. Evoked social anxiety through VR content has been revealed in previous studies [23,24].
Participants were instructed to look around their surroundings in high- and low-arousal conditions. In social anxiety conditions, interviews with three interviewers were conducted. The level of arousal could be similar in conditions of high arousal and social anxiety. However, high arousal and social anxiety were distinct in that the condition of social anxiety involved interactions with the interviewer. In addition, the two conditions differed in valence of emotions. Skydiving is a positive experience associated with fun, happiness, and pleasure [25]. The social anxiety condition can induce more negative feelings than the high-arousal condition, since the interview can evoke anxiety and distress on account of the fact that the participants are evaluated by the interviewer [26]. Examples of virtual content are shown in Figure 2.
The portable EEG device, which is the MUSE2 headset band from InteraXon, Inc., recorded brain activity. It has seven electrodes: four active electrodes located at AF7, AF8, TP9, and TP10, which are included in the prefrontal and temporoparietal lobe regions, a ground electrode located at FPz in the center, and two reference electrodes to the left and right of the ground (Figure 3). EEG data were acquired using a 256 Hz sampling frequency.
Prior to the experiment, we explained the EEG signal collection procedure to the participants and asked each participant to sign a consent form. Once the consent form was signed, eyes-open resting state EEG was recorded for 5 min while the participant sat in a relaxed position to obtain baseline activity. Following the baseline EEG recording, the emotional states following each VR experience were examined for each participant. A block diagram of the collection procedure is presented in Figure 4.

2.4. EEG Preprocessing

The VR-evoked EEG signals from the participants were preprocessed using the EEGLAB (interactive MATLAB toolbox) [29]. First, baseline correction was accomplished by subtracting the mean baseline value from each epoch to reduce baseline differences across the EEG channels and temporal drifts. Second, the band-pass filter was applied from 0.5 to 70 Hz in order to minimize artifacts such as direct current shifts and filtering artifacts at epoch boundaries [30]. Third, the continuous resting state EEG data were segmented into a 2 s epoch, the length of which is advisable in order to remove short artifacts and improve non-stationarity [31,32]. Finally, 2 s lengths of epochs with noise or artifacts such as eye movements, eye blinks, and muscle activity were removed manually based on their signal shape (e.g, sudden spikes and similar patterns with cardiac cycles). In addition, an independent component analysis (ICA) algorithm, which identifies maximal temporally independent EEG signals, was used to remove deliberate noise in the EEG signals. As a result, an average of 54.27 epochs remained and was applied for analysis from 150 epoch sets after removal.

2.5. EEG Feature Extraction

We extracted a total of fifty-four features for the analysis. The power spectral density was calculated by averaging the power for the frequency band in each epoch and then averaging it for all epochs. The frequency bands were divided into delta (0.5–4 Hz), theta (4–8 Hz), alpha (8–12 Hz), beta (12–30 Hz), and gamma (30–50 Hz). Twenty frequency power features, that is, five frequency bands for the four electrodes, were extracted. Previous findings have suggested that frequency power analysis of resting-state EEG provides insight into the neural correlates of emotional mental state. For example, frontal delta power is presented during reduced alertness, such as at rest [33]. Lower delta activity in the posterior temporal and occipital areas before target onset is related to a more alert state [34]. Resting state theta activity has been linked to anxiety states, including social anxiety [35,36]. Alpha power is an index of the resting state with reduced cortical arousal. Lower alpha power in the posterior region has been reported in the high-arousal group [37], and high beta power is related to stress and anxiety [38]. Resting state gamma power is associated with high-arousal states, such as higher cognitive and emotional processes [39]. Resting frontal gamma power is associated with early cognition and language [40]. Differential asymmetry (DASM) is the difference in the frequency band power of symmetrical electrode pairs (TP9&TP10 and AF7&AF8), and there are ten DASMs comprising five frequency band powers for the two electrode pairs used [41]. Rational asymmetry (RASM) is the ratio of the frequency band power of symmetrical electrode pairs (TP9&TP10 and AF7&AF8) [42]. A total of ten RASM features, comprising five frequency band powers for two electrode pairs, were used. Correlation coefficients of frequency band power on pairs of electrodes, TP9&TP10 and AF7&AF8, were also used [43]. Thus, a total of ten correlation coefficient features, comprising five frequency bands for two electrode pairs, were used. The fractal dimension (FD) is a measure of the complexity of a key feature of fractals. The nonlinear property of the EEG signal in the time domain can be analyzed using the FD. A low FD corresponds to a regular signal; conversely, a high FD corresponds to an irregular signal. The FD could distinguish the EEG data of subjects in different states [44,45]. The Katz FD was used in this study based on previous results that showed superior performance in the classification of EEG data [46]. The FDs of the four electrodes are calculated in this study.
In summary, the EEG frequency power analysis was performed using a fast Fourier transform. Frequency bands were divided into delta (0–4 Hz), theta (4–7 Hz), alpha (8–12 Hz), beta (12–30 Hz), and gamma (30–50 Hz). The absolute power of each frequency band was estimated for each EEG channel. A total of 20 values (four EEG channels × five frequency bands) was estimated. DASM and rational asymmetry (RASM) are the differences and ratios of symmetrical electrode pairs, which are TP10-TP9 and AF8-AF7. Each pair from five frequency bands (two pairs of EEG channels × five frequency bands) was measured in the DASM and RASM. The correlation coefficient between paired regions was calculated for the five frequency domains (two pairs of EEG channels × five frequency bands). The fractal dimension, which measures the fractal complexity, was calculated for each electrode (four channels). The 54 features mentioned above were analyzed in this study. Detailed extracted features were listed in Table 1.

2.6. Feature Selection

Fifty-four features extracted from EEG signals may have two problems. First, a high correlation can be observed between features included in the same categories (multicollinearity or redundancy for classification). Second, irrelevant features for classification can be included in the features (low correlation). To identify the proper features for classification tasks, we applied lasso and ridge regression for feature selection [47,48].
In this step, we fitted the lasso and ridge regression models based on a dataset consisting of EEG feature values. Fifty-four features were applied as independent variables and emotional states were used as dependent variables for regression models. The coefficients were checked for individual features and sorted by scale. After sorting features by their coefficients, we selected and compared the top 20 features between the lasso and ridge regression conditions. As a result, 14 common features were selected from 54 features. The selected features are listed in Table 2.

2.7. Machine Learning Classification Algorithm

To classify emotional neural states based on EEG signals, we utilized three machine learning classifiers with 14 EEG features. The first classifier was the XGBoost classifier, which is based on an ensemble of multiple decision tree models. A given dataset with n samples and m features ( D = { x i ,   y j   |   x i R m , y j R } ) was used to train the classifier. In our study, we applied a dataset of 60 rows and 14 features. Gradient boosting methods with regularized objective functions constitute the basis for the algorithm.
L ϕ = i l y i ,   y i + k Ω f k
where   Ω f = γ T + 1 2 λ ω 2
y i = ϕ x i = k = 1 K f k x i ,   f k F
To optimize the XGBoost classifier with a dataset, we minimize the objective function with regularized terms in (1), where y i denotes the predicted values from decision tree models and f k indicates individual tree models. Differentiable convex loss functions that compare the difference between the predicted value ( y i ) and target value ( y i ) were included in (1) as function l . Penalization term was added as function Ω in the second term. In this study, we set y i as class labels to which emotional neural state levels were assigned (such as baseline, low, high arousal, and social anxiety level).
The second classifier was a support vector classifier (SVC) with nonlinear kernels. This classification algorithm classifies the feature space using hyperplanes to separate class labels. We applied a non-linear kernel (radial basis function kernel) to evaluate the performance under more diverse class conditions. In addition, we trained and evaluated the algorithm performance using completely participant-separated datasets to avoid overfitting when non-linear kernels were used.
The third classifier was the logistic regression model (LR). A maximum likelihood estimation method is used to estimate the coefficients of the regression model. Consequently, the classifier calculates the likelihood value L x , where 0   L x 1 . This likelihood value indicates associations between class labels and input vectors. We considered the basic form of the logistic regression model with our EEG features and neural state classes as follows:
F z = E Y X = 1 1 + e α + β i X i
where   z = α + β 1 X 1 + β 2 X 2 + + β k X k
where Y indicates the emotional neural state level as a class label. In summary, logistic regression classifiers suggested the probability of categorizing each class under various conditions.
To determine the optimal hyperparameters of the three classification algorithms, we conducted a random search. The detailed hyperparameters are presented in Table 3.

2.8. Evaluation Metrics

We evaluated the classification performance of the three classifiers by comparing five evaluation metric values. The true positive (TP), true negative (TN), false positive (FP), and false negative (FN) values were calculated from the confusion matrix. The TP and TN values indicate the ratio of correctly classified samples. In addition, the FP and FN values indicate incorrectly classified samples. Finally, four metric values were obtained from the aforementioned four values: precision, recall, f1-score, and accuracy. Additionally, to establish the receiver operating characteristic (ROC) curve, the true positive rate (TPR) and false positive rate (FPR) values were calculated. The area under the ROC curve (AUROC) was determined using an ROC curve.
P r e c i s i o n = T P T P + F P
R e c a l l = T P T P + F N
F 1 s c o r e = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l
A c c u r a c y = T P + T N T P + F P + T N + F N
T r u e   P o s i t i v e   R a t e = T P T P + F N
F a l s e   P o s i t i v e   R a t e = F P F P + T N
A fivefold cross-validation was used to evaluate machine learning algorithms in robust settings. Trained data for the classification used 80% of the data, while the remaining 20% were used for testing. The classification scores were averaged across the iterations to estimate the performance of the classifier.

2.9. Tools

All codes for ML classifiers and data preprocessing were written using Python (version 3.7.1; scikit-learn, version 2.4.1) and R (version 4.0.3) programming languages.

3. Results

3.1. Classification Performance of Machine Learning Classifiers

To identify optimal classification algorithms, we utilized three machine learning classifiers (XGBoost, SVC, and LR) for emotional neural state classification. Classification performance was evaluated based on five evaluation metrics (precision, recall, f1-score, accuracy, and AUROC). In the experimental results, the XGBoost classifiers showed the best classification performance under different conditions. We found the same trends in performance in both binary-class and multi-class conditions. The detailed classification performances and ROC curves are listed in Table 4 and Table 5.

3.2. Importance of Features for Classification of Emotional Neural States

We compared the important features of machine learning classifiers to find proper feature categories for classification. Among the three classifiers (XGBoost, SVC, and LR) used in the analysis, the feature importance of XGBoost classifiers was selected based on their classification performance. The top five important features were compared in the binary and multi-class conditions. In five feature categories (FP, DASM, RASM, CC, and FD), features of differential asymmetry (DASM), and frequency band power (FP) categories were mainly found in the experimental results. The detailed important features are listed in Table 6 and Table 7.

4. Discussion

In this study, we attempted to identify emotional neural states based on VR-evoked EEG signals using machine learning classification algorithms. A total of 15 participants were involved in our study to collect EEG signals from four virtual environments. To detect the difference in neural variation between virtual stimuli, fifty-four features including five categories, were extracted. In addition, 14 features were selected based on the coefficients of the lasso and ridge regression models. After consisting of datasets with EEG features, three classification algorithms were used to classify the emotional states. The XGBoost classifier showed the best classification performance for our study design. Furthermore, the feature importance of the XGBoost classifiers was compared to validate the usability of the features for classification. Among the five feature categories, the rank of the DASM and FP features was higher than that of the other categories. To suggest reasonable evidence of our research topics (i.e., VR evoked emotional EEG signals, classification through machine learning algorithms), we found associated previous research. First, related to the collection of EEG signals based on VR content, Tauscher et al. [49] compared EEG signal quality between VR content and traditional displays. They concluded that an improvement in signal quality was found with the fixation of VR contents and straps. Tremmel et al. [50] estimated the cognitive workload in an interactive virtual reality environment using EEG signals. The authors evaluated the feasibility of monitoring cognitive workload via EEG signals while performing a classical n-back test in VR devices. Alimardani et al. [51] assessed empathy levels in an effective virtual environment using EEG signals. The relationships between empathy scores and frontal alpha asymmetry were examined in the experimental results. Second, in terms of analysis with machine learning algorithms, Stevens et al. [52] conducted an analysis of the relationship between EEG and distraction and engagement contents based on machine learning models. An artificial neural network (ANN) model was utilized to determine the influence of various contents on EEG signal values. In Gross et al. [53], researchers attempted to diagnose premature Internet addiction using EEG spectra with machine learning algorithms. A random forest classifier was utilized for the classification of addiction. Baumgartl et al. [54] proposed a measurement system for social desirability based on EEG data. They utilized random forest models to classify social desirability levels. Based on the aforementioned previous studies, we concluded that our research topic regarding emotional neural state analysis evoked by VR through machine learning algorithms was reasonable. In addition, we designed our study based on related previous research to compare or validate the experimental results. Wang et al. [55] attempted to find detailed relationships between different emotional states and various EEG features. To stimulate participants’ emotions, researchers provided a set of movie clips, including two target emotional states (i.e., positive and negative emotions). EEG signals were collected from six health participants while watching movie clips. Three analysis methods (power spectrum, wavelet, and nonlinear dynamical analysis) were used to extract the features. With regard to extracted features, feature dimensionality reduction was conducted using principal component analysis, linear discriminant analysis, and correlation-based feature selector. Support vector machine models were applied to binary classification tasks for the two emotions. The performances of the algorithms were compared under various parameter conditions. Bazgir et al. [56] developed an emotion recognition system using EEG signals. EEG signals were measured with consecutive 1 min length music videos from the participants. The discrete wavelet transform method was used to extract the gamma, beta, alpha, and theta frequency band features. A principal component analysis was applied to maintain the same dimensionality of features. Three classification algorithms (support vector machine, k-nearest neighbor, and artificial neural network) were utilized to classify emotional states. Among the various frequency band features, the beta frequency band condition showed the best classification performance. Similar to the previously mentioned studies, the study design of our research comprised similar steps. First, the EEG dataset was collected in a VR content stimulus from 15 participants. Second, fifty-four EEG features were extracted to evaluate the effects of each feature. Third, we applied a lasso and ridge regression model to select the proper features from fifty-four features. Fourth, XGBoost, SVC, and LR models were used to classify the emotional neural variations. Finally, we compared the classification performances using five evaluation metrics. In our experimental procedure, we compared four levels of emotional neural states in the virtual environment. To set a baseline of the emotional state, we measured the EEG signals without any stimulus materials. Additionally, in low- and high-arousal conditions, EEG signals were collected in virtual environmental stimuli without interaction. Furthermore, neural states in social anxiety were investigated with relatively higher emotional conditions than other conditions in virtual interaction materials. To check the optimal algorithms for our research topic, we validated the classification performances under various comparison conditions. XGBoost classifiers showed the best classification performance among the three algorithms and all experimental conditions. To examine the influence of the features, we compared the feature importance results of XGBoost under all conditions. Differential asymmetry and frequency band power features were frequently found in high rank (i.e., top five importance). This trend can be verified in previous studies. Jenke et al. [57] suggested the utility of differential asymmetry in the emotion classification of EEG. In addition, Jatupaiboon et al. [58] and Li et al. [59] validated the influence of frequency band power values as features for emotion classification.

5. Conclusions

We collected and analyzed EEG signals in virtual environments to identify emotional states based on neural variations. To compare emotional influences in virtual content, we set four levels of emotional arousal (baseline, low, high arousal, and social anxiety) in our experimental materials. Three machine learning classification algorithms (XGBoost classifier, support vector classifier, and logistic regression) were applied to classify the neural fluctuations evoked by VR stimuli. Additionally, 54 features in five feature categories were applied to compare the usability for classification. Among the three classifiers, the XGBoost algorithm showed the best classification performance under all experimental conditions. Furthermore, we found the effectiveness of features, including differential asymmetry and frequency band power categories, to classify emotional states in our study design.
The first strength of this study was the application of machine learning classification algorithms to identify emotional neural states evoked by VR content. Second, the efficiency of the features was investigated through the feature importance of machine learning algorithms. Third, we collected EEG signals in VR content to examine the influence of the virtual environment on emotion. Finally, our framework can utilize an emotional state classification framework for various objectives. Our study has some limitations. First, we extracted and compared several features, including five feature categories. Other features or indices may be valuable for classification. However, we selected 54 features widely used in previous studies. Second, various methodologies, including deep learning algorithms, can be used in our research topics. In our study, we utilized machine learning models to check the importance of the features in the models. Third, additional validation is required in future studies with more participants to generalize our framework and results.

Author Contributions

Conceptualization: S.H. and D.J.; methodology: S.H.; validation: S.H.; formal analysis: D.J. and J.C.; investigation: D.J., J.K., and S.C.; writing—original draft preparation, D.J. and J.C.; writing—review and editing D.J. and J.C.; visualization, D.J. and J.K.; supervision, S.H.; project administration, S.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Brain Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science and ICT (2017M3C7A1029485) and partially supported by a National Research Foundation of Korea (NRF) grant funded by the Korean government (NRF-2019R1A2C1007399).

Institutional Review Board Statement

The study was conducted in accordance with the guidelines of the Declaration of Helsinki and approved by the institutional review board of Yonsei University (7001988-202104-HR-659-06).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yoo, S.S.; Fairneny, T.; Chen, N.K.; Choo, S.E.; Panych, L.P.; Park, H.; Jolesz, F.A. Brain-computer interface using fMRI: Spatial navigation by thoughts. Neuroreport 2004, 15, 1591–1595. [Google Scholar] [CrossRef] [PubMed]
  2. Guger, C.; Schlogl, A.; Neuper, C.; Walterspacher, D.; Strein, T.; Pfurtscheller, G. Rapid prototyping of an EEG-based brain-computer interface (BCI). IEEE Trans. Neural Syst. Rehabil. Eng. 2001, 9, 49–58. [Google Scholar] [CrossRef] [PubMed]
  3. Fabiani, G.E.; McFarland, D.J.; Wolpaw, J.R.; Pfurtscheller, G. Conversion of EEG activity into cursor movement by a brain-computer interface (BCI). IEEE Trans. Neural Syst. Rehabil. Eng. 2004, 12, 331–338. [Google Scholar] [CrossRef]
  4. Kevric, J.; Subasi, A. Comparison of signal decomposition methods in classification of EEG signals for motor-imagery BCI system. Biomed. Signal Process. Control 2017, 31, 398–406. [Google Scholar] [CrossRef]
  5. Rota, G.; Handjaras, G.; Sitaram, R.; Birbaumer, N.; Dogil, G. Reorganization of functional and effective connectivity during real-time fMRI-BCI modulation of prosody processing. Brain Lang. 2011, 117, 123–132. [Google Scholar] [CrossRef] [PubMed]
  6. Gu, X.; Cao, Z.; Jolfaei, A.; Xu, P.; Wu, D.; Jung, T.P.; Lin, C.T. EEG-based brain-computer interfaces (BCIs): A survey of recent studies on signal sensing technologies and computational intelligence approaches and their applications. IEEE/ACM Trans. Comput. Biol. Bioinform. 2021, 18, 1645–1666. [Google Scholar] [CrossRef]
  7. Duan, R.N.; Zhu, J.Y.; Lu, B.L. Differential entropy feature for EEG-based emotion classification. In Proceedings of the 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), San Diego, CA, USA, 6–8 November 2013; pp. 81–84. [Google Scholar]
  8. Secerbegovic, A.; Ibric, S.; Nisic, J.; Suljanovic, N.; Mujcic, A. Mental Workload vs. Stress Differentiation Using Single-Channel EEG. In CMBEBIH 2017: Proceedings of the International Conference on Medical and Biological Engineering 2017, Sarajevo, Bosnia and Herzegovina, 16–18 March 2017; Springer: Singapore; pp. 511–515.
  9. Wen, D.; Yuan, J.; Zhou, Y.; Xu, J.; Song, H.; Liu, Y.; Jung, T.P. The EEG Signal Analysis for Spatial Cognitive Ability Evaluation Based on Multivariate Permutation Conditional Mutual Information-Multi-Spectral Image. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 2113–2122. [Google Scholar] [CrossRef]
  10. Kim, T.L.; Kim, K.; Choi, C.; Lee, J.Y.; Shin, J.H. FOPR test: A virtual reality-based technique to assess field of perception and field of regard in hemispatial neglect. J. Neuroeng. Rehabil. 2021, 18, 39. [Google Scholar] [CrossRef]
  11. Stevens, E.S.; Bourassa, K.J.; Norr, A.M.; Reger, G.M. Posttraumatic Stress Disorder Symptom Cluster Structure in Prolonged Exposure Therapy and Virtual Reality Exposure. J. Trauma. Stress 2021, 34, 287–297. [Google Scholar] [CrossRef]
  12. Torrico, D.D.; Sharma, C.; Dong, W.; Fuentes, S.; Viejo, C.G.; Dunshea, F.R. Virtual reality environments on the sensory acceptability and emotional responses of no-and full-sugar chocolate. LWT 2021, 137, 110383. [Google Scholar] [CrossRef]
  13. Yu, M.; Xiao, S.; Hua, M.; Wang, H.; Chen, X.; Tian, F.; Li, Y. EEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features. Biomed. Signal Process. Control 2022, 72, 103349. [Google Scholar] [CrossRef]
  14. Tamburin, S.; Dal Lago, D.; Armani, F.; Turatti, M.; Saccà, R.; Campagnari, S.; Chiamulera, C. Smoking-related cue reactivity in a virtual reality setting: Association between craving and EEG measures. Psychopharmacology 2021, 238, 1363–1371. [Google Scholar] [CrossRef] [PubMed]
  15. Li, J.; Kim, J.E. The Effect of Task Complexity on Time Estimation in the Virtual Reality Environment: An EEG Study. Appl. Sci. 2021, 11, 9779. [Google Scholar] [CrossRef]
  16. Kisker, J.; Lange, L.; Flinkenflügel, K.; Kaup, M.; Labersweiler, N.; Tetenborg, F.; Schöne, B. Authentic Fear Responses in Virtual Reality: A Mobile EEG Study on Affective, Behavioral and Electrophysiological Correlates of Fear. Front. Virtual Real. 2021, 2, 716318. [Google Scholar] [CrossRef]
  17. Qureshi, M.B.; Afzaal, M.; Qureshi, M.S.; Fayaz, M. Machine learning-based EEG signals classification model for epileptic seizure detection. Multimed. Tools Appl. 2021, 80, 17849–17877. [Google Scholar]
  18. Geraedts, V.J.; Koch, M.; Contarino, M.F.; Middelkoop, H.A.M.; Wang, H.; van Hilten, J.J.; Tannemaat, M.R. Machine learning for automated EEG-based biomarkers of cognitive impairment during Deep Brain Stimulation screening in patients with Parkinson’s Disease. Clin. Neurophysiol. 2021, 132, 1041–1048. [Google Scholar] [CrossRef]
  19. 360 Degree Interview Video for VR Device, Youtube Video, 3 min 50 s, SoohyunChoi. Available online: https://www.youtube.com/watch?v=4NCdEzKfc7A (accessed on 6 October 2020).
  20. [360 Degree Video Series] Beach Sunrise Video for VR, Youtube Video, 29 Minutes 16 Seconds, HotgoraeTV. Available online: https://www.youtube.com/watch?v=XoERcJsk4nQ (accessed on 3 February 2021).
  21. 3D 360 Degress VR Skydiving Experience with the Vuze Camera (4K), Youtube Video, 4 Minutes 25 Seconds, vuze.camera. Available online: https://www.youtube.com/watch?v=rTM8vXtdIUA (accessed on 17 September 2017).
  22. Sterlini, G.L.; Bryant, R.A. Hyperarousal and dissociation: A study of novice skydivers. Behav. Res. Ther. 2002, 40, 431–437. [Google Scholar] [CrossRef]
  23. Kwon, J.H.; Powell, J.; Chalmers, A. How level of realism influences anxiety in virtual reality environments for a job interview. Int. J. Hum. Comput. Stud. 2013, 71, 978–987. [Google Scholar] [CrossRef]
  24. Owens, M.E.; Beidel, D.C. Can Virtual Reality Effectively Elicit Distress Associated with Social Anxiety Disorder? J. Psychopathol. Behav. Assess. 2015, 37, 296–305. [Google Scholar] [CrossRef]
  25. Price, I.R.; Bundesen, C. Emotional changes in skydivers in relation to experience. Personal. Individ. Differ. 2005, 38, 1203–1211. [Google Scholar] [CrossRef]
  26. McCarthy, J.; Goffin, R. Measuring Job Interview Anxiety: Beyond Weak Knees and Sweaty Palms. Pers. Psychol. 2004, 57, 607–637. [Google Scholar] [CrossRef]
  27. Jasper, H.H. The ten-twenty electrode system of the international federation. Electroencephalogr. Clin. Neurophysiol. 1958, 10, 370–375. [Google Scholar]
  28. Bird, J.J.; Manso, L.J.; Ribeiro, E.P.; Ekárt, A.; Faria, D.R. A Study on Mental State Classification using EEG-based Brain-Machine Interface. In Proceedings of the 2018 International Conference on Intelligent Systems (IS), Funchal, Portugal, 25–27 September 2018; pp. 795–800. [Google Scholar]
  29. Delorme, A.; Makeig, S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef] [Green Version]
  30. Alomari, M.H.; Samaha, A.; AlKamha, K. Automated Classification of L/R Hand Movement EEG Signals using Advanced Feature Extraction and Machine Learning. Int. J. Adv. Comput. Sci. Appl. 2013, 4, 2013. [Google Scholar]
  31. Rodrigues, J.; Weiß, M.; Hewig, J.; Allen, J.J.B. EPOS: EEG Processing Open-Source Scripts. Front. Neurosci. 2021, 15, 663. [Google Scholar] [CrossRef] [PubMed]
  32. Hassanpour, H.; Shahiri, M. Adaptive Segmentation Using Wavelet Transform. In Proceedings of the 2007 International Conference on Electrical Engineering, Lahore, Pakistan, 11–12 April 2007. [Google Scholar] [CrossRef]
  33. Chen, A.C.N.; Feng, W.; Zhao, H.; Yin, Y.; Wang, P. EEG default mode network in the human brain: Spectral regional field powers. NeuroImage 2008, 41, 561–574. [Google Scholar] [CrossRef]
  34. Jaiswal, S.; Tsai, S.-Y.; Juan, C.-H.; Muggleton, N.G.; Liang, W.-K. Low delta and high alpha power are associated with better conflict control and working memory in high mindfulness, low anxiety individuals. Soc. Cogn. Affect. Neurosci. 2019, 14, 645–655. [Google Scholar] [CrossRef]
  35. Nakashima, K.; Sato, H. The Effects of Various Mental Tasks on Appearance of Frontal Midline Theta Activity in EEG. J. Hum. Ergol. 1992, 21, 201–206. [Google Scholar]
  36. Xing, M.; Tadayonnejad, R.; MacNamara, A.; Ajilore, O.; DiGangi, J.; Phan, K.L.; Klumpp, H. Resting-state theta band connectivity and graph analysis in generalized social anxiety disorder. NeuroImage Clin. 2016, 13, 24–32. [Google Scholar] [CrossRef] [Green Version]
  37. Barry, R.J.; Clarke, A.R.; McCarthy, R.; Selikowitz, M.; Rushby, J.A.; Ploskova, E. EEG differences in children as a function of resting-state arousal level. Clin. Neurophysiol. 2004, 115, 402–408. [Google Scholar] [CrossRef]
  38. Díaz, H.M.; Cid, F.M.; Otárola, J.; Rojas, R.; Alarcón, O.; Cañete, L. EEG Beta band frequency domain evaluation for assessing stress and anxiety in resting, eyes closed, basal conditions. Procedia Comput. Sci. 2009, 162, 974–981. [Google Scholar] [CrossRef]
  39. Engel, A.K.; Fries, P.; Singer, W. Dynamic predictions: Oscillations and synchrony in top–down processing. Nat. Rev. Neurosci. 2001, 2, 704–716. [Google Scholar] [CrossRef] [PubMed]
  40. Benasich, A.A.; Gou, Z.; Choudhury, N.; Harris, K.D. Early cognitive and language skills are linked to resting frontal gamma power across the first 3 years. Behav. Brain Res. 2008, 195, 215–222. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  41. Arsalan, A.; Majid, M.; Butt, A.R.; Anwar, S.M. Classification of Perceived Mental Stress Using a Commercially Available EEG Headband. IEEE J. Biomed. Health Inform. 2019, 23, 2257–2264. [Google Scholar] [CrossRef]
  42. Loo, C.K.; Samraj, A.; Lee, G.C. Evaluation of Methods for Estimating Fractal Dimension in Motor Imagery-Based Brain Computer Interface. Discrete Dyn. Nat. Soc. 2011, 2011, 724697. [Google Scholar] [CrossRef] [Green Version]
  43. Hou, X.; Liu, Y.; Sourina, O.; Tan, Y.R.E.; Wang, L.; Mueller-Wittig, W. EEG Based Stress Monitoring. In Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics, Hong Kong, China, 9–12 October 2015; pp. 3110–3115. [Google Scholar]
  44. Wijayanto, I.; Rizal, A.; Humairani, A. Seizure Detection Based on EEG Signals Using Katz Fractal and SVM Classifiers. In Proceedings of the 2019 5th International Conference on Science in Information Technology (ICSITech), Yogyakarta, Indonesia, 23–24 October 2019; pp. 78–82. [Google Scholar]
  45. Akar, S.A.; Kara, S.; Latifoğlu, F.; Bilgiç, V. Investigation of the noise effect on fractal dimension of EEG in schizophrenia patients using wavelet and SSA-based approaches. Biomed. Signal Process. Control 2015, 18, 42–48. [Google Scholar] [CrossRef]
  46. Katz, M.J. Fractals and the analysis of waveforms. Comput. Biol. Med. 1988, 18, 145–156. [Google Scholar] [CrossRef]
  47. Tibshirani, R. Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B 1996, 58, 267–288. [Google Scholar] [CrossRef]
  48. Marquardt, D.W.; Snee, R.D. Ridge regression in practice. Am. Stat. 1975, 29, 3–20. [Google Scholar]
  49. Tauscher, J.P.; Schottky, F.W.; Grogorick, S.; Bittner, P.M.; Mustafa, M.; Magnor, M. Immersive EEG: Evaluating Electroencephalography in Virtual Reality. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1794–1800. [Google Scholar]
  50. Tremmel, C.; Herff, C.; Sato, T.; Rechowicz, K.; Yamani, Y.; Krusienski, D.J. Estimating cognitive workload in an interactive virtual reality environment using EEG. Front. Hum. Neurosci. 2019, 13, 401. [Google Scholar] [CrossRef]
  51. Alimardani, M.; Hermans, A.; Tinga, A.M. Assessment of empathy in an affective VR environment using EEG signals. arXiv 2020, arXiv:2003.10886. Available online: https://arxiv.org/abs/2003.10886 (accessed on 14 December 2021).
  52. Stevens, R.; Galloway, T.; Berka, C. Integrating EEG models of cognitive load with machine learning models of scientific problem solving. Augment. Cogn. Past Present Future 2016, 2, 55–65. [Google Scholar]
  53. Gross, J.; Baumgartl, H.; Buettner, R. A Novel Machine Learning Approach for High-Performance Diagnosis of Premature Internet Addiction Using the Unfolded EEG Spectra. In Proceedings of the 26th Americas Conference on Information Systems, Online, 15–17 August 2020. [Google Scholar]
  54. Baumgartl, H.; Roessler, P.; Sauter, D.; Buettner, R. Measuring Social Desirability Using a Novel Machine Learning Approach Based on EEG Data. In Proceedings of the Pacific Asia Conference on Information Systems, Dubai, United Arab Emirates, 20–24 June 2020; p. 100. [Google Scholar]
  55. Wang, X.W.; Nie, D.; Lu, B.L. Emotional state classification from EEG data using machine learning approach. Neurocomputing 2014, 129, 94–106. [Google Scholar] [CrossRef]
  56. Bazgir, O.; Mohammadi, Z.; Habibi, S.A.H. Emotion recognition with machine learning using EEG signals. In Proceedings of the 2018 25th National and 3rd International Iranian Conference on Biomedical Engineering (ICBME), Qom, Iran, 29–30 November 2018; pp. 1–5. [Google Scholar]
  57. Jenke, R.; Peer, A.; Buss, M. Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affect. Comput. 2014, 5, 327–339. [Google Scholar] [CrossRef]
  58. Jatupaiboon, N.; Pan-ngum, S.; Israsena, P. Emotion classification using minimal EEG channels and frequency bands. In Proceedings of the 2013 10th International Joint Conference on Computer Science and Software Engineering (JCSSE), Khon Kaen, Thailand, 29–31 May 2013; pp. 21–24. [Google Scholar]
  59. Li, M.; Xu, H.; Liu, X.; Lu, S. Emotion recognition from multichannel EEG signals using K-nearest neighbor classification. Technol. Health Care 2018, 26, 509–519. [Google Scholar] [CrossRef]
Figure 1. Overview of the research scheme.
Figure 1. Overview of the research scheme.
Ijerph 19 02158 g001
Figure 2. VR content to induce the state.
Figure 2. VR content to induce the state.
Ijerph 19 02158 g002
Figure 3. Electrode and channel information of EEG devices used in this study. (a) Muse2 headset band. (b) EEG montage of Muse 2 headset band based on the International 10–20 EEG electrode placement standard [27,28].
Figure 3. Electrode and channel information of EEG devices used in this study. (a) Muse2 headset band. (b) EEG montage of Muse 2 headset band based on the International 10–20 EEG electrode placement standard [27,28].
Ijerph 19 02158 g003
Figure 4. Block diagram of EEG signal collection with VR contents.
Figure 4. Block diagram of EEG signal collection with VR contents.
Ijerph 19 02158 g004
Table 1. Extracted features from EEG signals.
Table 1. Extracted features from EEG signals.
Feature CategoryFeatureNo. of Features
Frequency band power (FP)Delta power20 features
(4 electrodes × 5 band power)
Theta power
Alpha power
Beta power
Gamma power
Differential asymmetry (DASM)Delta power10 features
(2 electrode pairs × 5 band power)
Theta power
Alpha power
Beta power
Gamma power
Rational asymmetry (RASM)Delta power10 features
(2 electrode pairs × 5 band power)
Theta power
Alpha power
Beta power
Gamma power
Correlation coefficient (CC)Delta power10 features
(2 electrode pairs × 5 band power)
Theta power
Alpha power
Beta power
Gamma power
Fractal dimension (FD)AF74 features
(4 electrodes)
AF8
TP9
TP10
Table 2. Coefficient values of each feature from lasso and ridge regression models.
Table 2. Coefficient values of each feature from lasso and ridge regression models.
RankRidge Regression CoefficientFeatureLasso Regression CoefficientFeatureNo.Selected Features (Common Feature)
11.6540CC_Gamma_AF 11.6025CC_Gamma_AF1CC_Gamma_AF
21.5410CC_Gamma_TP1.1770CC_Gamma_TP2CC_Gamma_TP
31.4678DASM_Beta_TP 20.7070FP_Alpha_AF73DASM_Beta_TP
41.2270FP_Delta_TP9 30.6682DASM_Beta_TP4DASM_Delta_TP
51.2219DASM_Delta_TP0.5537CC_Beta_AF5FP_Beta_TP9
61.2061FP_Beta_TP90.5525FP_Beta_TP96FP_Alpha_AF7
71.1511FP_Alpha_AF70.5366FP_Beta_AF87CC_Beta_AF
81.1342RASM_Delta_AF 40.2972CC_Delta_AF8FP_Beta_AF8
91.0362CC_Beta_AF0.2726DASM_Alpha_AF9FP_Gamma_TP9
101.0280DASM_Gamma_TP0.2453FP_Alpha_TP1010CC_Delta_AF
110.8683FP_Beta_AF80.2225FP_Gamma_AF811DASM_Alpha_AF
120.8216FP_Gamma_TP90.1509DASM_Delta_TP12FP_Alpha_TP10
130.5619RASM_Gamma_TP0.1448FP_Theta_TP913DASM_Theta_AF
140.5369CC_Delta_AF0.1327FP_Theta_AF714FP_Theta_AF7
150.4626DASM_Alpha_AF0.1064FP_Delta_AF8
160.4328RASM_Beta_TP0.0798DASM_Gamma_TP
170.4295RASM_Theta_AF0.0751FP_Delta_AF7
180.3106FP_Alpha_TP100.0561DASM_Theta_AF
190.2201DASM_Theta_AF0.0329DASM_Beta_AF
200.1270FP_Theta_AF70.0081FP_Gamma_TP9
1 CC, correlation coefficient; 2 DASM, differential asymmetry; 3 FP, frequency band power; 4 RASM, rational asymmetry.
Table 3. Hyperparameters of three machine learning classifiers.
Table 3. Hyperparameters of three machine learning classifiers.
ClassifierHyperparameterArgument
XGBoost classifierEta0.3
Gamma0
max_depth6
min_child_weight1
Support vector classifierKernelrbf
Gammaauto
Logistic regressionPenaltyL2
Solvernewton-cg
Table 4. Classification performance results for classifiers in binary-class condition.
Table 4. Classification performance results for classifiers in binary-class condition.
ConditionClassifierPrecisionRecallF1-ScoreAccuracyAUROC 1
Baseline vs. low arousalXGBoost0.8460.8460.8380.8490.925
SVC 20.7950.8290.7640.7370.789
LR 30.5330.5630.5280.5220.583
Baseline vs. high arousalXGBoost0.8510.8550.8580.8380.860
SVC0.7690.7470.7480.7220.686
LR0.6510.6730.6630.6320.669
Baseline vs. social anxietyXGBoost0.9290.9140.9150.9290.941
SVC0.8430.8330.8600.8300.856
LR0.7210.7280.7330.7120.813
low arousal vs. high arousalXGBoost0.8530.8580.8800.8430.858
SVC0.7570.7510.7520.7500.814
LR0.7400.6960.7170.7040.778
low arousal vs. social anxietyXGBoost0.8650.8400.8520.8400.857
SVC0.7770.7880.7430.7390.814
LR0.5140.5550.5730.5580.474
high arousal vs. social anxietyXGBoost0.9030.9210.9070.8920.936
SVC0.8390.8540.8260.8530.855
LR0.7870.7430.7540.7570.813
1 AUROC: area under the ROC curve, 2 SVC: support vector classifier, 3 LR: logistic regression.
Table 5. Classification performance results for classifiers in multi-class condition.
Table 5. Classification performance results for classifiers in multi-class condition.
ConditionClassifierPrecisionRecallF1-ScoreAccuracyAUROC 1
Baseline vs. low arousal vs. high arousalXGBoost0.9120.9110.9130.9380.938
SVC 20.6770.6700.6710.6790.631
LR 30.5790.5720.5270.5310.578
Baseline vs. low arousal vs. social anxietyXGBoost0.8470.8440.8390.8600.856
SVC0.7070.7580.7540.7450.767
LR0.5370.5320.5640.5470.534
low arousal vs. high arousal vs. social anxietyXGBoost0.9020.9110.9110.9380.905
SVC0.7450.7260.7550.7340.764
LR0.6600.6640.6510.6640.725
Baseline vs. low 4 vs. high arousal vs. social anxietyXGBoost0.8430.8740.8460.8450.858
SVC0.7300.7520.7520.7420.683
LR0.6290.6190.5330.5170.556
1 AUROC, area under the ROC curve; 2 SVC, support vector classifier; 3 LR, logistic regression; 4 low, low-arousal condition.
Table 6. Top 5 important features from XGBoost classifier in binary-class conditions.
Table 6. Top 5 important features from XGBoost classifier in binary-class conditions.
RankBaseline vs. Low 1Baseline vs. High 2Baseline vs. Social 3Low vs. HighLow vs. SocialHigh vs. Social
1FP_beta_TP9CC_delta_AFDASM_delta_TPDASM_delta_TPFP_theta_AF7DASM_delta_TP
2DASM_alpha_AFFP_alpha_TP10FP_theta_AF7CC_gamma_TPDASM_delta_TPFP_theta_AF7
3CC_delta_AFDASM_alpha_AFDASM_theta_AFFP_alpha_TP10CC_gamma_TPFP_beta_AF8
4FP_alpha_AF7CC_beta_AFDASM_beta_TPDASM_theta_AFFP_beta_AF8DASM_alpha_AF
5DASM_beta_TPFP_theta_AF7FP_alpha_AF7CC_beta_AFCC_delta_AFFP_beta_TP9
1 low: low-arousal condition, 2 high: high-arousal condition, 3 social anxiety condition.
Table 7. Top 5 important features from XGBoost classifier in multi-class conditions.
Table 7. Top 5 important features from XGBoost classifier in multi-class conditions.
RankBaseline vs. Low 1 vs. High 2Baseline vs. Low 1 vs. Social 3Low vs. High vs. SocialBaseline vs. Low vs. High vs. Social
1DASM_theta_AFFP_theta_AF7DASM_delta_TPFP_theta_AF7
2CC_delta_AFDASM_alpha_AFFP_theta_AF7DASM_delta_TP
3DASM_alpha_AFFP_alpha_TP10DASM_alpha_AFCC_gamma_TP
4DASM_delta_TPFP_beta_AF8CC_beta_AFFP_alpha_TP10
5DASM_beta_TPFP_beta_TP9DASM_theta_AFCC_delta_AF
1 low: low-arousal condition, 2 high: high-arousal condition, 3 social anxiety condition.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jung, D.; Choi, J.; Kim, J.; Cho, S.; Han, S. EEG-Based Identification of Emotional Neural State Evoked by Virtual Environment Interaction. Int. J. Environ. Res. Public Health 2022, 19, 2158. https://doi.org/10.3390/ijerph19042158

AMA Style

Jung D, Choi J, Kim J, Cho S, Han S. EEG-Based Identification of Emotional Neural State Evoked by Virtual Environment Interaction. International Journal of Environmental Research and Public Health. 2022; 19(4):2158. https://doi.org/10.3390/ijerph19042158

Chicago/Turabian Style

Jung, Dawoon, Junggu Choi, Jeongjae Kim, Seoyoung Cho, and Sanghoon Han. 2022. "EEG-Based Identification of Emotional Neural State Evoked by Virtual Environment Interaction" International Journal of Environmental Research and Public Health 19, no. 4: 2158. https://doi.org/10.3390/ijerph19042158

APA Style

Jung, D., Choi, J., Kim, J., Cho, S., & Han, S. (2022). EEG-Based Identification of Emotional Neural State Evoked by Virtual Environment Interaction. International Journal of Environmental Research and Public Health, 19(4), 2158. https://doi.org/10.3390/ijerph19042158

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop