entropy-logo

Journal Browser

Journal Browser

Entropy and Electroencephalography II

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: closed (30 June 2017) | Viewed by 108986

Special Issue Editor


E-Mail Website
Guest Editor
Instituto de Física, Universidade Federal de Alagoas, Maceió 57072-970, Alagoas, Brazil
Interests: time-series analysis; information theory; time–frequency transform; wavelet transform; entropy and complexity; non-linear dynamics and chaos; complex networks; medical and biological applications
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Synchronous neuronal discharges create rhythmic potential fluctuations that can be recorded from the scalp through electroencephalography. The electroencephalogram (EEG) can be roughly defined as the mean brain electrical activity measured at different sites of the head. An EEG reflects characteristics of the brain activity itself and also yields clues concerning the underlying associated neural dynamics. Information processing in the brain involves neurons communicating with each other and results in dynamical changes in their electrical activity. Relevant dynamical changes during information processing are also reflected in the time series, frequency, and through different brain localizations. Therefore, concomitant studies require methods capable of describing the qualitative and quantitative signal variations in time, frequency, and spatial localization.

The traditional way of analyzing brain electrical activity, on the basis of electroencephalography (EEG) records, relies mainly on visual inspection and years of training. Although such analysis is quite useful, its subjective nature precludes a systematic protocol.

Over the last few years, complex networks theory gained wider applicability since methods for transformation of time series to networks have been proposed and successfully tested, expanding in this way the form in which we analyze and characterize time series. In addition, Information Theory based quantifiers, such as entropy measures and related metrics, have emerged as particularly appropriate complexity measures in the study of time series from biological systems (such as the brain). The reasons for this increasing success are manifold.

First, biological systems are typically characterized by complex dynamics. Even at rest, such systems’ dynamics have rich temporal structures. On the one hand, spontaneous brain activity encompasses a set of dynamically switching states, which are continuously reedited across the cortex, in a non random way. On the other hand, various pathologies are associated with the appearance of highly stereotyped patterns of activity. For instance, epileptic seizures are typically characterized by ordered sequences of symptoms. Entropy based quantifiers seem particularly well equipped to capture these structures (i.e., stereotyped patterns) in both healthy systems and in pathological states.

Second, while over the last few decades, a wealth of linear (and, more recently, nonlinear) methods for quantifying these structures from time series have been devised, most of them, in addition to making restrictive hypotheses as to the type of underlying dynamics, are vulnerable to even low levels of noise. Even mostly deterministic biological time series typically contain a certain degree of randomness (e.g., in the form of dynamical and observational noise). Therefore, analyzing signals from such systems necessitates methods that are model free and robust. Contrary to most nonlinear measures, some entropy measures and derived metrics can be calculated for arbitrary real world time series and are rather robust with respect to noise sources and artifacts, and can be used in order to extract information between simultaneous recording data (causality, transfer information, synchronicity, etc.).

Finally, real time applications for clinical purposes require computationally parsimonious algorithms that can provide reliable results for relatively short and noisy time series. Most existing methods require long, stationary, and noiseless data. In contrast, methods utilizing quantifiers based on Information Theory, such as entropy measures, can be extremely fast and robust, and seem particularly advantageous when there are huge datasets and no time for preprocessing and fine tuning parameters. These new quantifiers can be applied to one-dimensional time series, as well as, adapted for complex networks.

For this second Special Issue on "Entropy and EEG", we welcome submissions related to time series analysis using entropy quantifiers and related measures to study brain (electrical) dynamics that is recorded under normal and special conditions like, sleep, conditions induced by anesthesia or other drugs. We also welcome studies concerning major abnormalities (pathological states) such as epilepsy seizures and mental illnesses such as dementia, schizophrenia, Alzheimer's and Parkinson's disease; and cognitive neuroscience, as well as computer-brain interphase. We envisage contributions that aim at clarifying brain dynamics characteristics using time series recorded with electroencephalographic (EEG) techniques. In addition, we hope to receive original papers illustrating entropic methods' wide variety of applications, which are relevant for studying EEG classification, EEG and its relation with local field potentials (LFP), determinism detection, detection of dynamical changes, prediction and spatiotemporal dynamics.

Dr. Osvaldo A. Rosso
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • brain dynamics
  • EEG and LFP time series
  • wavelet analysis
  • permutation entropy
  • permutation statistical complexity
  • approximate entropy
  • sample entropy
  • complex networks
  • synchronization
  • sleep
  • epilepsy
  • schizophrenia, Alzheimer's and Parkinson's disease

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (12 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

227 KiB  
Article
The Potential Application of Multiscale Entropy Analysis of Electroencephalography in Children with Neurological and Neuropsychiatric Disorders
by Yen-Ju Chu, Chi-Feng Chang, Jiann-Shing Shieh and Wang-Tso Lee
Entropy 2017, 19(8), 428; https://doi.org/10.3390/e19080428 - 21 Aug 2017
Cited by 17 | Viewed by 5211
Abstract
Electroencephalography (EEG) is frequently used in functional neurological assessment of children with neurological and neuropsychiatric disorders. Multiscale entropy (MSE) can reveal complexity in both short and long time scales and is more feasible in the analysis of EEG. Entropy-based estimation of EEG complexity [...] Read more.
Electroencephalography (EEG) is frequently used in functional neurological assessment of children with neurological and neuropsychiatric disorders. Multiscale entropy (MSE) can reveal complexity in both short and long time scales and is more feasible in the analysis of EEG. Entropy-based estimation of EEG complexity is a powerful tool in investigating the underlying disturbances of neural networks of the brain. Most neurological and neuropsychiatric disorders in childhood affect the early stage of brain development. The analysis of EEG complexity may show the influences of different neurological and neuropsychiatric disorders on different regions of the brain during development. This article aims to give a brief summary of current concepts of MSE analysis in pediatric neurological and neuropsychiatric disorders. Studies utilizing MSE or its modifications for investigating neurological and neuropsychiatric disorders in children were reviewed. Abnormal EEG complexity was shown in a variety of childhood neurological and neuropsychiatric diseases, including autism, attention deficit/hyperactivity disorder, Tourette syndrome, and epilepsy in infancy and childhood. MSE has been shown to be a powerful method for analyzing the non-linear anomaly of EEG in childhood neurological diseases. Further studies are needed to show its clinical implications on diagnosis, treatment, and outcome prediction. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
3612 KiB  
Article
Noise Robustness Analysis of Performance for EEG-Based Driver Fatigue Detection Using Different Entropy Feature Sets
by Jianfeng Hu and Ping Wang
Entropy 2017, 19(8), 385; https://doi.org/10.3390/e19080385 - 27 Jul 2017
Cited by 24 | Viewed by 6444
Abstract
Driver fatigue is an important factor in traffic accidents, and the development of a detection system for driver fatigue is of great significance. To estimate and prevent driver fatigue, various classifiers based on electroencephalogram (EEG) signals have been developed; however, as EEG signals [...] Read more.
Driver fatigue is an important factor in traffic accidents, and the development of a detection system for driver fatigue is of great significance. To estimate and prevent driver fatigue, various classifiers based on electroencephalogram (EEG) signals have been developed; however, as EEG signals have inherent non-stationary characteristics, their detection performance is often deteriorated by background noise. To investigate the effects of noise on detection performance, simulated Gaussian noise, spike noise, and electromyogram (EMG) noise were added into a raw EEG signal. Four types of entropies, including sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE), and spectral entropy (PE), were deployed for feature sets. Three base classifiers (K-Nearest Neighbors (KNN), Support Vector Machine (SVM), and Decision Tree (DT)) and two ensemble methods (Bootstrap Aggregating (Bagging) and Boosting) were employed and compared. Results showed that: (1) the simulated Gaussian noise and EMG noise had an impact on accuracy, while simulated spike noise did not, which is of great significance for the future application of driver fatigue detection; (2) the influence on noise performance was different based on each classifier, for example, the robust effect of classifier DT was the best and classifier SVM was the weakest; (3) the influence on noise performance was also different with each feature set where the robustness of feature set FE and the combined feature set were the best; and (4) while the Bagging method could not significantly improve performance against noise addition, the Boosting method may significantly improve performance against superimposed Gaussian and EMG noise. The entropy feature extraction method could not only identify driver fatigue, but also effectively resist noise, which is of great significance in future applications of an EEG-based driver fatigue detection system. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

5512 KiB  
Article
Cooperative Particle Filtering for Tracking ERP Subcomponents from Multichannel EEG
by Sadaf Monajemi, Delaram Jarchi, Sim-Heng Ong and Saeid Sanei
Entropy 2017, 19(5), 199; https://doi.org/10.3390/e19050199 - 29 Apr 2017
Cited by 9 | Viewed by 6770
Abstract
In this study, we propose a novel method to investigate P300 variability over different trials. The method incorporates spatial correlation between EEG channels to form a cooperative coupled particle filtering method that tracks the P300 subcomponents, P3a and P3b, over trials. Using state [...] Read more.
In this study, we propose a novel method to investigate P300 variability over different trials. The method incorporates spatial correlation between EEG channels to form a cooperative coupled particle filtering method that tracks the P300 subcomponents, P3a and P3b, over trials. Using state space systems, the amplitude, latency, and width of each subcomponent are modeled as the main underlying parameters. With four electrodes, two coupled Rao-Blackwellised particle filter pairs are used to recursively estimate the system state over trials. A number of physiological constraints are also imposed to avoid generating invalid particles in the estimation process. Motivated by the bilateral symmetry of ERPs over the brain, the channels further share their estimates with their neighbors and combine the received information to obtain a more accurate and robust solution. The proposed algorithm is capable of estimating the P300 subcomponents in single trials and outperforms its non-cooperative counterpart. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

2828 KiB  
Article
Permutation Entropy: New Ideas and Challenges
by Karsten Keller, Teresa Mangold, Inga Stolz and Jenna Werner
Entropy 2017, 19(3), 134; https://doi.org/10.3390/e19030134 - 21 Mar 2017
Cited by 65 | Viewed by 11995
Abstract
Over recent years, some new variants of Permutation entropy have been introduced and applied to EEG analysis, including a conditional variant and variants using some additional metric information or being based on entropies that are different from the Shannon entropy. In some situations, [...] Read more.
Over recent years, some new variants of Permutation entropy have been introduced and applied to EEG analysis, including a conditional variant and variants using some additional metric information or being based on entropies that are different from the Shannon entropy. In some situations, it is not completely clear what kind of information the new measures and their algorithmic implementations provide. We discuss the new developments and illustrate them for EEG data. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

907 KiB  
Article
Spectral Entropy Parameters during Rapid Ventricular Pacing for Transcatheter Aortic Valve Implantation
by Tadeusz Musialowicz, Antti Valtola, Mikko Hippeläinen, Jari Halonen and Pasi Lahtinen
Entropy 2017, 19(3), 133; https://doi.org/10.3390/e19030133 - 20 Mar 2017
Cited by 1 | Viewed by 5478
Abstract
The time-frequency balanced spectral entropy of the EEG is a monitoring technique measuring the level of hypnosis during general anesthesia. Two components of spectral entropy are calculated: state entropy (SE) and response entropy (RE). Transcatheter aortic valve implantation (TAVI) is a less invasive [...] Read more.
The time-frequency balanced spectral entropy of the EEG is a monitoring technique measuring the level of hypnosis during general anesthesia. Two components of spectral entropy are calculated: state entropy (SE) and response entropy (RE). Transcatheter aortic valve implantation (TAVI) is a less invasive treatment for patients suffering from symptomatic aortic stenosis with contraindications for open heart surgery. The goal of hemodynamic management during the procedure is to achieve hemodynamic stability with exact blood pressure control and use of rapid ventricular pacing (RVP) that result in severe hypotension. The objective of this study was to examine how the spectral entropy values respond to RVP and other critical events during the TAVI procedure. Twenty one patients undergoing general anesthesia for TAVI were evaluated. The RVP was used twice during the procedure at a rate of 185 ± 9/min with durations of 16 ± 4 s (range 8–22 s) and 24 ± 6 s (range 18–39 s). The systolic blood pressure during RVP was under 50 ± 5 mmHg. Spectral entropy values SE were significantly declined during the RVP procedure, from 28 ± 13 to 23 ± 13 (p < 0.003) and from 29 ± 12 to 24 ± 10 (p < 0.001). The corresponding values for RE were 29 ± 13 vs. 24 ± 13 (p < 0.006) and 30 ± 12 vs. 25 ± 10 (p < 0.001). Both SE and RE values returned to the pre-RVP values after 1 min. Ultra-short hypotension during RVP changed the spectral entropy parameters, however these indices reverted rapidly to the same value before application of RVP. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

4315 KiB  
Article
Quantitative EEG Markers of Entropy and Auto Mutual Information in Relation to MMSE Scores of Probable Alzheimer’s Disease Patients
by Carmina Coronel, Heinrich Garn, Markus Waser, Manfred Deistler, Thomas Benke, Peter Dal-Bianco, Gerhard Ransmayr, Stephan Seiler, Dieter Grossegger and Reinhold Schmidt
Entropy 2017, 19(3), 130; https://doi.org/10.3390/e19030130 - 17 Mar 2017
Cited by 35 | Viewed by 7068
Abstract
Analysis of nonlinear quantitative EEG (qEEG) markers describing complexity of signal in relation to severity of Alzheimer’s disease (AD) was the focal point of this study. In this study, 79 patients diagnosed with probable AD were recruited from the multi-centric Prospective Dementia Database [...] Read more.
Analysis of nonlinear quantitative EEG (qEEG) markers describing complexity of signal in relation to severity of Alzheimer’s disease (AD) was the focal point of this study. In this study, 79 patients diagnosed with probable AD were recruited from the multi-centric Prospective Dementia Database Austria (PRODEM). EEG recordings were done with the subjects seated in an upright position in a resting state with their eyes closed. Models of linear regressions explaining disease severity, expressed in Mini Mental State Examination (MMSE) scores, were analyzed by the nonlinear qEEG markers of auto mutual information (AMI), Shannon entropy (ShE), Tsallis entropy (TsE), multiscale entropy (MsE), or spectral entropy (SpE), with age, duration of illness, and years of education as co-predictors. Linear regression models with AMI were significant for all electrode sites and clusters, where R 2 is 0.46 at the electrode site C3, 0.43 at Cz, F3, and central region, and 0.42 at the left region. MsE also had significant models at C3 with R 2 > 0.40 at scales τ = 5 and τ = 6 . ShE and TsE also have significant models at T7 and F7 with R 2 > 0.30 . Reductions in complexity, calculated by AMI, SpE, and MsE, were observed as the MMSE score decreased. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

2703 KiB  
Article
Towards Operational Definition of Postictal Stage: Spectral Entropy as a Marker of Seizure Ending
by Ancor Sanz-García, Lorena Vega-Zelaya, Jesús Pastor, Rafael G. Sola and Guillermo J. Ortega
Entropy 2017, 19(2), 81; https://doi.org/10.3390/e19020081 - 21 Feb 2017
Cited by 7 | Viewed by 6316
Abstract
The postictal period is characterized by several neurological alterations, but its exact limits are clinically or even electroencephalographically hard to determine in most cases. We aim to provide quantitative functions or conditions with a clearly distinguishable behavior during the ictal-postictal transition. Spectral methods [...] Read more.
The postictal period is characterized by several neurological alterations, but its exact limits are clinically or even electroencephalographically hard to determine in most cases. We aim to provide quantitative functions or conditions with a clearly distinguishable behavior during the ictal-postictal transition. Spectral methods were used to analyze foramen ovale electrodes (FOE) recordings during the ictal/postictal transition in 31 seizures of 15 patients with strictly unilateral drug resistant temporal lobe epilepsy. In particular, density of links, spectral entropy, and relative spectral power were analyzed. Partial simple seizures are accompanied by an ipsilateral increase in the relative Delta power and a decrease in synchronization in a 66% and 91% of the cases, respectively, after seizures offset. Complex partial seizures showed a decrease in the spectral entropy in 94% of cases, both ipsilateral and contralateral sides (100% and 73%, respectively) mainly due to an increase of relative Delta activity. Seizure offset is defined as the moment at which the “seizure termination mechanisms” actually end, which is quantified in the spectral entropy value. We propose as a definition for the postictal start the time when the ipsilateral SE reaches the first global minimum. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

475 KiB  
Article
Classification of Normal and Pre-Ictal EEG Signals Using Permutation Entropies and a Generalized Linear Model as a Classifier
by Francisco O. Redelico, Francisco Traversaro, María Del Carmen García, Walter Silva, Osvaldo A. Rosso and Marcelo Risk
Entropy 2017, 19(2), 72; https://doi.org/10.3390/e19020072 - 16 Feb 2017
Cited by 27 | Viewed by 5774
Abstract
In this contribution, a comparison between different permutation entropies as classifiers of electroencephalogram (EEG) records corresponding to normal and pre-ictal states is made. A discrete probability distribution function derived from symbolization techniques applied to the EEG signal is used to calculate the Tsallis [...] Read more.
In this contribution, a comparison between different permutation entropies as classifiers of electroencephalogram (EEG) records corresponding to normal and pre-ictal states is made. A discrete probability distribution function derived from symbolization techniques applied to the EEG signal is used to calculate the Tsallis entropy, Shannon Entropy, Renyi Entropy, and Min Entropy, and they are used separately as the only independent variable in a logistic regression model in order to evaluate its capacity as a classification variable in a inferential manner. The area under the Receiver Operating Characteristic (ROC) curve, along with the accuracy, sensitivity, and specificity are used to compare the models. All the permutation entropies are excellent classifiers, with an accuracy greater than 94.5% in every case, and a sensitivity greater than 97%. Accounting for the amplitude in the symbolization technique retains more information of the signal than its counterparts, and it could be a good candidate for automatic classification of EEG signals. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

515 KiB  
Article
Transfer Learning for SSVEP Electroencephalography Based Brain–Computer Interfaces Using Learn++.NSE and Mutual Information
by Matthew Sybeldon, Lukas Schmit and Murat Akcakaya
Entropy 2017, 19(1), 41; https://doi.org/10.3390/e19010041 - 19 Jan 2017
Cited by 8 | Viewed by 5675
Abstract
Brain–Computer Interfaces (BCI) using Steady-State Visual Evoked Potentials (SSVEP) are sometimes used by injured patients seeking to use a computer. Canonical Correlation Analysis (CCA) is seen as state-of-the-art for SSVEP BCI systems. However, this assumes that the user has full control over their [...] Read more.
Brain–Computer Interfaces (BCI) using Steady-State Visual Evoked Potentials (SSVEP) are sometimes used by injured patients seeking to use a computer. Canonical Correlation Analysis (CCA) is seen as state-of-the-art for SSVEP BCI systems. However, this assumes that the user has full control over their covert attention, which may not be the case. This introduces high calibration requirements when using other machine learning techniques. These may be circumvented by using transfer learning to utilize data from other participants. This paper proposes a combination of ensemble learning via Learn++ for Nonstationary Environments (Learn++.NSE)and similarity measures such as mutual information to identify ensembles of pre-existing data that result in higher classification. Results show that this approach performed worse than CCA in participants with typical SSVEP responses, but outperformed CCA in participants whose SSVEP responses violated CCA assumptions. This indicates that similarity measures and Learn++.NSE can introduce a transfer learning mechanism to bring SSVEP system accessibility to users unable to control their covert attention. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

2424 KiB  
Article
Healthcare Teams Neurodynamically Reorganize When Resolving Uncertainty
by Ronald Stevens, Trysha Galloway, Donald Halpin and Ann Willemsen-Dunlap
Entropy 2016, 18(12), 427; https://doi.org/10.3390/e18120427 - 29 Nov 2016
Cited by 32 | Viewed by 6170
Abstract
Research on the microscale neural dynamics of social interactions has yet to be translated into improvements in the assembly, training and evaluation of teams. This is partially due to the scale of neural involvements in team activities, spanning the millisecond oscillations in individual [...] Read more.
Research on the microscale neural dynamics of social interactions has yet to be translated into improvements in the assembly, training and evaluation of teams. This is partially due to the scale of neural involvements in team activities, spanning the millisecond oscillations in individual brains to the minutes/hours performance behaviors of the team. We have used intermediate neurodynamic representations to show that healthcare teams enter persistent (50–100 s) neurodynamic states when they encounter and resolve uncertainty while managing simulated patients. Each of the second symbols was developed situating the electroencephalogram (EEG) power of each team member in the contexts of those of other team members and the task. These representations were acquired from EEG headsets with 19 recording electrodes for each of the 1–40 Hz frequencies. Estimates of the information in each symbol stream were calculated from a 60 s moving window of Shannon entropy that was updated each second, providing a quantitative neurodynamic history of the team’s performance. Neurodynamic organizations fluctuated with the task demands with increased organization (i.e., lower entropy) occurring when the team needed to resolve uncertainty. These results show that intermediate neurodynamic representations can provide a quantitative bridge between the micro and macro scales of teamwork. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

1274 KiB  
Article
Weighted-Permutation Entropy Analysis of Resting State EEG from Diabetics with Amnestic Mild Cognitive Impairment
by Zhijie Bian, Gaoxiang Ouyang, Zheng Li, Qiuli Li, Lei Wang and Xiaoli Li
Entropy 2016, 18(8), 307; https://doi.org/10.3390/e18080307 - 22 Aug 2016
Cited by 9 | Viewed by 5931
Abstract
Diabetes is a significant public health issue as it increases the risk for dementia and Alzheimer’s disease (AD). In this study, we aim to investigate whether weighted-permutation entropy (WPE) and permutation entropy (PE) of resting-state EEG (rsEEG) could be applied as potential objective [...] Read more.
Diabetes is a significant public health issue as it increases the risk for dementia and Alzheimer’s disease (AD). In this study, we aim to investigate whether weighted-permutation entropy (WPE) and permutation entropy (PE) of resting-state EEG (rsEEG) could be applied as potential objective biomarkers to distinguish type 2 diabetes patients with amnestic mild cognitive impairment (aMCI) from those with normal cognitive function. rsEEG series were acquired from 28 patients with type 2 diabetes (16 aMCI patients and 12 controls), and neuropsychological assessments were performed. The rsEEG signals were analysed using WPE and PE methods. The correlations between the PE or WPE of the rsEEG and the neuropsychological assessments were analysed as well. The WPE in the right temporal (RT) region of the aMCI diabetics was lower than the controls, and the WPE was significantly positively correlated to the scores of the Auditory Verbal Learning Test (AVLT) (AVLT-Immediate recall, AVLT-Delayed recall, AVLT-Delayed recognition) and the Wechsler Adult Intelligence Scale Digit Span Test (WAIS-DST). These findings were not obtained with PE. We concluded that the WPE of rsEEG recordings could distinguish aMCI diabetics from normal cognitive function diabetic controls among the current sample of diabetic patients. Thus, the WPE could be a potential index for assisting diagnosis of aMCI in type 2 diabetes. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Graphical abstract

Review

Jump to: Research

3998 KiB  
Review
Sleep Stage Classification Using EEG Signal Analysis: A Comprehensive Survey and New Investigation
by Khald Ali I. Aboalayon, Miad Faezipour, Wafaa S. Almuhammadi and Saeid Moslehpour
Entropy 2016, 18(9), 272; https://doi.org/10.3390/e18090272 - 23 Aug 2016
Cited by 268 | Viewed by 35189
Abstract
Sleep specialists often conduct manual sleep stage scoring by visually inspecting the patient’s neurophysiological signals collected at sleep labs. This is, generally, a very difficult, tedious and time-consuming task. The limitations of manual sleep stage scoring have escalated the demand for developing Automatic [...] Read more.
Sleep specialists often conduct manual sleep stage scoring by visually inspecting the patient’s neurophysiological signals collected at sleep labs. This is, generally, a very difficult, tedious and time-consuming task. The limitations of manual sleep stage scoring have escalated the demand for developing Automatic Sleep Stage Classification (ASSC) systems. Sleep stage classification refers to identifying the various stages of sleep and is a critical step in an effort to assist physicians in the diagnosis and treatment of related sleep disorders. The aim of this paper is to survey the progress and challenges in various existing Electroencephalogram (EEG) signal-based methods used for sleep stage identification at each phase; including pre-processing, feature extraction and classification; in an attempt to find the research gaps and possibly introduce a reasonable solution. Many of the prior and current related studies use multiple EEG channels, and are based on 30 s or 20 s epoch lengths which affect the feasibility and speed of ASSC for real-time applications. Thus, in this paper, we also present a novel and efficient technique that can be implemented in an embedded hardware device to identify sleep stages using new statistical features applied to 10 s epochs of single-channel EEG signals. In this study, the PhysioNet Sleep European Data Format (EDF) Database was used. The proposed methodology achieves an average classification sensitivity, specificity and accuracy of 89.06%, 98.61% and 93.13%, respectively, when the decision tree classifier is applied. Finally, our new method is compared with those in recently published studies, which reiterates the high classification accuracy performance. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Show Figures

Figure 1

Back to TopTop