Internet of Things Technologies and Machine Learning Methods for Parkinson’s Disease Diagnosis, Monitoring and Management: A Systematic Review
Abstract
:1. Introduction
1.1. Parkinson’s Disease
1.2. Atrificial Intelligence and Internet of Things for Parkinson’s Disease Diagnosis and Management
1.3. Aim of the Current Systematic Review
2. Methods
2.1. Literature Search Strategy
2.2. Eligibility Criteria
- Firstly, the considered studies should be strictly articles or conference papers, published between January 2012 and August 2021, as already stated.
- The considered studies should be about any aspect or phase of PD. The hypotheses should be tested on adult human subjects, under strict ethical guidelines. Some of the subjects should be PD patients but healthy subjects and other neurological disorders patients may also be included for, e.g., differential diagnosis. However, articles that present technologies which can be leveraged for PD diagnosis or management but are tested only on mixed or healthy populations are excluded. Furthermore, there should be a concise description of the datasets used and the respective signals should be real and not simulated.
- Moreover, the signals used should derive directly from wearable or non-wearable sensors, smart devices, ambient or other technologies that are related to IoT. Studies that exploit data from other sources, such as conventional medical equipment, interviews or medical reports that cannot be adopted for remote monitoring or diagnosis are excluded.
- At the same time, at least one specific AI algorithm should be proposed to solve the corresponding PD-related problem. The respective ML models should be trained over datasets that are in accordance with the previous guidelines and their performance should be measured with specific evaluation metrics. Studies that present only a statistical analysis and not any AI methods are excluded. Finally, the provided conclusion should be consistent with the initial research goal.
- If there are both a conference paper and an extended research article for the same study, the latter is preferred.
- In the case that the same or similar PD-related problems are addressed by one research group multiple times, utilizing similar methods or same datasets (e.g., proposing one or two more algorithms each time), only one is preserved and the rest are excluded. The most recent or comprehensive and detailed one was preferred.
- Finally, even if the research group is different when there are two or more almost identical approaches, only one is presented and the rest are excluded. By the term identical approaches, the paper refers to the exact same PD-related task, same ML/DL algorithms and the exact same open dataset or similar train and test population.
2.3. Selection and Data Collection Processes
3. Results
3.1. Search Results
3.2. Results of Individual Studies and Sensor-Wise Synthesis
3.2.1. Inertial Sensors
3.2.2. Pressure and Force Sensors
3.2.3. Image and Depth Sensors
3.2.4. Voice and Audio Sensors
3.2.5. Other Types of Sensors
3.2.6. Combination of the Previous Sensors
4. Collective Comparison and Overall Insights across the Studied Approaches
4.1. Sensors and Devices Deployed
4.2. Meta-Analysis over Sensor-Specific Results Obtained
4.2.1. Inertial Sensors
4.2.2. Pressure and Force Sensors
4.2.3. Image and Depth Sensors
4.2.4. Voice and Audio Sensors
4.2.5. Other Types of Sensors
4.2.6. Combination of the Previous Sensors
4.3. Overall Addressed Problems Related to Parkinon’s Disease and Obtained Accuracies
4.3.1. Frequency of Problems Addressed
4.3.2. Overall Performance Comparisons per Task
4.4. Ranking of Machine Learning Algorithms
4.5. Subjects Enrolled
5. Discussion and Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
2D | Two Dimensions |
3D | Three Dimensions |
AdaBoost | Adaptive Boosting |
ACC | Accuracy |
ADL | Activity of Daily Living |
AE | Autoencoder |
AI | Artificial Intelligence |
ALS | Amyotrophic Lateral Sclerosis |
ANFIS | Adaptive Neuro-Fuzzy Inference System |
ANN | Artificial neural Network |
ANOVA | Analysis of Variance |
AR | Autoregressive |
ARMA | Autoregressive Moving Average |
AU | Action Unit |
AUC | Area Under the Curve |
AUPR | Area under the Precision-Recall Curve |
AUROC | Area Under the Receiver-Operating Characteristic Curve |
Avg | Average |
BDI | Beck Depression Inventory |
BE | Backward Elimination |
CAD | Computer-Aided Diagnosis |
CART | Classification and Regression Tree |
CFS | Correlation-based Feature Selection |
CNN | Convolutional Neural Network |
CT | Computed Tomography |
CWT | Continuous Wavelet Transform |
DA | Discriminant Analysis |
DBN | Deep Belief Network |
DCALSTM | Dual-modal Convolutional Neural Network + Attention Enhanced Long Short-Term Memory |
DCNN | Dual-modal Convolutional Neural Network |
DD | Other Movement Disorder |
DESN | Deep Echo State Network |
DFT | Discrete Fourier Transform |
DL | Deep Learning |
DNN | Deep Neural Network |
DT | Decision Tree |
DTW | Dynamic Time Warping |
DWT | Discrete Wavelet Transform |
ECG | Electrocardiography |
EEG | Electroencephalography |
EER | Equal Error Rate |
ELM | Extreme Learning Machine |
EMG | Electromyography |
ESN | Echo State Network |
ESS | Epworth Sleepiness Scale |
ET | Essential Tremor |
EWPT | Empirical Wavelet Packet Transform |
EWT | Empirical Wavelet Transform |
FA | Factor Analysis |
FFT | Fast Fourier Transform |
FIR | Finite Impulse Response |
FLDA | Fisher Linear Discriminant Analysis |
FNR | False Negative Rate |
FoG | Freezing of Gait |
FPR | False Positive Rate |
FRP | Fuzzy Recurrent Plot |
GBM | Gradient Boosting Machine |
GD | Gradient Descent |
GDM | Gradient Descent with Momentum |
GM | Geometric Mean |
GMM | Gaussian Mixture Model |
GP | Gaussian Process |
GRF | Ground Reaction Force |
GRU | Gated Recurrent Unit |
GS | Graph Sequence |
H&Y | Hoehn and Yahr |
HBNN | Hierarchical Bayesian Neural Network |
HD | Huntington’s Disease |
hHMM | Hierarchical Hidden Markov Model |
HMCI | Healthy People with Mild Cognitive Impairment |
HMM | Hidden Markov Model |
HOA | Healthy Older Adult |
HOG | Histogram of Oriented Gradients |
IBk | Instance-Based with parameter k |
ICC | Intraclass Correlation Coefficient |
IIR | Infinite Impulse Response |
IMU | Inertial Measurement Unit |
IoT | Internet of Things |
IPD | Idiopathic Parkinson’s Disease |
JMIM | Joint Mutual Information Maximization |
KELM | Kernel Extreme Learning Machine |
KFD | Kernel Fisher Discriminant Analysis |
kNN | k-Nearest Neighbors |
LASSO | Least Absolute Shrinkage and Selection Operator |
LDA | Linear Discriminant Analysis |
LID | Levodopa-Induced Dyskinesia |
LOSO | Leave One Subject Out |
LR | Logistic Regression |
LSTM | Long Short-Term Memory |
MAE | Mean Absolute Error |
MCC | Matthew’s Correlation Coefficient |
MDS-UPDRS | Movement Disorder Society-Sponsored Revision of the Unified Parkinson’s Disease Rating Scale |
MFCC | Mel-Frequency Cepstral Coefficient |
ML | Machine Learning |
MLP | Multilayer Perceptron |
MMG | Mechanomyography |
MML | Multi-source Multi-task Learning |
MMTFL | Multiplicative Multi-Task Feature Learning |
MoCA | Montreal Cognitive Assessment |
MRI | Magnetic Resonance Imaging |
mRMR | Maximum Relevance Minimum Redundancy |
MS Kinect | Microsoft Kinect |
mUPDRS | Motor subscale of the Unified Parkinson’s Disease Rating Scale |
NB | Naïve Bayes |
NN | Neural Network |
OPF | Optimum Path Forest |
PCA | Principal Component Analysis |
PD | Parkinson’s Disease |
PDD | Parkinson’s Disease Patients with Dementia |
PDMCI | Parkinson’s Disease Patients with Mild Cognitive Impairment |
PDNC | Parkinson’s Disease Patients with Normal Cognition |
PET | Positron Emission Tomography |
PKG | Personal KinetiGraph |
PNN | Probabilistic Neural Network |
PPV | Positive Predictive Value |
PREC | Precision |
PRISMA | Preferred Reporting Items for Systematic Reviews and Meta-Analyses |
PSD | Power Spectral Density |
PSG | Polysomnography |
PSP | Progressive Supranuclear Palsy |
Q-BTDNN | Q-Backpropagation for a Time Delay Neural Network |
QDA | Quadratic Discriminant Analysis |
RBD | Rapid Eye Movement Sleep Behavior Disorder |
RBF | Radial Basis Function |
REC | Recall |
Res-Net | Residual Neural Network |
RF | Random Forest |
RGB | Red Green Blue |
RGB-D | Red Green Blue-Depth |
RMSE | Root Mean Square Error |
RNN | Recurrent Neural Network |
RUSBoost | Random UnderSampling Boosting |
SBS | Sequential Backward Selection |
SCG | Scaled Conjugate Gradient |
SDE | Sparse Difference Embedding |
SDH | Sum and Difference Histogram |
sEMG | Surface-Electromyography |
SENS | Sensitivity |
SF-36 | 36-Item Short Form Survey |
SFS | Sequential Forward Selection |
SOM | Self-Organizing Map |
SPDDS | Self-Assessment Parkinson’s Disease Disability Scale |
SPEC | Specificity |
STFT | Short-Time Fourier Transform |
STL | Single Task Learning |
SVM | Support Vector Machine |
SVR | Support Vector Regression |
TRIS | Treatment Response Index |
TUG | Timed Up and Go |
UDysRS | Unified Dyskinesia Rating Scale |
UPDRS | Unified Parkinson’s Disease Rating Scale |
VAE | Variational Autoencoder |
VaP | Vascular Parkinsonism |
VAS | Visual Analog Scale |
XGBoost/XGB | Extreme Gradient Boosting |
Appendix A
Study | Problem | Dataset Population | Input Data | Analysis/Algorithms | Evaluation |
---|---|---|---|---|---|
Rastegari et al. (2020) [49] | Diagnosis | 10 PD patients and 10 healthy controls from another study [166] | Accelerometer and gyroscope data from both ankles using SHIMMER sensors | Segmentation + bag-of-words features extraction based on sub-sequences clustering with k-medoid + SVM/DT/RF/kNN | Best performing: DT with ACC = 90%, PREC = 90%, REC = 90% |
Zhang et al. (2020) [54] | Diagnosis | 656 PD patients and 2148 healthy controls from the mPower study [126] | Gait features from smartphone sensors located at the pocket | Deep CNN | AUROC = 0.87 |
Juutinen et al. (2021) [50] | Diagnosis | 29 PD patients and 29 healthy controls | Accelerometer and gyroscope signals from a waist-mounted smartphone while performing walking tests | Interpolation + low-pass filtering + smoothening + segmentation to strides + feature extraction + feature selection with mRMR/SFS/SBS + classification of strides with DT/Gaussian kernel/LDA/ensemble/kNN/LR/NB/SVM/RF + majority voting for subject classification | Best performing: SFS + kNN with ACC = 84.5%, SENS = 88.5%, SPEC = 81.3% |
Fernandes et al. (2018) [51] | Parkinsonism diagnosis and differential diagnosis between idiopathic PD (IPD) and vascular parkinsonism (VaP) | 15 IPD, 15 VaP patients and 15 healthy controls | Gait signals from wearable Physilog motion sensors placed on both feet and MOCA scores | Normalization + feature selection based on Kruskal–Wallis and Mann–Whitney tests with Bonferroni correction + features ranking with RFs + MLP/DBN | Best performing for parkinsonism detection: MLP with ACC = 99.33%, SENS = 93.33%; for IPD-VaP differential diagnosis: DBN with ACC = 73.33%, SENS = 73.33%, SPEC = 73.33% |
Cuzzolin et al. (2017) [52] | Diagnosis and severity estimation in the SF-36 (0–100) scale | 156 PD patients and 424 healthy controls | Signals from an IMU attached to the lower spine while walking | HMM + kNN | For diagnosis: ACC = 85.51%, F1-SCORE = 81.54%; for severity estimation: MAE = 27.81 |
Borzì et al. (2020) [70] | Severity estimation (MDS-UPDRS related to leg agility scores classification) | 93 PD patients | Acceleration, angular velocity and orientation data from a smartphone application | Low-pass Butterworth filter + FFT + feature selection + DT/kNN/SVM/ANN | Best performing: ANN with ACC = 77.7%, AUROC = 0.92, r = 0.92, RMSE = 0.42, ICC = 0.82 |
Hssayeni et al. (2018) [96] | Fluctuation identification (on/off states detection) | 12 PD patients (1st dataset) and 7 PD patients (2nd dataset) | Signals from an ankle-mounted triaxial gyroscope | FIR bandpass filter + LSTM | ACC = 73–77%, SENS = 63–75%, SPEC = 78–83% |
Aich et al. (2020) [97] | Fluctuation identification (on/off states detection) | 20 PD patients | Statistical and gait parameter features from 2 knee-worn accelerometers | Feature selection + RF/SVM/kNN/NB | Best performing: RF with ACC = 96.72%, PREC = 96.92%, REC = 97.35%, F1-SCORE = 0.97, AUROC = 0.99 |
Abujrida et al. (2020) [53] | Diagnosis and severity estimation regarding walking balance (MDS-UPDRS-2.12), shaking/tremor (MDS-UPDRS-2.10) and FoG | 152 PD patients and 304 healthy controls from the mPower study [126] | Signals from smartphone embedded accelerometers and gyroscopes, demographics and lifestyle data | Segmentation + smoothening + feature extraction in time, frequency (FFT + PSD), wavelet (DWT) domain + RF/bagged trees/boosted tress/fine tree/cubic SVM/weighted kNN/LR/LDA | Best performing for diagnosis: RF with ACC = 95%, PREC = 94%, AUROC = 0.99; for walking balance: RF with ACC = 93%, PREC = 92%, AUROC = 0.97; for tremor: bagged trees with ACC = 95%, PREC = 95%, AUROC = 0.92; for FoG: bagged trees with ACC = 98%, PREC = 96%, AUROC = 0.98 |
Kim et al. (2015) [84] | FoG detection | 15 PD patients | Gyroscopic and accelerometer data from smartphones placed at the waist, pocket and ankle | AdaBoost | SENS = 81.1–86%, SPEC = 91.5–92.5% |
Ashour et al. (2020) [87] | FoG detection | 10 PD patients | Accelerometer data from sensors placed on the ankles, knees and hips | Patient-dependent LSTM/{DWT/FFT + Patient-dependent SVM/ANN} | Best performing: LSTM with ACC = 68.44–98.89% |
Torvi et al. (2016) [89] | FoG prediction in 1 s, 3 s and 5 s horizons | 10 PD patients from Daphnet dataset | Accelerometer data from sensors placed on the ankles, the thighs and the trunk | LSTM/RNN with or without transfer learning | Best performing: LSTM + transfer learning with ACC = 85–95% |
Arami et al. (2019) [90] | FoG prediction (2-class classification FoG/no-FoG and 3-class classification pre-FoG/FoG/no-FoG) | 10 PD patients from Daphnet dataset | Accelerometer data from sensors placed on the ankles, the thighs and the trunk | Windowing + filtering + feature extraction + feature selection with mRMR/BE + features time series prediction with AR/ARMA + RBF-SVM/PNN | For 2-class classification: ACC = 94%, SENS = 93%, SPEC = 87%; for 3-class classification: ACC = 77% |
Kleanthous et al. (2020) [91] | FoG detection (FoG/walking/transition from walking to FoG classification) | 10 PD patients from Daphnet dataset | Accelerometer data from sensors placed on the ankles, the thighs and the trunk | Low pass Butterworth filter + Boruta algorithm + GBM + XGB for feature selection + XGB/RF/GBM/RBF-SVM/MLP | Best performing: RBF-SVM with ACC = 79.85%, SENS = 72.34–91.49%, SPEC = 87.36–93.62% |
Li et al. (2020) [88] | FoG detection | 10 PD patients from Daphnet dataset | Accelerometer data from sensors placed on the ankles, the thighs and the trunk | Filtering + segmentation + data augmentation + CNN with squeeze-and-excitation blocks for feature extraction + attention-enhanced LSTM | ACC = 98.1–99.7%, SENS = 95.1–99.1%, SPEC = 98.8–99.8% for generalized and personalized models with 10-fold cross-validation; AUC = 0.945, ACC = 91.9%, EER = 10.6% with LOSO validation |
Halder et al. (2021) [92] | FoG states classification (pre-FoG, FoG, pre-post-FoG, no-FoG) | 10 PD patients from Daphnet dataset | Accelerometer data from sensors placed on the ankles, the thighs and the trunk | Second-order Butterworth low-pass filtering + PCA + kNN/MLP/SVM | Best performing: kNN with ACC = 98.92%, SENS = 94.97%, SPEC = 99.19%, F1-SCORE = 95.25%, PREC = 95.55% |
Palmerini et al. (2017) [93] | Pre-FoG detection (classification between gait and pre-FoG) | 11 PD patients | Accelerometer and gyroscope signals from sensors placed at the lower-back and at the ankles | Windowing + feature extraction + LDA | SENS = 83%, SPEC = 67%, AUROC = 0.76 |
Borzì et al. (2021) [94] | FoG and pre-FoG detection | 11 PD patients on and off therapy | A single angular velocity signal from 2 shins-mounted inertial sensors while performing TUG tests | Normalization + segmentation + wrapper feature selection + SVM/kNN/LDA/LR | Best performing: SVM with ACC = 85.5–86.1%, SENS = 84.1–85.5%, SPEC = 85.9–86.3%, F-SCORE = 73.4–74.6% |
Shi et al. (2020) [85] | FoG detection | 63 PD patients | Accelerometer, gyroscope and magnetometer signals from IMUs placed on both ankles and the spine while performing TUG tests | Time-series segmentation with overlapping windows + Morlet CWT/FFT/raw data + 1D-CNN/2D-CNN/LSTM | Best performing: CWT + 2D-CNN with ACC = 89.2%, SENS = 82.1%, SPEC = 96%, GM = 88.8% |
Camps et al. (2018) [86] | FoG detection | 21 PD patients | Signals from a waist-mounted IMU with accelerometer, gyroscope and magnetometer while performing several walking tests and ADLs | Spectral window stacking (with FFT) + RUSBoost/SVM-RBF/CNN | Best performing: CNN with ACC = 89%, SENS = 91.9%, SPEC = 89.5% and geometrical mean of SENS-SPEC = 90.6% |
Ghassemi et al. (2018) [95] | Gait segmentation (strides detection) | 10 PD patients for the 1st experiment and 34 PD patients for the 2nd experiment | Acceleration and angular velocity signals from foot-worn IMUs while walking in straight line (1st experiment) and walking, turning or performing other movements (2nd experiment) | Peak detection algorithm/Euclidean DTW/Probabilistic DTW/hierarchical HMM | Best performing for the 1st experiment: all with F-SCORE = 99.8–100%; For the 2nd experiment: hHMM with PREC = 98.5%, REC = 93.5%, F-SCORE = 95.9% |
Kostikis et al. (2015) [55] | Diagnosis | 25 PD patients and 20 healthy controls | Tremor measurements from accelerometer and gyroscope smartphone sensors | NB/LR/SVM/AdaBoost/C4.5/RF | Best performing: RF with SENS = 82%, SPEC = 90%, AUROC = 0.94 |
Williamson et al. (2021) [59] | Diagnosis | 202 PD patients and 178 healthy controls from the UK Biobank dataset (https://www.ukbiobank.ac.uk/, accessed on 17 February 2022) | Acceleration signals from a wrist-worn sensor | Segmentation + automatic segments labelling (gait or low movement) + segmentation into frames + features extraction + GMM | SENS = 65–75%, AUROC = 0.85 |
Park et al. (2021) [60] | Diagnosis | 25 PD patients and 21 healthy controls | Signals from IMUs attached to the thumb and index fingers while performing finger tapping, hand movements and rapid altering movements | Linear regression + correlation between motor parameters and UPDRS scores + DNN/LR for PD diagnosis | For motor parameters-UPDRS scores correlation: r = 0.838–0.849; Best performing for diagnosis: DNN with AUROC = 0.888–0.950 |
Talitckii et al. (2021) [65] | Differential diagnosis between PD and other extrapyramidal disorders | 41 PD patients and 15 patients with other extrapyramidal disorders | Accelerometer, gyroscope and magnetometer signals form a dorsal-mounted sensor while performing UPDRS-related tasks | STFT + feature extraction + linear-PCA/RBF-PCA/poly-PCA/LDA/FA + RF/SVM/LR/LightGBM/stacked ensemble model | Best performing: standard classifier with ACC = 72–85%, PREC = 72–85%, REC = 77–100%, F1-SCORE = 76–88% |
Varghese et al. (2020) [66] | PD or other movement disorders (DD) patients and healthy controls classification; PD patients and DD patients or healthy controls classification | 192 PD patients, 75 DD patients and 51 healthy controls | Accelerometer data from 2 smartwatches placed at both hands and answers from electronic questionnaires distributed via smartphones | FFT + PCA + RBF-SVM/RF/ANN | Best performing for PD/DD detection: ANN with ACC = 89%, PREC = 94%, REC = 92%, F1-SCORE = 93%; for PD detection: RBF-SVM with ACC = 79%, PREC = 81%, REC = 89%, F1-SCORE = 85% |
Loaiza Duque et al. (2020) [67] | Healthy and trembling subjects classification; PD-ET differential diagnosis | 19 PD patients, 20 ET patients and 12 healthy controls | Angular velocity signals from smartphone built-in triaxial gyroscope with the help of a smartphone application | Kinematic features extraction + feature selection based on chi-square and unbiased tree + linear-SVM/LR/DA/NB/DT/ensemble subspace kNN | Best performing for trembling patients detection: ensemble subspace kNN with ACC = 97.2%, SENS = 98.5%, SPEC = 93.3%; for PD-ET differential diagnosis: linear-SVM with ACC = 77.8%, SENS = 75.7%, SPEC = 80% |
Channa et al. (2021) [83] | Classification between PD patients with tremor and PD patients with bradykinesia and healthy controls | 10 PD patients with tremor, 10 PD patients with bradykinesia and 20 healthy controls | Accelerometer and gyroscope signals from a smart bracelet | Butterworth bandpass IIR filter + FFT to extract both time and frequency domain features + feature selection based on ANOVA test + NN-SOMs clustering/kNN | Best performing: kNN with ACC = 91.7%, SENS = 83–100%, SPEC = 89–100% |
Li et al. (2019) [57] | Diagnosis and severity (H&Y scores) estimation | 13 PD patients and 12 healthy controls | Acceleration and angular velocity signals from a prototype handle for spoons with embedded inertial sensors | IIR filtering + windowing + feature extraction + normalization + kNN/Adaboost/RF/linear-SVM for diagnosis; RF regression for severity estimation | Best performing for diagnosis: linear-SVM with ACC = 92%, SENS = 92.31%, SPEC = 91.67%, AUROC = 0.98; for severity estimation: MAE = 0.166, r = 0.97 |
Koçer et al. (2016) [56] | Diagnosis and severity (H&Y scores) estimation | 35 PD patients and 20 healthy controls | Resting tremor acceleration signals from the Nintendo Wii Remote (Wiimote) | Windowing + feature extraction (with FFT) + SVM | For diagnosis: ACC = 89%, PREC = 91%, REC = 94%; for severity estimation: ACC = 33–77% |
Bazgir et al. (2015) [71] | Severity estimation (UPDRS scores classification) | 52 PD patients | Accelerometer and gyroscope data from a smartphone placed at the wrist | Filtering + STFT + MLP trained with the back propagation algorithm | ACC = 91%, SENS = 89.6%, SPEC = 90.64% |
Kim et al. (2018) [72] | Severity estimation (UPDRS scores classification) | 92 PD patients | Accelerometer and gyroscope signals from a wrist-worn device | High pass filter + FFT + RF/NB/linear regression/DT/MLP/SVM/CNN | Best performing: CNN with ACC = 85%, kappa = 0.85, r = 0.93, RMSE = 0.35 |
Dai et al. (2021) [73] | Severity estimation (MDS-UPDRS scores classification) | 42 PD patients and 30 healthy controls | Accelerometer, gyroscope and geomagnetic data from a finger-mounted sensor while measuring rest and postural tremor and during finger tapping | Denoising with an IIR bandpass filter + FFT + SVM/RF/kNN | Best performing: SVM with ACC = 96–97.33%, SENS = 96.36–100%, SPEC = 95–96.67% |
Khodakarami et al. (2019) [74] | Severity estimation (UPDRS scores classification) and prediction of response to levodopa (absolute value and percentage) | 151 PD patients and 174 healthy controls | Signals from the wrist-worn smartwatch of the Parkinson’s Kinectigraph system | Feature extraction + JMIM-based feature selection + (+PCA) LR/RBF-SVM/gradient boosting DTs | Best performing for severity estimation: LR with AUROC = 0.79–0.88, AUPR = 0.65–0.88; for absolute levodopa response estimation AUROC = 0.92 and AUPR = 0.87; for levodopa response percentage estimation AUROC = 0.82, AUPR = 0.73 |
Javed et al. (2018) [58] | Diagnosis and treatment response index (TRIS) estimation | 19 PD patients and 22 healthy controls | Accelerometer and gyroscope data from both wrists using SHIMMER3 sensors while performing hand rotation tests before and after the dose administration | PCA/stepwise regression/LASSO regression + SVM/linear regression/DT/RF | Best performing for diagnosis: stepwise regression + SVM with ACC = 89%; for TRIS estimation: RMSE = 0.69, r = 0.84 |
Watts et al. (2021) [100] | PD patients classification according to their levodopa regimens and response | 26 PD patients | Bradykinesia and dyskinesia-related signals from Personal KinetiGraph (PKG), demographics and MDS-UPDRS-III scores | k-means for clustering based on regimen features + RF for classification based on PKG features, demographics and MDS-UPDRS-III scores | ACC = 86.9%, SENS = 86.5%, SPEC = 87.7%, PPV = 95.3%, F1-SCORE = 90.7%, AUROC = 0.871 |
Pfister et al. (2020) [98] | On/Off/Dyskinesia motor states classification | 30 PD patients | Data from a wrist-worn accelerometer in a free-living environment | Data augmentation + CNN/SVM/kNN/RF/MLP | Best performing: CNN with ACC = 65.4%, Kohen’s Kappa = 0.47, SENS = 64.45–66.68%, SPEC = 66.72–89.48%, F1-SCORE = 62.4–69.01%, 1vsALL ACC = 66.7–82.56% |
Eskofier et al. (2016) [79] | Bradykinesia detection | 10 PD patients | Accelerometer data from IMUs mounted on both hands | Timeseries segmentation + AdaBoost/PART/kNN/SVM/CNN-DNN | Best performing: DNN ACC = 90.9% |
Shawen et al. (2020) [80] | Tremor and bradykinesia detection; severity estimation (UPDRS scores classification) | 13 PD patients | Accelerometer and gyroscope signals from a flexible skin-mounted sensors and accelerometer signals from a wrist-worn smartwatch | Cubic spline interpolation + segmentation + high-pass filtering + time, frequency, entropy, correlation and derivative-based feature extraction + RF | For tremor detection: AUROC = 0.68–0.79; for bradykinesia detection: AUROC = 0.61–0.69; for tremor severity estimation: AUROC = 0.67–0.77; for bradykinesia severity estimation: AUROC = 0.59–0.66 |
San-Segundo et al. (2020) [81] | Tremor detection; tremor duration estimation | 12 PD patients for laboratory set and 6 PD patients for in-the-wild set | Accelerometer signals from wrist-worn sensors | Downsampling + FFT + unsupervised non-negative tremor factorization + feature extraction manually/with a CNN + RF/MLP | Best performing for tremor detection: CNN + MLP with AUC = 0.887; for tremor duration estimation: MAE = 4.1–9.1% |
Ibrahim et al. (2020) [82] | Tremor onset detection | 13 PD patients | Signals from hand-mounted IMUs while performing 6 different rest, postural and motor tasks | Butterworth filtering + zero-phase shifting + Hilbert-Huang Transform + MLP | ACC = 92.9%, PREC = 98.7%, REC = 86.7%, SPEC = 98.9%, F1-SCORE = 0.923 |
Som et al. (2020) [61] | Diagnosis | 152 healthy subjects for pre-training and 18 PD patients and 16 healthy controls for the final classification | Accelerometers signals from a wrist-worn sensor while performing various ADL for the pre-training + accelerometer signals from 6 sensors located at the sternum, the lumbar, both ankles and wrists while walking | Zero-centering + normalization + segmentation + feature extraction with pre-trained convolutional AE + PCA/global-average-pool layer + SVM/MLP | Best performing: MLP with ACC = 68.64–73.81%, PREC = 69.27–76.53%, REC = 68.64–73.81%, F1-SCORE = 67.65–73.89% |
Ricci et al. (2020) [62] | Diagnosis | 30 newly diagnosed untreated PD patients and 30 healthy controls | Acceleration, angular velocity and orientation signals from a network of 14 IMUs distributed in the whole body | Feature selection with ReliefF ranking and Kruskal–Wallis + NB/kNN/SVM | Best performing: SVM with ACC = 95%, PREC = 95.1%, AUROC = 0.95 |
De Vos et al. (2020) [69] | PD and PSP differential diagnosis | 20 PD patients and 21 PSP patients | Accelerometer, gyroscope and magnetometer signals from 6 IMUs placed on the lumbar spine, the sternum, both wrists and feet | ANOVA + LASSO + LR/RF | Best performing: RF with ACC = 88%, SENS = 86%, SPEC = 90% |
Moon et al. (2020) [68] | PD and ET differential diagnosis | 524 PD patients and 34 ET patients | Balance and gait characteristics from 6 IMUs placed on the lumbar spine, the sternum, both wrists and feet | NN/SVM/kNN/DT/RF/LR | Best performing: NN with ACC = 89%, PREC = 61%, REC = 61%, F1-SCORE = 61% |
Kuhner et al. (2017) [63] | Diagnosis; correlation with severity metrics | 14 PD patients and 26 healthy controls | Fusion of accelerometer, gyroscope and magnetometer signals from XSens motion capture suit while performing several motor tests | RF with probability distributions for classification PCA for correlation | For diagnosis: ACC = 86–94.6%, SENS up to 91.5% and SPEC up to 97.2%; for correlation between the 1st pc and the UPDRS scores: r = 0.79 |
Caramia et al. (2018) [64] | Diagnosis and severity estimation (H&Y scores classification) | 25 PD patients and 25 healthy controls | Accelerometer, gyroscope and magnetometer signals from 8 IMUs attached to both feet dorsum, thighs, shanks and to the chest and lumbar | Extraction of range of motion parameters/spatio-temporal parameters (+PCA) + LDA/NB/kNN/linear-SVM/RBF-SVM/DT + majority voting with equal weights/weights analogue to the individual accuracies | Best performing for diagnosis: majority voting with weights analogue to the individual accuracies with ACC = 96%; for severity estimation: RBF-SVM with ACC = 87.75–94.5% |
Hssayeni et al. (2021) [76] | Severity (UPDRS-III scores) estimation | 24 PD patients | Angular velocity from one wrist-mounted and one ankle-mounted inertial sensor | Gradient tree boosting/dual-channel LSTM with hand-crafted features and with or without transfer learning/1D-CNN-LSTM for raw signals/2D-CNN-LSTM for time-frequency data + ensemble learning | Best performing: ensemble of dual-channel LSTM with hand-crafted features and transfer learning, 1D-CNN-LSTM for raw signals and 2D-CNN-LSTM for time-frequency data with r = 0.79, MAE = 5.95 |
Butt et al. (2020) [77] | Severity (MDS-UPDRS III scores) estimation | 64 PD patients and 50 healthy controls | Gyroscope and geomagnetic data from 4 wrist-mounted IMUs and 1 foot-mounted IMU | Kolmogorov–Smirnov test + Mann–Whitney U-test + normalization + CFS/PCA ranker/correlation attribute evaluation/chi-square attribute evaluation/wrapper subset evaluation + SVR/RF/LR/ANFIS | Best performing: CFS + ANFIS with r = 0.814, RMSE = 0.101 |
Mirelman et al. (2021) [75] | Severity estimation (H&Y stages classification) | 332 PD patients and 100 healthy controls | Accelerometer and gyroscope signals from sensors placed on the ankles, the wrists and the lower back while performing various walking tests + demographic data | Low-pass Butterworth filter + feature selection based on RF permutation importance/neighborhood component analysis/mRMR + RUSBoost + DT/QDA for weak learner | SENS = 72–84%, SPEC = 69–80%, AUROC = 0.76–0.90 |
Stamate et al. (2018) [78] | Identification of failures to follow the UPDRS-III movement protocol | 12 PD patients | Motor signals from smartphone sensors with the cloudUPDRS application | Filtering + frequency transformations + extra tree/Bernoulli NB/Gaussian NB/MLP/RF/Gradient Boosting/Bagging/AdaBoost/RCNN | ACC = 78%, F1-SCORE = 82%, AUROC = 0.87 |
Belgiovine et al. (2018) [99] | L-dopa induced dyskinesia detection | 18 PD patients | Accelerometer and angular velocity data from smartphone placed on the wrist (for upper-limb experiment) or on the hip (for lower-limb experiment) | z-score normalization + DT/Gaussian-SVM/linear-SVM | Best performing: SVM (with both kernels) with ACC = 65.0–82.0%, MACRO F1-SCORE = 0.65–0.82 |
Study | Problem | Dataset Population | Input Data | Analysis/Algorithms | Evaluation |
---|---|---|---|---|---|
Aversano et al. (2020) [102] | Diagnosis and severity estimation (H&Y scores classification) | 93 PD patients and 73 healthy controls from 3 merged datasets [106] | Vertical GRF signals from 8 sensors located underneath each foot | Feed-forward DNN | For diagnosis: ACC = 99.29–99.52%; for severity estimation: ACC = 98.57–99.1% |
El Maachi et al. (2020) [103] | Diagnosis and severity estimation (H&Y scores classification) | 93 PD patients and 73 healthy controls from 3 merged datasets [106] | Vertical GRF signals from 8 sensors located underneath each foot | Gait segmentation + 1D-CNN/DNN/MLP/NB/RF | Best performing for diagnosis: 1D-CNN with ACC = 98.7%, SENS = 98.1%, SPEC = 100%; for severity estimation: ACC = 85.3%, PREC = 87.3%, REC = 85.3%, F1-SCORE = 85.3% |
Xia et al. (2020) [104] | Diagnosis and severity estimation (H&Y scores classification) | 93 PD patients and 73 healthy controls from 3 merged datasets [106] | Vertical GRF signals from 8 sensors located underneath each foot | Dual-modal CNN + attention-enhanced LSTM (DCALSTM)/baseline DL models removing one of the previous stages (DCNN/DALSTM/DCLSTM/CNN-LSTM)/feature-based models | Best performing for diagnosis: DCALSTM with ACC = 99.07–99.31%, SENS = 99.10–99.35%, SPEC = 98.98–99.35%; for severity estimation: ACC = 98.03–99.01% |
Nancy Jane et al. (2016) [101] | Severity estimation (H&Y scores classification) | 93 PD patients and 73 healthy controls from 3 merged datasets [106] | Vertical GRF signals from 8 sensors located underneath each foot | Q-backpropagation/Levenberg–Marquardt/GD/GDM/SCG for a time delay NN | Best performing: Q-BTDNN with ACC = 90.91–92.19% |
Balaji et al. (2021) [105] | Severity estimation (H&Y scores classification) | 93 PD patients and 73 healthy controls from 3 merged datasets [106] | Vertical GRF signals from 8 sensors located underneath each foot | Normalization + spatiotemporal feature extraction + Shapiro–Wilk test + feature selection + kNN/NB/Bagging classifier/SVM | Best performing: SVM with ACC = 98.8%, SENS = 96.6%, SPEC = 99.6%, PPV = 99.1%, FPR = 3.4%, PREC = 99.1%, F-SCORE = 97.8%, MCC = 0.98 |
Papavasileiou et al. (2017) [107] | Differential diagnosis between PD, post-stroke patients and healthy controls | 5 PD patients, 3 post-stroke patients and 3 healthy controls | Ground contact force data from barometric pressure sensors placed on both feet | Multiplicative multi-task feature learning (MMTFL)/single task learning (STL) | Best performing: MMTFL AUROC = 0.880–0.994 |
Khoury et al. (2019) [108] | Diagnosis and differential diagnosis between PD, HD, ALS and healthy controls | 93 PD patients and 73 healthy controls from 3 merged datasets [106] for diagnosis; 15 PD patients, 20 HD patients, 13 ALS patients and 16 healthy controls for differential diagnosis | Vertical GRF signals from 8 sensors located underneath each foot | Feature selection with RF-based wrapper method + kNN/DT/RF/NB/SVM/k-means/GMM | Best performing for diagnosis: kNN/RF/SVM with ACC = 81.25–90.91%, PREC = 81.43–89.41%, REC = 71.48–88.35%, F-SCORE = 79.45–86.83%; for PD vs. ALL differential diagnosis: kNN with ACC = 90%, PREC = 90.18%, REC = 90%, F-SCORE = 90.09% |
Study | Problem | Dataset Population | Input Data | Analysis/Algorithms | Evaluation |
---|---|---|---|---|---|
Guayacán et al. (2020) [111] | Diagnosis | 11 PD patients and 11 healthy controls | Video recordings while walking | 3D spatio-temporal CNN | ACC = 88–90% |
Reyes et al. (2019) [109] | Diagnosis | 88 PD patients and 94 healthy controls | Gait samples from MS Kinect | Cropping noisy parts + LSTM/1D-CNN/CNN-LSTM | Best performing: CNN-LSTM with ACC = 83.1%, PREC = 83.5%, REC = 83.4%, F1-SCORE = 81%, Kappa = 64% |
Buongiorno (2019) [110] | Diagnosis and severity estimation (mild vs. moderate) | 16 PD patients and 14 healthy controls | Postural and kinematics features from MS Kinect v2 sensor while performing 3 motor exercises (gait, finger and foot tapping) | Feature selection + SVM/ANN | Best performing for diagnosis: gait-based ANN with ACC = 89.4%, SENS = 87.0%, SPEC = 91.8%; for severity estimation: ACC = 95.0%, SENS = 90.0%, SPEC = 99.0% |
Grammatikopoulou et al. (2019) [117] | Severity estimation (UPDRS scores classification) | 12 advanced PD patients and 6 PD patients in initial stage | Skeletal features from MS Kinect v2 RGB videos while playing an exergame | Transformation to local coordinate system + two parallel LSTMs (the 1st trained with raw joint coordinates and the 2nd with joint line distances) | ACC = 77.7% |
Tucker et al. (2015) [122] | Medication adherence estimation (on/off medication classification) | 7 PD patients | Skeletal joints 3D position, velocity and acceleration from MS Kinect | C4.5 DT for generalized model; C4.5 DT, RF, SVM, IBk for personalized models | for generalized model: ACC = 36.2–77.9%; for personalized models: ACC = 67.7–100% |
Li et al. (2018) [120] | TUG subtasks segmentation and time estimation for each subtask | 24 PD patients | Video recordings while performing TUG tests | Pose estimation with OpenPose/Iterator Error Feedback + SVM/LSTM | Best performing: OpenPose + LSTM with ACC = 93.1%, PREC = 80.8–97.5%, REC = 86.3–97%, F1-SCORE = 83.5–97.3% for subtasks segmentation and MAE = 0.32–1.07 for time estimation |
Wei et al. (2019) [123] | Development of a virtual physical therapist: movement recognition (repetitions and sub-actions detection), patient’s errors identification (satisfactory/non-satisfactory performance), task recommendation (regress/repeat/progress) | 35 PD patients | Motion data recorded by MS Kinect v2 sensor while performing 3 balance/agility tasks | HMM for repetitions and sub-actions detection + linear-SVM for movement errors identification + {majority undersampling/minority oversampling/decision threshold adjustment/hybrid oversampling with feature standardization and interpolation + RF} for task recommendation | For repetitions detection: ACC = 97.1–99.4%; for sub-actions segmentation: SENS = 88.4–96.9%, SPEC = 97.2–98.8%; for errors identification: ACC = 86.3–94.2%; best performing for tasks recommendation: hybrid oversampling + RF with ACC = 81.8–95.7%, FPR = 2.8–5.4% |
Hu et al. (2019) [121] | FoG detection | 45 PD patients | Videos collected while performing TUG tests | Graph representation of videos + pretrained features (Res-Net 50 vertex, C3D vertex and context features) + graph sequence-RNN (Bi-directional GS-GRU/Bi-directional GS-LSTM/forward GS-GRU/forward GS-LSTM) + fusion | Best performing: linear fusion of Bi-directional GS-GRU with context model with AUC = 0.90, SENS = 83.8%, SPEC = 82.3%, ACC = 82.5%, Youden’s J = 0.66, FPR = 17.7%, FNR = 16.2% |
Li et al. (2018) [118] | Binary classification between pathological (PD/LID) and normal motion; multiclass classification (PD with LID, PD without LID and normal); levodopa-induced dyskinesia severity (UDysRS-III scores) estimation; parkinsonism severity (UPDRS-III scores) estimation | 9 PD patients | 2D-Video recordings while performing communication and drinking tasks (for dyskinesia detection) and while performing leg agility and toe tapping tasks (for parkinsonism detection) | Convolutional pose estimators + RF | For binary classification: AUC = 0.634–0.930, F1-SCORE = 50–90.6%; for multiclass classification: ACC = 71.4%, SENS = 83.5–96.2%, SPEC = 68.4–95.7%; for UDysRS-III estimation: RMSE = 2.906, r = 0.741; for UPDRS-III estimation: RMSE = 7.765, r = 0.53 |
Vivar-Estudillo et al. (2018) [112] | Diagnosis | 18 PD patients and 22 healthy controls | Position, velocity and rotation data regarding hand movements from leap motion sensor | Texture features extraction with SDH + kNN/SVM/DT/LDA/LR/ensembles | Best performing: bagged tree with ACC = 98.62%, SENS = 98.43%, SPEC = 98.80%, PREC = 98.80% |
Moshkova et al. (2020) [113] | Diagnosis | 16 PD patients and 16 healthy controls | Signals from leap motion sensor while performing hand motor tasks according to the MDS-UPDRS-III | Features extraction + kNN/SVM/DT/RF | Best performing: SVM with ACC = 98.4% when features are extracted from all the tasks |
Ali et al. (2020) [114] | Diagnosis; classification between PD patients with medication, without medication and healthy controls | 87 PD patients with medication, 119 PD patients without medication and 139 healthy controls | Videos while performing hand motor tasks | Segmentation to frames + temporal segmentation with CNN + spatial segmentation with CNN-AE + FFT for feature extraction + SVM | Best performance when combining 2 tasks for diagnosis: ACC = 91.8%; for 3-class classification: ACC = 73.5% |
Liu et al. (2019) [119] | Severity estimation (Bradykinesia-related MDS-UPDRS scores classification) | 60 PD patients | Video recordings while performing hand motor tests | Pose estimator NN + feature extraction + kNN/RF/linear-SVM/RBF-SVM | Best performing: RBF-SVM with ACC = 89.7%, PREC = 20–100%, REC = 60–100%, F1-SCORE = 33.3–100% |
Rajnoha et al. (2018) [116] | Diagnosis | 50 PD patients and 50 age-matched healthy controls | Face images extracted from video recordings | HOG for face detection + CNN for embeddings generation + kNN/DT/RF/XGBoost/SVM | Best performing: DT:ACC = 67.33% with leave-one-out cross validation, RF: ACC = 60.7–85.92% with train-test split |
Jin et al. (2020) [115] | Diagnosis | 33 PD patients and 31 elderly healthy subjects | Short videos while imitating images of smiley people | Splitting videos to frames + coordinate points extraction with Face++ + transformation from absolute to relative coordinates + features extraction + LASSO + LR/SVM/DT/RF/LSTM/RNN | Best performing: SVM with PREC = 99%, REC = 99%, F1-SCORE = 99% |
Study | Problem | Dataset Population | Input Data | Analysis/Algorithms | Evaluation |
---|---|---|---|---|---|
Zhang et al. (2017) [127] | Diagnosis | 23 PD patients and 4 healthy controls from the Oxford’s dataset (https://archive.ics.uci.edu/ml/datasets/parkinsons+telemonitoring, accessed on 17 February 2022); 20 PD patients and 20 healthy controls from the Istanbul dataset (https://archive.ics.uci.edu/ml/datasets/Parkinson%27s+Disease+Classification, accessed on 17 February 2022) | Vocal measurements from a smartphone application | Stacked AEs + KELM/linear-SVM/MLP-SVM/RBF-SVM/kNN/NB/CART/LDA | Best performing: kNN with ACC = 94–98% |
Zhang et al. (2018) [124] | Diagnosis | 500 PD patients and 500 healthy controls from the mPower study [126] | Vocal measurements from a smartphone application | DT-STFT + LSTM/CNN | Best performing: CNN with ACC = 90.45% |
Tougui et al. (2020) [125] | Diagnosis | 453 PD patients and 1037 healthy controls from the mPower study [126] | Vocal measurements from a smartphone application | Time, frequency and cepstral domain (with DFT) features + feature selection with ANOVA/LASSO + linear-SVM/kNN/RF/XGBoost | Best performing: LASSO + XGBoost with ACC = 95.78%, SENS = 95.32%, SPEC = 96.23%, F1-SCORE = 95.74% |
Almeida et al. (2019) [128] | Diagnosis | 64 PD patients and 35 healthy controls | Audio recordings from acoustic cardioid and smartphone | Phonation/speech/unvoiced/voiced features + kNN/MLP/OPF/SVM | Best performing: kNN based on phonation features with ACC = 92.94–94.55%, SENS = 92.94–94.55%, SPEC = 89.21–94.26%, AUROC = 0.87–0.92 |
Arora et al. (2021) [129] | Differential diagnosis between PD patients, patients with RBD and healthy controls; severity (MDS-UPDRS, MoCA, ESS, BDI and VAS scores) estimation | 335 PD patients, 112 patients with RBD and 92 healthy controls | Speech recording from smartphones | Segmentation + feature extraction + feature selection + RF | For all the pairwise classifications: SENS = 59.4–74.9%, SPEC = 67.4–73.2%; for severity estimation: MAE = 1–8 (MDS-UPDRS), MAE = 1–14 (MDS-UPDRS I-III), MAE = 1–2 (MoCA), MAE = 2–3 (ESS), MAE = 1–5 (BDI), MAE = 6.5–10 (VAS) |
Bayestehtashk et al. (2015) [132] | Severity (mUPDRS 0–108 scores) estimation | 168 PD patients | Speech recordings from a portable device | Feature extraction with harmonic model + Ridge/LASSO regression/linear-SVR | Best performing: ridge regression with MAE = 5.5, explained variance = 61% |
Yoon et al. (2019) [130] | Severity (UPDRS 0–176 scores) estimation | 42 PD patients | Phonation and speech recordings from an at-home testing device | Standardization + feature extraction + {one-model-fits-all approaches with DT/GP/linear regression/SVR/ensemble}/{single learning approaches with DT/GP/linear regression/SVR/ensemble}/positive transfer learning based on Bayesian parameter transfer model | Best performing: positive transfer learning with MAE = 2–3 approximately |
Raza et al. (2021) [131] | Severity (UPDRS and mUPDRS scores) prediction in 6 months | 42 PD patients | Phonation and speech recordings from an at-home testing device | Feature extraction + XGBoost | MAE = 6.45 (UPDRS), MAE = 5.09 (mUPDRS) |
Study | Problem | Dataset Population | Input Data | Analysis/Algorithms | Evaluation |
---|---|---|---|---|---|
Rahman et al. (2020) [133] | Diagnosis | 5 PD patients and 5 healthy controls | EEG signals from a portable headset with sensors placed at the forehead while watching 4 videos which provoke 4 different emotions | Feed-forward NN trained with Adam optimization algorithm | ACC = 96.5%, PREC = 95.5%, REC = 97%, F1-SCORE = 97.6% |
Kleinholdermann et al. (2021) [134] | Severity (MDS-UPDRS III scores) estimation | 45 PD patients | sEMG signals from a wrist-worn band while performing a simple tapping activity | Windowing + feature extraction + linear regression/poly-SVM/kNN/RF | Best performing: RF regression with r = 0.739 |
Capecci et al. (2019) [135] | Emotion (positive/negative) recognition | 36 PD patients | Body temperature, heart rate and galvanic response from smartwatch sensors | Linear-SVM/poly-SVM/RBF-SVM | Best performing: RBF-SVM with ACC = 88.6–91.3% |
Lacy et al. (2018) [136] | Diagnosis | 49 PD patients and 41 healthy controls (1st dataset); 58 PD patients and 29 healthy controls (2nd dataset) | Position measures from 2 electromagnetic sensors located at the thumb and index finger while performing finger-tapping tests | Low-pass Butterworth filtering + velocity and acceleration features extraction from derivatives + ESN | AUROC = 0.802 |
Picardi et al. (2017) [137] | Diagnosis; classification between different cognition levels (PD patients with normal cognition-PDNC, PD patients with mild cognitive impairment-PDMCI and PD patients with dementia-PDD) | 22 PDNC, 23 PDMCI, 10 PDD and 30 age-matched healthy controls | Flexion signals from a glove with finger-mounted sensors and position and orientation information from a wrist-worn tracking system | Feature extraction + Cartesian Genetic Programming/SVM/ANN | Similar performance for all the algorithms: AUROC = 0.72–0.99 for all the pair-wise classifications |
Memedi et al. (2015) [138] | Symptom detection (bradykinesia/dyskinesia) | 65 advanced PD patients | Spatiotemporal features from spiral drawings, produced with a touchscreen telemetry device | PCA + MLP/RF/RBF-SVM/linear-SVM/LR | Best performing: MLP with ACC = 84.0%, SENS = 75.7%, SPEC = 88.9%, AUROC = 0.86, weighted Kappa = 0.65 |
Pham et al. (2019) [139] | Diagnosis | 42 PD patients and 43 healthy controls from newQWERTY MIT-CSXPD database (https://www.physionet.org/content/nqmitcsxpd/1.0.0/, accessed on 17 February 2022) | keystroke logs time series | CNN/LSTM/CNN-GoogleNet/CNN-AlexNet/LSTM-fuzzy recurrent plots (FRP) | Best performing: LSTM-FRP (m = 3) with ACC = 65.14–81.90%, SENS = 66.67–95%, SPEC = 63.33–66.67% |
Matarazzo et al. (2019) [140] | Medication response detection (improved/not changed) and medication response prediction in 21 weeks | 29 PD patients and 30 age-matched healthy controls | Keystroke logs with the help of neuroQWERTY software | RNN | For medication response detection: ACC = 76.5%, AUROC = 0.75, kappa = 0.47; for medication response prediction: AUROC = 0.69–0.75 |
Study | Problem | Dataset Population | Input Data | Analysis/Algorithms | Evaluation |
---|---|---|---|---|---|
Aharonson et al. (2018) [141] | Diagnosis | 22 PD patients and 20 healthy controls | Signals from sensors mounted on a support walker (2 encoders on the wheels, 2 force sensors underneath the hand grips and a tri-axial accelerometer) while performing two walking tests | Filtering + wavelet denoising for accelerometer signals + differentiation for encoder signals + ranked feature selection/PCA + k-means/FLDA | Best performing: PCA + FLDA with SENS = 91–96%, SPEC = 95–100% |
Pardoel et al. (2021) [142] | FoG detection (pre-fog, transition, FoG, total-FoG, no-FoG events classification) | 11 PD patients | Accelerometer and gyroscope signals from 4 foot-mounted sensors and plantar pressure distribution data from in-sole sensors | Data windowing + feature extraction in time and frequency domain with FFT and DWT + feature selection with mRMR/Relief-f + RUSBoost | For total-FoG/no-FoG classification: SENS = 61.9–78%, SPEC = 83.2–91.6%; for FoG/no-FoG classification: SENS = 81.4–98.5%, SPEC = 83.2–91.6% |
Wu et al. (2020) [143] | Severity estimation (UPDRS-III scores classification) | 17 PD patients | Acceleration signals from hand-mounted sensors and displacement signals from detection devices | Detrending + Wavelet transform + PCA + linear regression/SVM/NN/RF | Best performing: NN with ACC = 91.18–95.30% |
Cole et al. (2014) [144] | Tremor and dyskinesia detection; severity estimation for tremor and dyskinesia | 8 PD patients and 4 healthy controls | Acceleration and electromyography signals | Dynamical DNN/SVM/HMM for tremor and dyskinesia detection; Bayesian maximum likelihood classifier for severity estimation | Best performing for tremor detection: HMM with global error = 6.1%; for dyskinesia detection: DNN with global error = 8.8%; for tremor severity estimation: SENS = 95.2–97.2%, SPEC = 97.1–99.3%; for dyskinesia severity estimation: SENS = 91.9–95%, SPEC = 94.6–98.6% |
Hossen et al. (2012) [145] | Differential diagnosis between PD and ET patients | 39 PD patients and 41 ET patients | Signals from a hand-mounted accelerometer and 2 sEMG sensors placed at the forearm flexors and extensors | Filtering + feature extraction with SDE based on wavelet decomposition + MLP trained with the back-propagation algorithm | ACC = 91.6%, SENS = 95%, SPEC = 88.2% |
Tahafchi et al. (2019) [146] | FoG detection | 4 PD patients | Accelerometer and gyroscope signals from 2 foot-mounted IMUs and EMG signals from 2 Shimmer modules | Fully connected NN | AUC = 0.906–0.963 |
Huo et al. (2020) [147] | Diagnosis; Severity (UPDRS scores) estimation; PD patients with UPDRS > 0 and PD patients with UPDRS = 0 (due to DBS) classification | 23 PD patients and 10 healthy controls | Bio-signals from hand-placed force sensor, 3 IMUs and 4 MMG sensors during different symptoms measurements (elbow rigidity, wrist rigidity, bradykinesia, kinetic tremor, postural tremor, rest tremor) | Voting classifier of 3 best performing basic classifiers (kNN, MLP and AdaBoost) | For severity estimation: ACC = 80.1–91.8% (avg = 85.4%); for diagnosis: ACC = 95.0–98.9% (avg = 96.6%); for PD patients with UPDRS > 0 and UPDRS = 0 classification: ACC = 85.2–91.1% (avg = 89.0%) |
Yu et al. (2018) [148] | Severity estimation; Fall risk detection | 22 PD patients | Signals from accelerometers, gyroscopes and thermometers embedded in 5 sensors attached to the chest, both thighs and feet while performing TUG tests | CNN for multi-source multi-task learning/single feature-based assessment + kNN/SVM/NB | Best performing for severity estimation: CNN for MML with RMSE = 0.060; for fall risk detection: PREC = 92.5%, REC = 95.8%, F-SCORE = 94% |
Sajal et al. (2020) [149] | Diagnosis and severity estimation (voice and rest tremor MDS-UPDRS scores classification) | 52 PD patients for tremor measurements; 23 PD patients and 8 healthy subjects for voice measurements (https://archive.ics.uci.edu/ml/datasets/parkinsons, accessed on 17 February 2022) | Rest tremor data from a smartphone built-in accelerometer and vowel phonation recordings from a smartphone | For tremor data: detrending + wavelet filtering For vocal data: bandpass filter + down-sampling + mRMR feature selection algorithm + kNN/SVM/NB + majority voting | Best performing for severity estimation: kNN with ACC = 93.7%, SENS = 94.6%, SPEC = 93.7% (vocal features), ACC = 90.5–98.5%, SENS = 87.5–94%, SPEC = 96–100% (tremor features); for diagnosis: kNN + SVM + NB ensemble averaging with ACC = 99.8% |
Oung et al. (2018) [150] | Severity estimation (healthy/mild/moderate/severe classification) | 15 healthy controls, 20 PD patients with mild severity, 20 PD patients with moderate severity, 15 PD patients with severe symptoms | Accelerometer, gyroscope and magnetometer data from 4 wrist- and limb-mounted IMUs and speech signals recorded with a headset | Segmentation + EWT for motor signals/EWPT for speech signals + Hilbert transformations + wavelet energy and entropy-based feature extraction + kNN/PNN/ELM | Best performing: ELM with ACC = 92.45–95.93% |
Papadopoulos et al. (2020) [151] | Diagnosis; tremor/fine-motor impairment detection | 14 PD patients and 8 healthy controls (1st dataset), 26 PD patients and 131 healthy controls (2nd dataset) | Accelerometer data from a smartphone sensor; typing dynamics from a smartphone virtual keyboard | Filtering + feature extraction with 1D-CNN for accelerometer data and fully connected NN for keystroke data + attention pooling module + NN classifiers | For tremor/fine-motor impairment detection on the 1st dataset: SENS = 85.4–92.8%, SPEC = 84.2–93.6%, PREC = 92.1–93%; for diagnosis on the 2nd dataset: ensemble of 10 models with AUROC = 0.834–0.868, SENS/SPEC = 60%/91.7–92%/68.9% |
Heidarivincheh et al. (2021) [152] | Diagnosis | 5 PD patients (on medication) and 5 healthy controls | Acceleration signals from a wrist-worn sensor and silhouette images from an RGB-D camera while cooking | {Convolutional VAEs + dense layers (MCPD-Net)}/CNN/unimodal VAE/RF/LSTM/other multimodal models | Best performing: MCPD-Net with PREC = 71%, REC = 77%, F1-SCORE = 66% |
Wahid et al. (2015) [153] | Diagnosis | 23 PD patients and 26 healthy controls | Spatial-temporal gait features captured by a video-camera and GRF from force platforms | Filtering + Normalization through multiple regression + KFD/NB/kNN/SVM/RF | Best performing: RF with AUROC = 0.96, ACC = 92.6%, SENS = 96%, SPEC = 90% |
Albani et al. (2019) [154] | Diagnosis and severity estimation (UPDRS scores classification) | 25 PD patients and 15 healthy controls for the final testing of the pre-trained models | Video recordings of upper-limbs from an RGB-D camera and accelerometer, gyroscope and magnetometer signals from 3 wearable sensors attached to the thighs and the chest | NB/LDA/MNR/kNN/poly-SVM for upper-limb classification (1st pretrained model); (PCA) + kNN/SVM for lower-limb classification (2nd pretrained model) | Best performing for diagnosis: SVM (1st model) + kNN (2nd model) with ACC = 91.5–98.6%; for severity estimation: ACC = 60.7–79.1% |
Joshi et al. (2018) [155] | Facial expressivity estimation (classification and regression) | 117 PD patients | Short interview audio-video clips | Audio features extraction (MFCC) + visual features extraction (AU statistics) + HBNN classification/regression + contextual information (gender/sentiment) | Best performing for classification: HBNN-sentiment with F1-SCORE = 0.55; for regression: MAE = 0.48 |
Barth et al. (2012) [159] | Diagnosis | 18 PD patients and 17 healthy controls | Signals form a smart pen (acceleration, grip force, refill force, vibration sound); gyroscope and accelerometer data from a shoe-mounted IMU | Chebyshev low pass filter + linear forward feature selection with CFS/backtracking facility + LDA/linear-SVM/AdaBoost | Best performing: linear forward feature selection with CFS + AdaBoost with classification rate = 97%, SENS = 100%, SPEC = 94% |
Xu et al. (2020) [156] | Diagnosis | 31 PD patients and 35 healthy controls from an extended HandPD dataset (http://wwwp.fc.unesp.br/~papa/pub/datasets/Handpd/, accessed on 17 February 2022) | Pressure, tilt and acceleration signals from a smart pen while performing 6 different handwriting tasks | PCA + RF + voting scheme | ACC = 88.8–89.4%, SENS = 83.7–84.5%, SPEC = 93.4–93.7%, F1-SCORE = 87.2–87.7% |
Gallicchio et al. (2018) [157] | Diagnosis | 61 PD patients and 15 healthy controls from a UCI dataset (https://archive.ics.uci.edu/ml/datasets/Parkinson+Disease+Spiral+Drawings+Using+Digitized+Graphics+Tablet, accessed on 17 February 2022) | Pen position, pressure and grip angle while sketching spirals on a tablet | Deep ESN/shallow ESN + ensemble learning or not | Best performing: ensemble of DESNs with ACC = 89.33%, SENS = 90%, SPEC = 80% |
Pereira et al. (2016) [158] | Diagnosis | 14 PD patients and 21 healthy controls from an extended HandPD dataset (http://wwwp.fc.unesp.br/~papa/pub/datasets/Handpd/, accessed on 17 February 2022) | Various signals from a smart pen (microphone, finger grip, axial pressure of ink refill, tilt and acceleration) while drawing spirals and meanders | CNN architectures (ImageNet/CIFAR-10/LeNet)/OPF | Best performing for meander dataset: ImageNet with ACC = 84.74–87.14%; for spiral dataset: OPF with ACC = 77.92–83.77% |
Schwab et al. (2019) [160] | Diagnosis | 1853 subjects (PD patients and healthy subjects) from the mPower study [126] | Touchscreen data from a tapping activity, accelerometer data from a walking activity, performance in a memory game, vocal measurements and demographics from a mobile application | RF/CNN/RNN for each test + evidence aggregation model combining evidence from all the different tests | AUROC = 0.85, AUPR = 0.87, F1-SCORE = 82% |
Prince et al. (2019) [161] | Diagnosis | 1513 subjects (PD patients and healthy subjects) | Touchscreen data from a tapping activity, accelerometer data from a walking activity, performance in a memory game and vocal measurements from a mobile app | LR/RF/DNN/CNN for each test + ensemble learning combining the previous classifiers | ACC = 82.0%, F1-SCORE = 87.1% |
Cook et al. (2015) [162] | Diagnosis; classification between healthy older adults and PD patients with and without mild cognitive impairment (HOA/PDNC/HMCI/PDMCI) | 50 HOA and 25 PD patients for diagnosis; 18 HOA, 16 PDNC, 9 PDMCI, 9 HMCI for their classification | Signals from ambient sensors in a smart house environment (infrared motion sensors on the ceiling, light sensors, magnetic door sensors, temperature sensors and vibration sensors on selected items); accelerometer, gyroscope and magnetometer data from hand- and ankle- mounted sensors | No dimensionality reduction technique/PCA/k-means clustering/random resampling + DT/NB/RF/SVM/Ada-DT/Ada-RF | Best performing for diagnosis: k-means clustering + Ada-DT with ACC = 80%, AUC = 0.84; for the multiclass classification: random resampling + Ada-DT with ACC = 86%, AUC = 0.97 |
References
- Tysnes, O.-B.; Storstein, A. Epidemiology of Parkinson’s Disease. J. Neural. Transm. 2017, 124, 901–905. [Google Scholar] [CrossRef] [PubMed]
- Dorsey, E.R.; Elbaz, A.; Nichols, E.; Abd-Allah, F.; Abdelalim, A.; Adsuar, J.C.; Ansha, M.G.; Brayne, C.; Choi, J.-Y.J.; Collado-Mateo, D.; et al. Global, Regional, and National Burden of Parkinson’s Disease, 1990–2016: A Systematic Analysis for the Global Burden of Disease Study 2016. Lancet Neurol. 2018, 17, 939–953. [Google Scholar] [CrossRef] [Green Version]
- Kalia, L.V.; Lang, A.E. Parkinson’s Disease. Lancet 2015, 386, 896–912. [Google Scholar] [CrossRef]
- Wolters, E.C. Variability in the Clinical Expression of Parkinson’s Disease. J. Neurol. Sci. 2008, 266, 197–203. [Google Scholar] [CrossRef] [PubMed]
- Chaudhuri, K.R.; Healy, D.G.; Schapira, A.H. Non-Motor Symptoms of Parkinson’s Disease: Diagnosis and Management. Lancet Neurol. 2006, 5, 235–245. [Google Scholar] [CrossRef]
- Ho, A.K.; Iansek, R.; Marigliani, C.; Bradshaw, J.L.; Gates, S. Speech Impairment in a Large Sample of Patients with Parkinson’s Disease. Behav. Neurol. 1998, 11, 131–137. [Google Scholar] [CrossRef]
- Harel, B.; Cannizzaro, M.; Snyder, P.J. Variability in Fundamental Frequency during Speech in Prodromal and Incipient Parkinson’s Disease: A Longitudinal Case Study. Brain Cogn. 2004, 56, 24–29. [Google Scholar] [CrossRef]
- Marras, C. Subtypes of Parkinson’s Disease: State of the Field and Future Directions. Curr. Opin. Neurol. 2015, 28, 382–386. [Google Scholar] [CrossRef]
- Jankovic, J. Motor Fluctuations and Dyskinesias in Parkinson’s Disease: Clinical Manifestations. Mov. Disord. 2005, 20, S11–S16. [Google Scholar] [CrossRef]
- Tolosa, E.; Wenning, G.; Poewe, W. The Diagnosis of Parkinson’s Disease. Lancet Neurol. 2006, 5, 75–86. [Google Scholar] [CrossRef]
- Bhidayasiri, R.; Martinez-Martin, P. Chapter Six—Clinical Assessments in Parkinson’s Disease: Scales and Monitoring. In International Review of Neurobiology; Bhatia, K.P., Chaudhuri, K.R., Stamelou, M., Eds.; Parkinson’s Disease; Academic Press: Cambridge, MA, USA, 2017; Volume 132, pp. 129–182. [Google Scholar]
- Norvig, P.; Russel, S. Artificial Intelligence: A Modern Approach; Prentice Hall Upper: Saddle River, NJ, USA, 2002. [Google Scholar]
- Mitchell, T.M. Does Machine Learning Really Work? AI Mag. 1997, 18, 11. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep Learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Atzori, L.; Iera, A.; Morabito, G. The Internet of Things: A Survey. Comput. Netw. 2010, 54, 2787–2805. [Google Scholar] [CrossRef]
- Espay, A.J.; Bonato, P.; Nahab, F.B.; Maetzler, W.; Dean, J.M.; Klucken, J.; Eskofier, B.M.; Merola, A.; Horak, F.; Lang, A.E.; et al. Technology in Parkinson’s Disease: Challenges and Opportunities: Technology in PD. Mov. Disord. 2016, 31, 1272–1282. [Google Scholar] [CrossRef] [Green Version]
- Miele, G.; Straccia, G.; Moccia, M.; Leocani, L.; Tedeschi, G.; Bonavita, S.; Lavorgna, L.; Padovani, A.; Clerico, M.; Brigo, F.; et al. Telemedicine in Parkinson’s Disease: How to Ensure Patient Needs and Continuity of Care at the Time of COVID-19 Pandemic. Telemed. e-Health 2020, 26, 1533–1536. [Google Scholar] [CrossRef] [PubMed]
- Raghavendra, U.; Acharya, U.R.; Adeli, H. Artificial Intelligence Techniques for Automated Diagnosis of Neurological Disorders. Eur. Neurol. 2019, 82, 41–64. [Google Scholar] [CrossRef] [PubMed]
- Pasluosta, C.F.; Gassner, H.; Winkler, J.; Klucken, J.; Eskofier, B.M. An Emerging Era in the Management of Parkinson’s Disease: Wearable Technologies and the Internet of Things. IEEE J. Biomed. Health Inform. 2015, 19, 1873–1881. [Google Scholar] [CrossRef] [PubMed]
- Suzuki, M.; Mitoma, H.; Yoneyama, M. Quantitative Analysis of Motor Status in Parkinson’s Disease Using Wearable Devices: From Methodological Considerations to Problems in Clinical Applications. Park. Dis. 2017, 2017, e6139716. [Google Scholar] [CrossRef] [Green Version]
- Monje, M.H.G.; Foffani, G.; Obeso, J.; Sánchez-Ferro, Á. New Sensor and Wearable Technologies to Aid in the Diagnosis and Treatment Monitoring of Parkinson’s Disease. Annu. Rev. Biomed. Eng. 2019, 21, 111–143. [Google Scholar] [CrossRef]
- Channa, A.; Popescu, N.; Ciobanu, V. Wearable Solutions for Patients with Parkinson’s Disease and Neurocognitive Disorder: A Systematic Review. Sensors 2020, 20, 2713. [Google Scholar] [CrossRef]
- Abou, L.; Peters, J.; Wong, E.; Akers, R.; Dossou, M.S.; Sosnoff, J.J.; Rice, L.A. Gait and Balance Assessments Using Smartphone Applications in Parkinson’s Disease: A Systematic Review. J. Med. Syst. 2021, 45, 87. [Google Scholar] [CrossRef] [PubMed]
- Little, M.A. Smartphones for Remote Symptom Monitoring of Parkinson’s Disease. J. Park. Dis. 2021, 11, S49–S53. [Google Scholar] [CrossRef] [PubMed]
- Ireland, D.; Liddle, J.; Mcbride, S.; Ding, H.; Knuepffer, C. Chat-Bots for People with Parkinson’s Disease: Science Fiction or Reality? In Driving Reform: Digital Health Is Everyone’s Business; IOS Press: Amsterdam, The Netherlands, 2015; pp. 128–133. [Google Scholar]
- Dias, S.B.; Konstantinidis, E.; Diniz, J.A.; Bamidis, P.; Charisis, V.; Hadjidimitriou, S.; Stadtschnitzer, M.; Fagerberg, P.; Ioakeimidis, I.; Dimitropoulos, K.; et al. Serious Games as a Means for Holistically Supporting Parkinson’s Disease Patients: The i-PROGNOSIS Personalized Game Suite Framework. In Proceedings of the 2017 9th International Conference on Virtual Worlds and Games for Serious Applications (VS-Games), Athens, Greece, 6–8 September 2017; pp. 237–244. [Google Scholar]
- Zhou, Y.; Jenkins, M.E.; Naish, M.D.; Trejos, A.L. Development of a Wearable Tremor Suppression Glove. In Proceedings of the 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob), Enschede, The Netherlands, 26–29 August 2018; pp. 640–645. [Google Scholar]
- Maetzler, W.; Domingos, J.; Srulijes, K.; Ferreira, J.J.; Bloem, B.R. Quantitative Wearable Sensors for Objective Assessment of Parkinson’s Disease: Wearable Sensors in PD. Mov. Disord. 2013, 28, 1628–1637. [Google Scholar] [CrossRef] [PubMed]
- Ossig, C.; Antonini, A.; Buhmann, C.; Classen, J.; Csoti, I.; Falkenburger, B.; Schwarz, M.; Winkler, J.; Storch, A. Wearable Sensor-Based Objective Assessment of Motor Symptoms in Parkinson’s Disease. J. Neural Transm. 2016, 123, 57–64. [Google Scholar] [CrossRef] [PubMed]
- Godinho, C.; Domingos, J.; Cunha, G.; Santos, A.T.; Fernandes, R.M.; Abreu, D.; Gonçalves, N.; Matthews, H.; Isaacs, T.; Duffen, J.; et al. A Systematic Review of the Characteristics and Validity of Monitoring Technologies to Assess Parkinson’s Disease. J. Neuroeng. Rehabil. 2016, 13, 24. [Google Scholar] [CrossRef] [Green Version]
- Del Din, S.; Kirk, C.; Yarnall, A.J.; Rochester, L.; Hausdorff, J.M. Body-Worn Sensors for Remote Monitoring of Parkinson’s Disease Motor Symptoms: Vision, State of the Art, and Challenges Ahead. J. Park. Dis. 2021, 11, S35–S47. [Google Scholar] [CrossRef] [PubMed]
- Ferreira-Sánchez, M.D.R.; Moreno-Verdú, M.; Cano-de-la-Cuerda, R. Quantitative Measurement of Rigidity in Parkinson’s Disease: A Systematic Review. Sensors 2020, 20, 880. [Google Scholar] [CrossRef] [Green Version]
- Vienne, A.; Barrois, R.P.; Buffat, S.; Ricard, D.; Vidal, P.-P. Inertial Sensors to Assess Gait Quality in Patients with Neurological Disorders: A Systematic Review of Technical and Analytical Challenges. Front. Psychol. 2017, 8, 817. [Google Scholar] [CrossRef] [Green Version]
- De Oliveira Gondim, I.T.G.; de Souza, C.D.C.B.; Rodrigues, M.A.B.; Azevedo, I.M.; de Sales, M.D.G.W.; Lins, O.G. Portable Accelerometers for the Evaluation of Spatio-Temporal Gait Parameters in People with Parkinson’s Disease: An Integrative Review. Arch. Gerontol. Geriatr. 2020, 90, 104097. [Google Scholar] [CrossRef]
- di Biase, L.; Di Santo, A.; Caminiti, M.L.; De Liso, A.; Shah, S.A.; Ricci, L.; Di Lazzaro, V. Gait Analysis in Parkinson’s Disease: An Overview of the Most Accurate Markers for Diagnosis and Symptoms Monitoring. Sensors 2020, 20, 3529. [Google Scholar] [CrossRef]
- Silva de Lima, A.L.; Evers, L.J.W.; Hahn, T.; Bataille, L.; Hamilton, J.L.; Little, M.A.; Okuma, Y.; Bloem, B.R.; Faber, M.J. Freezing of Gait and Fall Detection in Parkinson’s Disease Using Wearable Sensors: A Systematic Review. J. Neurol. 2017, 264, 1642–1654. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pardoel, S.; Kofman, J.; Nantel, J.; Lemaire, E.D. Wearable-Sensor-Based Detection and Prediction of Freezing of Gait in Parkinson’s Disease: A Review. Sensors 2019, 19, 5141. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sun, M.; Watson, A.; Zhou, G. Wearable Computing of Freezing of Gait in Parkinson’s Disease: A Survey. Smart Health 2020, 18, 100143. [Google Scholar] [CrossRef]
- Maitín, A.M.; García-Tejedor, A.J.; Muñoz, J.P.R. Machine Learning Approaches for Detecting Parkinson’s Disease from EEG Analysis: A Systematic Review. Appl. Sci. 2020, 10, 8662. [Google Scholar] [CrossRef]
- Mei, J.; Desrosiers, C.; Frasnelli, J. Machine Learning for the Diagnosis of Parkinson’s Disease: A Review of Literature. Front. Aging Neurosci. 2021, 13, 184. [Google Scholar] [CrossRef]
- Rovini, E.; Maremmani, C.; Cavallo, F. How Wearable Sensors Can Support Parkinson’s Disease Diagnosis and Treatment: A Systematic Review. Front. Neurosci. 2017, 11, 555. [Google Scholar] [CrossRef]
- Thorp, J.E.; Adamczyk, P.G.; Ploeg, H.-L.; Pickett, K.A. Monitoring Motor Symptoms During Activities of Daily Living in Individuals with Parkinson’s Disease. Front. Neurol. 2018, 9, 1036. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ramdhani, R.A.; Khojandi, A.; Shylo, O.; Kopell, B.H. Optimizing Clinical Assessments in Parkinson’s Disease Through the Use of Wearable Sensors and Data Driven Modeling. Front. Comput. Neurosci. 2018, 12, 72. [Google Scholar] [CrossRef] [Green Version]
- Belić, M.; Bobić, V.; Badža, M.; Šolaja, N.; Đurić-Jovičić, M.; Kostić, V.S. Artificial Intelligence for Assisting Diagnostics and Assessment of Parkinson’s Disease—A Review. Clin. Neurol. Neurosurg. 2019, 184, 105442. [Google Scholar] [CrossRef] [PubMed]
- Zhang, H.; Song, C.; Rathore, A.S.; Huang, M.-C.; Zhang, Y.; Xu, W. MHealth Technologies Towards Parkinson’s Disease Detection and Monitoring in Daily Life: A Comprehensive Review. IEEE Rev. Biomed. Eng. 2021, 14, 71–81. [Google Scholar] [CrossRef]
- Sica, M.; Tedesco, S.; Crowe, C.; Kenny, L.; Moore, K.; Timmons, S.; Barton, J.; O’Flynn, B.; Komaris, D.-S. Continuous Home Monitoring of Parkinson’s Disease Using Inertial Sensors: A Systematic Review. PLoS ONE 2021, 16, e0246528. [Google Scholar] [CrossRef]
- Barrachina-Fernández, M.; Maitín, A.M.; Sánchez-Ávila, C.; Romero, J.P. Wearable Technology to Detect Motor Fluctuations in Parkinson’s Disease Patients: Current State and Challenges. Sensors 2021, 21, 4188. [Google Scholar] [CrossRef]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
- Rastegari, E.; Ali, H. A Bag-of-Words Feature Engineering Approach for Assessing Health Conditions Using Accelerometer Data. Smart Health 2020, 16, 100116. [Google Scholar] [CrossRef]
- Juutinen, M.; Wang, C.; Zhu, J.; Haladjian, J.; Ruokolainen, J.; Puustinen, J.; Vehkaoja, A. Parkinson’s Disease Detection from 20-Step Walking Tests Using Inertial Sensors of a Smartphone: Machine Learning Approach Based on an Observational Case-Control Study. PLoS ONE 2020, 15, e0236258. [Google Scholar] [CrossRef]
- Fernandes, C.; Fonseca, L.; Ferreira, F.; Gago, M.; Costa, L.; Sousa, N.; Ferreira, C.; Gama, J.; Erlhagen, W.; Bicho, E. Artificial Neural Networks Classification of Patients with Parkinsonism Based on Gait. In Proceedings of the 2018 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Madrid, Spain, 3–6 December 2018; pp. 2024–2030. [Google Scholar]
- Cuzzolin, F.; Sapienza, M.; Esser, P.; Saha, S.; Franssen, M.M.; Collett, J.; Dawes, H. Metric Learning for Parkinsonian Identification from IMU Gait Measurements. Gait Posture 2017, 54, 127–132. [Google Scholar] [CrossRef] [Green Version]
- Abujrida, H.; Agu, E.; Pahlavan, K. Machine Learning-Based Motor Assessment of Parkinson’s Disease Using Postural Sway, Gait and Lifestyle Features on Crowdsourced Smartphone Data. Biomed. Phys. Eng. Express 2020, 6, 035005. [Google Scholar] [CrossRef] [PubMed]
- Zhang, H.; Deng, K.; Li, H.; Albin, R.L.; Guan, Y. Deep Learning Identifies Digital Biomarkers for Self-Reported Parkinson’s Disease. Patterns 2020, 1, 100042. [Google Scholar] [CrossRef]
- Kostikis, N.; Hristu-Varsakelis, D.; Arnaoutoglou, M.; Kotsavasiloglou, C. A Smartphone-Based Tool for Assessing Parkinsonian Hand Tremor. IEEE J. Biomed. Health Inform. 2015, 19, 1835–1842. [Google Scholar] [CrossRef] [PubMed]
- Koçer, A.; Oktay, A.B. Nintendo Wii Assessment of Hoehn and Yahr Score with Parkinson’s Disease Tremor. Technol. Health Care 2016, 24, 185–191. [Google Scholar] [CrossRef]
- Li, N.; Tian, F.; Fan, X.; Zhu, Y.; Wang, H.; Dai, G. Monitoring Motor Symptoms in Parkinson’s Disease via Instrumenting Daily Artifacts with Inertia Sensors. CCF Trans. Pervasive Comput. Interact. 2019, 1, 100–113. [Google Scholar] [CrossRef] [Green Version]
- Javed, F.; Thomas, I.; Memedi, M. A Comparison of Feature Selection Methods When Using Motion Sensors Data: A Case Study in Parkinson’s Disease. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 5426–5429. [Google Scholar]
- Williamson, J.R.; Telfer, B.; Mullany, R.; Friedl, K.E. Detecting Parkinson’s Disease from Wrist-Worn Accelerometry in the U.K. Biobank. Sensors 2021, 21, 2047. [Google Scholar] [CrossRef] [PubMed]
- Park, D.J.; Lee, J.W.; Lee, M.J.; Ahn, S.J.; Kim, J.; Kim, G.L.; Ra, Y.J.; Cho, Y.N.; Jeong, W.B. Evaluation for Parkinsonian Bradykinesia by Deep Learning Modeling of Kinematic Parameters. J. Neural Transm. 2021, 128, 181–189. [Google Scholar] [CrossRef]
- Som, A.; Krishnamurthi, N.; Buman, M.; Turaga, P. Unsupervised Pre-Trained Models from Healthy ADLs Improve Parkinson’s Disease Classification of Gait Patterns. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 784–788. [Google Scholar]
- Ricci, M.; Di Lazzaro, G.; Pisani, A.; Mercuri, N.B.; Giannini, F.; Saggio, G. Assessment of Motor Impairments in Early Untreated Parkinson’s Disease Patients: The Wearable Electronics Impact. IEEE J. Biomed. Health Inform. 2020, 24, 120–130. [Google Scholar] [CrossRef] [PubMed]
- Kuhner, A.; Schubert, T.; Cenciarini, M.; Wiesmeier, I.K.; Coenen, V.A.; Burgard, W.; Weiller, C.; Maurer, C. Correlations between Motor Symptoms across Different Motor Tasks, Quantified via Random Forest Feature Classification in Parkinson’s Disease. Front. Neurol. 2017, 8, 607. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Caramia, C.; Torricelli, D.; Schmid, M.; Munoz-Gonzalez, A.; Gonzalez-Vargas, J.; Grandas, F.; Pons, J.L. IMU-Based Classification of Parkinson’s Disease from Gait: A Sensitivity Analysis on Sensor Location and Feature Selection. IEEE J. Biomed. Health Inform. 2018, 22, 1765–1774. [Google Scholar] [CrossRef]
- Talitckii, A.; Kovalenko, E.; Anikina, A.; Zimniakova, O.; Semenov, M.; Bril, E.; Shcherbak, A.; Dylov, D.V.; Somov, A. Avoiding Misdiagnosis of Parkinson’s Disease with the Use of Wearable Sensors and Artificial Intelligence. IEEE Sens. J. 2021, 21, 3738–3747. [Google Scholar] [CrossRef]
- Varghese, J.; Fujarski, M.; Hahn, T.; Dugas, M.; Warnecke, T. The Smart Device System for Movement Disorders: Preliminary Evaluation of Diagnostic Accuracy in a Prospective Study. In Digital Personalized Health and Medicine; Series of Studies in Health Technology and Informatics; IOS Press: Amsterdam, The Netherlands, 2020; Volume 270, pp. 889–893. [Google Scholar] [CrossRef]
- Duque, J.D.L.; Egea, A.J.S.; Reeb, T.; Rojas, H.A.G.; González-Vargas, A.M. Angular Velocity Analysis Boosted by Machine Learning for Helping in the Differential Diagnosis of Parkinson’s Disease and Essential Tremor. IEEE Access 2020, 8, 88866–88875. [Google Scholar] [CrossRef]
- Moon, S.; Song, H.-J.; Sharma, V.D.; Lyons, K.E.; Pahwa, R.; Akinwuntan, A.E.; Devos, H. Classification of Parkinson’s Disease and Essential Tremor Based on Balance and Gait Characteristics from Wearable Motion Sensors via Machine Learning Techniques: A Data-Driven Approach. J. Neuroeng. Rehabil. 2020, 17, 125. [Google Scholar] [CrossRef]
- De Vos, M.; Prince, J.; Buchanan, T.; FitzGerald, J.J.; Antoniades, C.A. Discriminating Progressive Supranuclear Palsy from Parkinson’s Disease Using Wearable Technology and Machine Learning. Gait Posture 2020, 77, 257–263. [Google Scholar] [CrossRef]
- Borzì, L.; Varrecchia, M.; Sibille, S.; Olmo, G.; Artusi, C.A.; Fabbri, M.; Rizzone, M.G.; Romagnolo, A.; Zibetti, M.; Lopiano, L. Smartphone-Based Estimation of Item 3.8 of the MDS-UPDRS-III for Assessing Leg Agility in People with Parkinson’s Disease. IEEE Open J. Eng. Med. Biol. 2020, 1, 140–147. [Google Scholar] [CrossRef]
- Bazgir, O.; Frounchi, J.; Habibi, S.A.H.; Palma, L.; Pierleoni, P. A Neural Network System for Diagnosis and Assessment of Tremor in Parkinson Disease Patients. In Proceedings of the 2015 22nd Iranian Conference on Biomedical Engineering (ICBME), Tehran, Iran, 25–27 November 2015; pp. 1–5. [Google Scholar]
- Kim, H.B.; Lee, W.W.; Kim, A.; Lee, H.J.; Park, H.Y.; Jeon, H.S.; Kim, S.K.; Jeon, B.; Park, K.S. Wrist Sensor-Based Tremor Severity Quantification in Parkinson’s Disease Using Convolutional Neural Network. Comput. Biol. Med. 2018, 95, 140–146. [Google Scholar] [CrossRef] [PubMed]
- Dai, H.; Cai, G.; Lin, Z.; Wang, Z.; Ye, Q. Validation of Inertial Sensing-Based Wearable Device for Tremor and Bradykinesia Quantification. IEEE J. Biomed. Health Inform. 2021, 25, 997–1005. [Google Scholar] [CrossRef] [PubMed]
- Khodakarami, H.; Ricciardi, L.; Contarino, M.F.; Pahwa, R.; Lyons, K.E.; Geraedts, V.J.; Morgante, F.; Leake, A.; Paviour, D.; De Angelis, A.; et al. Prediction of the Levodopa Challenge Test in Parkinson’s Disease Using Data from a Wrist-Worn Sensor. Sensors 2019, 19, 5153. [Google Scholar] [CrossRef] [Green Version]
- Mirelman, A.; Ben or Frank, M.; Melamed, M.; Granovsky, L.; Nieuwboer, A.; Rochester, L.; Del Din, S.; Avanzino, L.; Pelosin, E.; Bloem, B.R.; et al. Detecting Sensitive Mobility Features for Parkinson’s Disease Stages Via Machine Learning. Mov. Disord. 2021, 36, 2144–2155. [Google Scholar] [CrossRef]
- Hssayeni, M.D.; Jimenez-Shahed, J.; Burack, M.A.; Ghoraani, B. Ensemble Deep Model for Continuous Estimation of Unified Parkinson’s Disease Rating Scale III. Biomed. Eng. Online 2021, 20, 32. [Google Scholar] [CrossRef]
- Butt, A.H.; Rovini, E.; Fujita, H.; Maremmani, C.; Cavallo, F. Data-Driven Models for Objective Grading Improvement of Parkinson’s Disease. Ann. Biomed. Eng. 2020, 48, 2976–2987. [Google Scholar] [CrossRef]
- Stamate, C.; Magoulas, G.D.; Kueppers, S.; Nomikou, E.; Daskalopoulos, I.; Jha, A.; Pons, J.S.; Rothwell, J.; Luchini, M.U.; Moussouri, T.; et al. The CloudUPDRS App: A Medical Device for the Clinical Assessment of Parkinson’s Disease. Pervasive Mob. Comput. 2018, 43, 146–166. [Google Scholar] [CrossRef]
- Eskofier, B.M.; Lee, S.I.; Daneault, J.-F.; Golabchi, F.N.; Ferreira-Carvalho, G.; Vergara-Diaz, G.; Sapienza, S.; Costante, G.; Klucken, J.; Kautz, T.; et al. Recent Machine Learning Advancements in Sensor-Based Mobility Analysis: Deep Learning for Parkinson’s Disease Assessment. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 655–658. [Google Scholar]
- Shawen, N.; O’Brien, M.K.; Venkatesan, S.; Lonini, L.; Simuni, T.; Hamilton, J.L.; Ghaffari, R.; Rogers, J.A.; Jayaraman, A. Role of Data Measurement Characteristics in the Accurate Detection of Parkinson’s Disease Symptoms Using Wearable Sensors. J. Neuroeng. Rehabil. 2020, 17, 52. [Google Scholar] [CrossRef] [Green Version]
- San-Segundo, R.; Zhang, A.; Cebulla, A.; Panev, S.; Tabor, G.; Stebbins, K.; Massa, R.E.; Whitford, A.; de la Torre, F.; Hodgins, J. Parkinson’s Disease Tremor Detection in the Wild Using Wearable Accelerometers. Sensors 2020, 20, 5817. [Google Scholar] [CrossRef]
- Ibrahim, A.; Zhou, Y.; Jenkins, M.E.; Naish, M.D.; Trejos, A.L. Parkinson’s Tremor Onset Detection and Active Tremor Classification Using a Multilayer Perceptron. In Proceedings of the 2020 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), London, ON, Canada, 30 August–2 September 2020; pp. 1–4. [Google Scholar]
- Channa, A.; Ifrim, R.-C.; Popescu, D.; Popescu, N. A-WEAR Bracelet for Detection of Hand Tremor and Bradykinesia in Parkinson’s Patients. Sensors 2021, 21, 981. [Google Scholar] [CrossRef] [PubMed]
- Kim, H.; Lee, H.J.; Lee, W.; Kwon, S.; Kim, S.K.; Jeon, H.S.; Park, H.; Shin, C.W.; Yi, W.J.; Jeon, B.S.; et al. Unconstrained Detection of Freezing of Gait in Parkinson’s Disease Patients Using Smartphone. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 3751–3754. [Google Scholar]
- Shi, B.; Yen, S.C.; Tay, A.; Tan, D.M.L.; Chia, N.S.Y.; Au, W.L. Convolutional Neural Network for Freezing of Gait Detection Leveraging the Continuous Wavelet Transform on Lower Extremities Wearable Sensors Data. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 5410–5415. [Google Scholar]
- Camps, J.; Samà, A.; Martín, M.; Rodríguez-Martín, D.; Pérez-López, C.; Moreno Arostegui, J.M.; Cabestany, J.; Català, A.; Alcaine, S.; Mestre, B.; et al. Deep Learning for Freezing of Gait Detection in Parkinson’s Disease Patients in Their Homes Using a Waist-Worn Inertial Measurement Unit. Knowl.-Based Syst. 2018, 139, 119–131. [Google Scholar] [CrossRef]
- Ashour, A.S.; El-Attar, A.; Dey, N.; El-Kader, H.A.; Abd El-Naby, M.M. Long short term Memory Based Patient-Dependent Model for FOG Detection in Parkinson’s Disease. Pattern Recognit. Lett. 2020, 131, 23–29. [Google Scholar] [CrossRef]
- Li, B.; Sun, Y.; Yao, Z.; Wang, J.; Wang, S.; Yang, X. Improved Deep Learning Technique to Detect Freezing of Gait in Parkinson’s Disease Based on Wearable Sensors. Electronics 2020, 9, 1919. [Google Scholar] [CrossRef]
- Torvi, V.G.; Bhattacharya, A.; Chakraborty, S. Deep Domain Adaptation to Predict Freezing of Gait in Patients with Parkinson’s Disease. In Proceedings of the 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, USA, 17–20 December 2018; pp. 1001–1006. [Google Scholar]
- Arami, A.; Poulakakis-Daktylidis, A.; Tai, Y.F.; Burdet, E. Prediction of Gait Freezing in Parkinsonian Patients: A Binary Classification Augmented with Time Series Prediction. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 1909–1919. [Google Scholar] [CrossRef]
- Kleanthous, N.; Hussain, A.J.; Khan, W.; Liatsis, P. A New Machine Learning Based Approach to Predict Freezing of Gait. Pattern Recognit. Lett. 2020, 140, 119–126. [Google Scholar] [CrossRef]
- Halder, A.; Singh, R.; Suri, A.; Joshi, D. Predicting State Transition in Freezing of Gait via Acceleration Measurements for Controlled Cueing in Parkinson’s Disease. IEEE Trans. Instrum. Meas. 2021, 70, 1–16. [Google Scholar] [CrossRef]
- Palmerini, L.; Rocchi, L.; Mazilu, S.; Gazit, E.; Hausdorff, J.M.; Chiari, L. Identification of Characteristic Motor Patterns Preceding Freezing of Gait in Parkinson’s Disease Using Wearable Sensors. Front. Neurol. 2017, 8, 394. [Google Scholar] [CrossRef] [Green Version]
- Borzì, L.; Mazzetta, I.; Zampogna, A.; Suppa, A.; Olmo, G.; Irrera, F. Prediction of Freezing of Gait in Parkinson’s Disease Using Wearables and Machine Learning. Sensors 2021, 21, 614. [Google Scholar] [CrossRef]
- Haji Ghassemi, N.; Hannink, J.; Martindale, C.F.; Gaßner, H.; Müller, M.; Klucken, J.; Eskofier, B.M. Segmentation of Gait Sequences in Sensor-Based Movement Analysis: A Comparison of Methods in Parkinson’s Disease. Sensors 2018, 18, 145. [Google Scholar] [CrossRef] [Green Version]
- Hssayeni, M.D.; Adams, J.L.; Ghoraani, B. Deep Learning for Medication Assessment of Individuals with Parkinson’s Disease Using Wearable Sensors. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 1–4. [Google Scholar]
- Aich, S.; Youn, J.; Chakraborty, S.; Pradhan, P.M.; Park, J.-H.; Park, S.; Park, J. A Supervised Machine Learning Approach to Detect the On/Off State in Parkinson’s Disease Using Wearable Based Gait Signals. Diagnostics 2020, 10, 421. [Google Scholar] [CrossRef]
- Pfister, F.M.J.; Um, T.T.; Pichler, D.C.; Goschenhofer, J.; Abedinpour, K.; Lang, M.; Endo, S.; Ceballos-Baumann, A.O.; Hirche, S.; Bischl, B.; et al. High-Resolution Motor State Detection in Parkinson’s Disease Using Convolutional Neural Networks. Sci. Rep. 2020, 10, 5860. [Google Scholar] [CrossRef] [Green Version]
- Belgiovine, G.; Capecci, M.; Ciabattoni, L.; Fiorentino, M.C.; Foresi, G.; Monteriù, A.; Pepa, L. Upper and Lower Limbs Dyskinesia Detection for Patients with Parkinson’s Disease. In Proceedings of the 2018 IEEE 7th Global Conference on Consumer Electronics (GCCE), Nara, Japan, 9–12 October 2018; pp. 704–705. [Google Scholar]
- Watts, J.; Khojandi, A.; Vasudevan, R.; Nahab, F.B.; Ramdhani, R.A. Improving Medication Regimen Recommendation for Parkinson’s Disease Using Sensor Technology. Sensors 2021, 21, 3553. [Google Scholar] [CrossRef] [PubMed]
- Nancy Jane, Y.; Khanna Nehemiah, H.; Arputharaj, K. A Q-Backpropagated Time Delay Neural Network for Diagnosing Severity of Gait Disturbances in Parkinson’s Disease. J. Biomed. Inform. 2016, 60, 169–176. [Google Scholar] [CrossRef]
- Aversano, L.; Bernardi, M.L.; Cimitile, M.; Pecori, R. Early Detection of Parkinson Disease Using Deep Neural Networks on Gait Dynamics. In Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; pp. 1–8. [Google Scholar]
- El Maachi, I.; Bilodeau, G.-A.; Bouachir, W. Deep 1D-Convnet for Accurate Parkinson Disease Detection and Severity Prediction from Gait. Expert Syst. Appl. 2020, 143, 113075. [Google Scholar] [CrossRef]
- Xia, Y.; Yao, Z.; Ye, Q.; Cheng, N. A Dual-Modal Attention-Enhanced Deep Learning Network for Quantification of Parkinson’s Disease Characteristics. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 42–51. [Google Scholar] [CrossRef] [PubMed]
- Balaji, E.; Brindha, D.; Vinodh Kumar, E.; Umesh, K. Data-Driven Gait Analysis for Diagnosis and Severity Rating of Parkinson’s Disease. Med. Eng. Phys. 2021, 91, 54–64. [Google Scholar] [CrossRef]
- Goldberger, A.L.; Amaral, L.A.N.; Glass, L.; Hausdorff, J.M.; Ivanov, P.C.; Mark, R.G.; Mietus, J.E.; Moody, G.B.; Peng, C.-K.; Stanley, H.E. PhysioBank, PhysioToolkit, and PhysioNet. Circulation 2000, 101, e215–e220. [Google Scholar] [CrossRef] [Green Version]
- Papavasileiou, I.; Zhang, W.; Wang, X.; Bi, J.; Zhang, L.; Han, S. Classification of Neurological Gait Disorders Using Multi-Task Feature Learning. In Proceedings of the 2017 IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE), Philadelphia, PA, USA, 17–19 July 2017; pp. 195–204. [Google Scholar]
- Khoury, N.; Attal, F.; Amirat, Y.; Oukhellou, L.; Mohammed, S. Data-Driven Based Approach to Aid Parkinson’s Disease Diagnosis. Sensors 2019, 19, 242. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Reyes, J.F.; Steven Montealegre, J.; Castano, Y.J.; Urcuqui, C.; Navarro, A. LSTM and Convolution Networks Exploration for Parkinson’s Diagnosis. In Proceedings of the 2019 IEEE Colombian Conference on Communications and Computing (COLCOM), Barranquilla, Colombia, 5–7 June 2019; pp. 1–4. [Google Scholar]
- Buongiorno, D.; Bortone, I.; Cascarano, G.D.; Trotta, G.F.; Brunetti, A.; Bevilacqua, V. A Low-Cost Vision System Based on the Analysis of Motor Features for Recognition and Severity Rating of Parkinson’s Disease. BMC Med. Inform. Decis. Mak. 2019, 19, 243. [Google Scholar] [CrossRef]
- Guayacán, L.C.; Rangel, E.; Martínez, F. Towards Understanding Spatio-Temporal Parkinsonian Patterns from Salient Regions of a 3D Convolutional Network. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 3688–3691. [Google Scholar]
- Vivar-Estudillo, G.; Ibarra-Manzano, M.-A.; Almanza-Ojeda, D.-L. Tremor Signal Analysis for Parkinson’s Disease Detection Using Leap Motion Device. In Advances in Soft Computing; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2018; Volume 11288, p. 353. [Google Scholar]
- Moshkova, A.; Samorodov, A.; Voinova, N.; Volkov, A.; Ivanova, E.; Fedotova, E. Parkinson’s Disease Detection by Using Machine Learning Algorithms and Hand Movement Signal from LeapMotion Sensor. In Proceedings of the 2020 26th Conference of Open Innovations Association (FRUCT), Yaroslavl, Russia, 20–24 April 2020; pp. 321–327. [Google Scholar]
- Ali, M.R.; Hernandez, J.; Dorsey, E.R.; Hoque, E.; McDuff, D. Spatio-Temporal Attention and Magnification for Classification of Parkinson’s Disease from Videos Collected via the Internet. In Proceedings of the 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020), Buenos Aires, Argentina, 16–20 November 2020; pp. 207–214. [Google Scholar]
- Jin, B.; Qu, Y.; Zhang, L.; Gao, Z. Diagnosing Parkinson Disease through Facial Expression Recognition: Video Analysis. J. Med. Internet Res. 2020, 22, e18697. [Google Scholar] [CrossRef] [PubMed]
- Rajnoha, M.; Mekyska, J.; Burget, R.; Eliasova, I.; Kostalova, M.; Rektorova, I. Towards Identification of Hypomimia in Parkinson’s Disease Based on Face Recognition Methods. In Proceedings of the 2018 10th International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT), Moscow, Russia, 5–9 November 2018; pp. 1–4. [Google Scholar]
- Grammatikopoulou, A.; Dimitropoulos, K.; Bostantjopoulou, S.; Katsarou, Z.; Grammalidis, N. Motion Analysis of Parkinson Diseased Patients Using a Video Game Approach. In Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Rhodes, Greece, 5–7 June 2019; pp. 523–527. [Google Scholar]
- Li, M.H.; Mestre, T.A.; Fox, S.H.; Taati, B. Vision-Based Assessment of Parkinsonism and Levodopa-Induced Dyskinesia with Pose Estimation. J. Neuroeng. Rehabil. 2018, 15, 97. [Google Scholar] [CrossRef] [Green Version]
- Liu, Y.; Chen, J.; Hu, C.; Ma, Y.; Ge, D.; Miao, S.; Xue, Y.; Li, L. Vision-Based Method for Automatic Quantification of Parkinsonian Bradykinesia. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 1952–1961. [Google Scholar] [CrossRef] [PubMed]
- Li, T.; Chen, J.; Hu, C.; Ma, Y.; Wu, Z.; Wan, W.; Huang, Y.; Jia, F.; Gong, C.; Wan, S.; et al. Automatic Timed Up-and-Go Sub-Task Segmentation for Parkinson’s Disease Patients Using Video-Based Activity Classification. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 2189–2199. [Google Scholar] [CrossRef] [PubMed]
- Hu, K.; Wang, Z.; Wang, W.; Ehgoetz Martens, K.A.; Wang, L.; Tan, T.; Lewis, S.J.G.; Feng, D.D. Graph Sequence Recurrent Neural Network for Vision-Based Freezing of Gait Detection. IEEE Trans. Image Process. 2020, 29, 1890–1901. [Google Scholar] [CrossRef] [PubMed]
- Tucker, C.S.; Behoora, I.; Nembhard, H.B.; Lewis, M.; Sterling, N.W.; Huang, X. Machine Learning Classification of Medication Adherence in Patients with Movement Disorders Using Non-Wearable Sensors. Comput. Biol. Med. 2015, 66, 120–134. [Google Scholar] [CrossRef] [Green Version]
- Wei, W.; McElroy, C.; Dey, S. Towards On-Demand Virtual Physical Therapist: Machine Learning-Based Patient Action Understanding, Assessment and Task Recommendation. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 1824–1835. [Google Scholar] [CrossRef]
- Zhang, H.; Wang, A.; Li, D.; Xu, W. DeepVoice: A Voiceprint-Based Mobile Health Framework for Parkinson’s Disease Identification. In Proceedings of the 2018 IEEE EMBS International Conference on Biomedical Health Informatics (BHI), Las Vegas, NV, USA, 4–7 March 2018; pp. 214–217. [Google Scholar]
- Tougui, I.; Jilbab, A.; Mhamdi, J.E. Analysis of Smartphone Recordings in Time, Frequency, and Cepstral Domains to Classify Parkinson’s Disease. Healthc. Inform. Res. 2020, 26, 274–283. [Google Scholar] [CrossRef] [PubMed]
- Bot, B.M.; Suver, C.; Neto, E.C.; Kellen, M.; Klein, A.; Bare, C.; Doerr, M.; Pratap, A.; Wilbanks, J.; Dorsey, E.R.; et al. The MPower Study, Parkinson Disease Mobile Data Collected Using ResearchKit. Sci. Data 2016, 3, 160011. [Google Scholar] [CrossRef]
- Zhang, Y.N. Can a Smartphone Diagnose Parkinson Disease? A Deep Neural Network Method and Telediagnosis System Implementation. Park. Dis. 2017, 2017, 6209703. [Google Scholar] [CrossRef] [Green Version]
- Almeida, J.S.; Rebouças Filho, P.P.; Carneiro, T.; Wei, W.; Damaševičius, R.; Maskeliūnas, R.; de Albuquerque, V.H.C. Detecting Parkinson’s Disease with Sustained Phonation and Speech Signals Using Machine Learning Techniques. Pattern Recognit. Lett. 2019, 125, 55–62. [Google Scholar] [CrossRef] [Green Version]
- Arora, S.; Lo, C.; Hu, M.; Tsanas, A. Smartphone Speech Testing for Symptom Assessment in Rapid Eye Movement Sleep Behavior Disorder and Parkinson’s Disease. IEEE Access 2021, 9, 44813–44824. [Google Scholar] [CrossRef]
- Yoon, H.; Li, J. A Novel Positive Transfer Learning Approach for Telemonitoring of Parkinson’s Disease. IEEE Trans. Autom. Sci. Eng. 2019, 16, 180–191. [Google Scholar] [CrossRef]
- Raza, M.; Awais, M.; Singh, N.; Imran, M.; Hussain, S. Intelligent IoT Framework for Indoor Healthcare Monitoring of Parkinson’s Disease Patient. IEEE J. Sel. Areas Commun. 2021, 39, 593–602. [Google Scholar] [CrossRef]
- Bayestehtashk, A.; Asgari, M.; Shafran, I.; McNames, J. Fully Automated Assessment of the Severity of Parkinson’s Disease from Speech. Comput. Speech Lang. 2015, 29, 172–185. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rahman, M.A.; Tutul, A.A.; Islam, A.B.M.A.A. Solving the Maze of Diagnosing Parkinson’s Disease Based on Portable EEG Sensing to Be Adaptable to Go In-The-Wild. In Proceedings of the 7th International Conference on Networking, Systems and Security, Dhaka, Bangladesh, 22–24 December 2020; pp. 65–73. [Google Scholar]
- Kleinholdermann, U.; Wullstein, M.; Pedrosa, D. Prediction of Motor Unified Parkinson’s Disease Rating Scale Scores in Patients with Parkinson’s Disease Using Surface Electromyography. Clin. Neurophysiol. 2021, 132, 1708–1713. [Google Scholar] [CrossRef] [PubMed]
- Capecci, M.; Ciabattoni, L.; Foresi, G.; Monteriù, A.; Pepa, L. A Machine-Learning Based Emotion Recognition System in Patients with Parkinson’s Disease. In Proceedings of the 2019 IEEE 9th International Conference on Consumer Electronics (ICCE-Berlin), Berlin, Germany, 8–11 September 2019; pp. 20–21. [Google Scholar]
- Lacy, S.E.; Smith, S.L.; Lones, M.A. Using Echo State Networks for Classification: A Case Study in Parkinson’s Disease Diagnosis. Artif. Intell. Med. 2018, 86, 53–59. [Google Scholar] [CrossRef]
- Picardi, C.; Cosgrove, J.; Smith, S.L.; Jamieson, S.; Alty, J.E. Objective Assessment of Cognitive Impairment in Parkinson’s Disease Using Evolutionary Algorithm. In Applications of Evolutionary Computation; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2017; Volume 10199, p. 124. [Google Scholar]
- Memedi, M.; Sadikov, A.; Groznik, V.; Žabkar, J.; Možina, M.; Bergquist, F.; Johansson, A.; Haubenberger, D.; Nyholm, D. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease. Sensors 2015, 15, 23727–23744. [Google Scholar] [CrossRef]
- Pham, T.D.; Wårdell, K.; Eklund, A.; Salerud, G. Classification of Short Time Series in Early Parkinsonʼ s Disease with Deep Learning of Fuzzy Recurrence Plots. IEEECAA J. Autom. Sin. 2019, 6, 1306–1317. [Google Scholar] [CrossRef]
- Matarazzo, M.; Arroyo-Gallego, T.; Montero, P.; Puertas-Martín, V.; Butterworth, I.; Mendoza, C.S.; Ledesma-Carbayo, M.J.; Catalán, M.J.; Molina, J.A.; Bermejo-Pareja, F.; et al. Remote Monitoring of Treatment Response in Parkinson’s Disease: The Habit of Typing on a Computer. Mov. Disord. 2019, 34, 1488–1495. [Google Scholar] [CrossRef]
- Aharonson, V.; Schlesinger, I.; McDonald, A.M.; Dubowsky, S.; Korczyn, A.D. A Practical Measurement of Parkinson’s Patients Gait Using Simple Walker-Based Motion Sensing and Data Analysis. J. Med. Devices Trans. ASME 2018, 12. [Google Scholar] [CrossRef]
- Pardoel, S.; Shalin, G.; Nantel, J.; Lemaire, E.D.; Kofman, J. Early Detection of Freezing of Gait during Walking Using Inertial Measurement Unit and Plantar Pressure Distribution Data. Sensors 2021, 21, 2246. [Google Scholar] [CrossRef] [PubMed]
- Wu, H.; Zhang, Y.; Wu, X.; Yang, F. Assessment of Upper Limb Tremors in Patients with Parkinson’s Disease Based on Displacement and Acceleration Information. In Proceedings of the 2020 5th International Conference on Automation, Control and Robotics Engineering (CACRE), Dalian, China, 19–20 September 2020; pp. 177–182. [Google Scholar]
- Cole, B.T.; Roy, S.H.; De Luca, C.J.; Nawab, S.H. Dynamical Learning and Tracking of Tremor and Dyskinesia from Wearable Sensors. IEEE Trans. Neural Syst. Rehabil. Eng. 2014, 22, 982–991. [Google Scholar] [CrossRef] [PubMed]
- Hossen, A.; Muthuraman, M.; Raethjen, J.; Deuschl, G.; Heute, U. A Neural Network Approach to Distinguish Parkinsonian Tremor from Advanced Essential Tremor. In Proceedings of the International Conference on Soft Computing for Problem Solving (SocProS 2011), AISC, Roorkee, India, 20–22 December 2012; Volume 130. [Google Scholar]
- Tahafchi, P.; Judy, J.W. Freezing-of-Gait Detection Using Wearable-Sensor Technology and Neural-Network Classifier. In Proceedings of the 2019 IEEE Sensors, Montreal, QC, Canada, 27–30 October 2019; pp. 1–4. [Google Scholar]
- Huo, W.; Angeles, P.; Tai, Y.F.; Pavese, N.; Wilson, S.; Hu, M.T.; Vaidyanathan, R. A Heterogeneous Sensing Suite for Multisymptom Quantification of Parkinson’s Disease. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 1397–1406. [Google Scholar] [CrossRef] [PubMed]
- Yu, S.; Chen, H.; Brown, R.; Sherman, S. Motion Sensor-Based Assessment on Fall Risk and Parkinson’s Disease Severity: A Deep Multi-Source Multi-Task Learning (DMML) Approach. In Proceedings of the 2018 IEEE International Conference on Healthcare Informatics (ICHI), New York, NY, USA, 4–7 June 2018; pp. 174–179. [Google Scholar]
- Sajal, M.S.R.; Ehsan, M.T.; Vaidyanathan, R.; Wang, S.; Aziz, T.; Mamun, K.A.A. Telemonitoring Parkinson’s Disease Using Machine Learning by Combining Tremor and Voice Analysis. Brain Inform. 2020, 7, 12. [Google Scholar] [CrossRef]
- Oung, Q.W.; Muthusamy, H.; Basah, S.N.; Lee, H.; Vijean, V. Empirical Wavelet Transform Based Features for Classification of Parkinson’s Disease Severity. J. Med. Syst. 2018, 42, 1–17. [Google Scholar] [CrossRef]
- Papadopoulos, A.; Iakovakis, D.; Klingelhoefer, L.; Bostantjopoulou, S.; Chaudhuri, K.R.; Kyritsis, K.; Hadjidimitriou, S.; Charisis, V.; Hadjileontiadis, L.J.; Delopoulos, A. Unobtrusive Detection of Parkinson’s Disease from Multi-Modal and in-the-Wild Sensor Data Using Deep Learning Techniques. Sci. Rep. 2020, 10, 21370. [Google Scholar] [CrossRef]
- Heidarivincheh, F.; McConville, R.; Morgan, C.; McNaney, R.; Masullo, A.; Mirmehdi, M.; Whone, A.L.; Craddock, I. Multimodal Classification of Parkinson’s Disease in Home Environments with Resiliency to Missing Modalities. Sensors 2021, 21, 4133. [Google Scholar] [CrossRef]
- Wahid, F.; Begg, R.K.; Hass, C.J.; Halgamuge, S.; Ackland, D.C. Classification of Parkinson’s Disease Gait Using Spatial-Temporal Gait Features. IEEE J. Biomed. Health Inform. 2015, 19, 1794–1802. [Google Scholar] [CrossRef]
- Albani, G.; Ferraris, C.; Nerino, R.; Chimienti, A.; Pettiti, G.; Parisi, F.; Ferrari, G.; Cau, N.; Cimolin, V.; Azzaro, C.; et al. An Integrated Multi-Sensor Approach for the Remote Monitoring of Parkinson’s Disease. Sensors 2019, 19, 4764. [Google Scholar] [CrossRef] [Green Version]
- Joshi, A.; Ghosh, S.; Gunnery, S.; Tickle-Degnen, L.; Sclaroff, S.; Betke, M. Context-Sensitive Prediction of Facial Expressivity Using Multimodal Hierarchical Bayesian Neural Networks. In Proceedings of the 2018 13th IEEE International Conference on Automatic Face Gesture Recognition (FG 2018), Xi’an, China, 15–19 May 2018; pp. 278–285. [Google Scholar]
- Xu, S.; Pan, Z. A Novel Ensemble of Random Forest for Assisting Diagnosis of Parkinson’s Disease on Small Handwritten Dynamics Dataset. Int. J. Med. Inf. 2020, 144, 104283. [Google Scholar] [CrossRef] [PubMed]
- Gallicchio, C.; Micheli, A.; Pedrelli, L. Deep Echo State Networks for Diagnosis of Parkinson’s Disease. In Proceedings of the ESANN—European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges, Belgium, 25–27 April 2018; pp. 397–402. [Google Scholar]
- Pereira, C.R.; Weber, S.A.T.; Hook, C.; Rosa, G.H.; Papa, J.P. Deep Learning-Aided Parkinson’s Disease Diagnosis from Handwritten Dynamics. In Proceedings of the 2016 29th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), Sao Paulo, Brazil, 4–7 October 2016; pp. 340–346. [Google Scholar]
- Barth, J.; Sünkel, M.; Bergner, K.; Schickhuber, G.; Winkler, J.; Klucken, J.; Eskofier, B. Combined Analysis of Sensor Data from Hand and Gait Motor Function Improves Automatic Recognition of Parkinson’s Disease. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August—1 September 2012; pp. 5122–5125. [Google Scholar]
- Schwab, P.; Karlen, W. PhoneMD: Learning to Diagnose Parkinson’s Disease from Smartphone Data. Proc. AAAI Conf. Artif. Intell. 2019, 33, 1118–1125. [Google Scholar] [CrossRef] [Green Version]
- Prince, J.; Andreotti, F.; De Vos, M. Multi-Source Ensemble Learning for the Remote Prediction of Parkinson’s Disease in the Presence of Source-Wise Missing Data. IEEE Trans. Biomed. Eng. 2019, 66, 1402–1411. [Google Scholar] [CrossRef]
- Cook, D.J.; Schmitter-Edgecombe, M.; Dawadi, P. Analyzing Activity Behavior and Movement in a Naturalistic Environment Using Smart Home Techniques. IEEE J. Biomed. Health Inform. 2015, 19, 1882–1892. [Google Scholar] [CrossRef] [PubMed]
- Habibzadeh, H.; Dinesh, K.; Rajabi Shishvan, O.; Boggio-Dandry, A.; Sharma, G.; Soyata, T. A Survey of Healthcare Internet of Things (HIoT): A Clinical Perspective. IEEE Internet Things J. 2020, 7, 53–71. [Google Scholar] [CrossRef] [PubMed]
- Barakat, B.; Taha, A.; Samson, R.; Steponenaite, A.; Ansari, S.; Langdon, P.M.; Wassell, I.J.; Abbasi, Q.H.; Imran, M.A.; Keates, S. 6G Opportunities Arising from Internet of Things Use Cases: A Review Paper. Future Internet 2021, 13, 159. [Google Scholar] [CrossRef]
- Srivastava, G.; Parizi, R.M.; Dehghantanha, A. The Future of Blockchain Technology in Healthcare Internet of Things Security. In Blockchain Cybersecurity, Trust and Privacy; Springer: Cham, Switzerland, 2020; pp. 161–184. [Google Scholar]
- Barth, J.; Oberndorfer, C.; Pasluosta, C.; Schülein, S.; Gassner, H.; Reinfelder, S.; Kugler, P.; Schuldhaus, D.; Winkler, J.; Klucken, J.; et al. Stride Segmentation during Free Walk Movements Using Multi-Dimensional Subsequence Dynamic Time Warping on Inertial Sensor Data. Sensors 2015, 15, 6419–6440. [Google Scholar] [CrossRef] [PubMed]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Giannakopoulou, K.-M.; Roussaki, I.; Demestichas, K. Internet of Things Technologies and Machine Learning Methods for Parkinson’s Disease Diagnosis, Monitoring and Management: A Systematic Review. Sensors 2022, 22, 1799. https://doi.org/10.3390/s22051799
Giannakopoulou K-M, Roussaki I, Demestichas K. Internet of Things Technologies and Machine Learning Methods for Parkinson’s Disease Diagnosis, Monitoring and Management: A Systematic Review. Sensors. 2022; 22(5):1799. https://doi.org/10.3390/s22051799
Chicago/Turabian StyleGiannakopoulou, Konstantina-Maria, Ioanna Roussaki, and Konstantinos Demestichas. 2022. "Internet of Things Technologies and Machine Learning Methods for Parkinson’s Disease Diagnosis, Monitoring and Management: A Systematic Review" Sensors 22, no. 5: 1799. https://doi.org/10.3390/s22051799
APA StyleGiannakopoulou, K. -M., Roussaki, I., & Demestichas, K. (2022). Internet of Things Technologies and Machine Learning Methods for Parkinson’s Disease Diagnosis, Monitoring and Management: A Systematic Review. Sensors, 22(5), 1799. https://doi.org/10.3390/s22051799