Human Activity Recognition in a Free-Living Environment Using an Ear-Worn Motion Sensor
Abstract
:1. Introduction
2. Materials and Methods
2.1. Participants
2.2. Ear-Worn Motion Sensor
2.3. Experimental Procedures
2.4. Classification Models
2.4.1. Data Segmentation
2.4.2. Shallow Learning Models
2.4.3. Deep Learning Models
2.4.4. Performance Metrics and Implementation
3. Results
3.1. Dataset Characteristics
3.2. Shallow Learning Models
3.3. Deep Learning Models
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Arshad, M.H.; Bilal, M.; Gani, A. Human Activity Recognition: Review, Taxonomy and Open Challenges. Sensors 2022, 22, 6463. [Google Scholar] [CrossRef] [PubMed]
- Minh Dang, L.; Min, K.; Wang, H.; Jalil Piran, M.; Hee Lee, C.; Moon, H. Sensor-based and vision-based human activity recognition: A comprehensive survey. Pattern Recognit. 2020, 108, 107561. [Google Scholar] [CrossRef]
- Warmerdam, E.; Hausdorff, J.M.; Atrsaei, A.; Zhou, Y.; Mirelman, A.; Aminian, K.; Espay, A.J.; Hansen, C.; Evers, L.J.W.; Keller, A.; et al. Long-term unsupervised mobility assessment in movement disorders. Lancet Neurol. 2020, 19, 462–470. [Google Scholar] [CrossRef] [PubMed]
- Schniepp, R.; Huppert, A.; Decker, J.; Schenkel, F.; Schlick, C.; Rasoul, A.; Dieterich, M.; Brandt, T.; Jahn, K.; Wuehr, M. Fall prediction in neurological gait disorders: Differential contributions from clinical assessment, gait analysis, and daily-life mobility monitoring. J. Neurol. 2021, 268, 3421–3434. [Google Scholar] [CrossRef] [PubMed]
- Ilg, W.; Seemann, J.; Giese, M.; Traschutz, A.; Schols, L.; Timmann, D.; Synofzik, M. Real-life gait assessment in degenerative cerebellar ataxia: Toward ecologically valid biomarkers. Neurology 2020, 95, e1199–e1210. [Google Scholar] [CrossRef] [PubMed]
- Del Din, S.; Elshehabi, M.; Galna, B.; Hobert, M.A.; Warmerdam, E.; Suenkel, U.; Brockmann, K.; Metzger, F.; Hansen, C.; Berg, D.; et al. Gait analysis with wearables predicts conversion to parkinson disease. Ann. Neurol. 2019, 86, 357–367. [Google Scholar] [CrossRef] [PubMed]
- Rehman, R.Z.U.; Zhou, Y.; Del Din, S.; Alcock, L.; Hansen, C.; Guan, Y.; Hortobagyi, T.; Maetzler, W.; Rochester, L.; Lamoth, C.J.C. Gait Analysis with Wearables Can Accurately Classify Fallers from Non-Fallers: A Step toward Better Management of Neurological Disorders. Sensors 2020, 20, 6992. [Google Scholar] [CrossRef] [PubMed]
- Wuehr, M.; Huppert, A.; Schenkel, F.; Decker, J.; Jahn, K.; Schniepp, R. Independent domains of daily mobility in patients with neurological gait disorders. J. Neurol. 2020, 267 (Suppl. S1), 292–300. [Google Scholar] [CrossRef] [PubMed]
- Schniepp, R.; Huppert, A.; Decker, J.; Schenkel, F.; Dieterich, M.; Brandt, T.; Wuehr, M. Multimodal Mobility Assessment Predicts Fall Frequency and Severity in Cerebellar Ataxia. Cerebellum 2023, 22, 85–95. [Google Scholar] [CrossRef]
- Galperin, I.; Hillel, I.; Del Din, S.; Bekkers, E.M.J.; Nieuwboer, A.; Abbruzzese, G.; Avanzino, L.; Nieuwhof, F.; Bloem, B.R.; Rochester, L.; et al. Associations between daily-living physical activity and laboratory-based assessments of motor severity in patients with falls and Parkinson’s disease. Park. Relat. Disord. 2019, 62, 85–90. [Google Scholar] [CrossRef]
- Lord, S.; Chastin, S.F.; McInnes, L.; Little, L.; Briggs, P.; Rochester, L. Exploring patterns of daily physical and sedentary behaviour in community-dwelling older adults. Age Ageing 2011, 40, 205–210. [Google Scholar] [CrossRef] [PubMed]
- Sazonov, E.; Hegde, N.; Browning, R.C.; Melanson, E.L.; Sazonova, N.A. Posture and activity recognition and energy expenditure estimation in a wearable platform. IEEE J. Biomed. Health Inf. 2015, 19, 1339–1346. [Google Scholar] [CrossRef] [PubMed]
- Hegde, N.; Bries, M.; Swibas, T.; Melanson, E.; Sazonov, E.; Hegde, N.; Bries, M.; Swibas, T.; Melanson, E.; Sazonov, E. Automatic Recognition of Activities of Daily Living Utilizing Insole-Based and Wrist-Worn Wearable Sensors. IEEE J. Biomed. Health Inf. 2018, 22, 979–988. [Google Scholar] [CrossRef] [PubMed]
- Min, C.; Mathur, A.; Kawsar, F. Exploring audio and kinetic sensing on earable devices. In Proceedings of the 4th ACM Workshop on Wearable Systems and Applications, Munich, Germany, 10 June 2018; Association for Computing Machinery: Munich, Germany, 2018; pp. 5–10. [Google Scholar]
- Atallah, L.; Lo, B.; King, R.; Guang-Zhong, Y. Sensor positioning for activity recognition using wearable accelerometers. IEEE Trans. Biomed. Circuits Syst. 2011, 5, 320–329. [Google Scholar] [CrossRef] [PubMed]
- Kavanagh, J.J.; Morrison, S.; Barrett, R.S. Coordination of head and trunk accelerations during walking. Eur. J. Appl. Physiol. 2005, 94, 468–475. [Google Scholar] [CrossRef] [PubMed]
- Winter, D.A.; Ruder, G.K.; MacKinnon, C.D. Control of Balance of Upper Body During Gait. In Multiple Muscle Systems: Biomechanics and Movement Organization; Winters, J.M., Woo, S.L.Y., Eds.; Springer: New York, NY, USA, 1990; pp. 534–541. [Google Scholar]
- Seifer, A.-K.; Dorschky, E.; Küderle, A.; Moradi, H.; Hannemann, R.; Eskofier, B.M. EarGait: Estimation of Temporal Gait Parameters from Hearing Aid Integrated Inertial Sensors. Sensors 2023, 23, 6565. [Google Scholar] [CrossRef] [PubMed]
- Röddiger, T.; Clarke, C.; Breitling, P.; Schneegans, T.; Zhao, H.; Gellersen, H.; Beigl, M. Sensing with Earables. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2022, 6, 1–57. [Google Scholar] [CrossRef]
- Zhang, M.; Sawchuk, A.A. USC-HAD: A daily activity dataset for ubiquitous activity recognition using wearable sensors. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 1036–1043. [Google Scholar]
- Banos, O.; Galvez, J.M.; Damas, M.; Pomares, H.; Rojas, I. Window size impact in human activity recognition. Sensors 2014, 14, 6474–6499. [Google Scholar] [CrossRef]
- Liu, H.; Schultz, T. How Long Are Various Types of Daily Activities? Statistical Analysis of a Multimodal Wearable Sensor-based Human Activity Dataset. Healthinf 2022, 2022, 680–688. [Google Scholar]
- Anguita, D.; Ghio, A.; Oneto, L.; Parra, X.; Reyes-Ortiz, J.L. Human Activity Recognition on Smartphones Using a Multiclass Hardware-Friendly Support Vector Machine. In Ambient Assisted Living and Home Care, Proceedings of the 4th International Workshop, IWAAL 2012, Vitoria-Gasteiz, Spain, 3–5 December 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 216–223. [Google Scholar]
- Soltani, A.; Aminian, K.; Mazza, C.; Cereatti, A.; Palmerini, L.; Bonci, T.; Paraschiv-Ionescu, A. Algorithms for Walking Speed Estimation Using a Lower-Back-Worn Inertial Sensor: A Cross-Validation on Speed Ranges. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 1955–1964. [Google Scholar] [CrossRef]
- Tan, T.-H.; Wu, J.-Y.; Liu, S.-H.; Gochoo, M. Human Activity Recognition Using an Ensemble Learning Algorithm with Smartphone Sensor Data. Electronics 2022, 11, 322. [Google Scholar] [CrossRef]
- Capela, N.A.; Lemaire, E.D.; Baddour, N. Feature selection for wearable smartphone-based human activity recognition with able bodied, elderly, and stroke patients. PLoS ONE 2015, 10, e0124414. [Google Scholar] [CrossRef]
- Ordonez, F.J.; Roggen, D. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z.; Wang, W.; An, A.; Qin, Y.; Yang, F. A human activity recognition method using wearable sensors based on convtransformer model. Evol. Syst. 2023, 14, 939–955. [Google Scholar] [CrossRef]
- Lugade, V.; Fortune, E.; Morrow, M.; Kaufman, K. Validity of using tri-axial accelerometers to measure human movement—Part I: Posture and movement detection. Med. Eng. Phys. 2014, 36, 169–176. [Google Scholar] [CrossRef]
- Ivascu, T.; Negru, V. Activity-Aware Vital SignMonitoring Based on a Multi-Agent Architecture. Sensors 2021, 21, 4181. [Google Scholar] [CrossRef]
- Wang, Z.; Yang, Z.; Dong, T. A Review of Wearable Technologies for Elderly Care that Can Accurately Track Indoor Position, Recognize Physical Activities and Monitor Vital Signs in Real Time. Sensors 2017, 17, 341. [Google Scholar] [CrossRef]
- Starliper, N.; Mohammadzadeh, F.; Songkakul, T.; Hernandez, M.; Bozkurt, A.; Lobaton, E. Activity-Aware Wearable System for Power-Efficient Prediction of Physiological Responses. Sensors 2019, 19, 441. [Google Scholar] [CrossRef]
- Wu, K.; Chen, E.H.; Hao, X.; Wirth, F.; Vitanova, K.; Lange, R.; Burschka, D. Adaptable Action-Aware Vital Models for Personalized Intelligent Patient Monitoring. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 826–832. [Google Scholar]
- Lokare, N.; Zhong, B.; Lobaton, E. Activity-Aware Physiological Response Prediction Using Wearable Sensors. Inventions 2017, 2, 32. [Google Scholar] [CrossRef]
- Sun, F.-T.; Kuo, C.; Cheng, H.-T.; Buthpitiya, S.; Collins, P.; Griss, M. Activity-Aware Mental Stress Detection Using Physiological Sensors. In Mobile Computing, Applications, and Services, Proceedings of the Second International ICST Conference, MobiCASE 2010, Santa Clara, CA, USA, 25–28 October 2010; Springer: Berlin/Heidelberg, Germany, 2012; pp. 211–230. [Google Scholar]
- Seshadri, D.R.; Li, R.T.; Voos, J.E.; Rowbottom, J.R.; Alfes, C.M.; Zorman, C.A.; Drummond, C.K. Wearable sensors for monitoring the internal and external workload of the athlete. NPJ Digit. Med. 2019, 2, 71. [Google Scholar] [CrossRef] [PubMed]
Activity | Percentage | Mean (s) | Std (s) | Min (s) | Max (s) |
---|---|---|---|---|---|
lying | 12.8 | 62.0 | 10.5 | 30.0 | 100.4 |
sitting/standing | 29.0 | 25.8 | 14.1 | 3.0 | 78.2 |
walking | 30.6 | 66.7 | 28.2 | 12 | 199.1 |
ascending stairs | 7.1 | 8.6 | 1.1 | 4.9 | 12.4 |
descending stairs | 6.3 | 7.7 | 1.8 | 4.3 | 33.2 |
running | 14.2 | 32.4 | 9.5 | 9.9 | 84.4 |
Classifier | Win Size | 0.5 s | 1 s | 2 s |
---|---|---|---|---|
K-Nearest Neighbors | 0.890 | 0.892 | 0.889 | |
Decision Tree | 0.868 | 0.882 | 0.879 | |
Support Vector Machine | 0.924 | 0.929 | 0.930 | |
Naive Bayes | 0.861 | 0.867 | 0.870 | |
Bagging | 0.903 | 0.910 | 0.910 | |
Random Forest | 0.911 | 0.918 | 0.917 | |
ExtraTrees | 0.906 | 0.911 | 0.911 | |
Gradient Boosting | 0.921 | 0.925 | 0.925 |
Activity | Precision | Recall | F1-Score | Support |
---|---|---|---|---|
lying | 0.987 | 0.997 | 0.992 | 8095 |
sitting/standing | 0.984 | 0.997 | 0.990 | 16,410 |
walking | 0.939 | 0.936 | 0.937 | 19,288 |
ascending stairs | 0.700 | 0.690 | 0.695 | 2549 |
descending stairs | 0.802 | 0.828 | 0.815 | 2001 |
running | 0.995 | 0.965 | 0.979 | 8367 |
accuracy | 0.951 | 56,710 | ||
macro avg | 0.901 | 0.902 | 0.901 | 56,710 |
weighted avg | 0.951 | 0.951 | 0.951 | 56,710 |
(1) min | (2) min | (3) min |
(5) std | (6) std | (7) iqr |
(9) range | (10) range |
Classifier | Win Size | 0.5 s | 1 s | 2 s |
---|---|---|---|---|
DeepConvLSTM | 0.964 | 0.975 | 0.981 | |
ConvTransformer | 0.966 | 0.978 | 0.977 |
Activity | Precision | Recall | F1-Score | Support |
---|---|---|---|---|
lying | 0.985 | 0.999 | 0.992 | 6010 |
sitting/standing | 0.986 | 0.997 | 0.992 | 13,518 |
walking | 0.987 | 0.967 | 0.977 | 14,262 |
ascending stairs | 0.907 | 0.937 | 0.922 | 3285 |
descending stairs | 0.970 | 0.959 | 0.965 | 2940 |
running | 0.998 | 0.996 | 0.997 | 6624 |
accuracy | 0.981 | 46,639 | ||
macro avg | 0.972 | 0.976 | 0.974 | 46,639 |
weighted avg | 0.981 | 0.981 | 0.981 | 46,639 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Boborzi, L.; Decker, J.; Rezaei, R.; Schniepp, R.; Wuehr, M. Human Activity Recognition in a Free-Living Environment Using an Ear-Worn Motion Sensor. Sensors 2024, 24, 2665. https://doi.org/10.3390/s24092665
Boborzi L, Decker J, Rezaei R, Schniepp R, Wuehr M. Human Activity Recognition in a Free-Living Environment Using an Ear-Worn Motion Sensor. Sensors. 2024; 24(9):2665. https://doi.org/10.3390/s24092665
Chicago/Turabian StyleBoborzi, Lukas, Julian Decker, Razieh Rezaei, Roman Schniepp, and Max Wuehr. 2024. "Human Activity Recognition in a Free-Living Environment Using an Ear-Worn Motion Sensor" Sensors 24, no. 9: 2665. https://doi.org/10.3390/s24092665
APA StyleBoborzi, L., Decker, J., Rezaei, R., Schniepp, R., & Wuehr, M. (2024). Human Activity Recognition in a Free-Living Environment Using an Ear-Worn Motion Sensor. Sensors, 24(9), 2665. https://doi.org/10.3390/s24092665