Recognizing Human Activity of Daily Living Using a Flexible Wearable for 3D Spine Pose Tracking
Abstract
:1. Introduction
- Identifying an appropriate wearable device with multi-tasking capability coping with the daily routine of users;
- Designing and developing a comprehensive protocol in order to include a wide range of daily activities for evaluating the device functionality which contributes to rehabilitation, HAR, as well as fall detection;
- Developing, implementing, testing, and comparing three light neural networks in order to evaluate the functionality of the device;
- Dividing the study design into indoor, outdoor, and transition states activities, each with its own contributions and in a total of 22, i.e., ten indoor activities, four outdoor activities, and eight transition states;
- Evaluating the accelerometer, spinal, and fusion of data and identifying the correlated activities contributions.
2. Materials and Methods
2.1. Device Description
2.2. Study Design, Protocol, and Experiment
- Outdoor-performed dynamic activities; walking, jogging, running, and walking upstairs (WU) from the third floor to the fourth floor;
- Indoor-performed static activities; RLD, supine, LLD, walking, standing, BF, BB, BR, BL, and sitting. Of which nine out of ten activities, we categorize as the static activity (except walking, which is not associated with the movement but stand still);
- TSs; we considered the TSs as the performance between changing each two positions of static activities, if applicable. TSs include standing to lying down on the bed (TS1), rolling from RLD to supine (TS2), supine to LLD (TS3), standing to BF (TS4), BF to BB (TS5), BB to BR (TS6), BR to BL (TS7), and BL to sitting (TS8).
2.3. Data Acquisition
2.4. Data Processing and Analysis
- Accelerometer with three features;
- Sensor strip with 50 features;
- Fusion of sensor strip and accelerometer with 53 features;
- Sensor strip with dimensional reduced features using principle component analysis (PCA);
- Fusion of accelerometer with the PCA-generated features of sensor strip;
3. Results
3.1. Ten-Second window
3.1.1. Architecture: CNN
- Accelerometer; yielded an overall classification accuracy of 87%. Using only the accelerometer data resulted in a poor classification of at least four activities (standing = 82%, BB = 40%, Sitting = 75%, and WU = 0%). This is due to the similar characteristics of activities such as WU and walking.
- Sensors strip; gave the overall classification accuracy of 61%. On the one hand, activities such as BR, BL, and BB which are associated with spinal movements, delivered the best accuracy classification results with 100%, 97%, and 91%, respectively. On the other hand, static activities such as supine, LLD, standing, and RLD with 16%, 16%, 25%, and 30%, respectively, yielded the poorest results. This indicates the similar characteristics of spinal movements in static activities;
- Accelerometer and sensor strip; delivered better results compared with the individual data in unit one and two. The overall classification accuracy was 94%. However, WU and BB with 10% and 58% classification accuracy are still not correctly detected. This shows that convolutional layers could not differentiate WU from walking or BB with standing;
- PCA applied to the sensor strip; using 95% PCA, 50 features were reduced to 13. The overall classification accuracy slightly lowered, as CNNs need as many features as possible in order to perform pattern recognition;
- PCA applied to the sensors strip and accelerometer; with the PCA applied to the sensor strip and accelerometer data, the activities such as RLD, supine, LLD, standing, BF, BR, and running were 100% correctly detected (Figure 10).
3.1.2. Architecture: LSTM
- Accelerometer; delivered an overall accuracy of 80%. RLD, supine, LLD, and running with 100% classification accuracy and WU, BB, and standing with 20%, 31%, and 43%, respectively, delivered the best and poorest results, respectively. Still, LSTM encountered difficulty in distinguishing the activities with similar patterns;
- Sensor strip; using sensors strip data feeding the LSTM model resulted in poor performance with an overall classification accuracy of 50%;
- Accelerometer and sensor strip; with 96% overall classification accuracy, the model classified RLD, supine, LLD, BF, BR, BL, and sitting without error. In addition, WU as one of the challenging activities reached an 80% accuracy. However, BB with a 60% classification accuracy remained a bottleneck. The reason for the false classification of WU can be laid on the landing step, in which the subject’s last step is considered as walking rather than ascending;
- PCA applied to the sensor strip; with 95% PCA, compared to sensor strip, better results were achieved. This can be due to the temporal dependency of LSTMs;
3.1.3. Architecture: Hybrid CNN-LSTM
- Accelerometer; with an overall accuracy of 84%, the hybrid CNN-LSTM was unable to classify the activities such as standing, BB, WU, and sitting. This is inline with the result from the CNN;
- Sensor strip; delivered a very poor performance in classification. The results are similar to the CNN model;
- Accelerometer and sensor strip; yielded an overall classification accuracy of 97%. WU yielded an accuracy of 40% and RLD, supine, LLD, standing, BF, BR, and running were 100% correctly detected;
- PCA applied to the sensor strip; yielded better results compared with unit two. This showed that the LSTM layer compensates for fewer features;
3.2. Two-Second Window
3.2.1. Architecture: CNN
- Accelerometer; whilst the outdoor dynamic activities (walking = 99% and running = 97%), and lying down static position-associated activities (LLD, RLD, supine = 100%) delivered good results, the bending-related activities and the activities with similar characteristics and their corresponding TSs yielded a poor classification accuracy;
- Sensor strip; having compared the outcome of the CNN model using the sensor strip in two- and ten-second windows indicated the poor performance in all activities disregarding the static, dynamic, transition, indoor, and outdoor type of activities (except for BL). This indicates the unsuitability of the sensor strips data with the CNN model;
- Accelerometer and sensor strip; with the overall classification accuracy of 84%, the CNN model displayed a good performance for classifying both indoor and outdoor activities. However, the TSs delivered poor results (i.e., TS1 = 66%, TS2 = 69%, TS3 = 90%, TS4 = 40%, TS5 = 44%, TS6 = 57%, TS7 = 44%, and TS8 = 67%). Having less data for TSs explains the poor classification accuracy;
- PCA applied to the sensors strip; delivered poor performance. In particular, lying down-associated positions (RLD = 15%, supine = 23%, LLD = 30%), and static activities (standing = 38% and sitting = 59%), as well as all TSs yielded very poor results;
- PCA applied to the sensor strip and accelerometer; even though the TSs were still poorly detected, compared to unit three, the classification accuracy was improved by 1% (Figure 13).
3.2.2. Architecture: LSTM
- Accelerometer; except for running and walking and the dynamic outdoor activities delivering a classification accuracy greater than 90%, the remaining activities including TSs were poorly classified;
- Sensor strip; with 40% overall classification accuracy, RLD, and BB with 2% and 92% delivered the poorest and best outcome, respectively;
- Sensor strip and accelerometer; even though the data fusion improved the results, the TSs activities were inadequately detected. TSs were mainly wrongly classified into the previous and/or the next activity. This can be due to insufficient TSs data in comparison with other activities;
- PCA applied to the sensor strip; compared to unit 2, it seems that the overall classification accuracy was slightly improved;
- PCA applied to the sensor strip and accelerometer; RLD, BL, running, walking, and BF were detected with 100%, 100%, 96%, 92%, and 91% accuracy, respectively. TS4 and supine delivered the least classification accuracy with 43% and 46%, respectively (Figure 14).
3.2.3. Architecture: Hybrid CNN-LSTM
- Accelerometer; in compliance with the previous architecture, the lying down-associated activities such as RLD, supine, and LLD delivered 100% classification accuracy while none of the TSs were detected with an accuracy greater than 50%. Moreover, the majority of bending-related activities were falsely classified. This is, again, similar to the CNN’s architecture, where bending-related activities were falsely recognized;
- Sensor strip; except for BL with a 98% classification accuracy, the architecture delivered a poor performance in almost all other activities;
- Sensor strip and accelerometer; data fusion improved the performance (RLD = 100%, supine = 100%, LLD = 100%, BF = 100%, BB = 98%, BR = 90%, BL = 100%, walking = 91%, and running = 98%). However, detecting TSs remained challenging;
- PCA applied to the sensor strip; although the general classification accuracy was low, it exhibited better results than unit 2;
4. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- The Top 10 Causes of Death. Available online: https://www.who.int/news-room/fact-sheets/detail/the-top-10-causes-of-death (accessed on 11 February 2022).
- The WHOQOL Group. The World Health Organization quality of life assessment (WHOQOL): Position paper from the World Health Organization. Soc. Sci. Med. 1995, 41, 1403–1409. [Google Scholar] [CrossRef]
- Alwardat, M.; Schirinzi, T.; Di Lazzaro, G.; Sancesario, G.M.; Franco, D.; Imbriani, P.; Sinibaldi Salimei, P.; Bernardini, S.; Mercuri, N.B.; Pisani, A. Association between physical activity and dementia’s risk factors in patients with Parkinson’s disease. J. Neural Transm. 2019, 126, 319–325. [Google Scholar] [CrossRef]
- Cipriani, G.; Danti, S.; Picchi, L.; Nuti, A.; Fiorino, M.D. Daily functioning and dementia. Dement. Neuropsychol. 2020, 14, 93–102. [Google Scholar] [CrossRef]
- O’Sullivan, K.; O’Sullivan, L.; Campbell, A.; O’Sullivan, P.; Dankaerts, W. Towards monitoring lumbo-pelvic posture in real-life situations: Concurrent validity of a novel posture monitor and a traditional laboratory-based motion analysis system. Man. Ther. 2012, 17, 77–83. [Google Scholar] [CrossRef]
- Papi, E.; Koh, W.S.; McGregor, A.H. Wearable technology for spine movement assessment: A systematic review. J. Biomech. 2017, 64, 186–197. [Google Scholar] [CrossRef]
- Wettstein, M.; Eich, W.; Bieber, C.; Tesarz, J. Pain intensity, disability, and quality of life in patients with chronic low Back pain: Does age matter? Pain Med. 2019, 20, 464–475. [Google Scholar] [CrossRef]
- Ageing and Health. Available online: https://www.who.int/news-room/fact-sheets/detail/ageing-and-health#:~:text=At%20this%20time%20the%20share,2050%20to%20reach%20426%20million (accessed on 11 February 2022).
- Susnea, I.; Dumitriu, L.; Talmaciu, M.; Pecheanu, E.; Munteanu, D. Unobtrusive monitoring the daily activity routine of elderly people living alone, with low-cost binary sensors. Sensors 2019, 19, 2264. [Google Scholar] [CrossRef]
- Vinoth Kumar, P.; Gunapriya, B.; Sivaranjani, S.; Gomathi, P.; Rajesh, T.; Sujitha, S.; Deebanchakkarawarthi, G. Smart Home Technologies Toward SMART (Specific, Measurable, Achievable, Realistic, and Timely) Outlook. In Mobile Computing and Sustainable Informatics; Springer: Berlin/Heidelberg, Germany, 2022; pp. 711–727. [Google Scholar]
- Deserno, T.M. Transforming smart vehicles and smart homes into private diagnostic spaces. In Proceedings of the 2020 2nd Asia Pacific Information Technology Conference, Bali Island, Indonesia, 17–19 January 2020; pp. 165–171. [Google Scholar]
- Haghi, M.; Spicher, N.; Wang, J.; Deserno, T.M. Integrated sensing devices for disease prevention and health alerts in smart homes. In Accident and Emergency Informatics; IOS Press: Amsterdam, The Netherlands, 2022; pp. 39–61. [Google Scholar]
- Wang, J.; Spicher, N.; Warnecke, J.M.; Haghi, M.; Schwartze, J.; Deserno, T.M. Unobtrusive health monitoring in private spaces: The smart home. Sensors 2021, 21, 864. [Google Scholar] [CrossRef]
- Urwyler, P.; Rampa, L.; Stucki, R.; Büchler, M.; Müri, R.; Mosimann, U.P.; Nef, T. Recognition of activities of daily living in healthy subjects using two ad hoc classifiers. Biomed. Eng. Online 2015, 14, 1–15. [Google Scholar] [CrossRef]
- Shuvo, M.M.H.; Ahmed, N.; Nouduri, K.; Palaniappan, K. A Hybrid approach for human activity recognition with support vector machine and 1D convolutional neural network. In Proceedings of the 2020 IEEE Applied Imagery Pattern Recognition Workshop (AIPR), Washington, DC, USA, 13–15 October 2020; pp. 1–5. [Google Scholar]
- Majumder, S.; Mondal, T.; Deen, M.J. Wearable sensors for remote health monitoring. Sensors 2017, 17, 130. [Google Scholar] [CrossRef]
- Bartalesi, R.; Lorussi, F.; De Rossi, D.; Tesconi, M.; Tognetti, A. Wearable monitoring of lumbar spine curvature by inertial and e-textile sensory fusion. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 6373–6376. [Google Scholar]
- Anzai, Y. Pattern Recognition and Machine Learning; Elsevier: Amsterdam, The Netherlands, 2012. [Google Scholar]
- Li, G.; Liu, T.; Yi, J.; Wang, H.; Li, J.; Inoue, Y. The lower limbs kinematics analysis by wearable sensor shoes. IEEE Sens. J. 2016, 16, 2627–2638. [Google Scholar] [CrossRef]
- Liu, X.; Zhao, C.; Zheng, B.; Guo, Q.; Duan, X.; Wulamu, A.; Zhang, D. Wearable devices for gait analysis in intelligent healthcare. Front. Comput. Sci. 2021, 3, 661676. [Google Scholar] [CrossRef]
- Díaz, S.; Stephenson, J.B.; Labrador, M.A. Use of wearable sensor technology in gait, balance, and range of motion analysis. Appl. Sci. 2019, 10, 234. [Google Scholar] [CrossRef]
- Scott, H.; Lack, L.; Lovato, N. A systematic review of the accuracy of sleep wearable devices for estimating sleep onset. Sleep Med. Rev. 2020, 49, 101227. [Google Scholar] [CrossRef]
- Simpson, L.; Maharaj, M.M.; Mobbs, R.J. The role of wearables in spinal posture analysis: A systematic review. BMC Musculoskelet. Disord. 2019, 20, 1–14. [Google Scholar] [CrossRef]
- Pierleoni, P.; Belli, A.; Palma, L.; Pellegrini, M.; Pernini, L.; Valenti, S. A high reliability wearable device for elderly fall detection. IEEE Sens. J. 2015, 15, 4544–4553. [Google Scholar] [CrossRef]
- Ruiz-Olaya, A.F.; Callejas-Cuervo, M.; Lara-Herrera, C.N. Wearable low-cost inertial sensor-based electrogoniometer for measuring joint range of motion. Dyna 2017, 84, 180–185. [Google Scholar]
- Farooq, M.; Sazonov, E. A novel wearable device for food intake and physical activity recognition. Sensors 2016, 16, 1067. [Google Scholar]
- Ciuti, G.; Ricotti, L.; Menciassi, A.; Dario, P. MEMS sensor technologies for human centred applications in healthcare, physical activities, safety and environmental sensing: A review on research activities in Italy. Sensors 2015, 15, 6441–6468. [Google Scholar] [CrossRef]
- Farooq, M.; Sazonov, E. Real time monitoring and recognition of eating and physical activity with a wearable device connected to the eyeglass. In Proceedings of the 2017 Eleventh International Conference on Sensing Technology (ICST), Sydney, Australia, 4–6 December 2017; pp. 1–6. [Google Scholar]
- Yang, C.C.; Hsu, Y.L. A review of accelerometry-based wearable motion detectors for physical activity monitoring. Sensors 2010, 10, 7772–7788. [Google Scholar]
- Crema, C.; Depari, A.; Flammini, A.; Sisinni, E.; Haslwanter, T.; Salzmann, S. IMU-based solution for automatic detection and classification of exercises in the fitness scenario. In Proceedings of the 2017 IEEE Sensors Applications Symposium (SAS), Glassboro, NJ, USA, 13–15 March 2017; pp. 1–6. [Google Scholar]
- Spinsante, S.; Scalise, L. Measurement of elderly daily physical activity by unobtrusive instrumented shoes. In Proceedings of the 2018 IEEE international symposium on medical measurements and applications (MeMeA), Rome, Italy, 11–13 June 2018; pp. 1–5. [Google Scholar]
- Guan, Q.; Yin, X.; Guo, X.; Wang, G. A novel infrared motion sensing system for compressive classification of physical activity. IEEE Sens. J. 2016, 16, 2251–2259. [Google Scholar] [CrossRef]
- Jang, Y.; Shin, S.; Lee, J.W.; Kim, S. A preliminary study for portable walking distance measurement system using ultrasonic sensors. In Proceedings of the 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; pp. 5290–5293. [Google Scholar]
- Zhang, H.; Li, L.; Jia, W.; Fernstrom, J.D.; Sclabassi, R.J.; Sun, M. Recognizing physical activity from ego-motion of a camera. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 5569–5572. [Google Scholar]
- Plasqui, G.; Bonomi, A.G.; Westerterp, K.R. Daily physical activity assessment with accelerometers: New insights and validation studies. Obes. Rev. 2013, 14, 451–462. [Google Scholar] [CrossRef] [PubMed]
- Thoumie, P.; Drape, J.L.; Aymard, C.; Bedoiseau, M. Effects of a lumbar support on spine posture and motion assessed by electrogoniometer and recording. Clin. Biomech. 1998, 13, 18–26. [Google Scholar] [CrossRef] [PubMed]
- Consmüller, T.; Rohlmann, A.; Weinland, D.; Druschel, C.; Duda, G.N.; Taylor, W.R. Comparative evaluation of a novel measurement tool to assess lumbar spine posture and range of motion. Eur. Spine J. 2012, 21, 2170–2180. [Google Scholar] [CrossRef] [PubMed]
- Wong, W.Y.; Wong, M.S. Trunk posture monitoring with inertial sensors. Eur. Spine J. 2008, 17, 743–753. [Google Scholar] [CrossRef] [PubMed]
- Joshi, S.; Abdelfattah, E. Deep Neural Networks for Time Series Classification in Human Activity Recognition. In Proceedings of the 2021 IEEE 12th Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON), Vancouver, BC, Canada, 27–30 October 2021; pp. 0559–0566. [Google Scholar]
- Perez-Gamboa, S.; Sun, Q.; Zhang, Y. Improved sensor based human activity recognition via hybrid convolutional and recurrent neural networks. In Proceedings of the 2021 IEEE International Symposium on Inertial Sensors and Systems (INERTIAL), Kailua-Kona, HI, USA, 22–25 March 2021; pp. 1–4. [Google Scholar]
Architecture | RLD | Supine | LLD | Standing | Sitting | Walking | Running |
---|---|---|---|---|---|---|---|
CNN | 100 | 100 | 100 | 82 | 75 | 100 | 100 |
LSTM | 100 | 100 | 100 | 43 | 75 | 96 | 100 |
Hybrid | 100 | 100 | 100 | 48 | 75 | 100 | 99 |
Fusion | |||||||
CNN | 100 | 100 | 100 | 92 | 94 | 100 | 100 |
LSTM | 100 | 100 | 100 | 97 | 100 | 98 | 97 |
Hybrid | 100 | 100 | 100 | 90 | 97 | 99 | 100 |
PCA and fusion | |||||||
CNN | 100 | 100 | 100 | 100 | 89 | 98 | 100 |
LSTM | 100 | 100 | 100 | 90 | 100 | 94 | 100 |
Hybrid | 100 | 100 | 100 | 75 | 100 | 100 | 100 |
Architecture | BF | BB | BR | BL |
---|---|---|---|---|
CNN | 28 | 91 | 100 | 97 |
LSTM | 49 | 88 | 92 | 100 |
Hybrid | 45 | 98 | 62 | 99 |
Fusion | ||||
CNN | 98 | 58 | 100 | 100 |
LSTM | 100 | 60 | 100 | 100 |
Hybrid | 100 | 88 | 100 | 100 |
PCA and fusion | ||||
CNN | 100 | 78 | 100 | 97 |
LSTM | 100 | 89 | 100 | 100 |
Hybrid | 100 | 82 | 100 | 100 |
Architecture | TS1 | TS2 | TS3 | TS4 | TS5 | TS6 | TS7 | TS8 |
---|---|---|---|---|---|---|---|---|
CNN | 48 | 73 | 90 | 40 | 25 | 11 | 56 | 39 |
LSTM | 65 | 85 | 87 | 30 | 31 | 29 | 48 | 47 |
Hybrid | 48 | 46 | 42 | 10 | 0 | 14 | 26 | 44 |
Fusion | ||||||||
CNN | 66 | 69 | 90 | 40 | 44 | 57 | 47 | 67 |
LSTM | 84 | 96 | 77 | 33 | 47 | 54 | 56 | 53 |
Hybrid | 50 | 31 | 19 | 37 | 38 | 31 | 49 | 47 |
PCA and fusion | ||||||||
CNN | 73 | 69 | 77 | 47 | 41 | 36 | 59 | 56 |
LSTM | 73 | 88 | 74 | 43 | 53 | 57 | 56 | 67 |
Hybrid | 45 | 31 | 23 | 40 | 34 | 39 | 44 | 42 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Haghi, M.; Ershadi, A.; Deserno, T.M. Recognizing Human Activity of Daily Living Using a Flexible Wearable for 3D Spine Pose Tracking. Sensors 2023, 23, 2066. https://doi.org/10.3390/s23042066
Haghi M, Ershadi A, Deserno TM. Recognizing Human Activity of Daily Living Using a Flexible Wearable for 3D Spine Pose Tracking. Sensors. 2023; 23(4):2066. https://doi.org/10.3390/s23042066
Chicago/Turabian StyleHaghi, Mostafa, Arman Ershadi, and Thomas M. Deserno. 2023. "Recognizing Human Activity of Daily Living Using a Flexible Wearable for 3D Spine Pose Tracking" Sensors 23, no. 4: 2066. https://doi.org/10.3390/s23042066
APA StyleHaghi, M., Ershadi, A., & Deserno, T. M. (2023). Recognizing Human Activity of Daily Living Using a Flexible Wearable for 3D Spine Pose Tracking. Sensors, 23(4), 2066. https://doi.org/10.3390/s23042066