Classical Machine Learning Versus Deep Learning for the Older Adults Free-Living Activity Classification
Abstract
:1. Introduction
- To develop a physical activity classification (PAC) system for an older population in free-living conditions using a deep learning approach.
- To compare the performance between classical machine learning-based PAC system and deep learning-based PAC system.
2. Materials and Methods
2.1. Dataset
2.2. Splitting Training and Testing Data
2.3. Splitting Training and Testing Data
2.4. Classical Machine Learning Algorithm for PAC
3. Results and Discussion
3.1. Performance Analysis of LSTM based PAC System
3.2. Performance Analysis of Classical Machine Learning Based PAC System
3.3. Classical Machine Learning Versus Deep Learning: Which Is Better?
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Feature # | Feature Description | Feature # | Feature Description |
---|---|---|---|
1–3 | Mean of acceleration (x, y, z) a | 40–42 | Variance of angular velocity (x, y, z) |
4–6 | Variance of acceleration (x, y, z) | 43–45 | Correlation between axes of angular velocity (x, y, z) |
7–9 | Correlation between axes of acceleration (x, y, z) | 46–48 | Energy of angular velocity (x, y, z) |
10–12 | Energy of body acceleration (BA) component (x, y, z) | 49 | SMA of the angular velocity |
13 | Signal magnitude area (SMA) of BA component | 50 | Mean of MV of angular velocity |
14 | Tilt angle obtained from gravitational acceleration (GA) component in vertical direction | 51 | Variance of MV of angular velocity |
15–17 | Mean of GA components (x, y, z) | 52 | Energy of MV of angular velocity |
18 | Mean of magnitude vector (MV) of BA component | 53–55 | Mean of jerk signal from angular velocity (x, y, z) |
19 | Variance of MV of BA component | 56–58 | Variance of jerk signal from angular velocity (x, y, z) |
20 | Energy of MV of BA component | 59–61 | Correlation between the axes of the jerk signal from angular velocity (x, y, z) |
21–23 | Mean of jerk signal from acceleration (x, y, z) | 62–64 | Energy of jerk signal from angular velocity (x, y, z) |
24–26 | Variance of jerk signal from acceleration (x, y, z) | 65 | SMA of the jerk signal from angular velocity |
27–29 | Correlation between the axes of jerk signal from acceleration (x, y, z) | 66 | Mean of MV of jerk signal from angular velocity |
30–32 | Energy of the jerk signal from acceleration (x, y, z) | 67 | Variance of MV of jerk signal from angular velocity |
33 | SMA of the jerk signal from acceleration | 68 | Energy of MV of jerk signal from angular velocity |
34 | Mean of MV of jerk signal from acceleration | 69–71 b | Attenuation constant between sensor combinations of acceleration (x, y, z) |
35 | Variance of MV of jerk signal from acceleration | 72–74 b | Correlation between sensor combinations of acceleration (x, y, z) |
36 | Energy of MV of jerk signal from acceleration | 75–77 b | Correlation between sensor combinations of angular velocity signal (x, y, z) |
37–39 | Mean of angular velocity (x, y, z) |
Appendix B
Sensor Combination | Walking | Sitting | Standing | Lying | Overall F-Score |
---|---|---|---|---|---|
L5 | 85.2 % | 95.0 % | 80.7 % | 76.9 % | 84.4 % |
Wrist (W) | 71.5 % | 89.7 % | 64.2 % | 0.0 % | 56.3 % |
Thigh (T) | 95.0 % | 99.1 % | 96.4 % | 0.0 % | 72.6 % |
Chest (C) | 71.4 % | 86.1 % | 61.7 % | 98.5 % | 79.4 % |
T + L5 | 94.2 % | 99.6 % | 95.9 % | 81.4 % | 92.8 % |
T + C + L5 | 94.2 % | 99.9 % | 95.9 % | 99.5 % | 97.3 % |
T + C + L5 + W | 94.5 | 99.9 | 96.1 | 98.5 | 97.2 |
References
- World Health Organization. Global Recommendations on Physical Activity for Health—2010; WHO: Geneva, Switzerland, 2015. [Google Scholar]
- McPhee, J.S.; French, D.P.; Jackson, D.; Nazroo, J.; Pendleton, N.; Degens, H. Physical activity in older age: Perspectives for healthy ageing and frailty. Biogerontology 2016, 17, 567–580. [Google Scholar] [CrossRef]
- Bangsbo, J.; Blackwell, J.; Boraxbekk, C.-J.; Caserotti, P.; Dela, F.; Evans, A.B.; Jespersen, A.P.; Gliemann, L.; Kramer, A.; Lundbye-Jensen, J.; et al. Copenhagen Consensus statement 2019: Physical activity and ageing. Br. J. Sports Med. 2019, 53, 856–858. [Google Scholar] [CrossRef] [Green Version]
- European Commission. The 2012 Ageing Report: Economic and budgetary projections for the EU-27 Member States (2010–2060). 2012. Available online: https://ec.europa.eu/economy_finance/publications/european_economy/2012/pdf/ee-2012-2_en.pdf (accessed on 1 December 2020).
- Preece, S.J.; Goulermas, J.Y.; Kenney, L.P.; Howard, D. A comparison of feature extraction methods for the classi-fication of dynamic activities from accelerometer data. IEEE Trans. Biomed. Eng. 2008, 56, 871–879. [Google Scholar] [CrossRef]
- Figo, D.; Diniz, P.C.; Ferreira, D.R.; Cardoso, J.M. Preprocessing techniques for context recognition from accel-erometer data. Pers. Ubiquitous Comput. 2010, 14, 645–662. [Google Scholar] [CrossRef]
- Anguita, D.; Ghio, A.; Oneto, L.; Parra, F.X.L.; Ortiz, J.L.R. Human Activity Recognition on Smartphones Using a Multiclass Hardware-Friendly Support Vector Machine. In International Workshop on Ambient Assisted Living; Springer: Berlin, Germany, 2012; pp. 216–223. [Google Scholar]
- Staudenmayer, J.; He, S.; Hickey, A.; Sasaki, J.; Freedson, P.S. Methods to estimate aspects of physical activity and sedentary behavior from high-frequency wrist accelerometer measurements. J. Appl. Physiol. 2015, 119, 396–403. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Awais, M.; Palmerini, L.; Chiari, L. Physical activity classification using body-worn inertial sensors in a multi-sensor setup. In Proceedings of the 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a better tomorrow (RTSI), Bologna, Italy, 7–9 September 2016. [Google Scholar]
- Kwon, M.-C.; Choi, S. Recognition of Daily Human Activity Using an Artificial Neural Network and Smartwatch. Wirel. Commun. Mob. Comput. 2018, 2018, 2618045. [Google Scholar] [CrossRef]
- O’Mahony, N.; Campbell, S.; Carvalho, A.; Harapanahalli, S.; Velasco-Hernández, G.; Krpalkova, L.; Riordan, D.; Walsh, J. Deep Learning vs. Traditional Computer Vision; Springer: Cham, Switzerland, 2019; pp. 128–144. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- LeCun, Y.; Boser, B.; Denker, J.; Henderson, D.; Howard, R.; Hubbard, W.; Jackel, L. Handwritten digit recognition with a back-propagation network. In Advances in Neural Information Processing Systems; Morgan Kaufmann Publishers Inc.: San Francisco, CA, USA, 1990; pp. 396–404. [Google Scholar]
- El Hihi, S.; Bengio, Y. Hierarchical recurrent neural networks for long-term dependencies. In Proceedings of the Advances in Neural Information Processing Systems, Cambridge, MA, USA, 27 November 1995; pp. 493–499. [Google Scholar]
- Rajkomar, A.; Oren, E.; Chen, K.; Dai, A.M.; Hajaj, N.; Hardt, M.; Liu, P.J.; Liu, X.; Marcus, J.; Sun, M.; et al. Scalable and accurate deep learning with electronic health records. npj Digit. Med. 2018, 1, 18. [Google Scholar] [CrossRef]
- Ahmad, T.; Chen, H. Deep learning for multi-scale smart energy forecasting. Energy 2019, 175, 98–112. [Google Scholar] [CrossRef]
- Tian, Y.; Pei, K.; Jana, S.; Ray, B. Deeptest: Automated testing of deep-neural-network-driven autonomous cars. In Proceedings of the 40th International Conference on Software Engineering, Gothenburg, Sweden, 27 May–3 June 2018; pp. 303–314. [Google Scholar]
- Amodei, D.; Ananthanarayanan, S.; Anubhai, R.; Bai, J.; Battenberg, E.; Case, C.; Caspe, J.; Catanzaro, B.; Cheng, Q.; Chen, G.; et al. Deep speech 2: End-to-end speech recognition in english and mandarin. In Proceedings of the International Conference on Machine Learning, New York, NY, USA, 19–24 June 2016; pp. 173–182. [Google Scholar]
- Heaton, J.; Polson, N.; Witte, J.H. Deep learning for finance: Deep portfolios. Appl. Stoch. Models Bus. Ind. 2017, 33, 3–12. [Google Scholar] [CrossRef]
- Bhattacharya, S.; Lane, N.D. From smart to deep: Robust activity recognition on smartwatches using deep learning. In Proceedings of the 2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), Sydney, Australia, 14–18 March 2016. [Google Scholar]
- Ordóñez, F.J.; Roggen, D. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ronao, C.A.; Cho, S.-B. Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst. Appl. 2016, 59, 235–244. [Google Scholar] [CrossRef]
- Morales, J.; Akopian, D. Physical activity recognition by smartphones, a survey. Biocybern. Biomed. Eng. 2017, 37, 388–400. [Google Scholar] [CrossRef]
- Gochoo, M.; Tan, T.-H.; Liu, S.-H.; Jean, F.-R.; Alnajjar, F.; Huang, S.-C. Unobtrusive Activity Recognition of Elderly People Living Alone Using Anonymous Binary Sensors and DCNN. IEEE J. Biomed. Heal. Informatics 2018, 23, 693–702. [Google Scholar] [CrossRef] [PubMed]
- Hassan, M.M.; Huda, S.; Uddin, Z.; Almogren, A.; Alrubaian, M.A. Human Activity Recognition from Body Sensor Data using Deep Learning. J. Med Syst. 2018, 42, 99. [Google Scholar] [CrossRef]
- Hassan, M.M.; Uddin, Z.; Mohamed, A.; Almogren, A. A robust human activity recognition system using smartphone sensors and deep learning. Futur. Gener. Comput. Syst. 2018, 81, 307–313. [Google Scholar] [CrossRef]
- Pienaar, S.W.; Malekian, R. Human Activity Recognition using LSTM-RNN Deep Neural Network Architecture. In Proceedings of the 2019 IEEE 2nd Wireless Africa Conference (WAC), Pretoria, South Africa, 18–20 August 2019. [Google Scholar]
- Lawal, I.A.; Bano, S. Deep human activity recognition using wearable sensors. In Proceedings of the Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments; Rhodes, Greece, 5–7 June 2019, pp. 45–48.
- Wang, H.; Zhao, J.; Li, J.; Tian, L.; Tu, P.; Cao, T.; An, Y.; Wang, K.; Li, S. Wearable Sensor-Based Human Activity Recognition Using Hybrid Deep Learning Techniques. Secur. Commun. Networks 2020, 2020, 2132138. [Google Scholar] [CrossRef]
- Papagiannaki, A.; Zacharaki, E.I.; Kalouris, G.; Kalogiannis, S.; Deltouzos, K.; Ellul, J.; Megalooikonomou, V. Rec-ognizing physical activity of older people from wearable sensors and inconsistent data. Sensors 2019, 19, 880. [Google Scholar] [CrossRef] [Green Version]
- Aicha, A.N.; Englebienne, G.; Van Schooten, K.S.; Pijnappels, M.; Kröse, B. Deep Learning to Predict Falls in Older Adults Based on Daily-Life Trunk Accelerometry. Sensors 2018, 18, 1654. [Google Scholar] [CrossRef] [Green Version]
- Awais, M.; Palmerini, L.; Bourke, A.K.; Ihlen, E.A.F.; Helbostad, J.L.; Chiari, L. Performance Evaluation of State of the Art Systems for Physical Activity Classification of Older Subjects Using Inertial Sensors in a Real Life Scenario: A Benchmark Study. Sensors 2016, 16, 2105. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Awais, M.; Chiari, L.; Ihlen, E.A.F.; Helbostad, J.L.; Palmerini, L. Physical Activity Classification for Elderly People in Free-Living Conditions. IEEE J. Biomed. Heal. Informatics 2018, 23, 197–207. [Google Scholar] [CrossRef] [PubMed]
- Shakya, S.R.; Zhang, C.; Zhou, Z. Comparative Study of Machine Learning and Deep Learning Architecture for Human Activity Recognition Using Accelerometer Data. Int. J. Mach. Learn. Comput. 2018, 8, 577–582. [Google Scholar]
- Baldominos Gómez, A.; Cervantes, A.; Sáez Achaerandio, Y.; Isasi, P. A Comparison of Machine Learning and Deep Learning Techniques for Activity Recognition using Mobile Devices. Sensors 2019, 19, 521. [Google Scholar] [CrossRef] [Green Version]
- Bourke, A.; Ihlen, E.A.F.; Bergquist, R.; Wik, P.B.; Vereijken, B.; Helbostad, J.L. A Physical Activity Reference Data-Set Recorded from Older Adults Using Body-Worn Inertial Sensors and Video Technology—The ADAPT Study Data-Set. Sensors 2017, 17, 559. [Google Scholar] [CrossRef] [Green Version]
- Graves, A.; Mohamed, A.-R.; Hinton, G. Speech recognition with deep recurrent neural networks. In Proceedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013; pp. 6645–6649. [Google Scholar]
- Softmax Cross Entropy. Available online: https://www.tensorflow.org/api_docs/python/tf/nn/softmax_cross_entropy_with_logits (accessed on 20 August 2020).
- Adam Optimizer. Available online: https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/optimizers/Adam (accessed on 20 August 2020).
- Rectified Linear Unit. Available online: https://www.tensorflow.org/api_docs/python/tf/nn/relu (accessed on 20 August 2020).
- Hall, M.A.; Smith, L.A. Feature selection for machine learning: Comparing a correlation-based filter approach to the wrapper. In Proceedings of the 12th International FLAIRS Conference, Orlando, FL, USA, 1–5 May 1999; pp. 235–239. [Google Scholar]
- Yu, L.; Liu, H. Feature selection for high-dimensional data: A fast correlation-based filter solution. In Proceedings of the 20th International Conference on Machine Learning (ICML-03), Washington, DC, USA, 21–24 August 2003; pp. 856–863. [Google Scholar]
- Moncada-Torres, A.; Leuenberger, K.; Gonzenbach, R.; Luft, A.; Gassert, R. Activity classification based on inertial and barometric pressure sensors at different anatomical locations. Physiol. Meas. 2014, 35, 1245–1263. [Google Scholar] [CrossRef]
- Awais, M.; Mellone, S.; Chiari, L. Physical Activity Classification Meets Daily Life: Review on Existing Methodologies and Open Challenges. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 5050–5053. [Google Scholar]
Subjects | Walk | Sit | Stand | Lie | Split |
---|---|---|---|---|---|
1 | 237 | 1001 | 449 | 54 | Testing |
2 | 187 | 454 | 119 | 13 | |
3 | 559 | 1661 | 801 | 0 | |
4 | 306 | 2493 | 406 | 0 | |
5 | 297 | 593 | 362 | 32 | |
6 | 493 | 2078 | 441 | 234 | Training |
7 | 644 | 1803 | 729 | 19 | |
8 | 323 | 568 | 495 | 23 | |
9 | 349 | 1053 | 554 | 37 | |
10 | 347 | 1762 | 654 | 35 | |
11 | 576 | 617 | 1503 | 2 | |
12 | 664 | 836 | 1293 | 0 | |
13 | 405 | 1027 | 589 | 13 | |
14 | 442 | 1871 | 774 | 0 | |
15 | 289 | 711 | 285 | 24 | |
16 | 222 | 969 | 335 | 27 | |
Total windows | 6340 | 19,497 | 9789 | 513 |
Parameter | Value |
---|---|
Window Size | N (500 samples) |
Sampling frequency | 100 Hz |
Number of features/signals | 24 (F) |
Training data feature space | (26,115, 500, 24) |
Training data label space | (26,115, 1) |
Testing data feature space | (10,024, 500, 24) |
Training data label space | (10,024, 1) |
Cost Function | Softmax Cross Entropy [39] |
Optimizer | Adam Optimizer [40] |
LSTM Layers | 2 |
No of Hidden Units | 32 |
Activation function | ReLU [41] |
Regularization | L2 regularization |
Learning rate | 0.0025 |
Batch size | 1500 |
Loss function | Softmax cross entropy with logits |
Software used | Tensflow with GPU |
System used | Lenovo Legion 5 Ryzen 7 16GB 512GB SSD RTX 2060 15.6” Win10 Home Gaming Laptop |
ADLs Classified | PAC-LSTM (%) | PAC-All-Feat (%) | PAC-CFS (%) | PAC-FCBF (%) | PAC-ReliefF (%) |
---|---|---|---|---|---|
Walking | 94.48 | 92.65 | 93.32 | 86.91 | 93.48 |
Sitting | 99.90 | 99.81 | 99.68 | 99.69 | 99.95 |
Standing | 96.09 | 95.48 | 95.29 | 91.58 | 95.41 |
Lying | 98.46 | 89.39 | 84.72 | 86.49 | 98.46 |
Overall F-score | 97.23 | 94.33 | 93.25 | 91.17 | 96.83 |
Predicted Class | |||||
---|---|---|---|---|---|
Actual Class | Walking | Sitting | Standing | Lying | |
Walking | 1464 | 3 | 118 | 0 | |
Sitting | 2 | 6197 | 3 | 0 | |
Standing | 48 | 1 | 2088 | 0 | |
Lying | 0 | 3 | 0 | 96 |
(a)→ All features | (b)→CFS | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
Predicted class | Predicted class | |||||||||
Actual class | Walking | Sitting | Standing | Lying | Walking | Sitting | Standing | Lying | ||
Walking | 1449 | 1 | 136 | 0 | Walking | 1424 | 0 | 162 | 0 | |
Sitting | 18 | 6181 | 3 | 0 | Sitting | 1 | 6164 | 4 | 33 | |
Standing | 58 | 0 | 2079 | 0 | Standing | 41 | 0 | 2096 | 0 | |
Lying | 17 | 2 | 0 | 80 | Lying | 0 | 2 | 0 | 97 | |
(c)→ FCBF | (d)→ ReliefF | |||||||||
Predicted class | Predicted class | |||||||||
Actual class | Walking | Sitting | Standing | Lying | Walking | Sitting | Standing | Lying | ||
Walking | 1268 | 0 | 318 | 0 | Walking | 1442 | 1 | 143 | 0 | |
Sitting | 4 | 6167 | 4 | 27 | Sitting | 1 | 6200 | 1 | 0 | |
Standing | 60 | 0 | 2077 | 0 | Standing | 56 | 0 | 2081 | 0 | |
Lying | 0 | 3 | 0 | 96 | Lying | 0 | 3 | 0 | 96 |
Feature Selection Technique | Number of Features |
---|---|
All features (no feature selection) | 326 |
CFS | 18 |
FCBF | 17 |
ReliefF | 105 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Awais, M.; Chiari, L.; Ihlen, E.A.F.; Helbostad, J.L.; Palmerini, L. Classical Machine Learning Versus Deep Learning for the Older Adults Free-Living Activity Classification. Sensors 2021, 21, 4669. https://doi.org/10.3390/s21144669
Awais M, Chiari L, Ihlen EAF, Helbostad JL, Palmerini L. Classical Machine Learning Versus Deep Learning for the Older Adults Free-Living Activity Classification. Sensors. 2021; 21(14):4669. https://doi.org/10.3390/s21144669
Chicago/Turabian StyleAwais, Muhammad, Lorenzo Chiari, Espen A. F. Ihlen, Jorunn L. Helbostad, and Luca Palmerini. 2021. "Classical Machine Learning Versus Deep Learning for the Older Adults Free-Living Activity Classification" Sensors 21, no. 14: 4669. https://doi.org/10.3390/s21144669
APA StyleAwais, M., Chiari, L., Ihlen, E. A. F., Helbostad, J. L., & Palmerini, L. (2021). Classical Machine Learning Versus Deep Learning for the Older Adults Free-Living Activity Classification. Sensors, 21(14), 4669. https://doi.org/10.3390/s21144669