HF-SPHR: Hybrid Features for Sustainable Physical Healthcare Pattern Recognition Using Deep Belief Networks
Abstract
:1. Introduction
- Statistical nonparametric operator; i.e., a 1D local binary pattern (1D-LBP) generates a code [28] that can describe larger data in its compressed form using the sample and its neighbors.
- Entropy-based features: these features are used to find the optimal characteristics of a signal [29], and can easily differentiate between noisy and plain signals.
- We developed hybrid approaches for feature abstraction, including statistical nonparametric, entropy-based, wavelet transform, and Mel-cepstral features.
- We designed a multi-layer sequential forward selection (MLSFS) to differentiate and select the optimal features for SPHR.
- A combination of a Gaussian mixture model (GMM) with Gaussian mixture regression (GMR) was introduced to generate the codebook and optimum interpretation of the features.
- We used two publicly-available benchmark datasets for our model, and fully validated it against other state-of-the-art methods, including CNN, AdaBoost, and ANN-based algorithms.
2. Materials and Methods
2.1. Data Acquisition and Pre-Processing
2.2. Data Segmentation
Algorithm 1 Signals Overlapping Segmentation |
2.3. IMU-Based Hybrid Feature Extraction
Algorithm 2 IMU Feature Abstraction |
2.3.1. 1D Local Binary Pattern
2.3.2. State–Space Correlation Entropy
2.3.3. Dispersion Entropy
2.4. ECG-Based Hybrid Feature Extraction
Algorithm 3 ECG Feature Abstraction |
2.4.1. Wavelet Packet Entropy (WPE)
2.4.2. P-Wave and T-Wave Detection
2.4.3. Mel-Frequency Cepstral Coefficients
2.4.4. R-Point Detection and R–R Interval
2.5. EMG-Based Hybrid Feature Extraction
Algorithm 4 EMG Feature Abstraction |
2.5.1. Fuzzy Entropy
2.5.2. Approximate Entropy
2.5.3. Renyi Entropy Order 2 and Order 3
2.6. Feature-to-Feature Fusion
2.7. Feature Reduction: Modified Multi-Layer Sequential Forward Selection
Algorithm 5 Multi-layer Sequential Forward Selection algorithm |
2.8. Codebook Generation
2.9. Deep Belief Network Implementation Using RBMs
3. Experimental Performance
3.1. Datasets Description
3.2. Results Evaluations
- In its actual implementation, pattern recognition challenges were faced while the same activity was performed by different individuals.
- Wearable-sensors–based architectures are susceptible to placement changes and other locomotion activities.
4. Discussion
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Gochoo, M.; Gochoo, M.; Velusamy, V.; Liu, S.-H.; Bayanduuren, D.; Huang, S.-C. Device-Free Non-Privacy Invasive Classification of Elderly Travel Patterns in A Smart House Using PIR Sensors and DCNN. IEEE Sens. J. 2017, 18, 1287. [Google Scholar] [CrossRef]
- Jalal, A.; Uddin, Z.; Kim, T.-S. Depth video-based human activity recognition system using translation and scaling invariant features for life logging at smart home. IEEE Trans. Consum. Electron. 2012, 58, 863–871. [Google Scholar] [CrossRef]
- Jalal, A.; Kamal, S. Real-time life logging via a depth silhouette-based human activity recognition system for smart home services. In Proceedings of the 2014 11th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), Seoul, Korea, 26–29 August 2014; pp. 74–80. [Google Scholar]
- Dang, L.M.; Min, K.; Wang, H.; Piran, J.; Lee, C.H.; Moon, H. Sensor-based and vision-based human activity recognition: A comprehensive survey. Pattern Recognit. 2020, 108, 107561. [Google Scholar] [CrossRef]
- Kaixuan, C.; Dalin, Z.; Lina, Y.; Bin, G.; Zhiwen, Y.; Yunhao, L. Deep Learning for Sensor-based Human Activity Recogni-tion: Overview, Challenges and Opportunities. J. ACM 2018, 37. [Google Scholar] [CrossRef]
- Shrestha, A.; Mahmood, A. Review of Deep Learning Algorithms and Architectures. IEEE Access 2019, 7, 53040–53065. [Google Scholar] [CrossRef]
- Nweke, H.F.; Teh, Y.W.; Al-Garadi, M.A.; Alo, U.R. Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges. Expert Syst. Appl. 2018, 105, 233–261. [Google Scholar] [CrossRef]
- Tingting, Y.; Junqian, W.; Lintai, W.; Yong, X. Three-stage network for age estimation. CAAI Trans. Intell. Technol. 2019, 4, 122–126. [Google Scholar] [CrossRef]
- Osterland, S.; Weber, J. Analytical analysis of single-stage pressure relief valves. Int. J. Hydromechatron. 2019, 2, 32. [Google Scholar] [CrossRef]
- Wang, J.; Chen, Y.; Hao, S.; Peng, X.; Hu, L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognit. Lett. 2019, 119, 3–11. [Google Scholar] [CrossRef] [Green Version]
- Zhang, S.; Wei, Z.; Nie, J.; Shuang, W.; Wang, S.; Li, Z. A Review on Human Activity Recognition Using Vision-Based Method. J. Healthc. Eng. 2017, 2017, 1–31. [Google Scholar] [CrossRef] [PubMed]
- Jalal, A.; Kamal, S.; Kim, D. A Depth Video Sensor-Based Life-Logging Human Activity Recognition System for Elderly Care in Smart Indoor Environments. Sensors 2014, 14, 11735–11759. [Google Scholar] [CrossRef]
- Espinosa, R.; Ponce, H.; Gutiérrez, S.; Martínez-Villaseñor, L.; Brieva, J.; Moya-Albor, E. A vision-based approach for fall detection using multiple cameras and convolutional neural networks: A case study using the UP-Fall detection dataset. Comput. Biol. Med. 2019, 115, 103520. [Google Scholar] [CrossRef]
- Jalal, A.; Quaid, M.A.K.; Tahir, S.B.U.D.; Kim, K. A Study of Accelerometer and Gyroscope Measurements in Physical Life-Log Activities Detection Systems. Sensors 2020, 20, 6670. [Google Scholar] [CrossRef]
- Yang, X.; Tian, Y. Super Normal Vector for Human Activity Recognition with Depth Cameras. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 39, 1028–1039. [Google Scholar] [CrossRef] [PubMed]
- Jalal, A.; Khalid, N.; Kim, K. Automatic Recognition of Human Interaction via Hybrid Descriptors and Maximum Entropy Markov Model Using Depth Sensors. Entropy 2020, 22, 817. [Google Scholar] [CrossRef] [PubMed]
- Mahmood, M.; Jalal, A.; Kim, K. WHITE STAG model: Wise human interaction tracking and estimation (WHITE) using spatio-temporal and angular-geometric (STAG) descriptors. Multimed. Tools Appl. 2020, 79, 6919–6950. [Google Scholar] [CrossRef]
- Jalal, A.; Kim, Y.-H.; Kim, Y.-J.; Kamal, S.; Kim, D. Robust human activity recognition from depth video using spatiotemporal multi-fused features. Pattern Recognit. 2017, 61, 295–308. [Google Scholar] [CrossRef]
- Irvine, N.; Nugent, C.; Zhang, S.; Wang, H.; Ng, W.W.Y. Neural Network Ensembles for Sensor-Based Human Activity Recognition Within Smart Environments. Sensors 2019, 20, 216. [Google Scholar] [CrossRef] [Green Version]
- Xi, X.; Tang, M.; Miran, S.M.; Miran, S.M. Evaluation of Feature Extraction and Recognition for Activity Monitoring and Fall Detection Based on Wearable sEMG Sensors. Sensors 2017, 17, 1229. [Google Scholar] [CrossRef] [PubMed]
- Wijekoon, A.; Wiratunga, N.; Sani, S.; Cooper, K. A knowledge-light approach to personalised and open-ended human activity recognition. Knowl. Based Syst. 2020, 192, 105651. [Google Scholar] [CrossRef]
- Quaid, M.A.K.; Jalal, A. Wearable sensors based human behavioral pattern recognition using statistical features and reweighted genetic algorithm. Multimed. Tools Appl. 2020, 79, 6061–6083. [Google Scholar] [CrossRef]
- Tahir, S.B.U.D.; Jalal, A.; Kim, K. Wearable Inertial Sensors for Daily Activity Analysis Based on Adam Optimization and the Maximum Entropy Markov Model. Entropy 2020, 22, 579. [Google Scholar] [CrossRef]
- Sueur, C.; Jeantet, L.; Chevallier, D.; Bergouignan, A.; Sueur, C. A Lean and Performant Hierarchical Model for Human Activity Recognition Using Body-Mounted Sensors. Sensors 2020, 20, 3090. [Google Scholar] [CrossRef]
- Badawi, A.A.; Al-Kabbany, A.; Shaban, H.A. Sensor Type, Axis, and Position-Based Fusion and Feature Selection for Multimodal Human Daily Activity Recognition in Wearable Body Sensor Networks. J. Healthc. Eng. 2020, 2020, 1–14. [Google Scholar] [CrossRef] [PubMed]
- Shokri, M.; Tavakoli, K. A Review on the Artificial Neural Network Approach to Analysis and Prediction of Seismic Damage in Infrastructure. Int. J. Hydromechatron. 2019, 1, 178–196. [Google Scholar] [CrossRef]
- Ahmed, A.; Jalal, A.; Kim, K. A Novel Statistical Method for Scene Classification Based on Multi-Object Categorization and Logistic Regression. Sensors 2020, 20, 3871. [Google Scholar] [CrossRef] [PubMed]
- Susan, S.; Agrawal, P.; Mittal, M.; Bansal, S. New shape descriptor in the context of edge continuity. CAAI Trans. Intell. Technol. 2019, 4, 101–109. [Google Scholar] [CrossRef]
- Guido, R.C. A tutorial review on entropy-based handcrafted feature extraction for information fusion. Inf. Fusion 2018, 41, 161–175. [Google Scholar] [CrossRef]
- Mallat, S. A theory for multiresolution signal decomposition: The wavelet representation. IEEE Trans. Pattern Anal. Mach. Intell. 1989, 11, 674–693. [Google Scholar] [CrossRef] [Green Version]
- Bruce, L.; Koger, C.; Li, J. Dimensionality reduction of hyperspectral data using discrete wavelet transform feature extraction. IEEE Trans. Geosci. Remote Sens. 2002, 40, 2331–2338. [Google Scholar] [CrossRef]
- Jalal, A.; Batool, M.; Kim, K. Stochastic Recognition of Physical Activity and Healthcare Using Tri-Axial Inertial Wearable Sensors. Appl. Sci. 2020, 10, 7122. [Google Scholar] [CrossRef]
- Yusuf, S.A.A.; Hidayat, R. MFCC Feature Extraction and KNN Classification in ECG Signals. In Proceedings of the 2019 6th International Conference on Information Technology, Computer and Electrical Engineering (ICITACEE), Semarang, Indonesia, 26–27 September 2019; pp. 1–5. [Google Scholar]
- Jalal, A.; Quaid, M.A.K.; Hasan, A.S. Wearable Sensor-Based Human Behavior Understanding and Recognition in Daily Life for Smart Environments. In Proceedings of the 2018 International Conference on Frontiers of Information Technology (FIT), Islamabad, Pakistan, 17–19 December 2018; pp. 105–110. [Google Scholar]
- Javeed, M.; Jalal, A.; Kim, K. Wearable Sensors based Exertion Recognition using Statistical Features and Random Forest for Physical Healthcare Monitoring. In Proceedings of the 18th International Bhurban Conference on Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 12–16 January 2021. [Google Scholar]
- Pervaiz, M.; Jalal, A.; Kim, K. Hybrid Algorithm for Multi People Counting and Tracking for Smart Surveillance. In Proceedings of the 18th International Bhurban Conference on Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 12–16 January 2021. [Google Scholar]
- Khalid, N.; Gochoo, M.; Jalal, A.; Kim, K. Modeling Two-Person Segmentation and Locomotion for Stereoscopic Action Identification: A Sustainable Video Surveillance System. Sustainability 2021, 13, 970. [Google Scholar] [CrossRef]
- Tahir, S.B.; Jalal, A.; Kim, K. IMU Sensor Based Automatic-Features Descriptor for Healthcare Patient’s Daily Life-log Recog-nition. In Proceedings of the 18th International Bhurban Conference on Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 12–16 January 2021. [Google Scholar]
- Jalal, A.; Batool, M.; ud din Tahir, S.B. Markerless Sensors for Physical Health Monitoring System Using ECG and GMM Feature Extraction. In Proceedings of the 18th International Bhurban Conference on Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 12–16 January 2021. [Google Scholar]
- Ahmed, A.; Jalal, A.; Rafique, A.A. Salient Segmentation based Object Detection and Recognition using Hybrid Genetic Transform. In Proceedings of the 2019 International Conference on Applied and Engineering Mathematics (ICAEM), Taxila, Pakistan, 27–29 August 2019; pp. 203–208. [Google Scholar]
- Jalal, A.; Sarif, N.; Kim, J.T.; Kim, T.-S. Human Activity Recognition via Recognized Body Parts of Human Depth Silhouettes for Residents Monitoring Services at Smart Home. Indoor Built Environ. 2012, 22, 271–279. [Google Scholar] [CrossRef]
- Rafique, A.A.; Jalal, A.; Kim, K. Automated Sustainable Multi-Object Segmentation and Recognition via Modified Sampling Consensus and Kernel Sliding Perceptron. Symmetry 2020, 12, 1928. [Google Scholar] [CrossRef]
- Rostaghi, M.; Azami, H. Dispersion Entropy: A Measure for Time-Series Analysis. IEEE Signal Process. Lett. 2016, 23, 610–614. [Google Scholar] [CrossRef]
- Azami, H.; Rostaghi, M.; Abásolo, D.E.; Escudero, J. Refined Composite Multiscale Dispersion Entropy and its Application to Biomedical Signals. IEEE Trans. Biomed. Eng. 2017, 64, 2872–2879. [Google Scholar] [CrossRef] [Green Version]
- Abdul, Z.K.; Al-Talabani, A.; Abdulrahman, A.O. A New Feature Extraction Technique Based on 1D Local Binary Pattern for Gear Fault Detection. Shock. Vib. 2016, 2016, 1–6. [Google Scholar] [CrossRef] [Green Version]
- Turnip, A.; Kusumandari, D.E.; Wijaya, C.; Turnip, M.; Sitompul, E. Extraction of P and T Waves from Electrocardiogram Signals with Modified Hamilton Algorithm. In Proceedings of the International Conference on Sustainable Engineering and Creative Computing (ICSECC), Bandung, Indonesia, 20–22 August 2019; pp. 58–62. [Google Scholar] [CrossRef]
- Young, S.; Evermann, G.; Gales, M.; Hain, T.; Kershaw, D.; Liu, X.; Moore, G.; Odell, J.; Ollason, D.; Povey, D.; et al. The HTK Book (for HTK Version 3.4.1). Engineering Department, Cambridge University. Available online: http://htk.eng.cam.ac.uk (accessed on 19 November 2020).
- Ellis, D. Reproducing the Feature Outputs of Common Programs Using Matlab and melfcc.m. 2005. Available online: http://labrosa.ee.columbia.edu/matlab/rastamat/mfccs.html (accessed on 31 January 2021).
- Jalal, A.; Kamal, S.; Kim, D.-S. Detecting Complex 3D Human Motions with Body Model Low-Rank Representation for Real-Time Smart Activity Monitoring System. KSII Trans. Internet Inf. Syst. 2018, 12, 1189–1204. [Google Scholar] [CrossRef] [Green Version]
- Jalal, A.; Quaid, M.A.K.; Sidduqi, M.A. A Triaxial acceleration-based human motion detection for ambient smart home sys-tem. In Proceedings of the 2019 16th International Bhurban Conference on Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 8–12 January 2019; pp. 353–358. [Google Scholar]
- Jalal, A.; Quaid, M.A.K.; Kim, K. A Wrist Worn Acceleration Based Human Motion Analysis and Classification for Ambient Smart Home System. J. Electron. Eng. Technol. 2019, 14, 1733–1739. [Google Scholar] [CrossRef]
- Ahmed, A.; Jalal, A.; Kim, K. Region and Decision Tree-Based Segmentations for Multi-Objects Detection and Classification in Outdoor Scenes. In Proceedings of the 2019 International Conference on Frontiers of Information Technology (FIT), Islamabad, Pakistan, 16–18 December 2019; pp. 209–2095. [Google Scholar]
- Azami, H.; Fernández, A.; Escudero, J. Refined multiscale fuzzy entropy based on standard deviation for biomedical signal analysis. Med. Biol. Eng. Comput. 2017, 55, 2037–2052. [Google Scholar] [CrossRef] [PubMed]
- Chen, W.; Wang, Z.; Xie, H.; Yu, W. Characterization of Surface EMG Signal Based on Fuzzy Entropy. IEEE Trans. Neural Syst. Rehabil. Eng. 2007, 15, 266–272. [Google Scholar] [CrossRef]
- Pincus, S.M.; Gladstone, I.M.; Ehrenkranz, R.A. A regularity statistic for medical data analysis. J. Clin. Monit. 1991, 7, 335–345. [Google Scholar] [CrossRef] [PubMed]
- Wenye, G. Shannon and Non-Extensive Entropy. MATLAB Central File Exchange. Available online: https://www.mathworks.com/matlabcentral/fileexchange/18133-shannon-and-non-extensive-entropy (accessed on 6 August 2020).
- Batool, M.; Jalal, A.; Kim, K. Telemonitoring of Daily Activity Using Accelerometer and Gyroscope in Smart Home Envi-ronments. J. Electr. Eng. Technol. 2020, 15, 2801–2809. [Google Scholar] [CrossRef]
- Jalal, A.; Uddin, Z.; Kim, J.T.; Kim, T.-S. Recognition of Human Home Activities via Depth Silhouettes and ℜ Transformation for Smart Homes. Indoor Built Environ. 2011, 21, 184–190. [Google Scholar] [CrossRef]
- Jalal, A.; Kim, Y. Dense depth maps-based human pose tracking and recognition in dynamic scenes using ridge data. In Proceedings of the 2014 11th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), Seoul, Korea, 26–29 August 2014; pp. 119–124. [Google Scholar]
- Vranković, A.; Lerga, J.; Saulig, N. A novel approach to extracting useful information from noisy TFDs using 2D local entropy measures. EURASIP J. Adv. Signal Process. 2020, 2020, 1–19. [Google Scholar] [CrossRef]
- Jalal, A.; Lee, S.; Kim, J.T.; Kim, T.-S. Human Activity Recognition via the Features of Labeled Depth Body Parts. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2012; pp. 246–249. [Google Scholar]
- Jalal, A.; Kamal, S.; Kim, D. Shape and Motion Features Approach for Activity Tracking and Recognition from Kinect Video Camera. In Proceedings of the 2015 IEEE 29th International Conference on Advanced Information Networking and Applications Workshops, Gwangju, Korea, 25–27 March 2015; pp. 445–450. [Google Scholar]
- Jalal, A.; Kamal, S.; Kim, D. Depth silhouettes context: A new robust feature for human tracking and activity recognition based on embedded HMMs. In Proceedings of the 2015 12th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), Goyang City, Korea, 28–30 October 2015; pp. 294–299. [Google Scholar]
- O’Connor, P.; Neil, D.; Liu, S.-C.; Delbruck, T.; Pfeiffer, M. Real-time classification and sensor fusion with a spiking deep belief network. Front. Neurosci. 2013, 7, 178. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Keyvanrad, M.A.; Homayounpour, M. A brief survey on deep belief networks and introducing a new object oriented MATLAB toolbox (DeeBNet). arXiv 2014, arXiv:1408.3264. [Google Scholar]
- Akhter, I.; Jalal, A.; Kim, K. Pose Estimation and Detection for Event Recognition Using Sense-Aware Features and Ada-Boost Classifier. In Proceedings of the 18th International Bhurban Conference on Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 12–16 January 2021. [Google Scholar]
- Banos, O.; Garcia, R.; Holgado-Terriza, J.A.; Damas, M.; Pomares, H.; Rojas, I.; Saez, A.; Villalonga, C. mHealthDroid: A Novel Framework for Agile Development of Mobile Health Applications. In Ambient Assisted Living and Daily Activities. IWAAL 2014. Lecture Notes in Computer Science; Pecchia, L., Chen, L.L., Nugent, C., Bravo, J., Eds.; Springer: Cham, Switzerland, 2020; Volume 8868, pp. 91–98. [Google Scholar]
- Chereshnev, R.; Kertész-Farkas, A. HuGaDB: Human Gait Database for Activity Recognition from Wearable Inertial Sensor Networks. Min. Data Financ. Appl. 2018, 131–141. [Google Scholar] [CrossRef] [Green Version]
- Jalal, A.; Akhter, I.; Kim, K. Human Posture Estimation and Sustainable Events Classification via Pseudo-2D Stick Model and K-ary Tree Hashing. Sustainability 2020, 12, 9814. [Google Scholar] [CrossRef]
- Zhu, C.; Miao, D. Influence of kernel clustering on an RBFN. CAAI Trans. Intell. Technol. 2019, 4, 255–260. [Google Scholar] [CrossRef]
- Wiens, T. Engine Speed Reduction for Hydraulic Machinery Using Predictive Algorithms. Int. J. Hydromechatron. 2019, 2, 16–31. [Google Scholar] [CrossRef]
- Abedin, A.; Motlagh, F.; Shi, Q.; Rezatofighi, H.; Ranasinghe, D. Towards deep clustering of human activities from wearables. In Proceedings of the 2020 International Symposium on Wearable Computers (ISWC ’20). Association for Computing Machinery, New York, NY, USA, 12–17 September 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Fang, B.; Zhou, Q.; Sun, F.; Shan, J.; Wang, M.; Xiang, C.; Zhang, Q. Gait Neural Network for Human-Exoskeleton Interaction. Front. Neurorobot. 2020, 14, 58. [Google Scholar] [CrossRef]
- Maitre, J.; Bouchard, K.; Gaboury, S. Classification models for data fusion in human activity recognition. In Proceedings of the 6th EAI International Conference on Smart Objects and Technologies for Social Good, Antwerp, Belgium, 14–16 September 2020; ACM: New York, NY, USA, 2020; pp. 72–77. [Google Scholar]
- Rasnayaka, S.; Saha, S.; Sim, T. Making the most of what you have! Profiling biometric authentication on mobile devices. In Proceedings of the 2019 International Conference on Biometrics (ICB), Crete, Greece, 4–7 June 2019; pp. 1–7. [Google Scholar] [CrossRef]
- O’Halloran, J.; Curry, E. A comparison of deep learning models in human activity recognition and behavioral prediction on the MHEALTH dataset. In Proceedings of the 27th AIAI Irish conference on Artificial Intelligence and Cognitive Science (AICS), Galway, Ireland, 5–6 December 2019; NUI: Galway, Ireland, 2019. [Google Scholar]
- Sun, Y.; Yang, G.-Z.; Lo, B. An artificial neural network framework for lower limb motion signal estimation with foot-mounted inertial sensors. In Proceedings of the 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Las Vegas, NV, USA, 4–7 March 2018; pp. 132–135. [Google Scholar]
- Masum, A.K.M.; Hossain, M.E.; Humayra, A.; Islam, S.; Barua, A.; Alam, G.R. A Statistical and Deep Learning Approach for Human Activity Recognition. In Proceedings of the 2019 3rd International Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, 23–25 April 2019; pp. 1332–1337. [Google Scholar]
- Kumari, G.; Chakraborty, J.; Nandy, A. Effect of Reduced Dimensionality on Deep learning for Human Activity Recognition. In Proceedings of the 2020 11th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kharagpur, India, 1–3 July 2020; pp. 1–7. [Google Scholar]
- Ha, S.; Choi, S. Convolutional neural networks for human activity recognition using multiple accelerometer and gyroscope sensors. In Proceedings of the 2016 International Joint Conference on Neural Networks (IJCNN), Vancouver, BC, Canada, 24–29 July 2016; pp. 381–388. [Google Scholar]
- Guo, H.; Chen, L.; Peng, L.; Chen, G. Wearable sensor based multimodal human activity recognition exploiting the diver-sity of classifier ensemble. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Heidelberg, Germany, 12–16 September 2016. [Google Scholar]
- Jalal, A.; Batool, M.; Kim, K. Sustainable Wearable System: Human Behavior Modeling for Life-Logging Activities Using K-Ary Tree Hashing Classifier. Sustainability 2020, 12, 10324. [Google Scholar] [CrossRef]
Activities | L1 | L2 | L3 | L4 | L5 | L6 | L7 | L8 | L9 | L10 | L11 | L12 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
L1 | 9 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 |
L2 | 1 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
L3 | 0 | 0 | 9 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 |
L4 | 0 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 |
L5 | 0 | 0 | 0 | 0 | 10 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
L6 | 0 | 0 | 0 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 0 | 0 |
L7 | 1 | 0 | 1 | 0 | 0 | 0 | 10 | 0 | 1 | 0 | 0 | 0 |
L8 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 | 0 | 0 | 0 | 0 |
L9 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 9 | 0 | 0 | 0 |
L10 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 10 | 0 | 1 |
L11 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 | 0 |
L12 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
Mean Accuracy = 93.33% |
Activities | H1 | H2 | H3 | H4 | H5 | H6 | H7 | H8 | H9 | H10 | H11 | H12 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
H1 | 10 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
H2 | 0 | 9 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 |
H3 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
H4 | 0 | 0 | 0 | 9 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 |
H5 | 1 | 0 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
H6 | 0 | 0 | 0 | 0 | 1 | 10 | 0 | 0 | 0 | 0 | 0 | 0 |
H7 | 0 | 1 | 0 | 0 | 0 | 1 | 9 | 1 | 0 | 0 | 0 | 0 |
H8 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 | 0 | 1 | 0 | 1 |
H9 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 | 0 | 0 | 0 |
H10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 | 0 | 0 |
H11 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 | 0 |
H12 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
Mean Accuracy = 92.50% |
Activities | Sensitivity | Specificity |
---|---|---|
L1 | 0.692 | 0.984 |
L2 | 0.758 | 0.992 |
L3 | 0.687 | 0.992 |
L4 | 0.692 | 0.984 |
L5 | 0.763 | 0.984 |
L6 | 0.687 | 0.992 |
L7 | 0.775 | 0.975 |
L8 | 0.687 | 0.992 |
L9 | 0.703 | 0.983 |
L10 | 0.775 | 0.967 |
L11 | 0.677 | 0.992 |
L12 | 0.698 | 0.984 |
Activities | Sensitivity | Specificity |
---|---|---|
H1 | 0.800 | 0.991 |
H2 | 0.732 | 0.983 |
H3 | 0.720 | 1.000 |
H4 | 0.726 | 0.983 |
H5 | 0.726 | 0.991 |
H6 | 0.800 | 0.991 |
H7 | 0.732 | 0.974 |
H8 | 0.732 | 0.974 |
H9 | 0.726 | 0.991 |
H10 | 0.800 | 1.000 |
H11 | 0.714 | 0.992 |
H12 | 0.720 | 0.991 |
Activities | Precision | Recall | F-Measure |
---|---|---|---|
L1 | 0.818 | 0.818 | 0.818 |
L2 | 0.909 | 0.909 | 0.909 |
L3 | 0.818 | 0.900 | 0.857 |
L4 | 0.818 | 0.818 | 0.818 |
L5 | 0.909 | 0.833 | 0.870 |
L6 | 0.818 | 1.000 | 0.900 |
L7 | 0.833 | 0.769 | 0.800 |
L8 | 0.818 | 0.900 | 0.857 |
L9 | 0.692 | 0.818 | 0.750 |
L10 | 0.909 | 0.714 | 0.800 |
L11 | 1.000 | 0.900 | 0.947 |
L12 | 0.818 | 0.818 | 0.818 |
Mean | 0.847 | 0.850 | 0.845 |
H1 | 0.909 | 0.909 | 0.909 |
H2 | 0.818 | 0.818 | 0.818 |
H3 | 0.818 | 1.000 | 0.900 |
H4 | 0.900 | 0.818 | 0.857 |
H5 | 0.818 | 0.900 | 0.857 |
H6 | 0.909 | 0.909 | 0.909 |
H7 | 0.900 | 0.750 | 0.818 |
H8 | 0.900 | 0.750 | 0.818 |
H9 | 0.818 | 0.900 | 0.857 |
H10 | 0.833 | 1.000 | 0.909 |
H11 | 1.000 | 0.900 | 0.947 |
H12 | 0.900 | 0.900 | 0.900 |
Mean | 0.877 | 0.880 | 0.875 |
Dataset | No. of RBMs-Performance Method | No. of Epochs | No. of Nodes in Each RBM | Average Reconstruction Error | Time (s) |
---|---|---|---|---|---|
mHealth | r = 1 Reconstruction | 7 | n = 500 | 49,458,754.7895 | 2290 |
r = 2 Reconstruction | 7 | n = 500 | 24,327.784 | 4689 | |
r = 3 Reconstruction | 7 | n = 500 | 0.04582 | 6087 | |
r = 4 Reconstruction | 7 | n = 1000 | 0.00037 | 8786 | |
r = 5 Classification | 7 | n = 12 | 0.0000002 | 12,784 | |
HuGaDB | r = 1 Reconstruction | 7 | n = 500 | 65,215,315,432.3545 | 2340 |
r = 2 Reconstruction | 7 | n = 500 | 78,652,131.2563 | 4808 | |
r = 3 Reconstruction | 7 | n = 500 | 156,325.012 | 7090 | |
r = 4 Reconstruction | 7 | n = 1000 | 0.024563 | 10,910 | |
r = 5 Classification | 7 | n = 12 | 0.000000284 | 15,580 |
Method | Accuracy Using mHealth (%) | Method | Accuracy Using HuGaDB (%) |
---|---|---|---|
Abedin et al. [72] | 57.19 | Fang et al. [73] | 79.24 |
Maitre et al. [74] | 84.89 | Rasnayaka et al. [75] | 85 |
O’Halloran et al. [76] | 90.55 | Sun et al. [77] | 88 |
Tahir et al. [23] | 90.91 | Badawi et al. [25] | 88 |
Masum et al. [78] | 91.68 | Kumari et al. [79] | 91.1 |
Ha et al. [80] | 91.94 | - | - |
Guo et al. [81] | 92.3 | - | - |
Proposed HF-SPHR Model | 93.33 | Proposed HF-SPHR Model | 92.50 |
Algorithm | Dataset | Accuracy | Dataset | Accuracy |
---|---|---|---|---|
DBN | mHealth | 93.33% | HuGaDB | 92.50% |
Random Forest | mHealth | 92.7% | HuGaDB | 91.9% |
Adaboost | mHealth | 49.9% | HuGaDB | 57.0% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Javeed, M.; Gochoo, M.; Jalal, A.; Kim, K. HF-SPHR: Hybrid Features for Sustainable Physical Healthcare Pattern Recognition Using Deep Belief Networks. Sustainability 2021, 13, 1699. https://doi.org/10.3390/su13041699
Javeed M, Gochoo M, Jalal A, Kim K. HF-SPHR: Hybrid Features for Sustainable Physical Healthcare Pattern Recognition Using Deep Belief Networks. Sustainability. 2021; 13(4):1699. https://doi.org/10.3390/su13041699
Chicago/Turabian StyleJaveed, Madiha, Munkhjargal Gochoo, Ahmad Jalal, and Kibum Kim. 2021. "HF-SPHR: Hybrid Features for Sustainable Physical Healthcare Pattern Recognition Using Deep Belief Networks" Sustainability 13, no. 4: 1699. https://doi.org/10.3390/su13041699
APA StyleJaveed, M., Gochoo, M., Jalal, A., & Kim, K. (2021). HF-SPHR: Hybrid Features for Sustainable Physical Healthcare Pattern Recognition Using Deep Belief Networks. Sustainability, 13(4), 1699. https://doi.org/10.3390/su13041699