Daily Living Activity Recognition In-The-Wild: Modeling and Inferring Activity-Aware Human Contexts
Abstract
:1. Introduction
- A two-stage model is proposed for ARW, which first identifies the primary physical activity and then uses this label to infer activity-related context information, thus providing a detailed activity representation in-the-wild.
- A methodical approach is conceived and followed to analyze the co-occurrences of different activity-context pairs in the “ExtraSensory” dataset. As a result, a set of ten (10) most frequent human behavioral contexts and four (04) phone contexts/positions are incorporated with six (06) primary PADLs, respectively, for ARW. The approach used to analyze and select activity-context pairs for the proposed ARW scheme is reproducible and can be applied to any multi-label dataset.
- An in-depth exploration of the proposed ARW scheme is conducted for feature selection, model selection (i.e., classifier selection), and classifier hyperparameter optimization to attain state-of-the-art recognition performance. Finally, based on the best-case experimental observations and parameters, the performance of boosted decision tree and neural network classifiers is further evaluated in detail for the proposed scheme using smartphone and watch accelerometers.
2. Related Works
3. Proposed Methodology
3.1. Data Acquisition and Preprocessing
3.1.1. Activity-Context Pairs for ARW: Systematic Analysis and Selection
Algorithm 1. Extraction of activity-context pairs and their frequencies per user for ARW |
Input: Output: and % and show the labels and counts for all activity-context pairs per user, respectively.
|
3.1.2. Signal De-Noising and Segmentation
3.2. Feature Extraction
3.3. Primary Physical Activity Recognition
3.4. Activity-Aware Context Recognition
4. Experimental Results, Performance Analysis, and Discussions
4.1. Method of Validation and Analysis
4.1.1. Model Selection and Hyperparameters Tuning
4.1.2. Performance Evaluation Metrics for Classification
4.2. Performance Analysis of Primary Physical Activity Recognition (PPAR)
4.3. Performance Evaluation of Activity-Aware Context Recognition (AACR)
4.3.1. Behavioral Context Recognition (BCR) Results and Investigation
4.3.2. Phone Context Recognition (PCR) Results and Investigation
4.4. Analysis of BDT vs. NN for Proposed ARW Scheme
4.5. Performance Comparison with Existing AR Schemes
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Liang, Y.; Zhou, X.; Guo, B.; Yu, Z. Activity recognition using ubiquitous sensors: An overview. Wearable Technol. Concepts Methodol. Tools Appl. 2018, 199–230. [Google Scholar] [CrossRef]
- Roggen, D.; Troster, G.; Lukowicz, P.; Ferscha, A.; Millan, J.D.R.; Chavarriaga, R. Opportunistic human activity and context recognition. Computer 2012, 46, 36–45. [Google Scholar] [CrossRef] [Green Version]
- Cao, J.; Lin, M.; Wang, H.; Fang, J.; Xu, Y. Towards Activity Recognition through Multidimensional Mobile Data Fusion with a Smartphone and Deep Learning. Mob. Inf. Syst. 2021, 2021, 1–11. [Google Scholar] [CrossRef]
- Abdallah, Z.; Gaber, M.; Srinivasan, B.; Krishnaswamy, S. Activity Recognition with Evolving Data Streams. ACM Comput. Surv. 2018, 51, 3158645. [Google Scholar] [CrossRef]
- Murtaza, F.; Yousaf, M.H.; Velastin, S.A.; Qian, Y. Vectors of temporally correlated snippets for temporal action detection. Comput. Electr. Eng. 2020, 85, 106654. [Google Scholar] [CrossRef]
- Wang, P.; Li, W.; Ogunbona, P.; Wan, J.; Escalera, S. RGB-D-based human motion recognition with deep learning: A survey. Comput. Vis. Image Underst. 2018, 171, 118–139. [Google Scholar] [CrossRef] [Green Version]
- Zhang, S.; Wei, Z.; Nie, J.; Shuang, W.; Wang, S.; Li, Z. A Review on Human Activity Recognition Using Vision-Based Method. J. Healthc. Eng. 2017, 2017, 1–31. [Google Scholar] [CrossRef]
- Sarabu, A.; Santra, A.K. Human Action Recognition in Videos using Convolution Long Short-Term Memory Network with Spatio-Temporal Networks. Emerg. Sci. J. 2021, 5, 25–33. [Google Scholar] [CrossRef]
- Aguileta, A.A.; Brena, R.F.; Mayora, O.; Molino-Minero-Re, E.; Trejo, L.A. Multi-Sensor Fusion for Activity Recognition—A Survey. Sensors 2019, 19, 3808. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xu, Y.; Shen, Z.; Zhang, X.; Gao, Y.; Deng, S.; Wang, Y.; Fan, Y.; Chang, E.I.-C. Learning multi-level features for sensor-based human action recognition. Pervasive Mob. Comput. 2017, 40, 324–338. [Google Scholar] [CrossRef] [Green Version]
- Alshammari, T.; Alshammari, N.; Sedky, M.; Howard, C. SIMADL: Simulated Activities of Daily Living Dataset. Data 2018, 3, 11. [Google Scholar] [CrossRef] [Green Version]
- Alsinglawi, B.; Nguyen, Q.V.; Gunawardana, U.; Maeder, A.; Simoff, S. RFID Systems in Healthcare Settings and Activity of Daily Living in Smart Homes: A Review. E-Health Telecommun. Syst. Netw. 2017, 6, 1–17. [Google Scholar] [CrossRef] [Green Version]
- Vanus, J.; Belesova, J.; Martinek, R.; Nedoma, J.; Fajkus, M.; Bilik, P.; Zidek, J. Monitoring of the daily living activities in smart home care. Hum. Cent. Comput. Inf. Sci. 2017, 7, 30. [Google Scholar] [CrossRef] [Green Version]
- Marques, B.; McIntosh, J.; Valera, A.; Gaddam, A. Innovative and Assistive eHealth Technologies for Smart Therapeutic and Rehabilitation Outdoor Spaces for the Elderly Demographic. Multimodal Technol. Interact. 2020, 4, 76. [Google Scholar] [CrossRef]
- Zhu, Z.; Liu, T.; Li, G.; Li, T.; Inoue, Y. Wearable Sensor Systems for Infants. Sensors 2015, 15, 3721–3749. [Google Scholar] [CrossRef]
- Kristoffersson, A.; Lindén, M. A Systematic Review on the Use of Wearable Body Sensors for Health Monitoring: A Qualitative Synthesis. Sensors 2020, 20, 1502. [Google Scholar] [CrossRef] [Green Version]
- Mukhopadhyay, S.C. Wearable Sensors for Human Activity Monitoring: A Review. IEEE Sens. J. 2014, 15, 1321–1330. [Google Scholar] [CrossRef]
- Schrack, J.A.; Cooper, R.; Koster, A.; Shiroma, E.J.; Murabito, J.M.; Rejeski, W.J.; Ferrucci, L.; Harris, T.B. Assessing Daily Physical Activity in Older Adults: Unraveling the Complexity of Monitors, Measures, and Methods. J. Gerontol. Ser. A Biomed. Sci. Med. Sci. 2016, 71, 1039–1048. [Google Scholar] [CrossRef] [Green Version]
- Ignatov, A. Real-time human activity recognition from accelerometer data using Convolutional Neural Networks. Appl. Soft Comput. 2018, 62, 915–922. [Google Scholar] [CrossRef]
- Hussain, F.; Hussain, F.; Ehatisham-ul-Haq, M.; Azam, M.A. Activity-Aware Fall Detection and Recognition based on Wearable Sensors. IEEE Sens. J. 2019, 19, 4528–4536. [Google Scholar] [CrossRef]
- Xu, L.; Yang, W.; Cao, Y.; Li, Q. Human activity recognition based on random forests. In Proceedings of the 2017 13th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD), Guilin, China, 29–31 July 2017; pp. 548–553. [Google Scholar] [CrossRef]
- Fu, Z.; He, X.; Wang, E.; Huo, J.; Huang, J.; Wu, D. Personalized Human Activity Recognition Based on Integrated Wearable Sensor and Transfer Learning. Sensors 2021, 21, 885. [Google Scholar] [CrossRef]
- Esfahani, M.I.M.; Nussbaum, M.A. Classifying Diverse Physical Activities Using “Smart Garments”. Sensors 2019, 19, 3133. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gadaleta, M.; Rossi, M. IDNet: Smartphone-based gait recognition with convolutional neural networks. Pattern Recognit. 2018, 74, 25–37. [Google Scholar] [CrossRef] [Green Version]
- Chen, Z.; Zhang, L.; Cao, Z.; Guo, J. Distilling the Knowledge from Handcrafted Features for Human Activity Recognition. IEEE Trans. Ind. Inform. 2018, 14, 4334–4342. [Google Scholar] [CrossRef]
- Lee, K.; Kwan, M.-P. Physical activity classification in free-living conditions using smartphone accelerometer data and exploration of predicted results. Comput. Environ. Urban Syst. 2018, 67, 124–131. [Google Scholar] [CrossRef]
- Incel, O.D.; Kose, M.; Ersoy, C. A Review and Taxonomy of Activity Recognition on Mobile Phones. BioNanoScience 2013, 3, 145–171. [Google Scholar] [CrossRef]
- Shoaib, M.; Bosch, S.; Incel, O.D.; Scholten, J.; Havinga, P.J. A Survey of Online Activity Recognition Using Mobile Phones. Sensors 2015, 15, 2059–2085. [Google Scholar] [CrossRef] [PubMed]
- Dao, M.-S.; Nguyen-Gia, T.-A.; Mai, V.-C. Daily Human Activities Recognition Using Heterogeneous Sensors from Smartphones. Procedia Comput. Sci. 2017, 111, 323–328. [Google Scholar] [CrossRef]
- Shoaib, M.; Bosch, S.; Incel, O.D.; Scholten, J.; Havinga, P.J.M. Complex Human Activity Recognition Using Smartphone and Wrist-Worn Motion Sensors. Sensors 2016, 16, 426. [Google Scholar] [CrossRef]
- Shoaib, M.; Bosch, S.; Scholten, H.; Havinga, P.J.M.; Incel, O.D. Towards detection of bad habits by fusing smartphone and smartwatch sensors. In Proceedings of the 2015 IEEE International Conference on Pervasive Computing and Communication Workshops, PerCom Workshops, St. Louis, MO, USA, 23–27 March 2015; pp. 591–596. [Google Scholar]
- Ranieri, C.M.; MacLeod, S.; Dragone, M.; Vargas, P.A.; Romero, R.A.F. Activity Recognition for Ambient Assisted Living with Videos, Inertial Units and Ambient Sensors. Sensors 2021, 21, 768. [Google Scholar] [CrossRef]
- Cao, L.; Wang, Y.; Zhang, B.; Jin, Q.; Vasilakos, A.V. GCHAR: An efficient Group-based Context—aware human activity recognition on smartphone. J. Parallel Distrib. Comput. 2018, 118, 67–80. [Google Scholar] [CrossRef]
- Otebolaku, A.M.; Andrade, M.T. User context recognition using smartphone sensors and classification models. J. Netw. Comput. Appl. 2016, 66, 33–51. [Google Scholar] [CrossRef]
- Fahim, M.; Khattak, A.M.; Baker, T.; Chow, F.; Shah, B. Micro-context recognition of sedentary behaviour using smartphone. In Proceedings of the 2016 6th International Conference on Digital Information and Communication Technology and Its Applications, DICTAP, Konya, Turkey, 21–23 July 2016; pp. 30–34. [Google Scholar]
- Ellis, K.; Godbole, S.; Kerr, J.; Lanckriet, G. Multi-sensor physical activity recognition in free-living. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication-UbiComp ’14 Adjunct, Seattle, WA, USA, 13–17 September 2014; pp. 431–440. [Google Scholar]
- Guiry, J.J.; Van De Ven, P.; Nelson, J. Multi-Sensor Fusion for Enhanced Contextual Awareness of Everyday Activities with Ubiquitous Devices. Sensors 2014, 14, 5687–5701. [Google Scholar] [CrossRef]
- Vaizman, Y.; Ellis, K.; Lanckriet, G.R.G. Recognizing Detailed Human Context in the Wild from Smartphones and Smartwatches. IEEE Pervasive Comput. 2017, 16, 62–74. [Google Scholar] [CrossRef] [Green Version]
- Safyan, M.; Sarwar, S.; Qayyum, Z.U.; Iqbal, M.; Li, S.; Kashif, M. Machine Learning based Activity learning for Behavioral Contexts in Internet of Things. Proc. Inst. Syst. Program. RAS 2021, 33, 47–58. [Google Scholar] [CrossRef]
- Marques, G.; Miranda, N.; Bhoi, A.K.; Garcia-Zapirain, B.; Hamrioui, S.; Díez, I.D.L.T. Internet of Things and Enhanced Living Environments: Measuring and Mapping Air Quality Using Cyber-physical Systems and Mobile Computing Technologies. Sensors 2020, 20, 720. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rehman, A.; Iqbal, M.; Xing, H.; Ahmed, I. COVID-19 Detection Empowered with Machine Learning and Deep Learning Techniques: A Systematic Review. Appl. Sci. 2021, 11, 3414. [Google Scholar] [CrossRef]
- Klumpp, M.; Hintze, M.; Immonen, M.; Ródenas-Rigla, F.; Pilati, F.; Aparicio-Martínez, F.; Çelebi, D.; Liebig, T.; Jirstrand, M.; Urbann, O.; et al. Artificial Intelligence for Hospital Health Care: Application Cases and Answers to Challenges in European Hospitals. Healthcare 2021, 9, 961. [Google Scholar] [CrossRef]
- Massaro, A.; Maritati, V.; Savino, N.; Galiano, A. Neural Networks for Automated Smart Health Platforms oriented on Heart Predictive Diagnostic Big Data Systems. In Proceedings of the 2018 AEIT International Annual Conference, Bari, Italy, 3–5 October 2018. [Google Scholar] [CrossRef]
- Fahad, L.G.; Ali, A.; Rajarajan, M. Learning models for activity recognition in smart homes. In Information Science and Applications; Springer: Berlin/Heidelberg, Germany, 2015; pp. 819–826. [Google Scholar] [CrossRef]
- Lu, L.; Qing-Ling, C.; Yi-Ju, Z. Activity Recognition in Smart Homes. Multimed. Tools Appl. 2016, 76, 24203–24220. [Google Scholar] [CrossRef]
- Chahuara, P.; Fleury, A.; Vacher, M.; Chahuara, P.; Fleury, A.; Vacher, M.; Activity, O.H. On-line Human Activity Recognition from Audio and Home Automation Sensors. J. Ambient. Intell. Smart Environ. 2016, 8, 399–422. [Google Scholar] [CrossRef] [Green Version]
- Ni, Q.; Hernando, A.B.G.; DE LA Cruz, I.P. A Context-Aware System Infrastructure for Monitoring Activities of Daily Living in Smart Home. J. Sens. 2016, 2016, 9493047. [Google Scholar] [CrossRef] [Green Version]
- Ghayvat, H.; Mukhopadhyay, S.; Shenjie, B.; Chouhan, A.; Chen, W. Smart home based ambient assisted living: Recognition of anomaly in the activity of daily living for an elderly living alone. In Proceedings of the I2MTC 2018-2018 IEEE International Instrumentation and Measurement Technology Conference: Discovering New Horizons in Instrumentation and Measurement, Houston, TX, USA, 14–17 May 2018; pp. 1–5. [Google Scholar]
- Muheidat, F.; Tawalbeh, L.; Tyrer, H. Context-Aware, Accurate, and Real Time Fall Detection System for Elderly People. In Proceedings of the 12th IEEE International Conference on Semantic Computing, ICSC, Laguna Hills, CA, USA, 31 January–2 February 2018; pp. 329–333. [Google Scholar]
- Esfahani, M.I.M.; Nussbaum, M.A. Preferred Placement and Usability of a Smart Textile System vs. Inertial Measurement Units for Activity Monitoring. Sensors 2018, 18, 2501. [Google Scholar] [CrossRef] [Green Version]
- Cleland, I.; Kikhia, B.; Nugent, C.; Boytsov, A.; Hallberg, J.; Synnes, K.; McClean, S.; Finlay, D. Optimal Placement of Accelerometers for the Detection of Everyday Activities. Sensors 2013, 13, 9183–9200. [Google Scholar] [CrossRef] [Green Version]
- Boerema, S.T.; Van Velsen, L.; Schaake, L.; Tönis, T.M.; Hermens, H.J. Optimal Sensor Placement for Measuring Physical Activity with a 3D Accelerometer. Sensors 2014, 14, 3188–3206. [Google Scholar] [CrossRef] [Green Version]
- Özdemir, A.T. An Analysis on Sensor Locations of the Human Body for Wearable Fall Detection Devices: Principles and Practice. Sensors 2016, 16, 1161. [Google Scholar] [CrossRef]
- Martinez-Hernandez, U.; Dehghani-Sanij, A.A. Probabilistic identification of sit-to-stand and stand-to-sit with a wearable sensor. Pattern Recognit. Lett. 2018, 118, 32–41. [Google Scholar] [CrossRef] [Green Version]
- Mehrang, S.; Pietilä, J.; Korhonen, I. An Activity Recognition Framework Deploying the Random Forest Classifier and A Single Optical Heart Rate Monitoring and Triaxial Accelerometer Wrist-Band. Sensors 2018, 18, 613. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bharti, P.; De, D.; Chellappan, S.; Das, S.K. HuMAn: Complex Activity Recognition with Multi-Modal Multi-Positional Body Sensing. IEEE Trans. Mob. Comput. 2018, 18, 857–870. [Google Scholar] [CrossRef]
- Anwary, A.R.; Yu, H.; Vassallo, M. Gait Evaluation Using Procrustes and Euclidean Distance Matrix Analysis. IEEE J. Biomed. Heal. Inform. 2018, 23, 2021–2029. [Google Scholar] [CrossRef]
- Russell, B.; McDaid, A.; Toscano, W.; Hume, P. Moving the Lab into the Mountains: A Pilot Study of Human Activity Recognition in Unstructured Environments. Sensors 2021, 21, 654. [Google Scholar] [CrossRef] [PubMed]
- Antos, S.A.; Albert, M.V.; Kording, K. Hand, belt, pocket or bag: Practical activity tracking with mobile phones. J. Neurosci. Methods 2013, 231, 22–30. [Google Scholar] [CrossRef] [Green Version]
- Khan, A.M.; Siddiqi, M.H.; Lee, S.-W. Exploratory Data Analysis of Acceleration Signals to Select Light-Weight and Accurate Features for Real-Time Activity Recognition on Smartphones. Sensors 2013, 13, 13099–13122. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shoaib, M.; Bosch, S.; Incel, O.D.; Scholten, J.; Havinga, P.J.M. Fusion of Smartphone Motion Sensors for Physical Activity Recognition. Sensors 2014, 14, 10146–10176. [Google Scholar] [CrossRef] [PubMed]
- Shi, D.; Wang, R.; Wu, Y.; Mo, X.; Wei, J. A novel orientation- and location-independent activity recognition method. Pers. Ubiquitous Comput. 2017, 21, 427–441. [Google Scholar] [CrossRef]
- Martín, H.; Bernardos, A.M.; Iglesias, J.; Casar, J.R. Activity logging using lightweight classification techniques in mobile devices. Pers. Ubiquitous Comput. 2012, 17, 675–695. [Google Scholar] [CrossRef]
- Coskun, D.; Incel, O.D.; Ozgovde, A. Phone position/placement detection using accelerometer: Impact on activity recognition. In Proceedings of the 2015 IEEE 10th International Conference on Intelligent Sensors, Sensor Networks and Information Processing, Singapore, 7–9 April 2015. [Google Scholar]
- Nalepa, G.J.; Kutt, K.; Bobek, S. Mobile platform for affective context-aware systems. Futur. Gener. Comput. Syst. 2019, 92, 490–503. [Google Scholar] [CrossRef]
- Hoseini-Tabatabaei, S.A.; Gluhak, A.; Tafazolli, R. A survey on smartphone-based systems for opportunistic user context recognition. ACM Comput. Surv. 2013, 45, 1–51. [Google Scholar] [CrossRef] [Green Version]
- Esfahani, P.; Malazi, H.T. PAMS: A new position-aware multi-sensor dataset for human activity recognition using smartphones. In Proceedings of the 2017 19th International Symposium on Computer Architecture and Digital Systems, CADS, Kish Island, Iran, 21–22 December 2017; pp. 1–7. [Google Scholar]
- Kohavi, R. Scaling Up the Accuracy of Naive-Bayes Classifiers: A Decision-Tree Hybrid. Proc. Second Int. Conf. Knowl. Discov. Data Min. 1996, 7, 202–207. [Google Scholar]
- Lim, H.; An, G.; Cho, Y.; Lee, K.; Suh, B. WhichHand: Automatic Recognition of a Smartphone’s Position in the Hand Using a Smartwatch. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, Florence, Italy, 6–9 September 2016; pp. 675–681. [Google Scholar]
- Yang, Z.; Shangguan, L.; Gu, W.; Zhou, Z.; Wu, C.; Liu, Y. Sherlock: Micro-Environment Sensing for Smartphones. IEEE Trans. Parallel Distrib. Syst. 2014, 25, 3295–3305. [Google Scholar] [CrossRef]
- Li, X.; Goldberg, D.W. Toward a mobile crowdsensing system for road surface assessment. Comput. Environ. Urban Syst. 2018, 69, 51–62. [Google Scholar] [CrossRef]
- Wang, A.; Chen, G.; Yang, J.; Zhao, S.; Chang, C.-Y. A Comparative Study on Human Activity Recognition Using Inertial Sensors in a Smartphone. IEEE Sens. J. 2016, 16, 4566–4578. [Google Scholar] [CrossRef]
- Costa, A.A.M.; Almeida, H.; Lorayne, A.; de Sousa, R.R.; Perkusich, A.; Ramos, F.B.A. Combining Smartphone and Smartwatch Sensor Data in Activity Recognition Approaches: An Experimental Evaluation. In Proceedings of the 28th International Conference on Software Engineering and Knowledge Engineering, Redwood City, CA, USA, 1–3 July 2016; pp. 267–272. [Google Scholar]
- Nweke, H.F.; Teh, Y.W.; Al-Garadi, M.A.; Alo, U.R. Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges. Expert Syst. Appl. 2018, 105, 233–261. [Google Scholar] [CrossRef]
- Monteiro, J.; Granada, R.; Barros, R.C.; Meneguzzi, F. Deep neural networks for kitchen activity recognition. In Proceedings of the International Joint Conference on Neural Networks, Anchorage, AK, USA, 14–19 May 2017; pp. 2048–2055. [Google Scholar]
- Mekruksavanich, S.; Jitpattanakul, A. LSTM Networks Using Smartphone Data for Sensor-Based Human Activity Recognition in Smart Homes. Sensors 2021, 21, 1636. [Google Scholar] [CrossRef]
- Ramos, R.; Domingo, J.; Zalama, E.; Gómez-García-Bermejo, J. Daily Human Activity Recognition Using Non-Intrusive Sensors. Sensors 2021, 21, 5270. [Google Scholar] [CrossRef]
- Nafea, O.; Abdul, W.; Muhammad, G.; Alsulaiman, M. Sensor-Based Human Activity Recognition with Spatio-Temporal Deep Learning. Sensors 2021, 21, 2141. [Google Scholar] [CrossRef]
- Wang, J.; Chen, Y.; Hao, S.; Peng, X.; Hu, L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognit. Lett. 2019, 119, 3–11. [Google Scholar] [CrossRef] [Green Version]
- Hammerla, N.Y.; Halloran, S.; Plötz, T. Deep, convolutional, and recurrent models for human activity recognition using wearables. In Proceedings of the IJCAI International Joint Conference on Artificial Intelligence, New York, NY, USA, 9–15 July 2016; pp. 1533–1540. [Google Scholar]
- Anguita, D.; Ghio, A.; Oneto, L.; Parra, X.; Reyes-Ortiz, J.L. A Public Domain Dataset for Human Activity Recognition Using Smartphones. In Proceedings of the 21th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN, Bruges, Belgium, 24–26 April 2013. [Google Scholar]
- Sucerquia, A.; López, J.D.; Vargas-Bonilla, J.F. SisFall: A Fall and Movement Dataset. Sensors 2017, 17, 198. [Google Scholar] [CrossRef]
- Vavoulas, G.; Chatzaki, C.; Malliotakis, T.; Pediaditis, M.; Tsiknakis, M. The MobiAct Dataset: Recognition of Activities of Daily Living using Smartphones. In Proceedings of the International Conference on Information and Communication Technologies for Ageing Well and e-Health, Rome, Italy, 21–22 April 2016; pp. 143–151. [Google Scholar]
- Chen, C.; Jafari, R.; Kehtarnavaz, N. UTD-MHAD: A multimodal dataset for human action recognition utilizing a depth camera and a wearable inertial sensor. In Proceedings of the 2015 IEEE International Conference on Image Processing (ICIP), Quebec City, QC, Canada, 27–30 September 2015; pp. 168–172. [Google Scholar] [CrossRef]
- Liu, H.; Hartmann, Y.; Schultz, T. CSL-SHARE: A Multimodal Wearable Sensor-Based Human Activity Dataset. Front. Comput. Sci. 2021, 3, 759136. [Google Scholar] [CrossRef]
- Lima, W.S.; Souto, E.; El-Khatib, K.; Jalali, R.; Gama, J. Human Activity Recognition Using Inertial Sensors in a Smartphone: An Overview. Sensors 2019, 19, 3213. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Vaizman, Y.; Weibel, N. Context Recognition In-the-Wild: Unified Model for Multi-Modal Sensors and Multi-Label Classification. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2017, 168, 22. [Google Scholar] [CrossRef]
- Ehatisham-Ul-Haq, M.; Azam, M.A.; Loo, J.; Shuang, K.; Islam, S.; Naeem, U.; Amin, Y. Authentication of Smartphone Users Based on Activity Recognition and Mobile Sensing. Sensors 2017, 17, 2043. [Google Scholar] [CrossRef] [Green Version]
- Ehatisham-Ul-Haq, M.; Azam, M.A.; Naeem, U.; Amin, Y.; Loo, J. Continuous authentication of smartphone users based on activity pattern recognition using passive mobile sensing. J. Netw. Comput. Appl. 2018, 109, 24–35. [Google Scholar] [CrossRef] [Green Version]
- Hall, M.A.; Smith, L.A. Feature subset selection: A correlation based filter approach. In Progress in Connectionist-Based Information Systems; Kasabov, N., Kozma, R., Ko, K., O’Shea, R., Coghill, G., Gedeon, T., Eds.; Springer: Berlin/Heidelberg, Germany, 1997; pp. 855–858. [Google Scholar]
- Jerome, H. Friedman Greedy Function Approximation: A Gradient Boosting Machine. Ann. Stat. 2000, 29, 1189–1232. [Google Scholar]
- Kothari, S.; Oh, H. Neural Networks for Pattern Recognition. In Advances in Computers; Elsevier: Amsterdam, The Netherlands, 1993; pp. 119–166. [Google Scholar] [CrossRef]
- Barga, R.; Fontama, V.; Tok, W.H. Predictive Analytics with Microsoft Azure Machine Learning; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
- Dernbach, S.; Das, B.; Krishnan, N.C.; Thomas, B.L.; Cook, D.J. Simple and Complex Activity Recognition through Smart Phones. In Proceedings of the 2012 8th International Conference on Intelligent Environments (IE), Guanajuato, Mexico, 26–29 June 2012; pp. 214–221. [Google Scholar]
- Rifkin, R.; Klautau, A. In defense of one-vs-all classification. J. Mach. Learn. Res. 2004, 5, 101–141. [Google Scholar]
- San-Segundo, R.; Blunck, H.; Moreno-Pimentel, J.; Stisen, A.; Gil-Martín, M. Robust Human Activity Recognition using smartwatches and smartphones. Eng. Appl. Artif. Intell. 2018, 72, 190–202. [Google Scholar] [CrossRef]
- Oniga, S.; Süto, J. Human activity recognition using neural networks. In Proceedings of the 2014 15th International Carpathian Control Conference, ICCC, Velke Karlovice, Czech Republic, 28–30 May 2014; pp. 403–406. [Google Scholar]
- Yao, S.; Hu, S.; Zhao, Y.; Zhang, A.; Abdelzaher, T. DeepSense: A unified deep learning framework for time-series mobile sensing data processing. In Proceedings of the 26th International World Wide Web Conference, WWW, Perth, Australia, 3–7 April 2017; pp. 351–360. [Google Scholar]
- Chawla, N.V.; Bowyer, K.W.; Hall, L.O.; Kegelmeyer, W.P. SMOTE: Synthetic minority over-sampling technique. J. Artif. Intell. Res. 2002, 16, 321–357. [Google Scholar]
- Catal, C.; Tufekci, S.; Pirmit, E.; Kocabag, G. On the use of ensemble of classifiers for accelerometer-based activity recognition. Appl. Soft Comput. 2015, 37, 1018–1022. [Google Scholar] [CrossRef]
- Hassan, M.M.; Uddin, Z.; Mohamed, A.; Almogren, A. A robust human activity recognition system using smartphone sensors and deep learning. Futur. Gener. Comput. Syst. 2018, 81, 307–313. [Google Scholar] [CrossRef]
- Garcia-Ceja, E.; Galván-Tejada, C.E.; Brena, R. Multi-view stacking for activity recognition with sound and accelerometer data. Inf. Fusion 2018, 40, 45–56. [Google Scholar] [CrossRef]
- Fatima, I.; Fahim, M.; Lee, Y.-K.; Lee, S. A Genetic Algorithm-based Classifier Ensemble Optimization for Activity Recognition in Smart Homes. KSII Trans. Internet Inf. Syst. 2013, 7, 2853–2873. [Google Scholar] [CrossRef]
- Ravi, D.; Wong, C.; Lo, B.; Yang, G.-Z. A Deep Learning Approach to on-Node Sensor Data Analytics for Mobile or Wearable Devices. IEEE J. Biomed. Heal. Inform. 2016, 21, 56–64. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Aly, H.; Ismail, M.A. UbiMonitor: Intelligent fusion of body-worn sensors for real-time human activity recognition. In Proceedings of the ACM Symposium on Applied Computing, Salamanca, Spain, 13–17 April 2015; pp. 563–568. [Google Scholar]
- Wang, Y.; Cang, S.; Yu, H. A Data Fusion-Based Hybrid Sensory System for Older People’s Daily Activity and Daily Routine Recognition. IEEE Sens. J. 2018, 18, 6874–6888. [Google Scholar] [CrossRef]
Primary Physical Activities | Physical Activities and Behavioral Contexts | Physical Activities and Phone Contexts | |||||
---|---|---|---|---|---|---|---|
Code | Activity | Count | Code | (Activity, Behavioral Context) | Count | (Activity, Phone Context) | Count |
A1 | Lying | 20,348 | A1C1 | (Lying, Sleeping) | 19,001 | (Lying, Phone in Hand) | 134 |
A2 | Sitting | 15,647 | A1C2 | (Lying, Surfing the Internet) | 1069 | (Lying, Phone on Table) | 20,214 |
A3 | Standing | 7115 | A1C3 | (Lying, Watching TV) | 278 | (Sitting, Phone in Bag) | 992 |
A4 | Walking | 2790 | A2C1 | (Sitting, Surfing the Internet) | 7501 | (Sitting, Phone in Hand) | 1214 |
A5 | Running | 3488 | A2C2 | (Sitting, In a Car) | 1427 | (Sitting, Phone in Pocket) | 618 |
A6 | Bicycling | 713 | A2C3 | (Sitting, In a Meeting) | 1084 | (Sitting, Phone on Table) | 12,823 |
- | - | - | A2C4 | (Sitting, Watching TV) | 5635 | (Walking, Phone in Bag) | 383 |
- | - | - | A3C1 | (Walking, Indoor) | 534 | (Walking, Phone in Hand) | 768 |
- | - | - | A3C2 | (Walking, Outdoor) | 1715 | (Walking, Phone in Pocket) | 1406 |
- | - | - | A3C3 | (Walking, Shopping) | 145 | (Walking, Phone on Table) | 233 |
- | - | - | A3C4 | (Walking, Talking) | 396 | (Standing, Phone in Bag) | 426 |
- | - | - | A4C1 | (Standing, Indoor) | 6477 | (Standing, Phone in Hand) | 587 |
- | - | - | A4C2 | (Standing, Outdoor) | 638 | (Standing, Phone in Pocket) | 2013 |
- | - | - | A5C1 | (Running, Exercise) | 3488 | (Standing, Phone on Table) | 4089 |
- | - | - | A6C1 | (Bicycling, Exercise) | 713 | (Running, Phone in Pocket) | 3488 |
- | - | - | - | - | - | (Bicycling, Phone in Pocket) | 713 |
Experiment Type | Based on (Activity) | Selected Features for Each Sensor Axis | Feature Vector Length per Sensor |
---|---|---|---|
PPAR | - | → {F2, F5, F8, F15, F12, F13, F16}; → {F2, F5, F8, F10, F13, F14, F15, F16}; → {F3, F4, F5, F8, F9, F10, F12, F13, F14, F15, F16, F17}; | 27 |
→ {F1, F4, F5, F10, F14, F15, F16, F17, F18}; → {F4, F5, F8, F10, F15, F16, F17, F18, F20}; → {F2, F5, F8, F10, F12, F15, F16, F17, F19, F20} | 28 | ||
BCR | Lying | → {F1, F2, F4, F6, F8, F10, F11, F20}; → {F1, F2, F6, F10, F15, F18, F20}; → {F2, F6, F10, F11, F12, F19, F20}; | 22 |
→ {F1, F2, F4, F6, F10, F11, F15, F23}; → {F1, F2, F6, F10, F15, F16, F20}; → {F1, F6, F10, F11, F12, F19, F20} | 22 | ||
Sitting | → {F2, F4, F6, F10, F11, F20}; → {F1, F2, F6, F10, F15, F20}; → {F1, F2, F6, F10, F11, F12, F14, F20}; | 19 | |
→ {F1, F2, F4, F6, F10, F11, F20}; → {F1, F2, F4, F10, F15, F16, F20}; → {F2, F6, F10, F11, F12, F14, F20}; | 21 | ||
Walking | → {F2, F4, F5, F15, F10, F20}; → {F1, F2, F6, F10, F15, F20}; → {F2, F6, F10, F11, F12, F20}; | 19 | |
→ {F2, F4, F6, F10, F11, F15, F20}; → {F1, F2, F6, F10, F15, F20}; → {F1, F2, F6, F10, F11, F12, F14, F20}; | 21 | ||
Standing | → {F2, F4, F6, F8, F10, F11, F14, F20}; → {F1, F2, F4, F6, F10, F15, F19}; → {F1, F2, F4, F6, F10, F11, F12, F19, F20}; | 24 | |
→ {F2, F4, F6, F10, F11, F19}; → {F2, F4, F10, F15, F19}; → {F1, F2, F6, F8, F10, F11, F12, F19, F20}; | 20 | ||
PCR | Lying | → {F2, F4, F10, F19}; → {F2, F6, F10, F19, F20}; → {F2, F4, F20} | 12 |
Sitting | → {F2, F4, F10, F11}; → {F4, F10, F20}; → {F4, F10, F19} | 10 | |
Walking | → {F2, F4, F10, F19, F20}; → {F2, F6, F10, F15}; → {F2, F4, F10, F11} | 13 | |
Standing | → {F2, F10, F20}; → {F6, F10, F19}; → {F2, F10, F15} | 09 |
Experiment Type | Based on (Activity) | Sensor(s) | BDT Hyperparameters | NN Hyperparameters | ||||
---|---|---|---|---|---|---|---|---|
No. of Leaves | Minimum Leaf Instances | No. of Trees | No. of Iterations | |||||
PPAR | - | A | 61 | 17 | 0.2699 | 178 | 114 | 0.0368 |
W | 32 | 06 | 0.2520 | 270 | 88 | 0.0387 | ||
A + W | 26 | 02 | 0.2993 | 358 | 128 | 0.0287 | ||
BCR | Lying | A | 08 | 50 | 0.1000 | 500 | 82 | 0.0320 |
W | 04 | 06 | 0.1120 | 57 | 40 | 0.0109 | ||
A + W | 86 | 22 | 0.0636 | 87 | 23 | 0.0306 | ||
Sitting | A | 128 | 10 | 0.4000 | 100 | 131 | 0.0135 | |
W | 54 | 19 | 0.3364 | 51 | 131 | 0.0135 | ||
A + W | 62 | 02 | 0.4000 | 94 | 51 | 0.3380 | ||
Walking | A | 28 | 10 | 0.4000 | 100 | 96 | 0.0396 | |
W | 32 | 05 | 0.2520 | 270 | 46 | 0.0144 | ||
A + W | 61 | 17 | 0.2699 | 178 | 97 | 0.3960 | ||
Standing | A | 32 | 50 | 0.2000 | 100 | 111 | 0.0158 | |
W | 17 | 13 | 0.0629 | 50 | 55 | 0.0315 | ||
A + W | 30 | 16 | 0.2358 | 27 | 135 | 0.0135 | ||
PCR | Lying | A | 06 | 47 | 0.1400 | 233 | 58 | 0.0101 |
Sitting | A | 59 | 27 | 0.3911 | 22 | 109 | 0.3030 | |
Standing | A | 128 | 01 | 0.4000 | 20 | 23 | 0.0309 | |
Walking | A | 36 | 07 | 0.3331 | 182 | 121 | 0.0301 |
Classifier | Sensor(s) | Accuracy | Precision | Sensitivity | Micro-F1 | Macro-F1 | Log Loss |
---|---|---|---|---|---|---|---|
BDT | Acc. | 0.959 | 0.856 | 0.807 | 0.877 | 0.829 | 1.215 |
W. Acc. | 0.941 | 0.825 | 0.708 | 0.822 | 0.752 | 1.414 | |
Acc. + W. Acc. | 0.974 | 0.907 | 0.881 | 0.920 | 0.893 | 0.787 | |
NN | Acc. | 0.900 | 0.744 | 0.579 | 0.700 | 0.628 | 1.284 |
W. Acc. | 0.861 | 0.691 | 0.503 | 0.583 | 0.553 | 1.748 | |
Acc. + W. Acc. | 0.921 | 0.789 | 0.677 | 0.763 | 0.718 | 1.018 |
Activity | Classifier | Sensor(s) | Accuracy | Precision | Sensitivity | Micro-F1 | Macro-F1 | Log Loss |
---|---|---|---|---|---|---|---|---|
Lying | BDT | Acc. | 0.997 | 0.794 | 0.698 | 0.944 | 0.755 | 3.640 |
W. Acc. | 0.997 | 0.945 | 0.719 | 0.973 | 0.803 | 1.502 | ||
Acc. + W. Acc. | 0.985 | 0.940 | 0.812 | 0.976 | 0.868 | 1.287 | ||
NN | Acc. | 0.963 | 0.716 | 0.461 | 0.934 | 0.514 | 2.019 | |
W. Acc. | 0.991 | 0.722 | 0.514 | 0.945 | 0.601 | 1.650 | ||
Acc. + W. Acc. | 0.969 | 0.813 | 0.602 | 0.953 | 0.674 | 1.867 | ||
Sitting | BDT | Acc. | 0.986 | 0.956 | 0.960 | 0.975 | 0.958 | 0.178 |
W. Acc. | 0.970 | 0.953 | 0.920 | 0.939 | 0.935 | 0.435 | ||
Acc. + W. Acc. | 0.991 | 0.984 | 0.971 | 0.982 | 0.978 | 0.147 | ||
NN | Acc. | 0.928 | 0.824 | 0.869 | 0.856 | 0.844 | 0.465 | |
W. Acc. | 0.911 | 0.850 | 0.747 | 0.821 | 0.785 | 0.877 | ||
Acc. + W. Acc. | 0.961 | 0.924 | 0.902 | 0.922 | 0.913 | 0.321 | ||
Walking | BDT | Acc. | 0.918 | 0.854 | 0.683 | 0.835 | 0.741 | 1.560 |
W. Acc. | 0.859 | 0.703 | 0.485 | 0.717 | 0.535 | 1.980 | ||
Acc. + W. Acc. | 0.926 | 0.847 | 0.715 | 0.852 | 0.764 | 1.437 | ||
NN | Acc. | 0.823 | 0.441 | 0.324 | 0.646 | 0.318 | 1.857 | |
W. Acc. | 0.818 | 0.433 | 0.316 | 0.626 | 0.310 | 2.296 | ||
Acc. + W. Acc. | 0.861 | 0.688 | 0.492 | 0.721 | 0.536 | 1.680 | ||
Standing | BDT | Acc. | 0.996 | 0.987 | 0.986 | 0.996 | 0.986 | 0.057 |
W. Acc. | 0.988 | 0.973 | 0.966 | 0.988 | 0.970 | 0.111 | ||
Acc. + W. Acc. | 0.996 | 0.991 | 0.986 | 0.996 | 0.988 | 0.049 | ||
NN | Acc. | 0.983 | 0.988 | 0.907 | 0.983 | 0.943 | 0.241 | |
W. Acc. | 0.987 | 0.880 | 0.974 | 0.987 | 0.920 | 1.118 | ||
Acc. + W. Acc. | 0.971 | 0.957 | 0.963 | 0.971 | 0.960 | 0.090 |
Predicted Output | Predicted Output | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Lying (A1) | A1C1 | A1C2 | A1C3 | Sitting (A2) | A2C1 | A2C2 | A2C3 | A2C4 | |||
Ground Truth | A1C1 | 99.63% | 0.33% | 0.04% | Ground Truth | A2C1 | 99.5% | 0.1% | 0.0% | 0.4% | |
A2C2 | 2.6% | 96.6% | 0.4% | 0.4% | |||||||
A1C2 | 31.1% | 68.8% | 0.1% | A2C3 | 2.3% | 0.3% | 94.5% | 3.0% | |||
A1C3 | 10.8% | 14.0% | 75.2% | A2C4 | 1.9% | 0.0% | 0.3% | 97.8% | |||
Predicted Output | Predicted Output | ||||||||||
Walking (A3) | A3C1 | A3C2 | A3C3 | A3C4 | Standing (A4) | A4C1 | A4C2 | ||||
Ground Truth | A3C1 | 82.2% | 15.2% | 0.4% | 2.2% | Ground Truth | A4C1 | 99.8% | 0.2% | ||
A3C2 | 2.4% | 94.9% | 0.5% | 2.3% | |||||||
A3C3 | 0.0% | 47.6% | 46.9% | 5.5% | A4C2 | 2.7% | 97.3% | ||||
A3C4 | 1.5% | 35.6% | 1.0% | 61.9% |
Classifier | Activity | Accuracy | Precision | Sensitivity | Micro-F1 | Macro-F1 | Log Loss |
---|---|---|---|---|---|---|---|
BDT | Lying | 0.996 | 0.923 | 0.772 | 0.996 | 0.831 | 1.420 |
Sitting | 0.982 | 0.917 | 0.904 | 0.964 | 0.911 | 0.426 | |
Walking | 0.889 | 0.753 | 0.684 | 0.777 | 0.711 | 1.731 | |
Standing | 0.991 | 0.975 | 0.974 | 0.982 | 0.974 | 0.076 | |
NN | Lying | 0.993 | 0.497 | 0.500 | 0.993 | 0.498 | 2.505 |
Sitting | 0.916 | 0.473 | 0.338 | 0.831 | 0.345 | 1.578 | |
Walking | 0.776 | 0.427 | 0.336 | 0.552 | 0.313 | 1.356 | |
Standing | 0.861 | 0.697 | 0.720 | 0.721 | 0.698 | 1.352 |
Predicted Output | Predicted Output | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Lying (A1) | PH | PT | Sitting (A2) | PB | PH | PP | PT | ||||
Ground Truth | PH | 54.5% | 45.5% | Ground Truth | PB | 96.1% | 1.5% | 0.1% | 2.3% | ||
PH | 1.2% | 80.6% | 1.6% | 16.5% | |||||||
PT | 0.1% | 99.9% | PP | 1.3% | 5.5% | 86.6% | 6.6% | ||||
PT | 0.1% | 1.0% | 0.4% | 98.5% | |||||||
Predicted Output | Predicted Output | ||||||||||
Walking (A3) | PB | PH | PP | PT | Standing (A4) | PB | PH | PP | PT | ||
Ground Truth | PB | 59.3% | 16.2% | 22.7% | 1.8% | Ground Truth | PB | 99.1% | 0.5% | 0.0% | 0.5% |
PH | 7.6% | 73.6% | 17.7% | 1.2% | PH | 0.7% | 93.2% | 1.4% | 4.8% | ||
PP | 2.8% | 6.5% | 89.4% | 1.4% | PP | 0.0% | 0.4% | 98.6% | 1.0% | ||
PT | 5.6% | 14.6% | 28.3% | 51.5% | PT | 0.0% | 0.7% | 0.6% | 98.8% |
Study | Activity/Context Type | No. of Activities/ Contexts | Occupancy/Environment | Sensing Device/ Sensors | Classifier(s) | Achieved Results |
---|---|---|---|---|---|---|
[100] | Daily Living | 06 | Single/ Controlled Lab | Smartphone (Acc.) | MLP, LR, DT (Decision-level Fusion) | F1-Score = 91.8% |
[104] | Daily Living | 06 | Single/ Controlled Lab | Smartphone (Acc.) | CNN | F1-Score = 97.4% |
Daily Living | 07 | Multiple/ Indoor and Outdoor | Smartphone (Acc., Gyro.) | F1-Score = 93.1% | ||
[101] | Daily Living | 12 | Single/- | Smartphone (Acc., Gyro.) | NN, SVM, DBN | Accuracy = 89.61% (DBN) |
[102] | Home Task | 07 | Single/Indoor | Smartphone (Acc., Mic.); Wearable (Acc.) | RF | Accuracy = 94.1% |
[37] | Daily Living | 09 | Multiple/ Indoor and Outdoor | Smartphone (Acc., Gyro., Mag.,); Pressure Sensor | DT, NB, SVM, MLP | Accuracy = 92.8% (MLP) |
[38] | Human Behavioral Contexts | 25 | Multiple/In-the-Wild | Smartphone (Acc., Gyro., Mag., GPS); Wearable (Acc.) | LR | BALACC = 80% |
[103] | Home Tasks | 10 | Single/Smart Home | Motion Sensor; Ambient (Temp. Sensor) | NN, HMM, CRF, SVM, CE (using Genetic Algorithm) | F1-Score = 90.1% (CE) |
Home Tasks | 11 | Single/Smart Home | Motion Sensor; Item; EU; Ambient (Door Sensor, Temp. Sensor, Light Sensor); | F1-Score = 81.9% (CE) | ||
Home Tasks | 15 | Single/Smart Home | Motion Sensor; Ambient (Door Sensor, Temp. Sensor); | F1-Score = 85.7% (CE) | ||
[105] | Daily Living and Home Tasks | 12 | Single/- | Wearable (Acc.) | DT, SVM (Two-level Fusion) | F1-Score = 93.0% (CE) |
[106] | Elderly Activities | 17 | Single/Smart Home | Wearable (Bar., Temp., Acc., Gyro., Mag.); Ambient (PIR) | SVM | Accuracy = 98.32% |
Proposed ARW | Daily Living | 06 | Multiple/In-the-Wild | Smartphone (Acc.); Wearable (Acc.) | BDT, NN | BALACC = 93.1% (BDT) |
Behavioral Contexts | 10 | Multiple/In-the-Wild | Smartphone (Acc.); Wearable (Acc.) | BDT, NN | BALACC = 91% (BDT) | |
Phone Contexts | 04 | Multiple/In-the-Wild | Smartphone (Acc.); | BDT, NN | BALACC = 84.2% (BDT) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ehatisham-ul-Haq, M.; Murtaza, F.; Azam, M.A.; Amin, Y. Daily Living Activity Recognition In-The-Wild: Modeling and Inferring Activity-Aware Human Contexts. Electronics 2022, 11, 226. https://doi.org/10.3390/electronics11020226
Ehatisham-ul-Haq M, Murtaza F, Azam MA, Amin Y. Daily Living Activity Recognition In-The-Wild: Modeling and Inferring Activity-Aware Human Contexts. Electronics. 2022; 11(2):226. https://doi.org/10.3390/electronics11020226
Chicago/Turabian StyleEhatisham-ul-Haq, Muhammad, Fiza Murtaza, Muhammad Awais Azam, and Yasar Amin. 2022. "Daily Living Activity Recognition In-The-Wild: Modeling and Inferring Activity-Aware Human Contexts" Electronics 11, no. 2: 226. https://doi.org/10.3390/electronics11020226
APA StyleEhatisham-ul-Haq, M., Murtaza, F., Azam, M. A., & Amin, Y. (2022). Daily Living Activity Recognition In-The-Wild: Modeling and Inferring Activity-Aware Human Contexts. Electronics, 11(2), 226. https://doi.org/10.3390/electronics11020226