Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition
Abstract
:1. Introduction
- An interpretable, passive, multi-modal, sensor fusion system PRF-PIR for human monitoring is accurate at HIAR and is non-intrusive, transparent, passive, and inexpensive in design. To our best knowledge, this is the first passive sensor fusion system for human monitoring applications.
- The proposed system mitigates the limitations of single modality solutions, such as the vertical FoV and ambient dependence on MI-PIR and the impact of electronic interference on PRF. PRF-PIR provides a robust, high-accuracy, and reliable classification system for the HIAR task.
- In addition to the transparent nature of the decision-level fusion, SHAP is used to interpret how the system reduces the influence of vertical FoV, ambient dependence, and electronic interference and provides a visual application prospect.
2. Related Work
2.1. SDR/PRF
2.2. PIR Sensor
2.3. Sensor Fusion
2.4. Explainable AI (XAI)
3. Methodologies
3.1. Data Acquisition
3.1.1. SDR Device
3.1.2. PIR Sensor
3.2. Sensor Fusion
4. Experiment and Results
4.1. Experimental Set-Up
4.2. Optimization of SDR Device Antenna Location
4.3. Experimental Results
4.3.1. Human Identification Results
4.3.2. Activity Recognition Results
4.4. Explainable AI (XAI)
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Silva, J.; Sousa, I.; Cardoso, J.S. Fusion of clinical, self-reported, and multisensor data for predicting falls. IEEE J. Biomed. Health Inform. 2019, 24, 50–56. [Google Scholar] [CrossRef] [PubMed]
- Chapron, K.; Lapointe, P.; Bouchard, K.; Gaboury, S. Highly accurate bathroom activity recognition using infrared proximity sensors. IEEE J. Biomed. Health Inform. 2019, 24, 2368–2377. [Google Scholar] [CrossRef] [PubMed]
- Tao, M.; Li, X.; Wei, W.; Yuan, H. Jointly optimization for activity recognition in secure IoT-enabled elderly care applications. Appl. Soft Comput. 2021, 99, 106788. [Google Scholar] [CrossRef]
- Yuan, L.; Li, J. Smart cushion based on pressure sensor array for human sitting posture recognition. In Proceedings of the 2021 IEEE Sensors, Sydney, Australia, 31 October–4 November 2021. [Google Scholar]
- Tan, J.-S.; Beheshti, B.K.; Binnie, T.; Davey, P.; Caneiro, J.; Kent, P.; Smith, A.; O’Sullivan, P.; Campbell, A. Human Activity Recognition for People with Knee Osteoarthritis—A Proof-of-Concept. Sensors 2021, 21, 3381. [Google Scholar] [CrossRef] [PubMed]
- Hbali, Y.; Hbali, S.; Ballihi, L.; Sadgal, M. Skeleton-based human activity recognition for elderly monitoring systems. IET Comput. Vis. 2018, 12, 16–26. [Google Scholar] [CrossRef]
- Manjarres, J.; Narvaez, P.; Gasser, K.; Percybrooks, W.; Pardo, M. Physical workload tracking using human activity recognition with wearable devices. Sensors 2019, 20, 39. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Oguntala, G.A.; Hu, Y.F.; Alabdullah, A.A.; Abd-Alhameed, R.A.; Ali, M.; Luong, D.K. Passive RFID Module With LSTM Recurrent Neural Network Activity Classification Algorithm for Ambient-Assisted Living. IEEE Internet Things J. 2021, 8, 10953–10962. [Google Scholar] [CrossRef]
- Bianchi, V.; Bassoli, M.; Lombardo, G.; Fornacciari, P.; Mordonini, M.; De Munari, I. IoT wearable sensor and deep learning: An integrated approach for personalized human activity recognition in a smart home environment. IEEE Internet Things J. 2019, 6, 8553–8562. [Google Scholar] [CrossRef]
- Luo, X.; Guan, Q.; Tan, H.; Gao, L.; Wang, Z.; Luo, X. Simultaneous indoor tracking and activity recognition using pyroelectric infrared sensors. Sensors 2017, 17, 1738. [Google Scholar] [CrossRef] [PubMed]
- Natani, A.; Sharma, A.; Perumal, T. Sequential neural networks for multi-resident activity recognition in ambient sensing smart homes. Appl. Intell. 2021, 51, 6014–6028. [Google Scholar] [CrossRef]
- Uddin, M.Z.; Khaksar, W.; Torresen, J. Ambient sensors for elderly care and independent living: A survey. Sensors 2018, 18, 2027. [Google Scholar] [CrossRef] [Green Version]
- Gochoo, M.; Tan, T.-H.; Liu, S.-H.; Jean, F.-R.; Alnajjar, F.S.; Huang, S.-C. Unobtrusive activity recognition of elderly people living alone using anonymous binary sensors and DCNN. IEEE J. Biomed. Health Inform. 2018, 23, 693–702. [Google Scholar] [CrossRef]
- Casaccia, S.; Braccili, E.; Scalise, L.; Revel, G.M. Experimental assessment of sleep-related parameters by passive infrared sensors: Measurement setup, feature extraction, and uncertainty analysis. Sensors 2019, 19, 3773. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yan, J.; Lou, P.; Li, R.; Hu, J.; Xiong, J. Research on the multiple factors influencing human identification based on pyroelectric infrared sensors. Sensors 2018, 18, 604. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pham, M.; Yang, D.; Sheng, W. A sensor fusion approach to indoor human localization based on environmental and wearable sensors. IEEE Trans. Autom. Sci. Eng. 2018, 16, 339–350. [Google Scholar] [CrossRef]
- Andrews, J.; Kowsika, M.; Vakil, A.; Li, J. A motion induced passive infrared (PIR) sensor for stationary human occupancy detection. In Proceedings of the 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS), Portland, OR, USA, 20–23 April 2020. [Google Scholar]
- Andrews, J.; Li, J. Human Detection and Biometric Authentication with Ambient Sensors. Signal Process. Med. Biol. Cham. Springer Int. Publ. 2021. accepted. [Google Scholar]
- Vena, A.; Samat, N.; Sorli, B.; Podlecki, J. Ultralow Power and Compact Backscatter Wireless Device and Its SDR-Based Reading System for Environment Monitoring in UHF Band. IEEE Sens. Lett. 2021, 5, 7500704. [Google Scholar] [CrossRef]
- Zhang, T.; Song, T.; Chen, D.; Zhang, T.; Zhuang, J. WiGrus: A WiFi-based gesture recognition system using software-defined radio. IEEE Access 2019, 7, 131102–131113. [Google Scholar] [CrossRef]
- Yuan, L.; Chen, H.; Ewing, R.; Blasch, E.; Li, J. Three Dimensional Indoor Positioning Based on Passive Radio Frequency Signal Strength Distribution. IEEE Trans. Signal Process. 2022, submitted.
- Liu, J.; Vakil, A.; Ewing, R.; Shen, X.; Li, J. Human Presence Detection via Deep Learning of Passive Radio Frequency Data. In Proceedings of the 2019 IEEE National Aerospace and Electronics Conference (NAECON), Dayton, OH, USA, 15–19 July 2019. [Google Scholar]
- Mu, H.; Liu, J.; Ewing, R.; Li, J. Human Indoor Positioning via Passive Spectrum Monitoring. In Proceedings of the 2021 55th Annual Conference on Information Sciences and Systems (CISS 2021), Baltimore, MD, USA, 24–26 March 2019. [Google Scholar]
- Mu, H.; Ewing, R.; Blasch, E.; Li, J. Human Subject Identification via Passive Spectrum Monitoring. In Proceedings of the 2021 IEEE National Aerospace and Electronics Conference (NAECON), Dayton, MD, USA, 16–19 August 2021. [Google Scholar]
- Wang, J.; Chen, Y.; Hao, S.; Peng, X.; Hu, L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognit. Lett. 2019, 119, 3–11. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Yang, Z.; Chen, T.; Chen, D.; Huang, M.-C. Cooperative sensing and wearable computing for sequential hand gesture recognition. IEEE Sens. J. 2019, 19, 5775–5783. [Google Scholar] [CrossRef]
- Martindale, C.F.; Christlein, V.; Klumpp, P.; Eskofier, B.M. Wearables-based multi-task gait and activity segmentation using recurrent neural networks. Neurocomputing 2021, 432, 250–261. [Google Scholar] [CrossRef]
- Willett, F.R.; Avansino, D.T.; Hochberg, L.R.; Henderson, J.M.; Shenoy, K.V. High-performance brain-to-text communication via handwriting. Nature 2021, 593, 249–254. [Google Scholar] [CrossRef] [PubMed]
- Hochreiter, S. The vanishing gradient problem during learning recurrent neural nets and problem solutions. Int. J. Uncertain. Fuzziness Knowl. Based Syst. 1998, 6, 107–116. [Google Scholar] [CrossRef] [Green Version]
- Li, Q.; Gravina, R.; Li, Y.; Alsamhi, S.H.; Sun, F.; Fortino, G. Multi-user activity recognition: Challenges and opportunities. Inf. Fusion 2020, 63, 121–135. [Google Scholar] [CrossRef]
- Fan, X.; Wang, F.; Wang, F.; Gong, W.; Liu, J. When RFID meets deep learning: Exploring cognitive intelligence for activity identification. IEEE Wirel. Commun. 2019, 26, 19–25. [Google Scholar] [CrossRef]
- Denis, S.; Berkvens, R.; Weyn, M. A survey on detection, tracking and identification in radio frequency-based device-free localization. Sensors 2019, 19, 5329. [Google Scholar] [CrossRef] [Green Version]
- Yang, L.; Qiu, Y.; Lin, F.; Wang, B.; Wu, X. Optimized algorithm for RFID-based activity recognition of the elderly. IEEE Access 2019, 7, 136777–136782. [Google Scholar] [CrossRef]
- Yao, L.; Sheng, Q.Z.; Li, X.; Gu, T.; Tan, M.; Wang, X.; Wang, S.; Ruan, W. Compressive representation for device-free activity recognition with passive RFID signal strength. IEEE Trans. Mob. Comput. 2017, 17, 293–306. [Google Scholar] [CrossRef]
- Liu, J.; Mu, H.; Vakil, A.; Ewing, R.; Shen, X.; Blasch, E.; Li, J. Human Occupancy Detection via Passive Cognitive Radio. Sensors 2020, 20, 4248. [Google Scholar] [CrossRef]
- Liu, X.; Yang, T.; Tang, S.; Guo, P.; Niu, J. From relative azimuth to absolute location: Pushing the limit of pir sensor based localization. In Proceedings of the 26th Annual International Conference on Mobile Computing and Networking, London, UK, 21–25 September 2020. [Google Scholar]
- Wu, L.; Gou, F.; Wu, S.-T.; Wang, Y. SLEEPIR: Synchronized low-energy electronically chopped PIR sensor for true presence detection. IEEE Sens. Lett. 2020, 4, 1–4. [Google Scholar] [CrossRef]
- Wu, L.; Wang, Y. A low-power electric-mechanical driving approach for true occupancy detection using a shuttered passive infrared sensor. IEEE Sens. J. 2018, 19, 47–57. [Google Scholar] [CrossRef]
- Wu, L.; Wang, Y.; Liu, H. Occupancy detection and localization by monitoring nonlinear energy flow of a shuttered passive infrared sensor. IEEE Sens. J. 2018, 18, 8656–8666. [Google Scholar] [CrossRef]
- Juan, R.O.S.; Kim, J.S.; Sa, Y.H.; Kim, H.S.; Cha, H.W. Development of a sensing module for standing and moving human body using a shutter and PIR sensor. Int. J. Multimed. Ubiquitous Eng. 2016, 11, 47–56. [Google Scholar] [CrossRef]
- Andrews, J.; Vakil, A.; Li, J. Biometric Authentication and Stationary Detection of Human Subjects by Deep Learning of Passive Infrared (PIR) Sensor Data. In Proceedings of the 2020 IEEE Signal Processing in Medicine and Biology Symposium (SPMB), Philadelphia, PA, USA, 5 December 2020. [Google Scholar]
- Kashimoto, Y.; Fujiwara, M.; Fujimoto, M.; Suwa, H.; Arakawa, Y.; Yasumoto, K. ALPAS: Analog-PIR-sensor-based activity recognition system in smarthome. In Proceedings of the 2017 IEEE 31st International Conference on Advanced Information Networking and Applications (AINA), Taiwan, China, 27–29 March 2020. [Google Scholar]
- Bazo, R.; Reis, E.; Seewald, L.A.; Rodrigues, V.F.; da Costa, C.A.; Gonzaga, L., Jr.; Antunes, R.S.; da Rosa Righi, R.; Maier, A.; Eskofier, B. Baptizo: A sensor fusion based model for tracking the identity of human poses. Inf. Fusion 2020, 62, 1–13. [Google Scholar] [CrossRef]
- Hsu, F.-S.; Chang, T.-C.; Su, Z.-J.; Huang, S.-J.; Chen, C.-C. Smart fall detection framework using hybridized video and ultrasonic sensors. Micromachines 2021, 12, 508. [Google Scholar] [CrossRef] [PubMed]
- Li, H.; Le Kernec, J.; Mehul, A.; Gurbuz, S.Z.; Fioranelli, F. Distributed radar information fusion for gait recognition and fall detection. In Proceedings of the 2020 IEEE Radar Conference (RadarConf20), Florence, Italy, 21–25 September 2020. [Google Scholar]
- Aguileta, A.A.; Brena, R.F.; Mayora, O.; Molino-Minero-Re, E.; Trejo, L.A. Multi-sensor fusion for activity recognition—A survey. Sensors 2019, 19, 3808. [Google Scholar] [CrossRef] [Green Version]
- Gravina, R.; Alinia, P.; Ghasemzadeh, H.; Fortino, G. Multi-sensor fusion in body sensor networks: State-of-the-art and research challenges. Inf. Fusion 2017, 35, 68–80. [Google Scholar] [CrossRef]
- Yadav, S.K.; Tiwari, K.; Pandey, H.M.; Akbar, S.A. A review of multimodal human activity recognition with special emphasis on classification, applications, challenges and future directions. Knowl. Based Syst. 2021, 223, 106970. [Google Scholar] [CrossRef]
- Dasarathy, B.V. Sensor fusion potential exploitation-innovative architectures and illustrative applications. Proc. IEEE 1997, 85, 24–38. [Google Scholar] [CrossRef]
- Blasch, E.; Vakil, A.; Li, J.; Ewing, R. Multimodal Data Fusion Using Canonical Variates Analysis Confusion Matrix Fusion. In Proceedings of the 2021 IEEE Aerospace Conference, Big Sky, MT, USA, 6–13 March 2021. [Google Scholar]
- Moosavi-Dezfooli, S.-M.; Fawzi, A.; Frossard, P. Deepfool: A simple and accurate method to fool deep neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Lipton, Z.C. The mythos of model interpretability: In machine learning, the concept of interpretability is both important and slippery. Queue 2018, 16, 31–57. [Google Scholar] [CrossRef]
- Letham, B.; Rudin, C.; McCormick, T.H.; Madigan, D. Interpretable classifiers using rules and bayesian analysis: Building a better stroke prediction model. Ann. Appl. Stat. 2015, 9, 1350–1371. [Google Scholar] [CrossRef]
- Lundberg, S.M.; Lee, S.-I. A unified approach to interpreting model predictions. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Red Hook, NY, USA, 4–9 December 2017. [Google Scholar]
- Lundberg, S.M.; Nair, B.; Vavilala, M.S.; Horibe, M.; Eisses, M.J.; Adams, T.; Liston, D.E.; Low, D.K.-W.; Newman, S.-F.; Kim, J. Explainable machine-learning predictions for the prevention of hypoxaemia during surgery. Nat. Biomed. Eng. 2018, 2, 749–760. [Google Scholar] [CrossRef] [PubMed]
- Amendola, S.; Bianchi, L.; Marrocco, G. Movement Detection of Human Body Segments: Passive radio-frequency identification and machine-learning technologies. IEEE Antennas Propag. Mag. 2015, 57, 23–37. [Google Scholar] [CrossRef]
- Singh, A.; Lubecke, V. A heterodyne receiver for harmonic Doppler radar cardiopulmonary monitoring with body-worn passive RF tags. In Proceedings of the 2010 IEEE MTT-S International Microwave Symposium, Anaheim, CA, USA, 23–28 May 2010. [Google Scholar]
Activity Recognition ID | Activity | Human Subject ID | Location | Description |
---|---|---|---|---|
0 | Unoccupied | 0 | N/A | Unoccupied. |
1 | Smartphone | 1-12 | 6 | Subject sits and uses a smartphone. |
2 | Laptop Desk | 2-3 | 2-4 | Subject sits and uses a laptop placed on the desk. |
3 | Laptop Lap | 1-2,4,7 | 6 | Subject sits and uses a laptop placed on the lap. |
4 | Watch TV | 1-3 | 4,6 | Subject sits and watches videos on a laptop placed on a desk. |
5 | Walking | 1-4 | Walk | Subject walks randomly in the laboratory. |
6 | Standing | 1-4 | 4-6 | Subject stands. |
7 | Exercise | 2-3 | 6 | Subject performs random exercises continuously. |
8 | Board | 1-4 | 5 | Subject writes on a whiteboard. |
9 | Sleep | 2-5 | 3 | Subject placed their head on a desk. |
10 | Eating | 2 | 4 | Subject eats at a desk. |
11 | Fall | 1-4 | 6 | Subject lies on the ground to simulate a fall. |
Minimum | Maximum | Average | Standard Deviation | Variance | |
---|---|---|---|---|---|
Age (years old) | 20.00 | 27.00 | 22.75 | 2.34 | 5.48 |
Weight (kg) | 51.00 | 100.00 | 70.25 | 13.10 | 171.66 |
Height (cm) | 162.00 | 188.00 | 176.17 | 8.04 | 64.70 |
BMI | 17.65 | 29.86 | 22.55 | 3.36 | 11.29 |
Activity Recognition | ||||
---|---|---|---|---|
Antenna Location | PIR | PRF | SF | SF vs. PRF Improvement1 |
A | 0.8066 | 0.9080 | 0.9458 | 4.16% |
B | 0.8868 | 0.9575 | 7.98% | |
C | 0.9434 | 0.9623 | 2.00% | |
D | 0.8915 | 0.9646 | 8.20% | |
E | 0.7453 | 0.9057 | 21.52% | |
Human Identification | ||||
Antenna Location | PIR | PRF | SF | SF vs. PRF Improvement1 |
A | 0.9530 | 0.8389 | 0.9664 | 15.20% |
B | 0.8054 | 0.9866 | 22.50% | |
C | 0.8993 | 0.9866 | 9.70% | |
D | 0.8456 | 0.9799 | 15.87% | |
E | 0.7919 | 0.9799 | 23.73% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yuan, L.; Andrews, J.; Mu, H.; Vakil, A.; Ewing, R.; Blasch, E.; Li, J. Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition. Sensors 2022, 22, 5787. https://doi.org/10.3390/s22155787
Yuan L, Andrews J, Mu H, Vakil A, Ewing R, Blasch E, Li J. Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition. Sensors. 2022; 22(15):5787. https://doi.org/10.3390/s22155787
Chicago/Turabian StyleYuan, Liangqi, Jack Andrews, Huaizheng Mu, Asad Vakil, Robert Ewing, Erik Blasch, and Jia Li. 2022. "Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition" Sensors 22, no. 15: 5787. https://doi.org/10.3390/s22155787
APA StyleYuan, L., Andrews, J., Mu, H., Vakil, A., Ewing, R., Blasch, E., & Li, J. (2022). Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition. Sensors, 22(15), 5787. https://doi.org/10.3390/s22155787