Smartphone-Based Activity Recognition in a Pedestrian Navigation Context
Abstract
:1. Introduction
- (a)
- a stationary user who is standing still, e.g., while reading signage on a wall or instructions on their phone;
- (b)
- a stationary user who is looking around, e.g., while trying to orient themselves or looking for a landmark that was referred to in an instruction;
- (c)
- different kinds of locomotion and transitions thereof, i.e., walking straight, walking around corners and going through doors; and
- (d)
- the direction of doors, i.e., whether they need to be pushed or pulled.
- (1)
- We collect and describe a dataset containing fine-grained pedestrian navigation activities that exceed the generic classes found in other datasets.
- (2)
- We design and evaluate an activity recognition approach that relies on hierarchical deep learning classifiers for these specific activities.
- (3)
- We apply the approach to naturalistic log data from real users of a navigation app in order to gain insights into which device placements and activities occur during the course of a typical navigation session.
2. Materials and Methods
2.1. The Activity Logs Dataset
- The linear acceleration sensor is Android’s variant of the acceleration sensor that does not include the Earth’s gravity. It is useful for detecting motion along one (or more) of the device’s axes, be it periodic or sporadic.
- The gyroscope allows us to detect rotation and therefore—over a longer period of time—changes in device orientation. Typically, gyroscopes react much more quickly than magnetic field sensors but have the disadvantage of accumulating drift over time.
- The magnetic field sensor provides—in theory—the absolute orientation of the device with respect to the earth’s magnetic field. In practice, it is often too slow to react to sudden changes in orientation and can be disturbed by external influences, especially inside buildings.
- The proximity sensor was also active during data collection and is only used during data pre-processing but not for classification. It is located near the earpiece and detects if the front of the device is close to an object. We use it to estimate whether the device was held in the phone call position or put into the pocket.
- WALKING_STRAIGHT, LEFT_TURN and RIGHT_TURN cover three different kinds of regular locomotion on a level surface.
- STANDING_STILL and LOOKING_AROUND are two types of stationary behavior:STANDING_STILL simulates the case where a user is mostly not moving, e.g., when reading signage on a wall or instructions on their phone, whereas LOOKING_AROUND implies that the user is—while still mostly stationary—more actively looking at their surroundings, leading to increased upper body motion. The rationale behind distinguishing between these two activities is the pursuit of a more context-aware navigation assistance.
- THROUGH_DOOR_PUSHING and THROUGH_DOOR_PULLING as the final two activities are detected whenever a user encounters, opens and walks through a door that needs to be pushed or pulled, respectively. Detecting a door transition provides valuable information to the indoor positioning system and directly allows it to make fine-grained assumptions about the user’s position, decreasing its uncertainty. This is even more true if we can also correctly detect the orientation of the door, at least for cases where it only opens in one direction.
- Handheld, in front (IN_FRONT): This placement occurs essentially in every navigation session when the user actively interacts with the screen of the device. Additionally, this covers the periods where the user is holding the device in front of the body, following instructions and ready to interact at any moment.
- Handheld, swinging (IN_HAND_SWINGING): The second case is when the device is still held in hand, but not statically in front of the body, and rather swinging off the side in an outstretched arm. Here, the device movement is most decoupled from the user’s activity, imposing additional challenges for activity recognition.
- Handheld, phone call (IN_PHONE_CALL): This covers the case where the phone is held tightly and generally upright next to the user’s head, potentially picking up more of the upper body motion.
- Phone in pocket (IN_POCKET): This class of placements covers situations in which the phone is put away into a pocket, usually at waist-level. It has the advantage of being more tightly coupled to the user’s lower body movements but might increase the difficulty of distinguishing between some of the more granular activities.
2.2. Activity Recognition Methodology
2.3. The Navigation Experiment Dataset
2.4. The Naturalistic Dataset
3. Results
3.1. Classification Performance
3.1.1. Device Placement
3.1.2. A General Classifier for All Activities
3.1.3. Locomotion vs. Stationary Behavior
3.1.4. Types of Stationary Behavior
3.1.5. Locomotion vs. Door Transitions
3.1.6. Door Types
3.1.7. Performance on Actual Devices
3.2. Applying Classifiers to the Navigation Experiment Dataset
3.3. Finding Behavioral Patterns in the Naturalistic Dataset
- (a)
- how device placement changes over time in a typical navigation session;
- (b)
- patterns in the distribution of stationary and locomotion periods; and
- (c)
- the correlation of device placement and activity.
3.3.1. Changes in Device Placement over Time
3.3.2. Stationary and Locomotion Periods
3.3.3. Correlation between Device Placement and Activity
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Brush, A.B.; Hammil, K.; Levi, S.; Karlson, A.K.; Scott, J.; Sarin, R.; Jacobs, A.; Bond, B.; Murillo, O.; Hunt, G.; et al. User Experiences with Activity-Based Navigation on Mobile Devices. In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services—MobileHCI ’10, Lisbon, Portugal, 7–10 September 2010; ACM Press: Lisbon, Portugal, 2010; p. 73. [Google Scholar] [CrossRef]
- Wang, H.; Elgohary, A.; Choudhury, R.R. No Need to War-Drive: Unsupervised Indoor Localization. In Proceedings of the 10th International Conference on Mobile Systems, Applications, and Services (MobiSys’12), Low Wood Bay, Lake District, UK, 25–29 June 2012; pp. 197–210. [Google Scholar] [CrossRef]
- Abdelnasser, H.; Mohamed, R.; Elgohary, A.; Alzantot, M.F.; Wang, H.; Sen, S.; Choudhury, R.R.; Youssef, M. SemanticSLAM: Using Environment Landmarks for Unsupervised Indoor Localization. IEEE Trans. Mob. Comput. 2016, 15, 1770–1782. [Google Scholar] [CrossRef]
- Ramasamy Ramamurthy, S.; Roy, N. Recent trends in machine learning for human activity recognition—A survey. WIREs Data Min. Knowl. Discov. 2018, 8, e1254. [Google Scholar] [CrossRef]
- Das Antar, A.; Ahmed, M.; Ahad, M.A.R. Challenges in Sensor-based Human Activity Recognition and a Comparative Analysis of Benchmark Datasets: A Review. In Proceedings of the 2019 Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 2019 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), Spokane, WA, USA, 30 May–2 June 2019; pp. 134–139. [Google Scholar] [CrossRef]
- Sousa Lima, W.; Souto, E.; El-Khatib, K.; Jalali, R.; Gama, J. Human Activity Recognition Using Inertial Sensors in a Smartphone: An Overview. Sensors 2019, 19, 3213. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhou, B.; Li, Q.; Mao, Q.; Tu, W.; Zhang, X. Activity Sequence-Based Indoor Pedestrian Localization Using Smartphones. IEEE Trans. Hum. Syst. 2015, 45, 562–574. [Google Scholar] [CrossRef]
- Martinelli, A.; Morosi, S.; Del Re, E. Daily movement recognition for Dead Reckoning applications. In Proceedings of the 2015 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Banff, AB, Canada, 13–16 October 2015; pp. 1–8. [Google Scholar] [CrossRef]
- Wang, J.; Chen, Y.; Hao, S.; Peng, X.; Hu, L. Deep Learning for Sensor-based Activity Recognition: A Survey. Pattern Recognit. Lett. 2019, 119. [Google Scholar] [CrossRef] [Green Version]
- Ordóñez, F.; Roggen, D. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yang, J.; Cheng, K.; Chen, J.; Zhou, B.; Li, Q. Smartphones based Online Activity Recognition for Indoor Localization using Deep Convolutional Neural Network. In Proceedings of the 2018 Ubiquitous Positioning, Indoor Navigation and Location-Based Services (UPINLBS), Wuhan, China, 22–23 March 2018; pp. 1–7. [Google Scholar] [CrossRef]
- Zhou, B.; Yang, J.; Li, Q. Smartphone-Based Activity Recognition for Indoor Localization Using a Convolutional Neural Network. Sensors 2019, 19, 621. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ye, J.; Li, X.; Zhang, X.; Zhang, Q.; Chen, W. Deep Learning-Based Human Activity Real-Time Recognition for Pedestrian Navigation. Sensors 2020, 20, 2574. [Google Scholar] [CrossRef] [PubMed]
- Ebner, M.; Fetzer, T.; Bullmann, M.; Deinzer, F.; Grzegorzek, M. Recognition of Typical Locomotion Activities Based on the Sensor Data of a Smartphone in Pocket or Hand. Sensors 2020, 20, 6559. [Google Scholar] [CrossRef] [PubMed]
- Zhang, S.; McCullagh, P.; Nugent, C.; Zheng, H. Activity Monitoring Using a Smart Phone’s Accelerometer with Hierarchical Classification. In Proceedings of the 2010 Sixth International Conference on Intelligent Environments, Kuala Lumpur, Malaysia, 19–21 June 2010; pp. 158–163. [Google Scholar] [CrossRef]
- Ashqar, H.I.; Almannaa, M.H.; Elhenawy, M.; Rakha, H.A.; House, L. Smartphone Transportation Mode Recognition Using a Hierarchical Machine Learning Classifier and Pooled Features From Time and Frequency Domains. IEEE Trans. Intell. Transp. Syst. 2019, 20, 244–252. [Google Scholar] [CrossRef] [Green Version]
- Gu, F.; Valaee, S.; Khoshelham, K.; Shang, J.; Zhang, R. Landmark Graph-Based Indoor Localization. IEEE Internet Things J. 2020, 7, 8343–8355. [Google Scholar] [CrossRef]
- Radu, V.; Marina, M.K. HiMLoc: Indoor Smartphone Localization via Activity Aware Pedestrian Dead Reckoning with Selective Crowdsourced WiFi Fingerprinting. In Proceedings of the International Conference on Indoor Positioning and Indoor Navigation, Montbeliard, France, 28–31 October 2013; pp. 1–10. [Google Scholar] [CrossRef]
- Murata, Y.; Hiroi, K.; Kaji, K.; Kawaguchi, N. Pedestrian dead reckoning based on human activity sensing knowledge. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication—UbiComp ’14 Adjunct, Seattle, WA, USA, 13–17 September 2014; ACM Press: Seattle, WA, USA, 2014; pp. 797–806. [Google Scholar] [CrossRef]
- Saeedi, S.; El-Sheimy, N. Activity Recognition Using Fusion of Low-Cost Sensors on a Smartphone for Mobile Navigation Application. Micromachines 2015, 6, 1100–1134. [Google Scholar] [CrossRef]
- Gu, F.; Kealy, A.; Khoshelham, K.; Shang, J. User-Independent Motion State Recognition Using Smartphone Sensors. Sensors 2015, 15, 30636–30652. [Google Scholar] [CrossRef] [PubMed]
- Sun, Y.; Wang, B. Indoor Corner Recognition from Crowdsourced Trajectories using Smartphone Sensors. Expert Syst. Appl. 2017, 82, 266–277. [Google Scholar] [CrossRef]
- Kasebzadeh, P.; Hendeby, G.; Fritsche, C.; Gunnarsson, F.; Gustafsson, F. IMU Dataset for Motion and Device Mode Classification. In Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 September 2017; pp. 1–8. [Google Scholar] [CrossRef] [Green Version]
- Klein, I.; Solaz, Y.; Ohayon, G. Pedestrian Dead Reckoning With Smartphone Mode Recognition. IEEE Sens. J. 2018, 18, 7577–7584. [Google Scholar] [CrossRef]
- Guo, H.; Chen, L.; Chen, G.; Lv, M. Smartphone-based activity recognition independent of device orientation and placement: Activity Recognition Independent of Device Orientation and Placement. Int. J. Commun. Syst. 2016, 29, 2403–2415. [Google Scholar] [CrossRef]
- De-La-Hoz-Franco, E.; Ariza-Colpas, P.; Quero, J.M.; Espinilla, M. Sensor-Based Datasets for Human Activity Recognition—A Systematic Review of Literature. IEEE Access 2018, 6, 59192–59210. [Google Scholar] [CrossRef]
- Anguita, D.; Ghio, A.; Oneto, L.; Parra, X.; Reyes-Ortiz, J.L. A Public Domain Dataset for Human Activity Recognition Using Smartphones. In Proceedings of the ESANN 2013 Proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges, Belgium, 24–26 April 2013; pp. 437–442. [Google Scholar]
- Micucci, D.; Mobilio, M.; Napoletano, P. UniMiB SHAR: A Dataset for Human Activity Recognition Using Acceleration Data from Smartphones. Appl. Sci. 2017, 7, 1101. [Google Scholar] [CrossRef] [Green Version]
- Bobkov, D.; Grimm, F.; Steinbach, E.; Hilsenbeck, S.; Schroth, G. Activity recognition on handheld devices for pedestrian indoor navigation. In Proceedings of the 2015 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Banff, AB, Canada, 13–16 October 2015; pp. 1–10. [Google Scholar] [CrossRef]
- Jackermeier, R.; Ludwig, B. User Behaviour in the Wild: Analysing Naturalistic Log Data of a Pedestrian Navigation App. Adv. Cartogr. Giscience ICA 2019, 2, 1–8. [Google Scholar] [CrossRef]
- Jackermeier, R.; Ludwig, B. Door Transition Detection for Long-Term Stability in Pedestrian Indoor Positioning. In Proceedings of the 2019 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Pisa, Italy, 30 September–3 October 2019; pp. 1–8. [Google Scholar] [CrossRef]
Device Placement | 1 General | 2 Loco. vs. Stat. | 3 Stat. Types | 4 Loco. vs. Door | 5 Door Types |
---|---|---|---|---|---|
unknown | 68.8 | 97.0 | 84.4 | 77.5 | 71.4 |
IN_FRONT | 74.3 | 98.7 | 93.0 | 89.7 | 82.9 |
IN_HAND_SWINGING | 68.8 | 97.0 | 84.9 | 83.1 | 78.9 |
IN_PHONE_CALL | 69.7 | 98.1 | 78.7 | 70.3 | 86.5 |
IN_POCKET | 68.4 | 97.0 | 88.8 | 73.4 | 62.6 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jackermeier, R.; Ludwig, B. Smartphone-Based Activity Recognition in a Pedestrian Navigation Context. Sensors 2021, 21, 3243. https://doi.org/10.3390/s21093243
Jackermeier R, Ludwig B. Smartphone-Based Activity Recognition in a Pedestrian Navigation Context. Sensors. 2021; 21(9):3243. https://doi.org/10.3390/s21093243
Chicago/Turabian StyleJackermeier, Robert, and Bernd Ludwig. 2021. "Smartphone-Based Activity Recognition in a Pedestrian Navigation Context" Sensors 21, no. 9: 3243. https://doi.org/10.3390/s21093243
APA StyleJackermeier, R., & Ludwig, B. (2021). Smartphone-Based Activity Recognition in a Pedestrian Navigation Context. Sensors, 21(9), 3243. https://doi.org/10.3390/s21093243