Next Article in Journal
Optimum Pitch of Volumetric Computational Reconstruction in Integral Imaging
Previous Article in Journal
KARAN: Mitigating Feature Heterogeneity and Noise for Efficient and Accurate Multimodal Medical Image Segmentation
Previous Article in Special Issue
Improving Churn Detection in the Banking Sector: A Machine Learning Approach with Probability Calibration Techniques
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Know Your Grip: Real-Time Holding Posture Recognition for Smartphones

by
Rene Hörschinger
1,†,
Marc Kurz
2,*,†,‡ and
Erik Sonnleitner
2,†,‡
1
Windpuls GmbH, 4020 Linz, Austria
2
Department for Smart and Interconnected Living (SAIL), University of Applied Sciences Upper Austria, 4232 Hagenberg, Austria
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Current address: Softwarepark 11, 4232 Hagenberg, Austria.
Electronics 2024, 13(23), 4596; https://doi.org/10.3390/electronics13234596
Submission received: 11 October 2024 / Revised: 11 November 2024 / Accepted: 15 November 2024 / Published: 21 November 2024
(This article belongs to the Special Issue Applied Machine Learning in Intelligent Systems)

Abstract

This paper introduces a model that predicts four common smartphone-holding postures, aiming to enhance user interface adaptability. It is unique in being completely independent of platform and hardware, utilizing the inertial measurement unit (IMU) for real-time posture detection based on sensor data collected around tap gestures. The model identifies whether the user is holding and operating the smartphone with one hand or using both hands in different configurations. For model training and validation, sensor time series data undergo extensive feature extraction, including statistical, frequency, magnitude, and wavelet analyses. These features are incorporated into 74 distinct sets, tested across various machine learning frameworks—k-nearest neighbors (KNN), support vector machine (SVM), and random forest (RF)—and evaluated for their effectiveness using metrics such as cross-validation scores, test accuracy, Kappa statistics, confusion matrices, and ROC curves. The optimized model demonstrates a high degree of accuracy, successfully predicting the holding hand with a 95.7% success rate. This approach highlights the potential of leveraging sensor data to improve mobile user experiences by adapting interfaces to natural user interactions.
Keywords: smartphone posture recognition; smartphone interaction; human–computer interaction (HCI); machine learning (ML); intelligent system smartphone posture recognition; smartphone interaction; human–computer interaction (HCI); machine learning (ML); intelligent system

Share and Cite

MDPI and ACS Style

Hörschinger, R.; Kurz, M.; Sonnleitner, E. Know Your Grip: Real-Time Holding Posture Recognition for Smartphones. Electronics 2024, 13, 4596. https://doi.org/10.3390/electronics13234596

AMA Style

Hörschinger R, Kurz M, Sonnleitner E. Know Your Grip: Real-Time Holding Posture Recognition for Smartphones. Electronics. 2024; 13(23):4596. https://doi.org/10.3390/electronics13234596

Chicago/Turabian Style

Hörschinger, Rene, Marc Kurz, and Erik Sonnleitner. 2024. "Know Your Grip: Real-Time Holding Posture Recognition for Smartphones" Electronics 13, no. 23: 4596. https://doi.org/10.3390/electronics13234596

APA Style

Hörschinger, R., Kurz, M., & Sonnleitner, E. (2024). Know Your Grip: Real-Time Holding Posture Recognition for Smartphones. Electronics, 13(23), 4596. https://doi.org/10.3390/electronics13234596

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop