Hand Movement Activity-Based Character Input System on a Virtual Keyboard
Abstract
:1. Introduction
2. The Proposed System
2.1. Preprocessing
2.2. Feature Extraction
2.2.1. Feature Extraction of Accelerometer and Gyroscope Sensors
2.2.2. Feature Extraction of EMG Sensors
2.3. Classification Using SVM
3. System Configuration and Virtual Keyboard
4. Experimental Result Analysis
4.1. Data Collection
4.2. Signal Preprocessing and Feature Extraction
4.3. Experimental Results and Performance Analysis
5. Discussion
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Mencarini, E.; Rapp, A.; Tirabeni, L.; Zancanaro, M. Designing Wearable Systems for Sports: A Review of Trends and Opportunities in Human–Computer Interaction. IEEE Trans. Hum. Mach. Syst. 2019, 49, 314–325. [Google Scholar] [CrossRef]
- Esposito, A.; Esposito, A.M.; Vogel, C. Needs and challenges in human computer interaction for processing social emotional information. Pattern Recognit. Lett. 2015, 66, 41–51. [Google Scholar] [CrossRef]
- Sherman, W.R.; Craig, A.B. Understanding Virtual Reality: Interface, Application, and Design; Morgan Kaufmann: Burlington, MA, USA, 2018. [Google Scholar]
- Rahim, M.A.; Islam, M.R.; Shin, J. Non-Touch Sign Word Recognition Based on Dynamic Hand Gesture Using Hybrid Segmentation and CNN Feature Fusion. Appl. Sci. 2019, 9, 3790. [Google Scholar] [CrossRef] [Green Version]
- Yang, H.D. Sign language recognition with the Kinect sensor based on conditional random fields. Sensors 2015, 15, 135–147. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ramakrishnan, S.; El Emary, I.M. Speech emotion recognition approaches in human computer interaction. Telecommun. Syst. 2013, 52, 1467–1478. [Google Scholar] [CrossRef]
- Rautaray, S.S.; Agrawal, A. Vision based hand gesture recognition for human computer interaction: A survey. Artif. Intell. Rev. 2015, 43, 1–54. [Google Scholar] [CrossRef]
- Corsi, M.C.; Chavez, M.; Schwartz, D.; Hugueville, L.; Khambhati, A.N.; Bassett, D.S.; De Vico Fallani, F. Integrating eeg and meg signals to improve motor imagery classification in brain–computer interface. Int. J. Neural Syst. 2019, 29, 1850014. [Google Scholar] [CrossRef] [PubMed]
- Rahim, M.A.; Shin, J.; Islam, M.R. Gestural flick input-based non-touch interface for character input. Vis. Comput. 2019, 1–19. [Google Scholar] [CrossRef]
- Kim, J.O.; Kim, M.; Yoo, K.H. Real-time hand gesture-based interaction with objects in 3D virtual environments. Int. J. Multimed. Ubiquitous Eng. 2013, 8, 339–348. [Google Scholar] [CrossRef]
- Rusydi, M.I.; Azhar, W.; Oluwarotimi, S.W.; Rusydi, F. Towards hand gesture-based control of virtual keyboards for effective communication. IOP Conf. Ser. Mater. Sci. Eng. 2019, 602, 012030. [Google Scholar] [CrossRef]
- Wang, F.; Cui, S.; Yuan, S.; Fan, J.; Sun, W.; Tian, F. MyoTyper: A MYO-based Texting System for Forearm Amputees. In Proceedings of the Sixth International Symposium of Chinese CHI, Montreal, QC, Canada, 21–22 April 2018; pp. 144–147. [Google Scholar]
- Tsuchida, K.; Miyao, H.; Maruyama, M. Handwritten character recognition in the air by using leap motion controller. In International Conference on Human-Computer Interaction; Springer: Berlin, Germany, 2015; pp. 534–538. [Google Scholar]
- Scalera, L.; Seriani, S.; Gallina, P.; Di Luca, M.; Gasparetto, A. An experimental setup to test dual-joystick directional responses to vibrotactile stimuli. IEEE Trans. Haptics 2018, 11, 378–387. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Chen, Y.; Yu, H.; Yang, X.; Lu, W.; Liu, H. Wearing-independent hand gesture recognition method based on EMG armband. Pers. Ubiquitous Comput. 2018, 22, 511–524. [Google Scholar] [CrossRef]
- Ding, X.; Lv, Z. Design and development of an EOG-based simplified Chinese eye-writing system. Biomed. Signal Process. Control 2020, 57, 101767. [Google Scholar] [CrossRef]
- Schimmack, M.; Mercorelli, P. An on-line orthogonal wavelet denoising algorithm for high-resolution surface scans. J. Franklin Inst. 2018, 355, 9245–9270. [Google Scholar] [CrossRef]
- Mercorelli, P. Biorthogonal wavelet trees in the classification of embedded signal classes for intelligent sensors using machine learning applications. J. Franklin Inst. 2007, 344, 813–829. [Google Scholar] [CrossRef]
- Shin, J.; Islam, M.R.; Rahim, M.A.; Mun, H.J. Arm movement activity based user authentication in P2P systems. Peer-to-Peer Netw. Appl. 2019, 1–12. [Google Scholar] [CrossRef]
- Nguyen, H.; Kim, J.; Kim, J.M. Optimal sub-band analysis based on the envelope power Spectrum for effective fault detection in bearing under variable, low speeds. Sensors 2018, 18, 1389. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, W.; Yu, L.; Yoshida, T.; Wang, Q. Feature weighted confidence to incorporate prior knowledge into support vector machines for classification. Knowl. Inf. Syst. 2019, 58, 371–397. [Google Scholar] [CrossRef]
- Schölkopf, B.; Smola, A.J.; Bach, F. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond; MIT Press: Cambridge, MA, USA, 2002. [Google Scholar]
Features | Descriptions |
---|---|
Center frequency | |
RMS Frequency | |
Root variance frequency | |
Slope sign change |
SVM Kernel Function | Kernel Scale |
---|---|
Linear KF | Auto |
Quadratic KF | Auto |
Cubic KF | Auto |
Fine Gaussian KF | 0.79 |
Mediaum Gaussian KF | 3.2 |
Coarse Gaussian KF | 1.3 |
Gestures | Description | Functions |
---|---|---|
Double-tap | Twice tapping the index and thumb fingers | Input a character |
Hold-fist | Fist all the fingers of the right hand together | Change character |
Wave-left | Move all the fingers together to the left | Delete a character |
Wave-right | Move all the fingers together to the right | Line break |
Spread-fingers | Move all the fingers of the hand away from each other | Space character |
SVM Kernels | Average Classification Accuracy (%) |
---|---|
Linear | 95.92 |
Quadratic | 97.62 |
Cubic | 96.56 |
Fine Gaussian | 97.10 |
Medium Gaussian | 97.16 |
Coarse Gaussian | 95.10 |
RBF | 97.50 |
Gestures | Average Accuracy (%) |
---|---|
Double-tap | 99 |
Hold-fist | 99.3 |
Wave-left | 97.68 |
Wave-right | 98.02 |
Spread-fingers | 98.56 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rahim, M.A.; Shin, J. Hand Movement Activity-Based Character Input System on a Virtual Keyboard. Electronics 2020, 9, 774. https://doi.org/10.3390/electronics9050774
Rahim MA, Shin J. Hand Movement Activity-Based Character Input System on a Virtual Keyboard. Electronics. 2020; 9(5):774. https://doi.org/10.3390/electronics9050774
Chicago/Turabian StyleRahim, Md Abdur, and Jungpil Shin. 2020. "Hand Movement Activity-Based Character Input System on a Virtual Keyboard" Electronics 9, no. 5: 774. https://doi.org/10.3390/electronics9050774
APA StyleRahim, M. A., & Shin, J. (2020). Hand Movement Activity-Based Character Input System on a Virtual Keyboard. Electronics, 9(5), 774. https://doi.org/10.3390/electronics9050774