Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition
Abstract
:1. Introduction
2. Conceptual Design
- Step 1.
- The user/patient pre-defines a DT mapping model consisting of two-digit number groups and the real meaning represented by each number group.
- Step 2.
- The user points to two positions in a specific 8-digit space using a single finger to generate a set of two-digit number groups.
- Step 3.
- The system automatically refers to the two-digit number group generated in Step 2 and converts it into a corresponding DT gesture command using the user-defined mapping model.
- Step 4.
- The system service continuously monitors the DT gesture commands in the DT mapping model and instantly maps them to write DT signals to the input Xn point of the PLC. When the DT mapping characteristics of the input Xn point of the PLC meet the user-defined computing conditions in the logic unit, the Yn logic point of the PLC is used to control and drive external electronic devices, thereby expanding the limb function of the user/patient.
2.1. Color Space Conversion, Noise Reduction, and Color Stability Improvement
2.2. Distinguishing the Problem of Uneven Color Brightness in the Foreground and Background
2.3. Segmentation of Images Using Otsu’s Algorithm
2.4. Derivation of Method for Extracting Object Foreground and Background Differences
2.5. Reassembling and Generating Speech Sentences Using DT Mapping Reference Model and Method
2.6. Combining Finger Gesture Values into Control Commands and then Mapping the Control Signals to the Method of Extending the Limbs
3. Experimental Results and Discussion
- I.
- Enabling the DT-HMIS collaboration for industrial and domestic applications, such as assisting individuals who cannot touch device switches quickly or directly due to oily or wet hands-on production lines.
- II.
- Used in medical applications [12] for assessing the performance of a patient’s limbs and brain function during awake brain surgery to avoid damage to nerves.
- III.
- Evaluating the recovery status of limbs or brain health before and after surgery.
- IV.
- Assisting patients with mobility impairments to control peripheral electromechanical devices.
- V.
- Helping non-speaking individuals to reorganize vocabulary to form coherent sentences.
- VI.
- Assisting disabled individuals with limb extension training and rehabilitation exercises.
- VII.
- Allowing doctors to use gestures to view X-ray images on a distant screen during surgery.
- VIII.
- Assessing the mental age and maturity of children.
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Wang, H.; Lv, L.; Li, X.; Li, H.; Leng, J.; Zhang, Y.; Thomson, V.; Liu, G.; Wen, X.; Sun, C.; et al. A safety management approach for Industry 5.0’s human-centered manufacturing based on digital-twin. J. Manuf. Syst. 2023, 66, 1–12. [Google Scholar] [CrossRef]
- Hultman, H.; Cedergren, S.; Wärmefjord, K.; Söderberg, R. Predicting Geometrical Variation in Fabricated Assemblies Using a Digital-twin Approach Including a Novel Non-Nominal Welding Simulation. Aerospace 2022, 9, 512. [Google Scholar] [CrossRef]
- Han, X.; Lin, Z.; Clark, C.; Vucetic, B.; Lomax, S. AI Based Digital-twin Model for Cattle Caring. Sensors 2022, 22, 7118. [Google Scholar] [CrossRef] [PubMed]
- Ghandar, A.; Ahmed, A.; Zulfiqar, S.; Hua, Z.; Hanai, M.; Theodoropoulos, G. A Decision Support System for Urban Agriculture Using Digital-twin: A Case Study with Aquaponics. IEEE Access 2021, 9, 35691–35708. [Google Scholar] [CrossRef]
- Sasikumar, A.; Vairavasundaram, S.; Kotecha, K.; Indragandhi, V.; Ravi, L.; Selvachandran, G.; Abraham, A. Blockchain-based trust mechanism for digital-twin empowered Industrial Internet of Things. Future Gener. Comput. Syst. 2023, 141, 16–27. [Google Scholar]
- Davis, S.P.; Ashayer, A.; Tabrizi, N. Predicting Sex and Age using Swipe-Gesture Data from a Mobile Device. In Proceedings of the 2020 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom), Exeter, UK, 17–19 December 2020; pp. 1136–1143. [Google Scholar]
- Pulfrey, J.; Hossain, M.S. Zoom gesture analysis for age-inappropriate internet content filtering. Expert Syst. Appl. 2022, 199, 116869. [Google Scholar] [CrossRef]
- Guarino, A.; Malandrino, D.; Zaccagnino, R.; Capo, C.; Lettieri, N. Touchscreen gestures as images. A transfer learning approach for soft biometric traits recognition. Expert Syst. Appl. 2023, 219, 119614. [Google Scholar] [CrossRef]
- Nguyen, T.; Roy, A.; Memon, N. Kid on the phone! Toward automatic detection of children on mobile devices. Comput. Secur. 2019, 84, 334–348. [Google Scholar] [CrossRef] [Green Version]
- Zaccagnino, R.; Capo, C.; Guarino, A.; Lettieri, N.; Malandrino, D. Techno-regulation and intelligent safeguards. Multimed. Tools Appl. 2021, 80, 15803–15824. [Google Scholar] [CrossRef]
- Gallala, A.; Kumar, A.A.; Hichri, B.; Plapper, P. Digital-twin for Human–Robot Interactions by Means of Industry 4.0 Enabling Technologies. Sensors 2022, 22, 4950. [Google Scholar] [CrossRef]
- Zhao, G.-R.; Cheng, Y.-F.; Feng, K.-K.; Wang, M.; Wang, Y.-G.; Wu, Y.-Z.; Yin, S.-Y. Clinical Study of Intraoperative Microelectrode Recordings during Awake and Asleep Subthalamic Nucleus Deep Brain Stimulation for Parkinson’s Disease: A Retrospective Cohort Study. Brain Sci. 2022, 12, 1469. [Google Scholar] [CrossRef] [PubMed]
- Yu, W.; Yamaguchi, H.; Yokoi, H.; Maruishi, M.; Mano, Y.; Kakazu, Y. EMG automatic switch for FES control for hemiplegics using artificial neural network. Robot. Auton. Syst. 2002, 40, 213–224. [Google Scholar] [CrossRef]
- Clark, R.A.; Pua, Y.-H.; Fortin, K.; Ritchie, C.; Webster, K.E.; Denehy, L.; Bryant, A.L. Validity of the Microsoft Kinect for assessment of postural control. Gait Posture 2012, 36, 372–377. [Google Scholar] [CrossRef] [PubMed]
- Hu, X.; Nenov, V. Multivariate AR modeling of electromyography for the classification of upper arm movements. Clin. Neurophysiol. 2004, 115, 1276–1287. [Google Scholar] [CrossRef]
- Raheja, J.L.; Chaudhary, A.; Maheshwari, S. Hand gesture pointing location detection. Optik 2014, 125, 993–996. [Google Scholar] [CrossRef]
- Kılıboz, N.Ç.; Güdükbay, U. A hand gesture recognition technique for human–computer interaction. J. Vis. Commun. Image Represent. 2015, 28, 97–104. [Google Scholar]
- Lin, J.; Ding, Y. A temporal hand gesture recognition system based on hog and motion trajectory. Optik 2013, 124, 6795–6798. [Google Scholar] [CrossRef]
- Mo, D.-H.; Wu, Y.-C.; Lin, C.-S. The Dynamic Image Analysis of Retaining Wall Crack Detection and Gap Hazard Evaluation Method with Deep Learning. Appl. Sci. 2022, 12, 9289. [Google Scholar] [CrossRef]
- Knibbe, J.; Seah, S.A.; Fraser, M. VideoHandles: Searching through action camera videos by replicating hand gestures. Comput. Graph. 2015, 48, 99–106. [Google Scholar] [CrossRef]
- Zhou, Y.; Jiang, G.; Lin, Y. A novel finger and hand pose estimation technique for real-time hand gesture recognition. Pattern Recognit. 2016, 49, 102–114. [Google Scholar] [CrossRef]
- Suau, X.; Alcoverro, M.; López-Méndez, A.; Ruiz-Hidalgo, J.; Casas, J.R. Real-time fingertip localization conditioned on hand gesture classification. Image Vis. Comput. 2014, 32, 522–532. [Google Scholar] [CrossRef] [Green Version]
- Maqueda, A.I.; del-Blanco, C.R.; Jaureguizar, F.; García, N. Human–computer interaction based on visual hand–gesture recognition using volumetric spatiograms of local binary patterns. Comput. Vis. Image Underst. 2015, 141, 126–137. [Google Scholar] [CrossRef] [Green Version]
- D’Orazio, T.; Marani, R.; Renò, V.; Cicirelli, G. Recent trends in gesture recognition: How depth data has improved classical approaches. Image Vis. Comput. 2016, 52, 56–72. [Google Scholar] [CrossRef]
- Lee, Y.W. Implementation of an interactive interview system using hand gesture recognition. Neurocomputing 2013, 116, 272–279. [Google Scholar] [CrossRef]
- Rempel, D.; Camilleri, M.J.; Lee, D.L. The design of hand gestures for human–computer interaction: Lessons from sign language interpreters. Int. J. Hum.–Comput. Stud. 2014, 72, 728–735. [Google Scholar] [CrossRef] [Green Version]
- Lin, C.S.; Li, K.C.; Chen, C.T.; Chang, C.C.; Chung, D.S. Hand Gesture Recognition in a Leg Sport System. J. Biomed. Eng.-Appl. Basis Commun. 2009, 21, 97–105. [Google Scholar] [CrossRef]
- Song, Y.; Sun, Y.; Zhang, H.; Wang, F. Activity testing model for automatic correction of hand pointing. Inf. Process. Lett. 2016, 116, 653–659. [Google Scholar] [CrossRef] [Green Version]
- Chang, C.M.; Lin, C.S.; Chen, W.C.; Chen, C.T.; Hsu, Y.L. Development and Application of a Human-Machine Interface Using Head Control and Flexible Numeric Tables for Severely Disabled. Appl. Sci. 2020, 10, 7005. [Google Scholar] [CrossRef]
- Pisella, L.; Grea, H.; Tilikete, C.; Vighetto, A.; Desmurget, M.; Rode, G.; Boisson, D.; Rossetti, Y. An ‘automatic pilot’ for the hand in human pos-terior parietal cortex: Toward reinterpreting optic ataxia. Nat. Neu-Rosci. 2000, 3, 729–736. [Google Scholar] [CrossRef]
- Markakis, E.; Nikoloudakis, Y.; Pallis, E.; Manso, M. Security assessment as a service cross-layered system for the adoption of digital, personalised and trusted healthcare. In Proceedings of the IEEE 5th World Forum Internet Things (WF-IoT), Limerick, Ireland, 15–18 April 2019; pp. 91–94. [Google Scholar]
- Tao, H.; Bhuiyan, M.Z.A.; Abdalla, A.N.; Hassan, M.M.; Zain, J.M.; Hayajneh, T. Secured data collection with hardware-based ciphers for IoT-based healthcare. In Proceedings of the 2019 IEEE 5th World Forum on Internet of Things (WF-IoT), Limerick, Ireland, 15–18 April 2019; pp. 91–94. [Google Scholar]
- Nausheen, F.; Begum, S.H. Healthcare IoT: Benefits, vulnerabilities and solutions. In Proceedings of the 2018 2nd International Conference on Inventive Systems and Control (ICISC), Coimbatore, India, 19–20 January 2018; pp. 517–522. [Google Scholar]
Image Mapping to Device Control Table | ||
---|---|---|
DT Vision Sensing Code | DT Vision Sensing Code | DT Vision Sensing Code |
1,1 | 11.mp3 | Let me have a look |
1,2 | 12.mp3 | Please call mom for me |
8,1 | 81.mp3 | It’s up to you |
DT Vision Sensing Code | Input Logical Point | Electronic Control Signal Output Result |
---|---|---|
1,1 | X0 ON | Y0 Bed Move Up is On |
1,2 | X1 ON | Y1 Bed Move Down is On |
8,1 | X2 ON | Y2 Help Lamp is On |
8,1 | X2 OFF | Y2 Help Lamp is Off |
Component | Description |
---|---|
Component | Intel Core i9-12900H processor |
PC | 16GB RAM |
RAM | NVIDIA GeForce RTX 3050 Ti with 4GB VRAM |
Graphics Card | 1080P Resolution and Python & Open CV |
Camera and Library | Windows 10 |
Operating System | MITSUBISHI-Q03UDE, Output Module QY10 DC 24V/2A |
Peripheral Equipment | Fan and Bed, and Speaker. |
Number of Experimenters | There are 22 people involved in the study, which does not involve any invasive or assigned medical actions and is conducted with voluntary assistance. Therefore, an ethics committee review is not Required |
Frequently-Used Command | Output Function | Average Time(s) |
---|---|---|
Let me have a look and PLC X0 | (1,1) | 6.99 |
Please call mom for me and PLC X1 | (1,2) | 8.58 |
It’s up to you and PLC X2 | (8,1) | 9.895 |
Biocompatibility 22 Items | DT-HMIS Type 3 | Wearable Type 2 | Touch Type 1 |
---|---|---|---|
Users-defined DT model | √ | ||
Users-defined control logic | √ | ||
Distant operation | √ | √ | |
Low cost | √ | √ | |
Customizable The DT model | √ | ||
Unrestricted light source | √ | √ | |
Contact | √ | √ | |
Non-contact | √ | ||
Extended limbs | √ | ||
Custom commands | √ | ||
Reorganized word pronunciation | √ | ||
Single finger operation | √ | ||
No need to hold by hand | √ | √ | |
Vitality assessment | √ | √ | |
Memory evaluation | √ | √ | √ |
Rehabilitation guidance | √ | √ | √ |
Age prediction | √ | √ | √ |
Training responsiveness | √ | √ | √ |
Hands will not get dirty | √ | ||
Not requiring battery power | √ | ||
No risk of electric shock | √ | ||
It won’t hurt | √ |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mo, D.-H.; Tien, C.-L.; Yeh, Y.-L.; Guo, Y.-R.; Lin, C.-S.; Chen, C.-C.; Chang, C.-M. Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition. Sensors 2023, 23, 3509. https://doi.org/10.3390/s23073509
Mo D-H, Tien C-L, Yeh Y-L, Guo Y-R, Lin C-S, Chen C-C, Chang C-M. Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition. Sensors. 2023; 23(7):3509. https://doi.org/10.3390/s23073509
Chicago/Turabian StyleMo, Dong-Han, Chuen-Lin Tien, Yu-Ling Yeh, Yi-Ru Guo, Chern-Sheng Lin, Chih-Chin Chen, and Che-Ming Chang. 2023. "Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition" Sensors 23, no. 7: 3509. https://doi.org/10.3390/s23073509
APA StyleMo, D. -H., Tien, C. -L., Yeh, Y. -L., Guo, Y. -R., Lin, C. -S., Chen, C. -C., & Chang, C. -M. (2023). Design of Digital-Twin Human-Machine Interface Sensor with Intelligent Finger Gesture Recognition. Sensors, 23(7), 3509. https://doi.org/10.3390/s23073509