Robust Identification System for Spanish Sign Language Based on Three-Dimensional Frame Information
Abstract
:1. Introduction
1.1. Technologies
1.2. Relative Works
1.3. Our Proposal
1.4. Structure of the Paper
2. Materials and Methods
2.1. Introduccion
2.2. Leap Motion Controller
2.3. Acquisition
2.4. Data Preprocessing
2.5. Pattern Generation
2.5.1. Parameters
2.5.2. Generation
2.6. Comparator
2.6.1. Introduction to DTW
2.6.2. Development of the DTW
- Boundary condition; w1 = (1, 1) y wK = (m, n)
- Continuity; Si wk-1 = (A’, B’), then the next point W for the path k = (A, B) must comply (A-A’) <= 1 y (B’) <= 1
- Monotone; Si wk-1 = (A’, B’), then the next point W for the path k = (A, B) must comply 0 <= (A-A’) y 0 <= (B-B’).
2.6.3. Comparator Scheme
2.7. Implementation
3. Experimental Methodologies
3.1. Procedures
3.2. System Quality
3.2.1. Confusion Matrix
- False Positive [FP]; A False Positive occurs when the model expects an output, and it does not occur.
- False Negative [FN]; A False Negative occurs when the model does not expect an output that is eventually produced.
- True Positive [TP]; A True Positive is when the model expects an output, and it does.
- True Negative [TN]; A True Negative is when the model does not expect an output and it does not eventually produce one.
3.2.2. Precision
3.2.3. Recall
3.2.4. F1
3.2.5. Accuracy
4. Results
4.1. Experiment
4.2. Experiment for Validation
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | |
---|---|---|---|---|---|---|---|---|---|---|
0 | info.frame_id | tracking_frame_id | info.timestamp | framerate | nHands | confidence | visible_time | id | type | palm.position.x |
1 | palm.position.y | palm.position.z | pinch_distance | grab_angle | pinch_strength | grab_strength | palm.velocity.x | palm.velocity.y | palm.velocity.z | palm.normal.x |
2 | palm.normal.y | palm.normal.z | palm.width | palm.direction.x | palm.direction.y | palm.direction.z | palm.orientation.w | palm.orientation.x | palm.orientation.y | palm.orientation.z |
3 | 0 | digits(0).finger_id | Digits(0).is_extended | Digits(0).bones(0).width | Digits(0).bones(0).next_joint.x | Digits(0).bones(0).next_joint.y | Digits(0).bones(0).next_joint.z | Digits(0).bones(0).prev_joint.x | Digits(0).bones(0).prev_joint.y | Digits(0).bones(0).prev_joint.z |
4 | Digits(0).bones(0).rotation.w | Digits(0).bones(0).rotation.x | Digits(0).bones(0).rotation.y | Digits(0).bones(0).rotation.z | digits(0).bones(1).width | digits(0).bones(1).next_joint.x | Digits(0).bones(1).next_joint.y | Digits(0).bones(1).next_joint.z | digits(0).bones(1).prev_joint.x | Digits(0).bones(1).prev_joint.y |
5 | Digits(0).bones(1).prev_joint.z | Digits(0).bones(1).rotation.w | Digits(0).bones(1).rotation.x | Digits(0).bones(1).rotation.y | Digits(0).bones(1).rotation.z | Digits(0).bones(2).width | Digits(0).bones(2).next_joint.x | Digits(0).bones(2).next_joint.y | Digits(0).bones(2).next_joint.z | Digits(0).bones(2).prev_joint.x |
6 | Digits(0).bones(2).prev_joint.y | Digits(0).bones(2).prev_joint.z | Digits(0).bones(2).rotation.w | Digits(0).bones(2).rotation.x | Digits(0).bones(2).rotation.y | Digits(0).bones(2).rotation.z | Digits(0).bones(3).width | Digits(0).bones(3).next_joint.x | Digits(0).bones(3).next_joint.y | Digits(0).bones(3).next_joint.z |
7 | Digits(0).bones(3).prev_joint.x | Digits(0).bones(3).prev_joint.y | Digits(0).bones(3).prev_joint.z | Digits(0).bones(3).rotation.w | Digits(0).bones(3).rotation.x | Digits(0).bones(3).rotation.y | Digits(0).bones(3).rotation.z | 1 | Digits(1).finger_id | Digits(1).is_extended |
8 | Digits(1).bones(0).width | Digits(1).bones(0).next_joint.x | Digits(1).bones(0).next_joint.y | Digits(1).bones(0).next_joint.z | Digits(1).bones(0).prev_joint.x | Digits(1).bones(0).prev_joint.y | Digits(1).bones(0).prev_joint.z | Digits(1).bones(0).rotation.w | Digits(1).bones(0).rotation.x | Digits(1).bones(0).rotation.y |
9 | Digits(1).bones(0).rotation.z | Digits(1).bones(1).width | Digits(1).bones(1).next_joint.x | Digits(1).bones(1).next_joint.y | Digits(1).bones(1).next_joint.z | Digits(1).bones(1).prev_joint.x | Digits(1).bones(1).prev_joint.y | Digits(1).bones(1).prev_joint.z | Digits(1).bones(1).rotation.w | Digits(1).bones(1).rotation.x |
10 | Digits(1).bones(1).rotation.y | Digits(1).bones(1).rotation.z | Digits(1).bones(2).width | Digits(1).bones(2).next_joint.x | Digits(1).bones(2).next_joint.y | Digits(1).bones(2).next_joint.z | Digits(1).bones(2).prev_joint.x | Digits(1).bones(2).prev_joint.y | Digits(1).bones(2).prev_joint.z | Digits(1).bones(2).rotation.w |
11 | Digits(1).bones(2).rotation.x | Digits(1).bones(2).rotation.y | Digits(1).bones(2).rotation.z | Digits(1).bones(3).width | Digits(1).bones(3).next_joint.x | Digits(1).bones(3).next_joint.y | Digits(1).bones(3).next_joint.z | Digits(1).bones(3).prev_joint.x | Digits(1).bones(3).prev_joint.y | Digits(1).bones(3).prev_joint.z |
12 | Digits(1).bones(3).rotation.w | Digits(1).bones(3).rotation.x | Digits(1).bones(3).rotation.y | Digits(1).bones(3).rotation.z | 2 | Digits(2).finger_id | Digits(2).is_extended | Digits(2).bones(0).width | Digits(2).bones(0).next_joint.x | Digits(2).bones(0).next_joint.y |
13 | Digits(2).bones(0).next_joint.z | Digits(2).bones(0).prev_joint.x | Digits(2).bones(0).prev_joint.y | Digits(2).bones(0).prev_joint.z | Digits(2).bones(0).rotation.w | Digits(2).bones(0).rotation.x | Digits(2).bones(0).rotation.y | Digits(2).bones(0).rotation.z | Digits(2).bones(1).width | Digits(2).bones(1).next_joint.x |
14 | Digits(2).bones(1).next_joint.y | Digits(2).bones(1).next_joint.z | Digits(2).bones(1).prev_joint.x | Digits(2).bones(1).prev_joint.y | Digits(2).bones(1).prev_joint.z | Digits(2).bones(1).rotation.w | Digits(2).bones(1).rotation.x | Digits(2).bones(1).rotation.y | Digits(2).bones(1).rotation.z | Digits(2).bones(2).width |
15 | Digits(2).bones(2).next_joint.x | Digits(2).bones(2).next_joint.y | digits(2).bones(2).next_joint.z | digits(2).bones(2).prev_joint.x | digits(2).bones(2).prev_joint.y | digits(2).bones(2).prev_joint.z | digits(2).bones(2).rotation.w | digits(2).bones(2).rotation.x | digits(2).bones(2).rotation.y | digits(2).bones(2).rotation.z |
16 | digits(2).bones(3).width | digits(2).bones(3).next_joint.x | digits(2).bones(3).next_joint.y | digits(2).bones(3).next_joint.z | digits(2).bones(3).prev_joint.x | digits(2).bones(3).prev_joint.y | digits(2).bones(3).prev_joint.z | digits(2).bones(3).rotation.w | digits(2).bones(3).rotation.x | digits(2).bones(3).rotation.y |
17 | digits(2).bones(3).rotation.z | 3 | digits(3).finger_id | digits(3).is_extended | digits(3).bones(0).width | digits(3).bones(0).next_joint.x | digits(3).bones(0).next_joint.y | digits(3).bones(0).next_joint.z | digits(3).bones(0).prev_joint.x | digits(3).bones(0).prev_joint.y |
18 | digits(3).bones(0).prev_joint.z | digits(3).bones(0).rotation.w | digits(3).bones(0).rotation.x | digits(3).bones(0).rotation.y | digits(3).bones(0).rotation.z | digits(3).bones(1).width | digits(3).bones(1).next_joint.x | digits(3).bones(1).next_joint.y | digits(3).bones(1).next_joint.z | digits(3).bones(1).prev_joint.x |
19 | digits(3).bones(1).prev_joint.y | digits(3).bones(1).prev_joint.z | digits(3).bones(1).rotation.w | digits(3).bones(1).rotation.x | digits(3).bones(1).rotation.y | digits(3).bones(1).rotation.z | digits(3).bones(2).width | digits(3).bones(2).next_joint.x | digits(3).bones(2).next_joint.y | digits(3).bones(2).next_joint.z |
20 | digits(3).bones(2).prev_joint.x | digits(3).bones(2).prev_joint.y | digits(3).bones(2).prev_joint.z | digits(3).bones(2).rotation.w | digits(3).bones(2).rotation.x | digits(3).bones(2).rotation.y | digits(3).bones(2).rotation.z | digits(3).bones(3).width | digits(3).bones(3).next_joint.x | digits(3).bones(3).next_joint.y |
21 | digits(3).bones(3).next_joint.z | digits(3).bones(3).prev_joint.x | digits(3).bones(3).prev_joint.y | digits(3).bones(3).prev_joint.z | digits(3).bones(3).rotation.w | digits(3).bones(3).rotation.x | digits(3).bones(3).rotation.y | digits(3).bones(3).rotation.z | 4 | digits(4).finger_id |
22 | digits(4).is_extended | digits(4).bones(0).width | digits(4).bones(0).next_joint.x | digits(4).bones(0).next_joint.y | digits(4).bones(0).next_joint.z | digits(4).bones(0).prev_joint.x | digits(4).bones(0).prev_joint.y | digits(4).bones(0).prev_joint.z | digits(4).bones(0).rotation.w | digits(4).bones(0).rotation.x |
23 | digits(4).bones(0).rotation.y | digits(4).bones(0).rotation.z | digits(4).bones(1).width | digits(4).bones(1).next_joint.x | digits(4).bones(1).next_joint.y | digits(4).bones(1).next_joint.z | digits(4).bones(1).prev_joint.x | digits(4).bones(1).prev_joint.y | digits(4).bones(1).prev_joint.z | digits(4).bones(1).rotation.w |
24 | digits(4).bones(1).rotation.x | digits(4).bones(1).rotation.y | digits(4).bones(1).rotation.z | digits(4).bones(2).width | digits(4).bones(2).next_joint.x | digits(4).bones(2).next_joint.y | digits(4).bones(2).next_joint.z | digits(4).bones(2).prev_joint.x | digits(4).bones(2).prev_joint.y | digits(4).bones(2).prev_joint.z |
25 | digits(4).bones(2).rotation.w | digits(4).bones(2).rotation.x | digits(4).bones(2).rotation.y | digits(4).bones(2).rotation.z | digits(4).bones(3).width | digits(4).bones(3).next_joint.x | digits(4).bones(3).next_joint.y | digits(4).bones(3).next_joint.z | digits(4).bones(3).prev_joint.x | digits(4).bones(3).prev_joint.y |
26 | digits(4).bones(3).prev_joint.z | digits(4).bones(3).rotation.w | digits(4).bones(3).rotation.x | digits(4).bones(3).rotation.y | digits(4).bones(3).rotation.z | arm.width | arm.next_joint.x | arm.next_joint.y | arm.next_joint.z | arm.prev_joint.x |
27 | arm.prev_joint.y | arm.prev_joint.z | arm.rotation.w | arm.rotation.x | arm.rotation.y | arm.rotation.z |
References
- United Nations. International Day of Sign Languages. Available online: https://www.un.org/en/observances/sign-languages-day (accessed on 19 December 2022).
- Premaratne, P.; Nguyen, Q.; Premaratne, M. Human Computer Interaction Using Hand Gestures. In Advanced Intelligent Computing Theories and Applications; Huang, D.-S., McGinnity, M., Heutte, L., Zhang, X.-P., Eds.; Springer: Berlin, Heidelberg, 2010; pp. 381–386. [Google Scholar]
- LaViola, J.J.J. A Survey of Hand Posture and Gesture Recognition Techniques and Technology 1999. Brown Univ. Provid. RI. 1999. Available online: https://www.semanticscholar.org/paper/A-Survey-of-Hand-Posture-and-Gesture-Recognition-LaViola/856d4bf0f1f5d4480ce3115d828f34d4b2782e1c (accessed on 12 July 2022).
- CyberGlove Systems LLC. Available online: http://www.cyberglovesystems.com/ (accessed on 25 August 2022).
- Hernandez-Rebollar, J.L.; Kyriakopoulos, N.; Lindeman, R.W. The AcceleGlove: A whole-hand input device for virtual reality. In Proceedings of the ACM SIGGRAPH 2002 Conference Abstracts and Applications; Association for Computing Machinery: New York, NY, USA, 2002; p. 259. [Google Scholar]
- Barreto, A.; Scargle, S.; Adjouadi, M. Hands-off human-computer interfaces for individuals with severe motor disabilities. In Proceedings of the on Human-Computer Interaction: Communication, Cooperation, and Application Design, Hillsdale, NJ, USA, 22–26 August 1999; L. Erlbaum Associates Inc.: Munich, Alemania; Volume 2, pp. 970–974. [Google Scholar]
- Coleman, K. Electromyography based human-computer-interface to induce movement in elderly persons with movement impairments. In Proceedings of the 2001 EC/NSF Workshop on Universal Accessibility of Ubiquitous Computing: Providing for the Elderly, Alcácer do Sal, Portugal, 22–25 May 2001; Association for Computing Machinery: New York, NY, USA, 2001; pp. 75–79. [Google Scholar]
- Guerreiro, T.; Jorge, J. EMG as a daily wearable interface. In Proceedings of the First International Conference on Computer Graphics Theory and Applications, Setúbal, Portugal, 25–28 February 2006; pp. 216–223. [Google Scholar]
- Ahsan, R.; Ibrahimy, M.I.; Khalifa, O.O. EMG Signal Classification for Human Computer Interaction: A Review. Eur. J. Sci. Res. 2009, 33, 480–501. [Google Scholar]
- Booij, W.E.; Welle, K.O. Ultrasound detectors. US8792305B2, 29 July 2014. Available online: https://patents.google.com/patent/US8792305B2/en (accessed on 12 July 2022).
- Saad, M.; Bleakley, C.J.; Nigram, V.; Kettle, P. Ultrasonic hand gesture recognition for mobile devices. J. Multimodal. User Interfaces 2018, 12, 31–39. [Google Scholar] [CrossRef]
- Sang, Y.; Shi, L.; Liu, Y. Micro Hand Gesture Recognition System Using Ultrasonic Active Sensing. IEEE Access 2018, 6, 49339–49347. [Google Scholar] [CrossRef]
- Asadzadeh, P.; Kulik, L.; Tanin, E. Gesture recognition using RFID technology. Pers. Ubiquit. Comput. 2012, 16, 225–234. [Google Scholar] [CrossRef]
- Bouchard, K.; Bouzouane, A.; Bouchard, B. Gesture recognition in smart home using passive RFID technology. In Proceedings of the 7th International Conference on PErvasive Technologies Related to Assistive Environments, Rhodes, Greece, 27–30 May 2014; Association for Computing Machinery: Nueva York, NY, USA, 2014; pp. 1–8. [Google Scholar]
- Jayatilaka, A.; Ranasinghe, D.C. Real-time fluid intake gesture recognition based on batteryless UHF RFID technology. Pervasive Mob. Comput. 2017, 34, 146–156. [Google Scholar] [CrossRef]
- Wen, Y.; Hu, C.; Yu, G.; Wang, C. A robust method of detecting hand gestures using depth sensors. In Proceedings of the 2012 IEEE International Workshop on Haptic Audio Visual Environments and Games (HAVE 2012), Munich, Germany, 8–9 October 2012; pp. 72–77. [Google Scholar]
- API Overview—Leap Motion JavaScript SDK v3.2 Beta Documentation. Available online: https://developer-archive.leapmotion.com/documentation/javascript/devguide/Leap_Overview.html (accessed on 12 July 2022).
- Funasaka, M.; Ishikawa, Y.; Takata, M.; Joe, K. Sign Language Recognition using Leap Motion; In Proceedings of the International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA). Las Vegas, NV, USA, 25–28 July 2016; pp. 263–269. [Google Scholar]
- Marin, G.; Dominio, F.; Zanuttigh, P. Hand gesture recognition with leap motion and kinect devices. In Proceedings of the 2014 IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October 2014; pp. 1565–1569. [Google Scholar]
- Simos, M.; Nikolaidis, N. Greek sign language alphabet recognition using the leap motion device. In Proceedings of the 9th Hellenic Conference on Artificial Intelligence, Thessaloniki, Greece, 18–20 May 2016; Association for Computing Machinery: Nueva York, NY, USA, 2016; pp. 1–4. [Google Scholar]
- Mapari, R.B.; Kharat, G. American Static Signs Recognition Using Leap Motion Sensor. In Proceedings of the Second International Conference on Information and Communication Technology for Competitive Strategies, Udaipur, India, 4–5 March 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 1–5. [Google Scholar]
- Vaitkevičius, A.; Taroza, M.; Blažauskas, T.; Damaševičius, R.; Maskeliūnas, R.; Woźniak, M. Recognition of American Sign Language Gestures in a Virtual Reality Using Leap Motion. Appl. Sci. 2019, 9, 445. [Google Scholar] [CrossRef] [Green Version]
- Mohandes, M.; Deriche, M.; Liu, J. Image-Based and Sensor-Based Approaches to Arabic Sign Language Recognition. IEEE Trans. Hum. -Mach. Syst. 2014, 44, 551–557. [Google Scholar] [CrossRef]
- Mohandes, M.; Aliyu, S.; Deriche, M. Arabic Sign Language Recognition using the Leap Motion Controller. In Proceedings of the 2014 IEEE 23rd International Symposium on Industrial Electronics (ISIE), Istanbul, Turkey, 1–4 June 2014; pp. 960–965. [Google Scholar]
- Hisham, B.; Hamouda, A. Arabic Sign Language Recognition using Microsoft Kinect and Leap Motion Controller. In Proceedings of the 11th International Conference on Informatics & Systems (INFOS 2018), Rochester, NY, USA, 27 October 2018. [Google Scholar]
- Naglot, D.; Kulkarni, M. Real time sign language recognition using the leap motion controller. In Proceedings of the 2016 International Conference on Inventive Computation Technologies (ICICT), Coimbatore, India, 26–27 August 2016; Volume 3, pp. 1–5. [Google Scholar]
- Chong, T.-W.; Lee, B.G. American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach. Sensors 2018, 18, 3554. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lee, C.K.M.; Ng, K.K.H.; Chen, C.-H.; Lau, H.C.W.; Chung, S.Y.; Tsoi, T. American sign language recognition and training method with recurrent neural network. Expert Syst. Appl. 2021, 167, 114403. [Google Scholar] [CrossRef]
- Tao, W.; Lai, Z.-H.; Leu, M.C.; Yin, Z. American Sign Language Alphabet Recognition Using Leap Motion Controller. In Proceedings of the IIE Annual Conference, Orlando, FL, USA, 19–22 May 2018; ProQuest: Orlando, FL, USA, 2018; pp. 599–604. Available online: https://www.proquest.com/scholarly-journals/american-sign-language-alphabet-recognition-using/docview/2553578468/se-2 (accessed on 19 December 2022).
- Anwar, A.; Basuki, A.; Sigit, R.; Rahagiyanto, A.; Zikky, M. Feature Extraction for Indonesian Sign Language (SIBI) Using Leap Motion Controller. In Proceedings of the 2017 21st International Computer Science and Engineering Conference (ICSEC), Bangkok, Thailand, 15–18 November 2017; pp. 1–5. [Google Scholar]
- Alnahhas, A.; Alkhatib, B. Enhancing The Recognition Of Arabic Sign Language By Using Deep Learning And Leap Motion Controller. Int. J. Sci. Technol. Res. 2020, 9, 1865–1870. [Google Scholar]
- Avola, D.; Bernardi, M.; Cinque, L.; Foresti, G.L.; Massaroni, C. Exploiting Recurrent Neural Networks and Leap Motion Controller for the Recognition of Sign Language and Semaphoric Hand Gestures. IEEE Trans. Multimed. 2019, 21, 234–245. [Google Scholar] [CrossRef] [Green Version]
- Elons, A.S.; Ahmed, M.; Shedid, H.; Tolba, M.F. Arabic sign language recognition using leap motion sensor. In Proceedings of the 2014 9th International Conference on Computer Engineering & Systems (ICCES), Cairo, Egypt, 22–23 December 2014; pp. 368–373. [Google Scholar]
- Jenkins, J.; Rashad, S. An Innovative Method for Automatic American Sign Language Interpretation using Machine Learning and Leap Motion Controller. In Proceedings of the 2021 IEEE 12th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON), New York, NY, USA, 1–4 December 2021; pp. 0633–0638. [Google Scholar]
- Tuzcu, V.; Nas, S. Dynamic time warping as a novel tool in pattern recognition of ECG changes in heart rhythm disturbances. In Proceedings of the 2005 IEEE International Conference on Systems, Man and Cybernetics, Waikoloa, HI, USA, 10–12 October 2005; Volume 1, pp. 182–186. [Google Scholar]
- Legrand, B.; Chang, C.S.; Ong, S.H.; Neo, S.-Y.; Palanisamy, N. Chromosome classification using dynamic time warping. Pattern Recognit. Lett. 2008, 29, 215–222. [Google Scholar] [CrossRef]
- Kovacs-Vajna, Z.M. A fingerprint verification system based on triangular matching and dynamic time warping. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1266–1276. [Google Scholar] [CrossRef] [Green Version]
- Rath, T.M.; Manmatha, R. Word image matching using dynamic time warping. In Proceedings of the 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Madison, WI, USA, 18–20 June 2003; Volume 2, p. II. [Google Scholar]
- Okawa, M. Template Matching Using Time-Series Averaging and DTW With Dependent Warping for Online Signature Verification. IEEE Access 2019, 7, 81010–81019. [Google Scholar] [CrossRef]
- Piyush Shanker, A.; Rajagopalan, A.N. Off-line signature verification using DTW. Pattern Recognit. Lett. 2007, 28, 1407–1414. [Google Scholar] [CrossRef]
- Muda, L.; Begam, M.; Elamvazuthi, I. Voice Recognition Algorithms using Mel Frequency Cepstral Coefficient (MFCC) and Dynamic Time Warping (DTW) Techniques. arXiv 2010, arXiv:1003.4083. [Google Scholar]
- Amin, T.B.; Mahmood, I. Speech Recognition using Dynamic Time Warping. In Proceedings of the 2008 2nd International Conference on Advances in Space Technologies, Islamabab, Pakistan, 29–30 November 2008; pp. 74–79. [Google Scholar]
- Adwan, S.; Arof, H. On improving Dynamic Time Warping for pattern matching. Measurement 2012, 45, 1609–1620. [Google Scholar] [CrossRef]
- Arici, T.; Celebi, S.; Aydin, A.S.; Temiz, T.T. Robust gesture recognition using feature pre-processing and weighted dynamic time warping. Multimed. Tools Appl. 2014, 72, 3045–3062. [Google Scholar] [CrossRef]
- Calin, A.D. Gesture Recognition on Kinect Time Series Data Using Dynamic Time Warping and Hidden Markov Models. In Proceedings of the 2016 18th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC), Timisoara, Rumanía, 24–27 September 2016; pp. 264–271. [Google Scholar]
- Riofrío, S.; Pozo, D.; Rosero, J.; Vásquez, J. Gesture Recognition Using Dynamic Time Warping and Kinect: A Practical Approach. In Proceedings of the 2017 International Conference on Information Systems and Computer Science (INCISCOS), Quito, Ecuador, 23–25 November 2017; pp. 302–308. [Google Scholar]
- Reyes, M.; Domínguez, G.; Escalera, S. Featureweighting in dynamic timewarping for gesture recognition in depth data. In Proceedings of the 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Barcelona, Spain, 7 November 2011; pp. 1182–1188. [Google Scholar]
- Raheja, J.L.; Minhas, M.; Prashanth, D.; Shah, T.; Chaudhary, A. Robust gesture recognition using Kinect: A comparison between DTW and HMM. Optik 2015, 126, 1098–1104. [Google Scholar] [CrossRef]
- Ahmed, W.; Chanda, K.; Mitra, S. Vision based Hand Gesture Recognition using Dynamic Time Warping for Indian Sign Language. In Proceedings of the 2016 International Conference on Information Science (ICIS), Kochi, India, 12–13 August 2016; pp. 120–125. [Google Scholar]
- Jambhale, S.S.; Khaparde, A. Gesture recognition using DTW & piecewise DTW. In Proceedings of the 2014 International Conference on Electronics and Communication Systems (ICECS), Coimbatore, India, 13–14 February 2014; pp. 1–5. [Google Scholar]
- Kuzmanic, A.; Zanchi, V. Hand shape classification using DTW and LCSS as similarity measures for vision-based gesture recognition system. In Proceedings of the EUROCON 2007—The International Conference on “Computer as a Tool”, Varsovia, Polonia, 9–12 September 2007; pp. 264–269. [Google Scholar]
Medical Words | ||||||
Allergy | Alzheimer | Ambulance | Anxiety | Asthma | Bacteria | Bladder |
Blood | blood circulation | Blood Test | Breasts | Burp | Cancer | Care |
Chest | Consult | Depression | Diabetes | Doctor | Fainting | Fatten |
Fever | Gluteus | Headset | Health | Heart | Heart Attack | Hemorrhage |
Hormone | Hospital | Ictus | Implant | Inflammation | Injection | Injury |
Intestine | Jaw | Liver | Lumbago | Lungs | Mask | Medicines |
Organs | Ovaries | Oxygen | Phlegm | Pressure | Prostate | Serum |
Sickness | Stitches | Stomach Stress | Stress | Swelling | Urgency | Vaccine |
Vagina | ||||||
Verbs | ||||||
I didn’t know | I don’t know | Is | To Ask | To Break | To Burn | To Cook |
To Cure | To Disappear | To Do | To Eat | To Fall | To Feel | To Go |
To Have | To Operate | To Reduce | To Rest | To Run | To Shower | To Size |
To Take | To Take a walk | To Try | To Walk | To Work | ||
Everyday Words | ||||||
Accident | After | Air | All Right | Already | Also | Always |
April | August | Back | Before | Car | Centre | Coat |
Cold | Danger | Day | Deaf | December | Ear | Effort |
Elbow | End | Example | Eyelid | Eyes | Family | Fat |
February | Feet | Friday | Gloves | Good | Good Afternoon | Good Bye |
Good Morning | Good Night | Hands | Head | High | Higher | Hour |
How are you? | Hungry | Information | June | Last Night | Little | Lonely |
Man | March | Midday | Moto | Mouth | Much | Mummy |
Neck | Night | No | No Problem | Noise | Nothing | Now |
October | Panic | Private | Rain | Regular | Risk | Saturday |
Sensation | September | Sex | Shape | Shoulder | Something | Suffer |
Sure | Thank You | They | Thin | To | Too | Trouble |
Unusual | Upset | Wednesday | Without | Yesterday | Yours |
Parameters | ||||||||
---|---|---|---|---|---|---|---|---|
4 | 8 | 9 | 10 | 11 | 27 | 28 | 29 | 32 |
41 | 42 | 43 | 52 | 53 | 54 | 63 | 64 | 65 |
74 | 75 | 76 | 79 | 88 | 89 | 90 | 99 | 100 |
101 | 110 | 111 | 112 | 121 | 122 | 123 | 126 | 135 |
136 | 137 | 146 | 147 | 148 | 157 | 158 | 159 | 168 |
169 | 170 | 173 | 182 | 183 | 184 | 193 | 194 | 195 |
204 | 205 | 206 | 215 | 216 | 217 | 220 | 229 | 230 |
231 | 240 | 241 | 242 | 251 | 252 | 253 | 262 | 263 |
264 | 272 |
Nº Words | Precision | Recall | F1 |
---|---|---|---|
50 | 98.93% | 98.80% | 98.80% |
100 | 97.96% | 97.60% | 97.59% |
176 | 96.32% | 95.17% | 95.02% |
Nº Words | Precision | Recall | F1 |
---|---|---|---|
50 | 95.34% | 94.80% | 94.77% |
Reference | Gestures | Data Set | Classifier | Accuracy |
---|---|---|---|---|
Static Signs | ||||
[18] | 24 ASL Letters | unclear | Decision tree | 82.71% |
[19] * | 10 ASL | 14 people × 10 sets × 10 samples | SVM | 80.86% |
[20] | 24 GSL | 6 people × 10 sets | MLP (boneTraslation) | 99.08% |
MLP (palmTraslation) | 98.96% | |||
[21] | 32 ASL | 146 people × 1 sample/letter | MLP | 90% |
[22] | 24 ASL Letters Use sentences | 12 people × 10 samples | Linear Regression Analysis | 86.1% |
Static and Dynamic Signs | ||||
[23] | 28 ArSL | 10 samples/letter | KNN | 97.1% |
HMM | 97.7% | |||
[24] | 28 ArSL | 10 samples/letter | Naïve Bayes | 98.3% |
MLP | 99.1% | |||
[25] * | 16 ArSL Static Words 20 Dynamic Words | 2 people × 200 samples | Static | |
Neural Network | 90.35% | |||
K-NN | 95.22% | |||
SVM (RBF Kernel) | 89.12% | |||
SVM (Poly Kernel) | 90.78% | |||
Dynamic | ||||
DTW | 96.41% | |||
[26] | 26 ASL letters | 4 people × 20 samples/letter | MLP-BP | 96.15% |
[27] | 26 ASL letters 10 digits | 12 people Unclear samples | SVM (letters) | 80.30% |
DNN (letters) | 93.81% | |||
SVM (total) | 72.79% | |||
DNN (total) | 88.79% | |||
[28] | 26 ASL letters | 100 people × 1 samples/letter | LSTM | 97.96% |
SVM | 98.35% | |||
RNN | 98.19% | |||
[29] | 26 ASL letters | 5 sets × 450 samples/letter | CNN | 80.1–99.7% |
[30] | 26 SIBI letters | 5 people × 10 samples/letter | K-NN | 95.15% |
SVM | 93.85% | |||
[31] | 15 ArSL words | 5 people × 10 samples/letter | LSTM | 96% |
[32] | 18 static 12 dynamic | 20 people × 60 samples/word | LSTM | 96.41% |
Dynamic Signs | ||||
[33] | 50 ArSL | 4 people × 1 set | MLP | 88% |
[34] | 13 ASL words | 10 samples | Neural Network | 99.9% |
Random Forest | 99.7% | |||
SVM | 99.9% | |||
K-NN | 98.7% | |||
Naïve Bayes | 96.4% | |||
This Paper | 50 SSL words | 1 people × 30 samples/word | DTW | 98.80% |
This Paper | 100 SSL words | 1 people × 30 samples/word | DTW | 97.60% |
This Paper | 176 SSL words | 1 people × 30 samples/word | DTW | 95.17% |
This Paper | 50 SSL words | 15 different people × 10 samples/word | DTW | 94.80% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Galván-Ruiz, J.; Travieso-González, C.M.; Pinan-Roescher, A.; Alonso-Hernández, J.B. Robust Identification System for Spanish Sign Language Based on Three-Dimensional Frame Information. Sensors 2023, 23, 481. https://doi.org/10.3390/s23010481
Galván-Ruiz J, Travieso-González CM, Pinan-Roescher A, Alonso-Hernández JB. Robust Identification System for Spanish Sign Language Based on Three-Dimensional Frame Information. Sensors. 2023; 23(1):481. https://doi.org/10.3390/s23010481
Chicago/Turabian StyleGalván-Ruiz, Jesús, Carlos M. Travieso-González, Alejandro Pinan-Roescher, and Jesús B. Alonso-Hernández. 2023. "Robust Identification System for Spanish Sign Language Based on Three-Dimensional Frame Information" Sensors 23, no. 1: 481. https://doi.org/10.3390/s23010481
APA StyleGalván-Ruiz, J., Travieso-González, C. M., Pinan-Roescher, A., & Alonso-Hernández, J. B. (2023). Robust Identification System for Spanish Sign Language Based on Three-Dimensional Frame Information. Sensors, 23(1), 481. https://doi.org/10.3390/s23010481