Energy–Accuracy Aware Finger Gesture Recognition for Wearable IoT Devices
Abstract
:1. Introduction
- -
- Provide the full design for a finger gesture recognition system using a single flex sensor.
- -
- Explore the design choices of a finger gesture recognition system in terms of performance, accuracy, and energy consumption using the conducted performance and energy consumption models.
- -
- Demonstrate the functionality and feasibility of the proposed designs by implementing the prototypes using four commonly used low-end embedded MCUs.
- -
- Show the energy–accuracy aware design which achieves up to 95.5% accuracy with an energy consumption of 2.74 mJ per gesture.
- -
- Provide the energy–accuracy aware Pareto-optimal designs among a total of 159 design choices to find energy–accuracy aware design points under given energy or accuracy constraints.
2. Backgrounds
2.1. Related Work
2.2. Basics of Flex Sensors
3. Designing the Finger Gesture Recognition System
3.1. System Architecture
3.2. Designing Preprocessing Filters
3.3. Designing an MLP-Based Classifier
4. Energy–Accuracy Aware Design Optimization
4.1. Performance (Timing) Estimation Models
4.2. Energy Estimation Models
4.3. Energy–Accuracy Aware System-Level Design
5. Evaluations
5.1. Experimental Setup
5.2. Results of Design Choice Exploration
5.3. Pareto-Optimal Energy–Accuracy Aware Design Points
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Gunawan, M.R.; Djamal, E.C. Spatio-Temporal Approach using CNN-RNN in Hand Gesture Recognition. In Proceedings of the 2021 4th International Conference of Computer and Informatics Engineering (IC2IE), Depok, Indonesia, 14–15 September 2021; pp. 385–389. [Google Scholar] [CrossRef]
- Chen, X.; Guo, H.; Wang, G.; Zhang, L. Motion feature augmented recurrent neural network for skeleton-based dynamic hand gesture recognition. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 2881–2885. [Google Scholar] [CrossRef] [Green Version]
- Chen, L.; Fu, J.; Wu, Y.; Li, H.; Zheng, B. Hand Gesture Recognition Using Compact CNN via Surface Electromyography Signals. Sensors 2020, 20, 672. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mendes, N.; Ferrer, J.; Vitorino, J.; Safeea, M.; Neto, P. Human Behavior and Hand Gesture Classification for Smart Human-robot Interaction. Procedia Manuf. 2017, 11, 91–98. [Google Scholar] [CrossRef]
- Alam, S.; Kwon, K.-C.; Kim, N. Implementation of a Character Recognition System Based on Finger-Joint Tracking Using a Depth Camera. IEEE Trans. Hum.-Mach. Syst. 2021, 51, 229–241. [Google Scholar] [CrossRef]
- Chen, M.; AlRegib, G.; Juang, B.-H. Air-Writing Recognition—Part I: Modeling and Recognition of Characters, Words, and Connecting Motions. IEEE Trans. Hum.-Mach. Syst. 2015, 46, 403–413. [Google Scholar] [CrossRef]
- Chuang, W.-C.; Hwang, W.-J.; Tai, T.-M.; Huang, D.-R.; Jhang, Y.-J. Continuous Finger Gesture Recognition Based on Flex Sensors. Sensors 2019, 19, 3986. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kim, M.; Cho, J.; Lee, S.; Jung, Y. IMU Sensor-Based Hand Gesture Recognition for Human-Machine Interfaces. Sensors 2019, 19, 3827. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Dang, L.M.; Min, K.; Wang, H.; Piran, J.; Lee, C.H.; Moon, H. Sensor-based and vision-based human activity recognition: A comprehensive survey. Pattern Recognit. 2020, 108, 107561. [Google Scholar] [CrossRef]
- Fernandez, I.G.; Ahmad, S.A.; Wada, C. Inertial Sensor-Based Instrumented Cane for Real-Time Walking Cane Kinematics Estimation. Sensors 2020, 20, 4675. [Google Scholar] [CrossRef] [PubMed]
- Côté-Allard, U.; Fall, C.L.; Drouin, A.; Campeau-Lecours, A.; Gosselin, C.; Glette, K.; Laviolette, F.; Gosselin, B. Deep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 760–771. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lin, B.-S.; Hsiao, P.-C.; Yang, S.-Y.; Su, C.-S.; Lee, I.-J. Data Glove System Embedded With Inertial Measurement Units for Hand Function Evaluation in Stroke Patients. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 2204–2213. [Google Scholar] [CrossRef] [PubMed]
- Chan, J.; Veas, E.; Simon, J. Designing a Sensor Glove Using Deep Learning. In Proceedings of the 26th International Conference on Intelligent User Interfaces, College Station, TX, USA, 14–17 April 2021; pp. 150–160. [Google Scholar] [CrossRef]
- Bendlabs. 2-Axis Soft Flex Sensor. Available online: https://www.bendlabs.com/products/2-axis-soft-flex-sensor/ (accessed on 30 August 2018).
- Laput, G.; Harrison, C. Sensing Fine-Grained Hand Activity with Smartwatches. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI’19), Glasgow, UK, 4–9 May 2019. [Google Scholar] [CrossRef]
- Ketykó, I.; Kovács, F.; Varga, K.Z. Domain Adaptation for sEMG-based Gesture Recognition with Recurrent Neural Networks. In Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 14–19 July 2019; pp. 1–7. [Google Scholar] [CrossRef] [Green Version]
- Hao, J.; Yang, P.; Chen, L.; Geng, Y. A gait recognition approach based on surface electromyography and triaxial acceleration signals. Chin. J. Tissue Eng. Res. 2019, 23, 5164. [Google Scholar] [CrossRef]
- Roland, T.; Amsuess, S.; Russold, M.F.; Baumgartner, W. Ultra-Low-Power Digital Filtering for Insulated EMG Sensing. Sensors 2019, 19, 959. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ponraj, G.; Ren, H. Sensor Fusion of Leap Motion Controller and Flex Sensors Using Kalman Filter for Human Finger Tracking. IEEE Sens. J. 2018, 18, 2042–2049. [Google Scholar] [CrossRef]
- Lichtenauer, J.F.; Hendriks, E.A.; Reinders, M.J. Sign Language Recognition by Combining Statistical DTW and Independent Classification. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 2040–2046. [Google Scholar] [CrossRef] [PubMed]
- Vijayalakshmi, P.; Aarthi, M. Sign language to speech conversion. In Proceedings of the 2016 International Conference on Recent Trends in Information Technology (ICRTIT), Chennai, India, 8–9 April 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Hu, Q.; Tang, X.; Tang, W. A Smart Chair Sitting Posture Recognition System Using Flex Sensors and FPGA Implemented Artificial Neural Network. IEEE Sens. J. 2020, 20, 8007–8016. [Google Scholar] [CrossRef]
- Shin, S.; Sung, W. Dynamic hand gesture recognition for wearable devices with low complexity recurrent neural networks. In Proceedings of the 2016 IEEE International Symposium on Circuits and Systems (ISCAS), Montréal, QC, Canada, 22–25 May 2016; pp. 2274–2277. [Google Scholar] [CrossRef] [Green Version]
- Wang, L.; Meydan, T.; Williams, P.; Wolfson, K.T. A proposed optical-based sensor for assessment of hand movement. In Proceedings of the 2015 IEEE Sensors, Busan, Korea, 1–4 November 2015; pp. 1–4. [Google Scholar] [CrossRef]
- Shafi, I.; Ahmad, J.; Shah, S.I.; Kashif, F.M. Impact of Varying Neurons and Hidden Layers in Neural Network Architecture for a Time Frequency Application. In Proceedings of the 2006 IEEE International Multitopic Conference, Islamabad, Pakistan, 23–24 December 2006; pp. 188–193. [Google Scholar] [CrossRef]
- Panda, A.K.; Chakravarty, R.; Moulik, S. Hand Gesture Recognition using Flex Sensor and Machine Learning Algorithms. In Proceedings of the 2020 IEEE-EMBS Conference on Biomedical Engineering and Sciences, Langkawi Island, Malaysia, 1–3 March 2021; pp. 449–453. [Google Scholar] [CrossRef]
Definition | Description |
---|---|
Number of sampled data per gesture to be recognized | |
Number of parameters used in the MLP classifier | |
Sensor frequency (sample rate) | |
Time taken per gesture recognition | |
Time taken to read one sample from the sensor 269 us (including time to wakeup, I2C transfer, time to sleep) | |
Time taken to perform preprocessing Depends on | |
Time taken to perform the MLP evaluation Depends on # of parameters in the | |
Time taken to collect data |
MCU | Clock Frequency (MHz) | On-Chip Memory (KB) | Architecture | Active Current (mA/MHz) | Standby Current (uA) | |
---|---|---|---|---|---|---|
CC2652R | 48 | 80 | 18,100 | CortexM4F 32 bit RISC | 0.07 | 675 |
Atmega2560 | 16 | 8 | 1972 | AVR 8 bit RISC | 2.3 | 170 |
Atmega1284P | 16 | 16 | 3960 | AVR 8 bit RISC with picoPower | 0.86 | 210 |
MSP430 | 16 | 4 | 900 | 16 bit RISC | 0.13 | 420 |
MCU Type | Sample Rate | NMLP | Memory Size (Byte) | MLP Layers | Accuracy (%) | Eges (mJ) |
---|---|---|---|---|---|---|
MSP430 | 5 | 185 | 740 | 10 × 6 × 17 | 51.1 | 1.31 |
7 | 297 | 1188 | 14 × 7 × 7 × 17 | 60.1 | 1.33 | |
9 | 449 | 1796 | 18 × 12 × 17 | 78.1 | 1.36 | |
11 | 589 | 2356 | 22 × 11 × 11 × 17 | 81.7 | 1.37 | |
12 | 689 | 2756 | 24 × 16 × 17 | 89.7 | 1.42 | |
14 | 891 | 3564 | 28 × 19 × 17 | 91.0 | 1.47 | |
CC2652R | 20 | 1583 | 6332 | 40 × 27 × 17 | 92.3 | 2.39 |
30 | 3287 | 13,148 | 60 × 30 × 30 × 17 | 92.9 | 2.49 | |
40 | 5603 | 22,412 | 80 × 57 × 17 | 94.8 | 2.61 | |
50 | 7787 | 31,148 | GRU | 95.8 | 2.72 |
[1] | [3] | [5] | [6] | [7] | [8] | [13] | [23] | [26] | This Work | |
---|---|---|---|---|---|---|---|---|---|---|
Used sensors | Camera | EMG (Myo) | Depth camera | Optical and IMU | Flex Sensor | IMU | Pressure, flex, gyro, IMU, etc. | Accelerometer | Flex sensor | 2-axes flex sensor |
Models (num. of parmas or mem. size) | CNN + RNN (N/A) | CNN (34 K) | Custom (600 MB) | HMM (N/A) | GRU + MAP (50 K~) | RCE (274.3 Kb) | LSTM (N/A) | RNN (69 K) | AL 1 (N/A) | MLP (185~8513) |
Classes | 4 | 7 | 124 | 26 | 4 | 10 | 31 | 8 | 4 | 17 |
Accuracy (%) | 96.4 | 98.8 | 91.9 | 98.1 | 97.3 | 98.6 | 90.0 | 88.6 | 88.3 | 95.5 |
Implementation | N/A | N/A | Inter i5, GPU (GTX750) | N/A | Raspberry Pi 3 | Arduino + FPGA | N/A | N/A | N/A | CC2652R, Atmega, MSP430 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jung, W.; Lee, H.G. Energy–Accuracy Aware Finger Gesture Recognition for Wearable IoT Devices. Sensors 2022, 22, 4801. https://doi.org/10.3390/s22134801
Jung W, Lee HG. Energy–Accuracy Aware Finger Gesture Recognition for Wearable IoT Devices. Sensors. 2022; 22(13):4801. https://doi.org/10.3390/s22134801
Chicago/Turabian StyleJung, Woosoon, and Hyung Gyu Lee. 2022. "Energy–Accuracy Aware Finger Gesture Recognition for Wearable IoT Devices" Sensors 22, no. 13: 4801. https://doi.org/10.3390/s22134801
APA StyleJung, W., & Lee, H. G. (2022). Energy–Accuracy Aware Finger Gesture Recognition for Wearable IoT Devices. Sensors, 22(13), 4801. https://doi.org/10.3390/s22134801