Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors
Abstract
:1. Introduction
- This study proposes a novel spatial pose telemanipulation method of a 6-DOF manipulator without the human skeletal kinematic model with only two IMU sensors by an unprecedented combination of the gesture mode and states.
- Consistent hand-guiding gesture mode classification and state estimation were successfully achieved by integrating the floating body-fixed frame method into the proposed telemanipulation method, even during the operator’s dynamic movements.
2. Problem Definition
- The gesture recognition model-based method enables robot arm control through specific hand gestures mapped to corresponding motions. However, it is limited in its capacity to control the robot arm in all directions.
- Skeletal kinematic model-based method: this approach allows for omnidirectional control of the robot arm by replicating the hand movements of a human worker. Despite its versatility, it faces challenges implementing pure linear or angular motion and differentiating between intended and unintended control motions. Moreover, accurately discerning the operator’s motion intent without external sensor-based error feedback remains unattainable.
- (Operator side) Linear gesture → (Robot side) spatial translational motion;
- (Operator side) Angular gesture → (Robot side) spatial rotational motion;
- (Operator side) Unintentional gesture → (Robot side) zero motion.
3. Method
3.1. Floating Body-Fixed Frame
3.2. Bi-Directional LSTM-Based Hand-Guiding Gesture Classification
3.3. Hand-guiding Gesture’s State Estimation
4. Experiments
4.1. Gesture Recognition
4.2. Gesture State Estimation
- Testbench: Within the testbench, 6 Prime 13 cameras, 2 retro-reflective marker sets (hand, manipulator’s EEF), and a workstation with MOTIVE 2.1.1 (NaturalPoint Inc., Corvallis, OR, USA) software are installed. The pose of the marker-fixed frame is measured for the Optitrack-fixed frame {Of} defined in an L-shaped calibration square (CS-200) located within the motion-capture volume.
- Hand trajectory: One wireless IMU sensor is attached to the back of the subject’s pelvis and right hand with a strap, and a marker set is attached to the right hand to measure its position for {Of}.
4.3. Validation of Hand-Guiding Gesture-Based Telemanipulation
5. Result and Discussion
- Utilizing the floating-body fixed frame, the hand-guiding gesture mode can be classified with 84.5% accuracy in real time, even during the operator’s dynamic movement scenarios.
- The spatial direction of hand-guiding gestures could be estimated with an error of less than 1 degree.
- The gesture intensity of hand-guiding gestures could be successfully estimated with a speed estimator and finely tuned with the scaling factor.
- Finally, a subject could place the EEF within the average range of 83 mm and 2.56 degrees in the target pose with only less than ten consecutive hand-guiding gestures and visual inspection in the first trial.
- Intuitive user interface (U.I.) with AR-assisted devices: The current system allows unskilled subjects to place the EEF within an average range of 83 mm and 2.56 degrees in the goal pose, but there is additional room for improvement in the U.I. part. Thus, future work should look into more intuitive control interfaces or improved feedback systems to help the operator guide the manipulator in remote sites more accurately.
- Constraints on manipulator motion: In the current study, while it is possible to separately control the position and orientation of the manipulator’s EEF as intended by the user, it is impossible to control both simultaneously. Therefore, in future work, we need to add types of hand-guiding gestures to control both the position and orientation of the manipulator’s EEF simultaneously.
- Integration with force feedback: Although the current research does not include haptic feedback, force feedback could offer the operator a more immersive and intuitive control experience.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Kumar, N.; Lee, S.C. Human-machine interface in smart factory: A systematic literature review. Technol. Forecast. Soc. Chang. 2022, 174, 121284. [Google Scholar] [CrossRef]
- Nuzzi, C.; Pasinetti, S.; Lancini, M.; Docchio, F.; Sansoni, G. Deep learning-based hand gesture recognition for collaborative robots. IEEE Instrum. Meas. Mag. 2019, 22, 44–51. [Google Scholar] [CrossRef] [Green Version]
- Fang, W.; Ding, Y.; Zhang, F.; Sheng, J. Gesture recognition based on CNN and DCGAN for calculation and text output. IEEE Access 2019, 7, 28230–28237. [Google Scholar] [CrossRef]
- Jiang, D.; Li, G.; Sun, Y.; Kong, J.; Tao, B. Gesture recognition based on skeletonization algorithm and CNN with ASL database. Multimedia Tools Appl. 2019, 78, 29953–29970. [Google Scholar] [CrossRef]
- Suarez, J.; Murphy, R.R. Hand gesture recognition with depth images: A review. In Proceedings of the 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 13 September 2012; pp. 411–417. [Google Scholar]
- Mazhar, O.; Navarro, B.; Ramdani, S.; Passama, R.; Cherubini, A. A real-time human-robot interaction framework with robust background invariant hand gesture detection. Robot. Comput. Manuf. 2019, 60, 34–48. [Google Scholar] [CrossRef] [Green Version]
- CMU-Perceptual-Computing-Lab/Openpose. GitHub. Available online: https://github.com/CMU-Perceptual-Computing-Lab/openpose (accessed on 13 March 2023).
- OpenSign—Kinect V2 Hand Gesture Data—American Sign Language. NARCIS. Available online: https://www.narcis.nl/dataset/RecordID/oai%3Aeasy.dans.knaw.nl%3Aeasy-dataset%3A127663 (accessed on 4 April 2023).
- Zhou, D.; Shi, M.; Chao, F.; Lin, C.M.; Yang, L.; Shang, C.; Zhou, C. Use of human gestures for controlling a mobile robot via adaptive cmac network and fuzzy logic controller. Neurocomputing 2018, 282, 218–231. [Google Scholar] [CrossRef]
- Bouteraa, Y.; Ben Abdallah, I.; Ghommam, J. Task-space region-reaching control for medical robot manipulator. Comput. Electr. Eng. 2018, 67, 629–645. [Google Scholar] [CrossRef]
- Vogel, J.; Castellini, C.; van der Smagt, P. EMG-based teleoperation and manipulation with the DLR LWR-III. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011. [Google Scholar]
- Bouteraa, Y.; Ben Abdallah, I. A gesture-based telemanipulation control for a robotic arm with biofeed-back-based grasp. Ind. Robot. Int. J. 2017, 44, 575–587. [Google Scholar] [CrossRef]
- Chico, A.; Cruz, P.J.; Vasconez, J.P.; Benalcazar, M.E.; Alvarez, R.; Barona, L.; Valdivieso, A.L. Hand Gesture Recognition and Tracking Control for a Virtual UR5 Robot Manipulator. In Proceedings of the 2021 IEEE Fifth Ecuador Technical Chapters Meeting (ETCM), Cuenca, Ecuador, 12–15 October 2021. [Google Scholar] [CrossRef]
- Kulkarni, P.V.; Illing, B.; Gaspers, B.; Brüggemann, B.; Schulz, D. Mobile manipulator control through gesture recognition using IMUs and Online Lazy Neighborhood Graph search. Acta IMEKO 2019, 8, 3–8. [Google Scholar] [CrossRef]
- Shintemirov, A.; Taunyazov, T.; Omarali, B.; Nurbayeva, A.; Kim, A.; Bukeyev, A.; Rubagotti, M. An open-source 7-DOF wireless human arm motion-tracking system for use in robotics research. Sensors 2020, 20, 3082. [Google Scholar] [CrossRef] [PubMed]
- Roetenberg, D.; Luinge, H.; Slycke, P. Xsens MVN: Full 6DOF Human Motion Tracking Using Miniature Inertial Sensors; Technical Report; Xsens Motion Technologies B.V.: Enschede, The Netherlands, 2013; pp. 1–7. [Google Scholar]
- Luinge, H.; Veltink, P.; Baten, C. Ambulatory measurement of arm orientation. J. Biomech. 2007, 40, 78–85. [Google Scholar] [CrossRef] [PubMed]
- Van der Helm, F.C.T.; Pronk, G.M. Three-dimensional recording and description of motions of the shoulder mechanism. J. Biomech. Eng. 1995, 117, 27–40. [Google Scholar] [CrossRef] [PubMed]
- Zhang, J.-T.; Novak, A.C.; Brouwer, B.; Li, Q. Concurrent validation of Xsens MVN measurement of lower limb joint angular kinematics. Physiol. Meas. 2013, 34, N63–N69. [Google Scholar] [CrossRef] [PubMed]
- Kim, M.; Lee, D. Wearable inertial sensor based parametric calibration of lower-limb kinematics. Sens. Actuators A Phys. 2017, 265, 280–296. [Google Scholar] [CrossRef]
- Jeon, H.; Choi, H.; Noh, D.; Kim, T.; Lee, D. Wearable Inertial Sensor-Based Hand-Guiding Gestures Recognition Method Robust to Significant Changes in the Body-Alignment of Subject. Mathematics 2022, 10, 4753. [Google Scholar] [CrossRef]
- Yuan, Q.; Chen, I.-M. Human velocity and dynamic behavior tracking method for inertial capture system. Sens. Actuators A Phys. 2012, 183, 123–131. [Google Scholar] [CrossRef]
- Lynch, K.M.; Park, F.C. Modern Robotics; Cambridge University Press: Cambridge, UK, 2017. [Google Scholar]
- Yoo, M.; Na, Y.; Song, H.; Kim, G.; Yun, J.; Kim, S.; Moon, C.; Jo, K. Motion estimation and hand gesture recognition-based human–UAV interaction approach in real time. Sensors 2022, 22, 2513. [Google Scholar] [CrossRef] [PubMed]
- Chamorro, S.; Jack, C.; François, G. Neural network based lidar gesture recognition for real-time robot teleoperation. In Proceedings of the 2021 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), New York City, NY, USA, 25–27 October 2021. [Google Scholar]
- Se-Yun, J.; Eun-Su Kim, E.-S.; Park, B.Y. CNN-based hand gesture recognition method for teleoperation control of industrial robot. IEMEK J. Embed. Syst. Appl. 2021, 16, 65–72. [Google Scholar]
- Kim, E.; Shin, J.; Kwon, Y.; Park, B. EMG-Based Dynamic Hand Gesture Recognition Using Edge A.I. for Human–Robot Interac-tion. Electronics 2023, 12, 1541. [Google Scholar] [CrossRef]
- Cruz, P.J.; Vásconez, J.P.; Romero, R.; Chico, A.; Benalcázar, M.E.; Álvarez, R.; López, L.I.B.; Caraguay, L.V. A Deep Q-Network based hand gesture recognition system for control of robotic platforms. Sci. Rep. 2023, 13, 7956. [Google Scholar] [CrossRef] [PubMed]
Model Size | Input Sequence Length | GPU Memory | Latency |
---|---|---|---|
114 kB | 40 | 48 G.B. (RTX 30902) | 1.3 |
Optimizer | Learning Rate | Batch Size | Epochs |
Adam | 0.001 | 5000 | 1000 |
Training | Test | |||
---|---|---|---|---|
No. of Subjects | No. of the Training Sets | No. of the Validation Sets | No. of Subjects | No. of Datasets |
6 | 163,375 | 108,276 | 3 | 90,547 |
Training and Test Classification Accuracy [%] | |||
---|---|---|---|
Linear Gestures | Angular Gestures | Unintentional Gestures | Total |
99.7/88.7 | 98.4/85.6 | 97.6/74.4 | 98.6/82.9 |
Linear Gestures | Angular Gestures | Unintentional Gestures | Total |
---|---|---|---|
89.4 | 84.5 | 79.6 | 84.5 |
Gesture Type | Accuracy [%] | Controllability | ||
---|---|---|---|---|
Spatial Direction | Spatial Displacement | Limitation | ||
(Dynamic) Hand gesture | 84.5 | O | O | Decoupling of controllable motion to spatial linear and angular |
(Static) Hand Gesture [24] | 91.7 | × | × | Predefined motion only for UAV (e.g., move forward/backward, left/right, ascend/descend) |
(Static & Dynamic) Arm gesture [25] | 93.1 | × | × | Predefined motion only for vehicles (e.g., stop, go-forward fast, go-forward while turning left/right, etc.) |
(Static) Hand gesture [26] | 88.0 | × | × | Limited linear motion in only left and right directions |
(Static) Hand gesture [27] | 96.0 | × | × | Predefined motion (e.g., close, open, rest, supination, fist, etc.) |
(Static) Arm & hand gestures [28] | 88.1 | × | × | Predefined motion (e.g., wave in/out for selection of a new orientation reference, open for increasing/decreasing speed, etc.) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Choi, H.; Jeon, H.; Noh, D.; Kim, T.; Lee, D. Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors. Mathematics 2023, 11, 3514. https://doi.org/10.3390/math11163514
Choi H, Jeon H, Noh D, Kim T, Lee D. Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors. Mathematics. 2023; 11(16):3514. https://doi.org/10.3390/math11163514
Chicago/Turabian StyleChoi, Haegyeom, Haneul Jeon, Donghyeon Noh, Taeho Kim, and Donghun Lee. 2023. "Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors" Mathematics 11, no. 16: 3514. https://doi.org/10.3390/math11163514
APA StyleChoi, H., Jeon, H., Noh, D., Kim, T., & Lee, D. (2023). Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors. Mathematics, 11(16), 3514. https://doi.org/10.3390/math11163514