Next Article in Journal
Can Foot Orthoses Prevent Falls? A Proposal for a New Evaluation Protocol
Previous Article in Journal
Proton Exchange Membrane Fuel Cell Stack Durability Prediction Using Arrhenius-Based Accelerated Degradation Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Carpometacarpal Thumb Tracking Device for Telemanipulation of a Robotic Thumb: Development, Prototyping, and Evaluation

by
Abdul Hafiz Abdul Rahaman
and
Panos S. Shiakolas
*,†
Mechanical and Aerospace Engineering Department, The University of Texas at Arlington, Arlington, TX 76019, USA
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Appl. Sci. 2025, 15(3), 1301; https://doi.org/10.3390/app15031301
Submission received: 20 November 2024 / Revised: 13 January 2025 / Accepted: 14 January 2025 / Published: 27 January 2025
(This article belongs to the Special Issue Human–Robot Collaboration and Its Applications)

Abstract

:
Hand−tracking systems are widely employed for telemanipulating grippers with high degrees of freedom (DOFs) such as an anthropomorphic robotic hand (ARH). However, tracking human thumb motion is challenging due to the complex motion of the carpometacarpal (CMC) joint. Existing hand−tracking systems can track the motion of simple joints with one DOF, but most fail to track the motion of the CMC joint, or to do so, there is a need for expensive and intricately set up hardware systems. This research introduces and realizes an affordable and personalizable tracking device to capture the CMC joint Flexion/Extension and Abduction/Adduction motions. Tracked human thumb motion is mapped to a robot thumb in a hybrid approach: the proposed algorithm maps the CMC joint motion to the first two joints of the robot thumb, while joint mapping is established between the metacarpophalangeal and interphalangeal joints to the last two joints. When the tracking device is paired with a flex glove outfitted with bend sensors, the developed system provides the means to telemanipulate an ARH with a four-DOF thumb and one-DOF underactuated digits. A three-stage framework is proposed to telemanipulate the fully actuated robot thumb. The tracking device and framework were evaluated through a device operation and personalization test, as well as a framework verification test. Two volunteers successfully personalized, calibrated, and tested the device using the proposed mapping algorithm. One volunteer further evaluated the framework by performing hand poses and grasps, demonstrating effective control of the robot thumb for precision and power grasps in coordination with the other digits. The successful results support expanding the system and further evaluating it as a research platform for studying human–robot interaction in grasping tasks or in manufacturing, assistive, or medical domains.

1. Introduction

An anthropomorphic robotic hand (ARH) is an effective gripper for robots, allowing interaction with objects in human−designed environments. Advancements in 3D printing have made ARH manufacturing feasible and affordable, driving research into improved hardware and control strategies for applications in prosthetics, assistive robotics, and remote operations [1,2,3,4]. ARHs have been developed and utilized for collaborative teleoperation for bimanual manipulations in unstructured tasks [5]. Typically featuring at least 15 finger joints, manually controlling an ARH can be complex, necessitating specialized interface systems to enhance user control. Manipulating an ARH with handheld devices like joystick controllers can be difficult because of the numerous degrees of freedom (DOFs), prompting the use of intuitive hand−tracking systems. Thus, researchers have developed control interface systems such as hand−tracking systems [6,7,8] and kinematic mapping methods for the telemanipulation of ARHs [4,9,10,11]. Tracking the human thumb is critical in HRI tasks such as telemanipulating an ARH to grasp and manipulate objects [10]. The human thumb is the most complex digit of the hand due to the carpometacarpal (CMC) joint, which plays a significant role in grasping. For available hand−tracking systems, even though they can track the motion of simple joints with one DOF, most of them either fail to track the motion of the CMC joint or rely on expensive hardware to track its complex motions. Hence, an affordable, personalizable, and easily prototyped tracking device capable of reliably capturing both the CMC joint flexion/extension (FE) and abduction/adduction (AA) motion is necessary for the telemanipulation of a fully actuated thumb. An HRI research platform was developed to investigate, experiment, and develop various interaction modalities for an ARH [12]. The ARH used in this work is based on the InMoov hand [13], where all the digits are underactuated. However, the underactuated InMoov thumb has been replaced with a new thumb with four rotational joints each independently controlled with joint−attached actuators.
Human and robot thumb kinematics differ due to joint configuration, DOFs, and link dimensions. A joint space mapping between the human and robot thumb motion is not desirable, since it produces different poses due to their kinematic differences and makes telemanipulation less intuitive [9]. Thus, a suitable kinematic mapping approach is required for the telemanipulation of the ARH using the tracked thumb motion. In general, a telemanipulation system for a fully actuated robot thumb requires a tracking device capable of capturing the CMC, MCP, and IP joint motions, as well as a kinematic mapping approach to map the human to the robot motion. Also, it is necessary to develop a framework to integrate the functional components of the telemanipulation system to estimate the human thumb pose, establish the mapping between human and robot thumb motion, and actuate the robot thumb.
In this research, an affordable, 3D−printable, personalizable tracking device capable of capturing the complex motion of the human thumb CMC joint is introduced. The tracking device is designed with personalization to ensure proper fitment to different human thumb sizes. The realized device is then employed along with a flex glove to telemanipulate the fully actuated four−DOF thumb of the developed ARH. The flex glove is a glove outfitted with five (5) bendable variable resistor sensors (flex sensor) and used to track the combined FE motion of the metacarpophalangeal (MCP) and interphalangeal (IP) joints of each finger and thumb. The flex glove is used to telemanipulate the ARH and was developed in our lab [14,15].
A kinematic model of the human thumb to estimate its joint angles based on the tracking device measurements is presented. The tracked human thumb motion is mapped to the robot thumb in a hybrid approach: the proposed algorithm maps the CMC joint motion to the base two joints, while a joint space mapping is used between the distal human MCP and IP joints to the last two joints. The algorithm is developed to implement a Cartesian space mapping and account for kinematic differences between the human CMC joint and the first two joints of the robot thumb. A framework is proposed to telemanipulate the robot thumb using the CMC tracking device in three stages: (1) device calibration and estimation of human thumb joint angles, (2) kinematic mapping algorithm of human to robot thumb, and (3) motion control.
The personalization of the tracking device to accommodate different thumb sizes and the proposed mapping algorithm were evaluated by two volunteers. The volunteers were able to personalize and calibrate the CMC tracking device and then use the developed system to oppose a cylinder against the palm of the ARH by controlling only the base two ARH joints. The performance of the overall system following the proposed framework was evaluated by one of the volunteers. The system tracked the thumb motion of the volunteers and mapped the motion to control the robot thumb to perform prescribed poses and power and precision grasps.
The manuscript is organized as follows. The background and related literature are discussed in Section 2. The kinematic models of the human and robot thumbs are presented in Section 3, and the tracking device development and functionality are discussed in Section 4. The developed framework is introduced in Section 5. The evaluation of the performance of the tracking device and the telemanipulation system following the proposed framework is presented in Section 6. The manuscript closes with conclusions and remarks for future research.

2. Background and Related Literature

Researchers have been investigating and developing different kinds of hand−tracking systems [4,6,8,10,16,17,18,19,20,21,22,23] that can find applications in various fields such as space exploration, surgical robotics, search and rescue operations, manufacturing sector, etc. These systems can be classified as either contactless or wearable.
The contactless tracking systems are usually vision−based with recent versions using non−contact inductive sensors to track hand gestures [23]. Vision−based systems typically use multiple cameras (RGB depth or infrared sensors) or a combination of them [6,16]. Kuo et al. [18] developed a vision−based system to estimate the motion of human finger and thumb joints by tracking the reflective markers placed on the joints. Handa et al. [4] developed a markerless vision−based hand-−tracking system to telemanipulate a robot hand using four RGB−D cameras. They used point cloud data to train a neural network to estimate the hand pose. Commercially available vision−based systems that track markers accurately are generally expensive; a basic vision−based motion capture system with software such as VICON cost approximately 12,500 USD in 2023 [24]. However, vision−based systems experience issues such as motion blur, occlusions, and limited workspace. Systems with multiple cameras need to be employed at known fixed locations, making these system not easily portable and increasing the system cost.
An alternative approach has been to use other sensors such as magnetic−based, optical linear encoders, potentiometers, IMUs (Inertial Measurement Units), and flex and stretch sensors to develop wearable hand−tracking systems [8,19,20,21,22,25,26]. Park et al. [19] used linear potentiometers to capture the motion of the human hand, but they were able to capture only the FE motion of the CMC joint. Fang et al. [22] developed a glove using magnetic and IMU sensors and used quaternion–based extended Kalman filters to estimate the kinematic model parameters for hand–arm tracking. IMUs and magnetic sensors generally require chronic recalibration due to the accumulation of errors from sensor drift over time.
Researchers also investigated the use of soft or stretch sensors in wearable hand−tracking systems due to advancements in soft sensor technology and their ability to provide unrestricted finger motion [27,28,29,30,31]. Glauser et al. [8] developed a glove outfitted with 44 stretchable capacitive silicone sensors to capture finger motion using a deep network architecture to reconstruct sensed skin deformations. Under cyclic stretching/releasing, the stretch sensors experience difficulty in recovery and suffer from hysteresis, signal drift, and sensitivity to environmental influence [6,32]. Flex or bend sensors have been successful in reliably tracking the bending of one−DOF hand joints [25,26] and adopted by commercial gloves such as the CyberGlove® [33], the Manus VR glove [34], and the VRFree glove [35]. The latter two gloves also use IMU sensors in addition to flex sensors to track hand motion. In general, hand−tracking systems are widely employed for the telemanipulation of ARHs. However, CMC joint tracking is still challenging and requires expensive hardware or not easily personalizable setups.
Regardless of the tracking systems used, the captured human hand motion cannot be directly mapped to a robot thumb due to their kinematic differences such as joint configuration, DOFs, dimensions, workspace, and user−specific calibration [9]. The most commonly used mapping approaches are joint and Cartesian space. Joint space mapping is simple, since it does not provide direct control over the finger/thumb tip location and might result in different poses depending on kinematic parameters and configuration differences. Joint space mapping is mostly suited for power grasps. Cartesian space mapping is used to map human hand fingertip positions to those of a robot hand and is useful for manipulation tasks where precision grasps are desired [36]. Cartesian mapping for telemanipulation can either be implemented using numerical−based [36,37] or data−driven approaches [4,11,38]. Numerical methods are easy to implement but often require heuristic modeling to accommodate kinematic differences. In contrast, data−driven approaches need large labeled datasets for training but can leverage valuable features like approach vectors and fingertip positions. Although both joint and Cartesian space mapping can be adapted to a robot thumb with kinematic dissimilarities, they can yield robot thumb poses that are different from the human thumb, which makes telemanipulation less intuitive [39,40].
Researchers have proposed hybrid mapping approaches that provide intuitive control for telemanipulation. Chattaraj et al. [39] developed a hybrid mapping between human and robot hands that switches from joint and Cartesian space mapping depending on the executed grasp type. They used joint space for power grasps and Cartesian space for precision grasps. Meattini et al. [40] developed a hybrid mapping approach that transitions from joint space to Cartesian space mapping depending on the distance between the thumb and fingertips.
In our literature review, we identified various tracking systems and mapping approaches that can effectively track human thumb movements and telemanipulate a robot thumb with kinematic differences. However, employing these systems depends on factors such as cost, portability, and output quality. While wearable systems are less expensive and portable compared to vision systems, they generally do not track the entire CMC joint motion without the need of expensive hardware. As such, a cost−effective device capable of accurately capturing both the FE and AA motion of the human CMC joint along with a suitable kinematic mapping algorithm to the robot thumb will be desirable and beneficial for research purposes.

3. Kinematic Models of the Human and Robot Thumbs

Human thumb anatomy and motion are more complex than other hand digits. The four thumb bones, starting at the wrist, are the trapezium, first metacarpal, proximal phalange, and distal phalange, and the three thumb joints are carpometacarpal (CMC), metacarpophalangeal (MCP), and interphalangeal (IP), as shown in Figure 1a.
The saddle−shaped CMC joint, also known as the trapeziometacarpal joint (TM joint), has three DOFs and allows the thumb to perform complex motions, thus enabling the hand to engage with a variety of objects through various grasps such as power, pinch, or lateral. The majority of clinical−based thumb models identify the CMC joint movements as flexion/extension (FE), abduction/adduction (AA), and pronation/supination (PS) [41,42,43,44,45].
Even though the metacarpal bone has three motion modes (AA, FE, and PS), it only has two degrees of actuation, since the PS motion is coupled with the FE and AA motion according to Cooney et al. [46]. Hollister et al. [41], using cadaver thumbs, found that the thumb FE and AA axes are non−orthogonal and non−intersecting, making accurate modeling of the thumb more challenging than the other digits. The thumb MCP and IP joints have two DOFs (FE and AA) and one DOF (FE), respectively [42,47].
Developing an exact kinematic model of the CMC joint is difficult due to its complex motion. As such, various approximate kinematic joint definitions have been used to describe its motion [20,21,48,49]. For example, the CyberGlove® [21,45] uses an approximate thumb model, where the CMC joint FE and AA axes are considered orthogonal and intersecting, and the FE and PS motions are coupled and defined as a roll motion instead. The model is based on the location of the flex sensors used to track the CMC joint motion. Cerulo et al. [49] developed a vision−based hand−tracking system to teleoperate an underactuated ARH. They modeled the human thumb as a kinematic chain with two DOFs (FE and AA) for the CMC joint and one DOF (FE) for each of the MCP and IP joints. They assumed the CMC joint PS motion and the MCP joint AA motion to be fixed, and the FE motion of the MCP joint was coupled with the IP joint.
In this research, the developed human thumb kinematic model is based on clinical anatomical models (see Figure 1a) with assumptions. The FE and AA axes are assumed to be intersecting and orthogonal. The PS motion is considered fixed due to its relatively small motion of (∼17 deg). The FE axes for each of the CMC, MCP, and IP joints are considered to be parallel to each other, as shown in Figure 1b,c. The thumb kinematic base coordinate system is established at the intersection of the thumb and index metacarpal bones, as shown in Figure 1b.
The human thumb is kinematically modeled as a serial linkage with four rotational joints and is shown schematically in Figure 1c. The CMC joint is modeled as two serially intersecting orthogonal rotational joints, J F E and J A A , connected by the metacarpal phalange. The kinematic parameters of the human thumb are defined following the modified Denavit–Hartenberg (mDH) kinematic convention [50] and presented in Table 1. L m p , L p p , and L d p are the lengths of the metacarpal, proximal phalange, and distal phalange, respectively, and L t is the offset distance to the contact surface point of the thumb.
The MCP joint of each of the remaining fingers (index, middle, ring, and pinky), similar to that of the thumb, is a two−degree−of−freedom joint. However, the abduction/adduction motion is small, and it is reasonable to model the MCP as a single degree of freedom for the flexion/extension motion, as shown in Figure 1b for the index finger. The FE axis of the distal IP, proximal IP, and MCP joints of the fingers are modeled as simple one−DOF parallel rotational joints, as presented by Rohling et al. [51] and shown for the index finger in Figure 1b. The kinematic models of the human thumb and the index finger shown in Figure 1b do not consider soft tissue deformation or bone−on−bone sliding effects.
The robot thumb is a four−link serial manipulator with four rotational joints (see Figure 2d) and its solid model and joint motion axes are shown in Figure 2c. The 3D−printed robot thumb is outfitted with an individually controlled servo motor for each joint, as shown in Figure 2b. The kinematics of the robot thumb following the mDH notation are presented in Table 2 with the values obtained from the 3D solid model. L 1 , L 2 , L 3 , and L 4 are the link lengths, and L 5 is the offset distance to the contact surface point of the robot thumb.
The kinematic models of the human and robot thumbs have the same number of joints but different joint configurations, ranges of motion, and link dimensions.

4. Thumb Tracking Device Development and Functionality

The tracking device is used to capture the complex CMC joint motion while being modular, ergonomic, easily personalized to different−sized thumbs, 3D−printable, and inexpensive. The combined motion of the MCP and IP thumb joints is captured using a single flex sensor attached to the dorsal side of the flex glove thumb, as shown in Figure 3. The sensor signal acquisition, processing, computation, and control of the robot thumb were performed on a myRIO−1900 microcontroller by National Instruments (NI) [52] programmed in NI LabVIEW [53]. The prototyped device and the flex glove worn by a user are shown in Figure 3.

4.1. Device Components and Their Function

The structural components of the CMC tracking device were 3D−printed for ease of manufacturing without the need for specialized equipment. These are the bracelet, pot joint, connector linkage, and thumb ring, as shown in Figure 3. The bracelet is the base of the device and is firmly attached to the human wrist using a Velcro strap. The thumb ring attaches to the MCP knuckle. The pot joint tracks the angular motion of the two−DOF CMC joint and is shown in Figure 3. The connector linkage connects the pot joint and thumb ring to transfer the CMC joint motion to the pot joint and provides the means for personalization for different thumb sizes.
The bracelet, pot joint, connector linkage, and the metacarpal (along with the CMC joint FE axis) establish a four−bar linkage mechanism, as presented in Figure 4. Thus, the CMC joint FE and AA motions are transferred to the pot joint and measured by their respective axis potentiometer. The FE and AA axis rotational potentiometers are attached on the pot joint (model PT6MV−103A2020, 0 to 90 rotation, 0–10 K Ω resistance, and 6.3 mm diameter [54]).

4.2. Device Personalization

The components of the tracking device are modular and provide the means to personalize the device to different thumb sizes. The bracelet consists of a ball–socket joint and a prismatic joint for personalizing the device. The pot joint attaches to the bracelet using the lockable ball–socket joint (see Figure 3) and provides the ability to adjust the pot joint orientation such that the user CMC joint motion range remains within the operational range of the pot joint. The lockable prismatic joint (see Figure 3) allows the user to adjust the height of the pot joint relative to the wrist to avoid interfering with the thumb during operation. Both the ball–socket joint and the prismatic joint are locked at a desired configuration using thumb screws. The connector linkage is designed with three connecting offsets/locations to adjust the link length (b) of the four−bar mechanism (see Figure 4) to accommodate the variations in metacarpal lengths of different users. The thumb ring should be worn on the metacarpal phalange close to the MCP knuckle (between the MCP and CMC joints) to ensure that only the CMC joint motion is captured while maximizing the operating range of motion of the pot joint. The attachment location of the thumb ring makes a rigid ring design impractical, whereas the use of elastic bands or velcro makes the attachment possible without causing discomfort. The elastic bands provide the means to easily and quickly adapt and personalize the device to accommodate different thumb girths. The CMC tracking device offers several personalization options to accommodate different−sized thumbs without hindering its operation.

5. Framework

The telemanipulation of the robot thumb using the developed CMC tracking device and the flex glove is accomplished through a three−stage framework, as shown in Figure 5. The three stages are device calibration and estimation of human thumb joint angles, kinematic analysis to map human thumb to robot thumb joint motion, and robot thumb motion control.

5.1. Stage 1: Device Calibration and Estimation of Human Thumb Joint Angles

The developed device needs to be personalized to the user, and as such, the first step in the calibration procedure is to obtain a set of user−specific measurements. The user−specific limits for the MCP ( θ M C P m i n / m a x ) and IP ( θ I P m i n / m a x ) joints are measured using a goniometer, and their ranges of motion are determined. The location of the CMC joint is identified by extending the thumb and palpating the prominent bony structure at its base, where the first metacarpal meets the trapezium bone. Then, the dimensions for the metacarpal length ( L m p ) and MCP knuckle thickness ( t ) are measured using a vernier caliper/ruler. The user must wear the device using the bracelet such that the AA pot joint axis is parallel and coincident or as close to the CMC AA joint visually. Based on the personalization procedure discussed in Section 4.2, the user must adjust the device to ensure an unrestricted range of motion for the thumb while performing a set of prescribed pure FE or AA motions. The user must perform all the prescribed motions and record the FE and AA pot−joint potentiometer readings at their respective limits for each thumb motion. Then, the user must measure the height of the AA pot joint center from the CMC joint (g) along the direction of pot joint height adjustment (see Figure 3). Next, the user must flex/extend the MCP and IP joints and record the flex sensor readings at their motion limits.
After calibration, the pot joint FE and AA potentiometer readings must be correlated to the human CMC joint motion. When a user wears the tracking device, the FE motion of the thumb is transferred to the FE axis pot joint through an equivalent four−bar kinematic linkage shown in Figure 4 with links a , f , b , g . The CMC FE joint angle, θ F E , is the angular displacement of link a relative to X B (X axis of the base frame in Figure 4) and is estimated using Equation (1):
θ F E = α + μ
where α is the angular displacement of link a relative to link g, and μ = arctan t 0 / L m p is the angular offset of the metacarpal phalange from link a. Note that t 0 ( = t / 2 + t 1 ) is the position offset of the thumb ring pivot along the negative Y F E direction, where t is the thickness of the MCP knuckle, and t 1 is the thumb ring pivot joint offset from the base of the ring to the joint center along the X F E direction (see Figure 4), which is obtained from the 3D model of the thumb ring. The angular displacement ( α ) of link a is estimated using Equation (2):
α = arccos g 2 + d 2 b 2 2 g d + arccos a 2 + d 2 f 2 2 a d
where d 2 = ( g 2 + b 2 2 g b cos ( π θ p o t , F E ) ) , θ p o t , F E is the angular displacement of the FE pot joint relative to X B (the X axis of the kinematic base frame in Figure 4), and a 2 = ( t 0 2 + L m p 2 ) . The link lengths f and b are obtained from the 3D model of the tracking device. The FE axis pot joint angular displacement, θ p , F E , is estimated using Equation (3):
θ p , F E = v F E v F E , m i n Δ v F E , m a x Δ θ p , F E + θ p , F E , m i n
where Δ θ p , F E = ( θ p , F E , m a x θ p , F E , m i n ) 90 is the range of FE pot joint motion, θ p , F E , m i n = 45 is the minimum angular displacement, and Δ v F E = ( v F E , m a x v F E , m i n ) 1.6 V and v F E , m i n 1.7 V are the voltage range and minimum voltage reading, respectively. These values are based on the 3D solid model constraints. v F E is the voltage reading of the potentiometer associated with the FE motion.
The personalization of the device for adjusting the orientation of the pot joint introduces a rotational offset in the FE potentiometer ( θ p , F E , 0 ). Post−personalization, the FE potentiometer is rotated until the AA axis is parallel to X B , and the estimated angular displacement is recorded to determine θ p , F E , 0 . Then, the corrected angular displacement of the FE potentiometer relative to X B is evaluated using θ p o t , F E = θ p , F E θ p , F E , 0 . The AA potentiometer reading is calibrated similarly to the FE one, and its angular displacement is estimated using Equation (4):
θ p o t , A A = v A A v A A , m i n Δ v A A Δ θ p o t , A A + θ p o t , A A , m i n
where Δ θ p o t , A A = ( θ p o t , A A , m a x θ p o t , F E , m i n ) 90 is the range of motion, θ p o t , A A , m i n = 45 is the minimum angular displacement measured using a protractor, v A A is the voltage reading of the potentiometer associated with the AA motion, and Δ v A A = ( v A A , m a x v A A , m i n ) 1.6 V and v A A , m i n = 1.7 V are its voltage range and minimum voltage reading, respectively.
The CMC joint AA motion is estimated from the angular rotation of the AA axis pot joint. The CMC AA joint angle is measured relative to X F E and is estimated from the rotation of the AA axis potentiometer, θ A A = θ p o t , A A θ p o t , A A , 0 . The θ p o t , A A , 0 is the rotational offset introduced during personalization and is equal to the estimated angular displacement of the AA potentiometer when the CMC joint is completely adducted.
The flex sensor reading measures the combined flexion of the MCP and IP user joints. The MCP and IP joint motions are assumed to be coupled and actuate at the same rate. Due to this approximation, both the MCP and IP joints reach their limits at the same time, and their respective joint angles can be correlated independently to the flex sensor readings using Equation (5):
θ i = v f v f , m i n v f , m a x v f , m i n Δ θ i
where i = M C P , I P , Δ θ i is the respective range of motion, and θ i is the estimated angular displacement.

5.2. Stage 2: Kinematic Mapping Algorithm of Human to Robot Thumb

In the second stage of the framework, the estimated human thumb angles are mapped to the robot joint angles. The first two joints (CMC joint) of the human thumb model are orthogonal and intersecting (see Figure 1) whereas those of the robot thumb are non−orthogonal and non−intersecting (see Figure 2c,d). Due to these anatomical and kinematic topology differences, the estimated human thumb joint angles cannot be directly mapped to the robot thumb [9]. Hence, a hybrid approach was followed to map the human thumb motion to the robot thumb. Cartesian space mapping was used to map the CMC joint to the first two joints of the robot thumb, and joint space mapping was used to map the MCP and IP joints to the third and fourth joints of the robot thumb.
For the Cartesian space mapping, the motion of the CMC joint is decoupled and mapped to the first two joints of the robot thumb using the developed Radial Projection Algorithm (RPA) algorithm. The kinematic model of the human thumb CMC joint ( H c m c ) includes the metacarpal phalange, link L m p , as shown in Figure 1d. The kinematic model of the equivalent CMC joint of the robot thumb ( R c m c ), which includes the base two joints and links ( J 1 , J 2 , L 1 , and L 2 ), is defined as shown in Figure 2d. The position vectors of the distal end for both H c m c and R c m c are defined as P h and P r , respectively, and shown in Figure 6.
The H c m c workspace is the partial surface of a sphere with radius r 0 = L m p , whereas that of R c m c is the partial surface of a warped ellipsoid as shown in Figure 7a. Due to the geometric differences between these two workspaces, the distal end of R c m c ( P r ) cannot reach the actual position of the distal end of the metacarpal phalange ( P h ).
The proposed RPA maps the motion of H c m c to R c m c such that P r follows P h radially towards the base, as shown in Figure 6. Note that P r cannot follow P h throughout its workspace due to the joint range limitations. The workspaces of H c m c and R c m c are shown in Figure 7a. The H c m c workspace radius, r 0 , is user−specific and is the length of the user’s first metacarpal, L m p . The mapped area using radial projection from the H c m c to the R c m c workspace does not change even if the workspace radius r 0 changes; however, it depends on the boundary of the H c m c workspace which is a function of the user−specific joint limits and offsets.
A complete mapping between the workspaces is desirable, since it will allow a user’s CMC joint to control the entire R c m c workspace. As such, to achieve complete mapping, a projected workspace is defined by projecting the R c m c workspace onto a spherical surface with radius r 0 , with the bounds of the projected area determined as shown in Figure 7b. The boundary of the projected workspace in the configuration space of the H c m c is shown in Figure 7c. The rectangle inscribing the projected workspace in the configuration space is considered to be the H c m c mapped workspace. The H c m c mapped workspace bounds are determined to be θ m , F E , l i m i t s ( 0 , 70 ) and θ m , A A , l i m i t s ( 10 , 90 ) . The mapped H c m c joint angles ( θ m , F E , θ m , A A ) are determined by linearly remapping the estimated CMC joint angles to the mapped workspace bounds using Equation (6):
θ m , i = ( θ e , i θ e , i , m i n ) ( θ e , i , m a x θ e , i , m i n ) Δ θ m . i + θ m , i , m i n
where i = { F E , A A } with m i n and m a x representing their respective minimum and maximum values, Δ θ m , i = ( θ m , i , m a x θ m , i , m i n ) represents the angular range of the mapped joint angles, and θ e , i and θ m , i are the estimated and mapped human thumb joint angles, respectively.
Using the mapped CMC joint angles, the position of the distal end of the CMC joint, P m , h , is determined through forward kinematics [50] using Equations (7)–(9):
P m , h x = L m p cos ( θ F E ) cos θ A A
P m , h y = L m p sin θ A A
P m , h z = L m p sin ( θ F E ) cos ( θ A A )
Since the H c m c mapped workspace encapsulates the entire projection of the R c m c workspace, P m , h can guide P r throughout the R c m c workspace using the RPA. The RPA imposes a constraint between P m , h and P r such that these vectors become aligned and pass through the base frame origin. This constraint is realized by satisfying Equation (10):
P m , h × P r = 0
The position of the distal point of the robot thumb, P r , is evaluated using forward kinematics and the mDH kinematic parameters in Table 2 and is shown in Equations (11)–(13).
P r x = K 1 cos θ 1 K 2 sin θ 1
P r y = K 1 sin θ 1 + K 2 cos θ 1
P r z = L 2 sin ( 0.38 π ) sin ( θ 2 π / 2 )
where K 1 = L 1 + L 2 c o s ( θ 2 π / 2 ) , and K 2 = L 2 c o s ( 0.38 π ) s i n ( θ 2 π / 2 ) .
The robot thumb ( R c m c ) joint angles ( θ 1 , θ 2 ) cannot be evaluated in closed form. Therefore, an optimization−based inverse kinematics approach was followed using Equations (7)–(13) as presented in Algorithm 1.
Algorithm 1 Optimization−based inverse kinematics procedure.
Input   ( L m p , θ m , F E , θ m , A A , L 1 , L 2 )
1:
Define constraints ( θ 1 , θ 2 ) [ 0 , π / 2 ]
2:
Initialize ( θ 1 , θ 2 ) = 0
3:
Define optimization cost function
C f ( θ 1 , θ 2 ) = P m , h × P r 2
4:
Calculate P m , h x , P m , h y , P m , h z using Equations (7)–(9)
5:
Calculate P r x , P r y , P r z using Equations (11)–(13)
6:
Evaluate the optimization cost function C f ( θ 1 , θ 2 )
7:
Minimize C f ( θ 1 , θ 2 ) using fmincon (MATLAB constrained optimization function)
Output   ( θ 1 , θ 2 )
The estimated MCP and IP joint angles were linearly mapped to the third and fourth joints of the robot thumb using Equation (14).
θ r , i = ( θ h , j θ h , j , m i n ) ( θ h , j , m a x θ h , j , m i n ) Δ θ r , i + θ r , i , m i n
where θ r , i and θ h , j are the robot and human thumb joint angles with i = { 3 , 4 } and j = { M C P , I P } representing the associated robot and human thumb joint, respectively, and Δ θ r , i = ( θ r , i , m a x θ r , i , m i n ) represents the angular range of the respective robot joint.

5.3. Stage 3: Motion Control

The robot thumb has a single servo motor for each joint controlled using pulse width modulation (PWM) at 50 Hz frequency. The calculated joint angles for the robot thumb are provided as input to the motion control algorithm from the mapping algorithm (Algorithm 1) in real time. The developed system performs thumb tracking, angle estimation, and kinematic mapping at a rate of approximately 55 Hz.

6. CMC Tracking Device and System Evaluation

The operation of the tracking device and the developed system were evaluated using two tests. The first test evaluated the operation and personalization of the tracking device and verified the implemented RPA mapping between H c m c and R c m c . The second test verified the framework and the system. Two volunteers took part in the first test and one in the second test. The developed system used by the volunteers for the evaluation consisted of the CMC tracking device and a flex glove interfaced with the ARH (see Figure 2a) using an NI myRIO device. The myRIO sampled the tracking device and flex glove readings and transmited PWM signals to servo motors controlling the ARH. The NI myRIO was connected to a host PC and programmed using NI LabVIEW.

6.1. Evaluation Protocol

The first step is to introduce the volunteer to the developed system and provide instructional cues on the system operation based on the framework presented in Figure 5. The volunteer, guided by the research team, measured their thumb parameters and calibrated the device. After calibration, the user visually verified that the ARH thumb motion resembled natural thumb movements by performing thumb circumduction a few times. After this verification, the volunteer was asked to test the device and telemanipulate the ARH. The test procedures were as follows:
  • Device operation and personalization test
    This test evaluated the operation of the CMC tracking device using the RPA and its ability to be personalized to different thumb sizes. The volunteer used the developed system without the flex glove and telemanipulated the R c m c (the equivalent CMC joint) of the ARH thumb to oppose a cylinder (40 mm diameter and 200 mm height) suspended by a string against the palm. The opposition was deemed successful when no slippage was observed after removing the string holding the cylinder.
  • Framework verification test
    This test verified the operation of the developed system using the proposed framework. The volunteer was asked to perform prescribed thumb motions and poses. The motions included pure flexion, extension, abduction, and adduction of the CMC joint, as well as thumb opposition poses with the index and middle fingers. An observational approach was followed in which the hand poses performed by the volunteer were observed by a research team member to evaluate the extent of the thumb reaching the target pose.
    Then, the volunteer was provided with four objects and asked to perform power and precision grasps. The power grasps were performed using a cylinder (diameter of 40 mm and height of 200 mm) and a cuboid (115 × 57 × 28 mm). The precision grasps were performed using a marker pen (a thin cylinder with a diameter of 18 mm and length of 115 mm) and a sphere (radius of 25 mm). The objects used for power grasps were suspended by a string such that the object was touching the stationary ARH palm. The objects used for precision grasps were positioned and oriented by a research team member to achieve a human−like grasp. A grasp was considered successful if the object grasped by the ARH did not slip even after the string suspending the object was removed. A grasp was considered unsuccessful if the object slipped during the grasp or an ARH finger or the thumb reached their respective operational limits without adequately securing the object.

6.2. Results and Discussion

The performance of the tracking device was evaluated by two able−bodied volunteers. The proposed methodology was evaluated by one volunteer. The measured thumb parameters for each volunteer were used as inputs to the mapping algorithm and are presented in Table 3.
During the device operation test, each volunteer easily personalized the tracking device and calibrated the system with guidance from a research team member. The tracking device captured the CMC joint motion of both volunteers. During visual verification, the R c m c was observed to follow the tracked thumb circumduction motion for both volunteers. The H c m c , mapped H c m c , and estimated R c m c positions during the thumb circumduction for Volunteer 1 are shown in Figure 8. Subsequently, using the system, each volunteer was able to telemanipulate the ARH (with only R c m c ) to successfully oppose the cylinder against the palm, as shown in Figure 9.
The framework validation test was performed by Volunteer 1, who successfully telemanipulated the ARH to perform the prescribed thumb poses and grasp different shaped objects, as shown in Figure 10. The prescribed poses performed by the volunteer are shown in Figure 10a–d. The power grasps for a cylinder and a cuboid are shown in Figure 10e and Figure 10f, respectively. The precision grasps for a marker pen (thin cylinder) and a sphere are shown in Figure 10g and h, respectively. The trajectory of poses during the telemanipulation of the ARH by Volunteer 1 to grasp a cylinder (diameter of 40 mm and height of 120 mm) is shown in Figure 11, and the trajectory of the robot thumb joint angles is shown in Figure 12.
The experimental results of the device operation tests verify that the developed tracking device could be easily personalized and calibrated to volunteers with different thumb dimensions and successfully track their CMC joint motion. Using the RPA algorithm, the estimated H c m c motion was mapped to the R c m c to position and oppose an object against the palm by each volunteer.
According to the framework verification test results, the tracking device integrated with the developed system following the proposed framework could be successfully deployed to telemanipulate a robot thumb with kinematic differences. The results validate the proposed framework and its ability to integrate functional components of the telemanipulation system by successfully reaching prescribed poses and performing various grasps (precision and power) of objects using the CMC tracking device.

7. Conclusions

Hand−tracking systems are widely adopted as control interfaces for telemanipulating ARHs. The human thumb is the most complex digit of the hand due to the CMC joint, which plays a significant role in grasping. As such, tracking the CMC joint is critical for the telemanipulation of fully actuated four−DOF ARH thumbs to grasp and manipulate objects. Existing hand−tracking systems either fail to track the CMC joint motion or rely on expensive hardware and setups. The developed CMC joint thumb tracking device is inexpensive, affordable, 3D−printable, and personalizable to different thumb sizes. The tracked human thumb motion requires suitable mapping to telemanipulate a robot thumb with kinematic differences. Accordingly, a hybrid mapping approach, utilizing the RPA for CMC joint mapping, has been proposed and implemented within the framework. Based on the device operation and personalization tests, the CMC tracking device demonstrated its ability to be personalized and calibrated to volunteers with different thumb dimensions and captured the FE and AA motions of their CMC joint. During thumb cirucumduction, it was observed that the RPA mapped the CMC joint motion of H c m c to the entire R c m c range of motion. The ability of the proposed framework to integrate system components to telemanipulate the ARH was confirmed during the framework verification test. This test also demonstrated the ability of the tracking device to be used along with the flex glove to telemanipulate the whole ARH and grasp objects. The results from the evaluations of the tracking device, mapping algorithm, and telemanipulation framework provide confidence to expand and employ this system as a research platform to investigate HRI in assistive, manufacturing, or other applicable domains. The CMC tracking device with the flex glove can be a valuable tool to teach and train robotic hands, both in a physical or virtual environment, to effectively and reliably grasp objects of regular or irregular shapes or evaluate grasping performance.

Author Contributions

Conceptualization, P.S.S. and A.H.A.R.; methodology, P.S.S. and A.H.A.R.; software, A.H.A.R.; formal analysis, P.S.S. and A.H.A.R.; investigation, P.S.S. and A.H.A.R.; resources, P.S.S.; writing—original draft preparation, A.H.A.R.; writing—review and editing, P.S.S.; visualization, A.H.A.R.; supervision, P.S.S.; project administration, P.S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study is contained within the article.

Acknowledgments

The authors would like to thank the two volunteers who participated in the evaluation of the proposed tracking device and framework on their own accord and without any compensation. The authors would also like the anonymous reviewers whose comments improved the paper from its original form.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ARHAnthropomorphic Robotic Hand
CMCCarpometacarpal Joint
MCPMetacarpophalangeal Joint
IPInterphalangeal Joint
AAAbduction/Adduction
FEFlexion/Extension
DOFDegree(s) of Freedom
RPARadial Projection Algorithm

References

  1. Kim, U.; Jung, D.; Jeong, H.; Park, J.; Jung, H.M.; Cheong, J.; Choi, H.R.; Do, H.; Park, C. Integrated linkage-driven dexterous anthropomorphic robotic hand. Nat. Commun. 2021, 12, 7177. [Google Scholar] [CrossRef] [PubMed]
  2. Min, S.; Yi, S. Development of Cable-driven Anthropomorphic Robot Hand. IEEE Robot. Autom. Lett. 2021, 6, 1176–1183. [Google Scholar] [CrossRef]
  3. Woods, S.; Fisher, C. Development of a self-contained robotic hand with dexterous grasping capabilities for research applications. SAIEE Afr. Res. J. 2023, 114, 87–92. [Google Scholar] [CrossRef]
  4. Handa, A.; Van Wyk, K.; Yang, W.; Liang, J.; Chao, Y.W.; Wan, Q.; Birchfield, S.; Ratliff, N.; Fox, D. DexPilot: Vision-Based Teleoperation of Dexterous Robotic Hand-Arm System. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 9164–9170. [Google Scholar] [CrossRef]
  5. Zhu, G.; Xiao, X.; Li, C.; Ma, J.; Ponraj, G.; Prituja, A.V.; Ren, H. A Bimanual Robotic Teleoperation Architecture with Anthropomorphic Hybrid Grippers for Unstructured Manipulation Tasks. Appl. Sci. 2020, 10, 2086. [Google Scholar] [CrossRef]
  6. Chen, W.; Yu, C.; Tu, C.; Lyu, Z.; Tang, J.; Ou, S.; Fu, Y.; Xue, Z. A Survey on Hand Pose Estimation with Wearable Sensors and Computer-Vision-Based Methods. Sensors 2020, 20, 1074. [Google Scholar] [CrossRef]
  7. Rashid, A.; Hasan, O. Wearable technologies for hand joints monitoring for rehabilitation: A survey. Microelectron. J. 2019, 88, 173–183. [Google Scholar] [CrossRef]
  8. Glauser, O.; Wu, S.; Panozzo, D.; Hilliges, O.; Sorkine-Hornung, O. Interactive Hand Pose Estimation Using a Stretch-Sensing Soft Glove. ACM Trans. Graph. 2019, 38, 1–15. [Google Scholar] [CrossRef]
  9. Li, R.; Wang, H.; Liu, Z. Survey on Mapping Human Hand Motion to Robotic Hands for Teleoperation. IEEE Trans. Circuits Syst. Video Technol. 2022, 32, 2647–2665. [Google Scholar] [CrossRef]
  10. Mizera, C.; Delrieu, T.; Weistroffer, V.; Andriot, C.; Decatoire, A.; Gazeau, J.P. Evaluation of Hand-Tracking Systems in Teleoperation and Virtual Dexterous Manipulation. IEEE Sensors J. 2020, 20, 1642–1655. [Google Scholar] [CrossRef]
  11. Li, S.; Ma, X.; Liang, H.; Görner, M.; Ruppel, P.; Fang, B.; Sun, F.; Zhang, J. Vision-based Teleoperation of Shadow Dexterous Hand using End-to-End Deep Neural Network. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 416–422. [Google Scholar] [CrossRef]
  12. Abrego, C.E. A Two Stage Event Based Data Driven Controller for Improved Grasping of an Artificial Hand. Ph.D. Thesis, The University of Texas at Arlington, Arlington, TX, USA, 2018. [Google Scholar]
  13. Langevin, G. Inmoov Hand. 2014. Available online: http://inmoov.fr/hand-and-forarm/ (accessed on 30 August 2016).
  14. Abdul Rahaman, A.H.; Hazra, S.; Shiakolas, P.S. On the Development and Evaluation of an Affordable Telerobotic System for Object Grasping for Human-Machine Interaction. In ASME International Mechanical Engineering Congress and Exposition, New Orleans, LA, USA, 29 October–2 November 2023; Volume 5: Biomedical and Biotechnology; American Society of Mechanical Engineers: New York, NY, USA, 2023. [Google Scholar] [CrossRef]
  15. Hazra, S.; Abdul Rahaman, A.H.; Shiakolas, P.S. An Affordable Telerobotic System Architecture for Grasp Training and Object Grasping for Human-Machine Interaction. J. Eng. Sci. Med. Diagn. Ther. 2023, 7, 011011. [Google Scholar] [CrossRef]
  16. Theodoridou, E.; Cinque, L.; Mignosi, F.; Placidi, G.; Polsinelli, M.; Tavares, J.M.R.S.; Spezialetti, M. Hand Tracking and Gesture Recognition by Multiple Contactless Sensors: A Survey. IEEE Trans. Hum. Mach. Syst. 2023, 53, 35–43. [Google Scholar] [CrossRef]
  17. Rijpkema, H.; Girard, M. Computer Animation of Knowledge-Based Human Grasping; ACM Siggraph Computer Graphics: New York, NY, USA, 1991; Volume 25, pp. 339–348. [Google Scholar] [CrossRef]
  18. Kuo, L.C.; Su, F.C.; Chiu, H.Y.; Yu, C.Y. Feasibility of using a video-based motion analysis system for measuring thumb kinematics. J. Biomech. 2002, 35, 1499–1506. [Google Scholar] [CrossRef] [PubMed]
  19. Park, Y.; Lee, J.; Bae, J. Development of a Wearable Sensing Glove for Measuring the Motion of Fingers Using Linear Potentiometers and Flexible Wires. IEEE Trans. Ind. Inform. 2015, 11, 198–206. [Google Scholar] [CrossRef]
  20. Li, K.; Chen, I.M.; Yeo, S.H. Design and validation of a multi-finger sensing device based on Optical linear encoder. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 3629–3634. [Google Scholar] [CrossRef]
  21. Kessler, G.D.; Hodges, L.F.; Walker, N. Evaluation of the CyberGlove as a whole-hand input device. ACM Trans. Comput.-Hum. Interact. 1995, 2, 263–283. [Google Scholar] [CrossRef]
  22. Fang, B.; Sun, F.; Liu, H.; Guo, D. Development of a Wearable Device for Motion Capturing Based on Magnetic and Inertial Measurement Units. Sci. Program. 2017, 2017, 7594763. [Google Scholar] [CrossRef]
  23. Khatoon, F.; Ravan, M.; Amineh, R.K.; Byberi, A. Hand Gesture Recognition Pad Using an Array of Inductive Sensors. IEEE Trans. Instrum. Meas. 2023, 72, 2516611. [Google Scholar] [CrossRef]
  24. VICON. Motion Capture Systems. 2018. Available online: https://www.vicon.com/ (accessed on 30 February 2018).
  25. Borghetti, M.; Sardini, E.; Serpelloni, M. Sensorized Glove for Measuring Hand Finger Flexion for Rehabilitation Purposes. IEEE Trans. Instrum. Meas. 2013, 62, 3308–3314. [Google Scholar] [CrossRef]
  26. P, G.J.; Miss, A.K. IoT Based Sign Language Interpretation System. J. Physics Conf. Ser. 2019, 1362, 012034. [Google Scholar] [CrossRef]
  27. Dickey, M.D.; Chiechi, R.C.; Larsen, R.J.; Weiss, E.A.; Weitz, D.A.; Whitesides, G.M. Eutectic Gallium-Indium (EGaIn): A Liquid Metal Alloy for the Formation of Stable Structures in Microchannels at Room Temperature. Adv. Funct. Mater. 2008, 18, 1097–1104. [Google Scholar] [CrossRef]
  28. Muth, J.T.; Vogt, D.M.; Truby, R.L.; Mengüç, Y.; Kolesky, D.B.; Wood, R.J.; Lewis, J.A. Embedded 3D Printing of Strain Sensors within Highly Stretchable Elastomers. Adv. Mater. 2014, 26, 6307–6312. [Google Scholar] [CrossRef]
  29. Park, Y.L.; Majidi, C.; Kramer, R.; Bérard, P.; Wood, R.J. Hyperelastic pressure sensing with a liquid-embedded elastomer. J. Micromechanics Microengineering 2010, 20, 125029. [Google Scholar] [CrossRef]
  30. Park, Y.L.; Chen, B.R.; Wood, R.J. Design and Fabrication of Soft Artificial Skin Using Embedded Microchannels and Liquid Conductors. IEEE Sensors J. 2012, 12, 2711–2718. [Google Scholar] [CrossRef]
  31. Kim, D.H.; Lee, S.W.; Park, H.S. Improving Kinematic Accuracy of Soft Wearable Data Gloves by Optimizing Sensor Locations. Sensors 2016, 16. [Google Scholar] [CrossRef]
  32. Si, Y.; Chen, S.; Li, M.; Li, S.; Pei, Y.; Guo, X. Flexible Strain Sensors for Wearable Hand Gesture Recognition: From Devices to Systems. Adv. Intell. Syst. 2022, 4, 2100046. [Google Scholar] [CrossRef]
  33. Cyberglove 3, CyberGlove Systems. Available online: https://www.cyberglovesystems.com/cyberglove-ii/ (accessed on 21 February 2024).
  34. Prime 3 Mocap, MANUS. Available online: https://www.manus-meta.com/buy-now (accessed on 21 February 2024).
  35. Sensoryx VRFree Glove Hand Tracking System. Available online: https://www.interhaptics.com/blog/2021/07/08/ (accessed on 21 February 2024).
  36. Hu, H.; Gao, X.; Li, J.; Wang, J.; Liu, H. Calibrating human hand for teleoperating the HIT/DLR hand. In Proceedings of the 2004 IEEE International Conference on Robotics and Automation (ICRA ’04), New Orleans, LA, USA, 26 April–1 May 2024; IEEE: Piscataway, NJ, USA, 2004. Proceedings. Volume 5, pp. 4571–4576. [Google Scholar] [CrossRef]
  37. Cui, L.; Cupcic, U.; Dai, J.S. Kinematic Mapping and Calibration of the Thumb Motions for Teleoperating a Humanoid Robot Hand. In Proceedings of the International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Volume 6: 35th Mechanisms and Robotics Conference, Parts A and B, Washington, DC, USA, 28–31 August 2011; Volume 8, pp. 1139–1147. [Google Scholar] [CrossRef]
  38. Villegas, R.; Yang, J.; Ceylan, D.; Lee, H. Neural Kinematic Networks for Unsupervised Motion Retargetting. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 8639–8648. [Google Scholar] [CrossRef]
  39. Chattaraj, R.; Bepari, B.; Bhaumik, S. Grasp mapping for Dexterous Robot Hand: A hybrid approach. In Proceedings of the 2014 Seventh International Conference on Contemporary Computing (IC3), Noida, India, 7–9 August 2014; pp. 242–247. [Google Scholar] [CrossRef]
  40. Meattini, R.; Chiaravalli, D.; Palli, G.; Melchiorri, C. Exploiting In-Hand Knowledge in Hybrid Joint-Cartesian Mapping for Anthropomorphic Robotic Hands. IEEE Robot. Autom. Lett. 2021, 6, 5517–5524. [Google Scholar] [CrossRef]
  41. Hollister, A.; Buford, W.L.; Myers, L.M.; Giurintano, D.J.; Novick, A. The axes of rotation of the thumb carpometacarpal joint. J. Orthop. Res. 1992, 10, 454–460. [Google Scholar] [CrossRef]
  42. Valero-Cuevas, F.J.; Johanson, M.; Towles, J.D. Towards a realistic biomechanical model of the thumb: The choice of kinematic description may be more critical than the solution method or the variability/uncertainty of musculoskeletal parameters. J. Biomech. 2003, 36, 1019–1030. [Google Scholar] [CrossRef]
  43. Crisco, J.J.; Halilaj, E.; Moore, D.C.; Patel, T.; Weiss, A.P.C.; Ladd, A.L. In Vivo Kinematics of the Trapeziometacarpal Joint During Thumb Extension-Flexion and Abduction-Adduction. J. Hand Surg. 2015, 40, 289–296. [Google Scholar] [CrossRef]
  44. Chang, L.Y.; Pollard, N.S. Method for Determining Kinematic Parameters of the In Vivo Thumb Carpometacarpal Joint. IEEE Trans. Biomed. Eng. 2008, 55, 1897–1906. [Google Scholar] [CrossRef]
  45. Griffin, W.B.; Findley, R.P.; Turner, M.L.; Cutkosky, M.R. Calibration and Mapping of a Human Hand for Dexterous Telemanipulation. In ASME International Mechanical Engineering Congress and Exposition; Dynamic Systems and Control: Volume 2; American Society of Mechanical Engineers: Orlando, FL, USA, 2000; pp. 1145–1152. [Google Scholar] [CrossRef]
  46. Cooney, W.; Lucca, M.; Chao, E.; Linscheid, R. The kinesiology of the thumb trapeziometacarpal joint. J. Bone Jt. Surg. Ser. A 1981, 63, 1371–1381. [Google Scholar] [CrossRef]
  47. Imaeda, T.; An, K.N.; Cooney, W.P. Functional Anatomy and Biomechanics of the Thumb. Hand Clin. 1992, 8, 9–15. [Google Scholar] [CrossRef] [PubMed]
  48. Cui, L.; Cupcic, U.; Dai, J.S. An Optimization Approach to Teleoperation of the Thumb of a Humanoid Robot Hand: Kinematic Mapping and Calibration. J. Mech. Des. 2014, 136, 091005. [Google Scholar] [CrossRef]
  49. Cerulo, I.; Ficuciello, F.; Lippiello, V.; Siciliano, B. Teleoperation of the SCHUNK S5FH under-actuated anthropomorphic hand using human hand motion tracking. Robot. Auton. Syst. 2017, 89, 75–84. [Google Scholar] [CrossRef]
  50. Craig, J.J. Introduction to Robotics: Mechanics and Control; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2005; Volume 3. [Google Scholar]
  51. Rohling, R.N.; Hollerbach, J.M. Calibrating the Human Hand for Haptic Interfaces. Presence Teleoperators Virtual Environ. 1993, 2, 281–296. [Google Scholar] [CrossRef]
  52. National Instruments. myRIO. 2015. Available online: http://www.ni.com/pdf/manuals/376047c.pdf (accessed on 30 August 2016).
  53. National Instruments. LabVIEW. 2015. Available online: http://www.ni.com/myrio (accessed on 30 August 2016).
  54. Piher Sensing Systems. Trimmer and Control Potentiometers. Available online: https://www.piher.net/products/contacting-position-sensors/trimmers-and-control-potentiometers/ (accessed on 11 January 2024).
Figure 1. Human thumb (a) Anatomy, (b) Visual of assumed kinematic model, and (c) Kinematic model schematic.
Figure 1. Human thumb (a) Anatomy, (b) Visual of assumed kinematic model, and (c) Kinematic model schematic.
Applsci 15 01301 g001
Figure 2. (a) Kinematic model of the robot thumb ( R c m c model consists of the first two joints ( J 1 , J 2 ) and links ( L 1 , L 2 ) ). (b) 3D model of the 4−DOF robot thumb. (c) Fully actuated 4−DOF thumb developed in our lab. (d) ARH used in this research.
Figure 2. (a) Kinematic model of the robot thumb ( R c m c model consists of the first two joints ( J 1 , J 2 ) and links ( L 1 , L 2 ) ). (b) 3D model of the 4−DOF robot thumb. (c) Fully actuated 4−DOF thumb developed in our lab. (d) ARH used in this research.
Applsci 15 01301 g002
Figure 3. The 3D model of the tracking device labeled with components (left). The thumb tracking device and flex glove worn by a volunteer (right).
Figure 3. The 3D model of the tracking device labeled with components (left). The thumb tracking device and flex glove worn by a volunteer (right).
Applsci 15 01301 g003
Figure 4. Graphical layout of the four−bar linkage of the device (links a, g, b, f in solid lines). The base frame axes are X B and Z B , and CMC FE joint frame axes are X F E and Y F E .
Figure 4. Graphical layout of the four−bar linkage of the device (links a, g, b, f in solid lines). The base frame axes are X B and Z B , and CMC FE joint frame axes are X F E and Y F E .
Applsci 15 01301 g004
Figure 5. Framework for mapping human thumb to an anatomically different robot thumb motion.
Figure 5. Framework for mapping human thumb to an anatomically different robot thumb motion.
Applsci 15 01301 g005
Figure 6. Radial projection of a desired point from human, P h , to robot, P r , space reference to the base frame.
Figure 6. Radial projection of a desired point from human, P h , to robot, P r , space reference to the base frame.
Applsci 15 01301 g006
Figure 7. (a) Workspace comparison of the CMC joint and R c m c . (b) Radial Projection of R c m c workspace onto the mapped workspace of CMC joint. (c) Projected bounds of H c m c in configuration space based on the R c m c workspace boundary. The hand model is not to scale but is shown to facilitate the understanding of the workspace orientation.
Figure 7. (a) Workspace comparison of the CMC joint and R c m c . (b) Radial Projection of R c m c workspace onto the mapped workspace of CMC joint. (c) Projected bounds of H c m c in configuration space based on the R c m c workspace boundary. The hand model is not to scale but is shown to facilitate the understanding of the workspace orientation.
Applsci 15 01301 g007
Figure 8. The trajectory/positions of H c m c , mapped H c m c , and estimated R c m c using the tracking device data captured during thumb circumduction by Volunteer 1. The hand model is not to scale but is shown to facilitate the understanding of the workspace orientation.
Figure 8. The trajectory/positions of H c m c , mapped H c m c , and estimated R c m c using the tracking device data captured during thumb circumduction by Volunteer 1. The hand model is not to scale but is shown to facilitate the understanding of the workspace orientation.
Applsci 15 01301 g008
Figure 9. Time−lapse images of the robot CMC joint controlled by the human CMC joint while opposing a cylinder against the palm by each volunteer.
Figure 9. Time−lapse images of the robot CMC joint controlled by the human CMC joint while opposing a cylinder against the palm by each volunteer.
Applsci 15 01301 g009
Figure 10. Telemanipulation of the ARH by Volunteer 1. (a,b) CMC joint pure flexion and extension poses. (c,d) Thumb opposition with index and middle fingers. (eh) Grasps for different objects (cylinder, cuboid, marker, and sphere) using the developed system.
Figure 10. Telemanipulation of the ARH by Volunteer 1. (a,b) CMC joint pure flexion and extension poses. (c,d) Thumb opposition with index and middle fingers. (eh) Grasps for different objects (cylinder, cuboid, marker, and sphere) using the developed system.
Applsci 15 01301 g010
Figure 11. Time−lapse images of Volunteer 1 thumb poses followed by the thumb of the ARH to grasp a cylinder.
Figure 11. Time−lapse images of Volunteer 1 thumb poses followed by the thumb of the ARH to grasp a cylinder.
Applsci 15 01301 g011
Figure 12. Trajectories of the evaluated robot thumb joint angles to grasp a cylinder based on the thumb motion of Volunteer 1 captured by the thumb tracking device and the flex glove.
Figure 12. Trajectories of the evaluated robot thumb joint angles to grasp a cylinder based on the thumb motion of Volunteer 1 captured by the thumb tracking device and the flex glove.
Applsci 15 01301 g012
Table 1. Human thumb (Figure 1b,c) mDH kinematic table.
Table 1. Human thumb (Figure 1b,c) mDH kinematic table.
FramesJointmDH Parameters
n 1 n a n 1 (mm) α n 1 (rad) d n (mm) θ n (rad)
BFEFE0 3 π / 2 0 θ F E
FEAAAA0 π / 2 0 θ A A
AAMCPMCP L m p 3 π / 2 0 θ M C P
MCPIPIP L p p 00 θ I P
IP p t , h - L d p 3 π / 2 L t 0
Table 2. Robot thumb (Figure 2a,b) mDH kinematic table.
Table 2. Robot thumb (Figure 2a,b) mDH kinematic table.
FramesJointmDH Parameters
n 1 n a n 1 (mm) α n 1 (rad) d n (mm) θ n (rad)
B11000 θ 1
122 L 1 = 20.9 0.38 π 0 θ 2 π / 2
233 L 2 = 18.0 3 π / 2 0 θ 3
344 L 3 = 22.4 00 θ 4
4 p t , r - L 4 = 30.2 3 π / 2 L 5 = 5.9 0
Table 3. Measured thumb parameters for each volunteer.
Table 3. Measured thumb parameters for each volunteer.
VolunteerPhalange Length *Thickness *Range of Motion **
L mp L pp L dp MCP θ FE θ AA
1524534204550
2483527155550
* All length dimensions are in millimeters (mm). ** All angular dimensions are in degrees (deg).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Abdul Rahaman, A.H.; Shiakolas, P.S. A Carpometacarpal Thumb Tracking Device for Telemanipulation of a Robotic Thumb: Development, Prototyping, and Evaluation. Appl. Sci. 2025, 15, 1301. https://doi.org/10.3390/app15031301

AMA Style

Abdul Rahaman AH, Shiakolas PS. A Carpometacarpal Thumb Tracking Device for Telemanipulation of a Robotic Thumb: Development, Prototyping, and Evaluation. Applied Sciences. 2025; 15(3):1301. https://doi.org/10.3390/app15031301

Chicago/Turabian Style

Abdul Rahaman, Abdul Hafiz, and Panos S. Shiakolas. 2025. "A Carpometacarpal Thumb Tracking Device for Telemanipulation of a Robotic Thumb: Development, Prototyping, and Evaluation" Applied Sciences 15, no. 3: 1301. https://doi.org/10.3390/app15031301

APA Style

Abdul Rahaman, A. H., & Shiakolas, P. S. (2025). A Carpometacarpal Thumb Tracking Device for Telemanipulation of a Robotic Thumb: Development, Prototyping, and Evaluation. Applied Sciences, 15(3), 1301. https://doi.org/10.3390/app15031301

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop