Next Article in Journal
A Helical Microrobot with an Optimized Propeller-Shape for Propulsion in Viscoelastic Biological Media
Previous Article in Journal / Special Issue
Tactile-Driven Grasp Stability and Slip Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Dexterous Manipulation of Unknown Objects Using Virtual Contact Points

Institute of Industrial and Control Engineering (IOC), Universitat Politècnica de Catalunya (UPC), 08028 Barcelona, Spain
*
Author to whom correspondence should be addressed.
Robotics 2019, 8(4), 86; https://doi.org/10.3390/robotics8040086
Submission received: 12 July 2019 / Revised: 6 October 2019 / Accepted: 9 October 2019 / Published: 12 October 2019
(This article belongs to the Special Issue Robotics in Spain 2019)

Abstract

:
The manipulation of unknown objects is a problem of special interest in robotics since it is not always possible to have exact models of the objects with which the robot interacts. This paper presents a simple strategy to manipulate unknown objects using a robotic hand equipped with tactile sensors. The hand configurations that allow the rotation of an unknown object are computed using only tactile and kinematic information, obtained during the manipulation process and reasoning about the desired and real positions of the fingertips during the manipulation. This is done taking into account that the desired positions of the fingertips are not physically reachable since they are located in the interior of the manipulated object and therefore they are virtual positions with associated virtual contact points. The proposed approach was satisfactorily validated using three fingers of an anthropomorphic robotic hand (Allegro Hand), with the original fingertips replaced by tactile sensors (WTS-FT). In the experimental validation, several everyday objects with different shapes were successfully manipulated, rotating them without the need of knowing their shape or any other physical property.

1. Introduction

Dexterous manipulation of objects has been one of the most relevant topics in robotic research during the past years [1]. Several approaches to the development of new manipulation strategies are inspired in the human capability to manipulate quite different objects with the hands and this has led to the development of a wide variety of anthropomorphic robotic hands [2]. The list of robotics hands is long but some representative examples are the DLR-HIT Hand II [3], the MA-I [4], the Shadow Hand [5], the Schunk Dexterous Hand [6], the Robonaut 2 Hand [7], the Robotiq three-finger gripper [8], the Allegro Hand [9], among several others.
Dexterous manipulation has been defined in different ways in the literature, looking to pioneering works in the field, it goes, for instance, from: “dexterity is defined as the ability of a grasp to achieve one or more useful secondary objectives while satisfying the kinematic relationship (between joint and Cartesian spaces) as the primary objective” (from Reference [10]) to “manipulation that achieves the goal configuration for the object and the [grasp] contacts” (from Reference [11]), and a more recent work summarizes it as “In the robotics research literature ‘dexterity’ often refers to the manipulation of an object in the hand, with the hand” (from Reference [12]). There are other definitions but all of them refer explicitly or implicitly to the manipulation of the object by properly locating/changing the positions of the grasp contact points, that is, by properly managing the finger configurations, which in turn give rise to the expression “in-hand manipulation” to explicitly refer to the object manipulation using only finger movements [12,13,14,15,16].
In-hand manipulation can be done in two ways [11] or combinations of them:
  • Keeping the contacts of each finger with the object during the manipulation, which includes rolling and/or sliding at each contact. In this case the grasps changes continuously (e.g., References [11,13,14,16,17,18,19,20]).
  • Sequentially removing one (or more) finger contact and placing it on another point on the object surface, which is called “finger gaiting”. In this case the grasp changes in a discontinuous way (e.g., References [11,21,22,23]) and sometimes is called as “in-hand regrasping,” expression that is not frequently used in the previous case.
Moreover, The inclusion of tactile sensors into robotic hands improves their manipulation capabilities, because these sensors provide contact information during the manipulation process allowing the execution of more complex tasks with better results, both in industrial and in everyday environments (see References [24,25] for two recent reviews of the state of the art regarding tactile sensors for dexterous hands).
In manipulation, tactile sensors are used in different ways. On one side, when the object model is partially or completely unknown tactile sensors are used to reduce uncertainty and adjust the object geometric model, which can be used to recognize the object (using a data base) as well as to precisely identify the actual position of the object within the hand [26,27,28,29]. Another approach use tactile sensors in manipulation strategies based on the tactile feedback without caring about the object model, that is, the manipulation is performed even when the object model is completely unknown [18,30]. Tactile sensors are also used to detect sliding of the grasped object within the gripper [31,32], improving also the grasp stability of deformable objects by adjusting the forces applied by the fingers when there are changes in the center of mass of the grasped object [33]. In some cases, the hardware design of the gripper was particularly influenced by a tactile application, designing specific fingers [17,34,35], for instance to roll on an unknown object in order to do dexterous manipulation and identify the object surface or to manipulate wires for manufacturing applications. Others manipulation strategies decompose the manipulation problem into small movements that allow the description of a complex task in terms of simpler actions [36], apply control techniques to model the manipulation problem [20] or use geometric reasoning to manipulate unknown objects to improve the grasp quality from the point of view of the hand, the grasp and the task [30] using two articulated fingers of an industrial gripper to rotate an object around a given axis.
On the other hand, regarding complementary hardware, vision systems has been used to complement the tactile information in the exploration of unknown objects [37], an regarding complementary software, machine learning approaches has been proposed for the detection of slippage [38], for the object recognition [29], for the adaptation of the grasping motion [39], and for the extraction of manipulation primitives for compliant robot hands [14].
In this context, the goal of the approach proposed in this work is the dexterous in-hand manipulation (no need of wrist or arm movements) of an unknown object starting from a given initial grasp and keeping the contact between each finger and the object during the manipulation (i.e., finger gaiting is not applied) while keeping the grasping forces within a desired range and preventing the object from falling. The expression “unknown object” is used indicate that the model of the object is not needed during the manipulation procedure (note that “model of the object” includes shape, texture, stiffness, center of mass, friction coefficient and any other related physical property of the object). The main contribution of the approach is that it is a relatively simple geometric procedure based on the commanded positions of the fingertips, that are used to define a set of virtual contact points, without caring about the positions of the real contact points between the fingertips and the object. The object manipulation is done as a reactive procedure considering only the tactile information and the kinematic data of the hand measured during the object manipulation. It must be remarked that tactile information is used only to obtain knowledge of two relevant things: the position of the contact point on each fingertip and the module of the corresponding contact force. Inspired by the typical movements that a human being does to rotated and object, the experimental implementation uses three fingers of an anthropomorphic hand to rotate an unknown object forward and backward. The work presented in this paper merges and extends the initial ideas previously presented in References [40] and [41].
The rest of the paper is organized as follows, Section 2 introduces the problem statement. The proposed approach and the manipulation algorithm are described in Section 3. The description of the hardware used for experimentation and illustrative experimental results are presented in Section 4. Finally, Section 5 presents the conclusions and the proposed future work.

2. Problem Statement

The problem addressed in this work is the manipulation of unknown objects with a robotic hand equipped with tactile sensors, understanding by an “unknown objects” that object properties like shape, weight and center of mass are not known during the manipulation, that is, the manipulation is done without the knowledge of an object model. The manipulation process to be done is the rotation of the grasped object avoiding its fall and making the contact force at each fingertip to remain within a threshold around a desired value. The manipulation commands are continuously provided by the user at a high level, that is, in each iteration the system receives a user command indicating the sense of the rotation movement and the system autonomously determine the finger movements. There is no external measurement of the object orientation but adding, for instance, a vision system the proposed methodology could be used to position the object in a commanded absolute orientation, if such orientation is actually reachable. The computation of the finger movements for the object manipulation is done using only the tactile information (contact forces and contact points) and kinematic information (values of the finger joints) obtained online during the manipulation, that is, no other external feedback sources are considered to obtain information about the object position (like for instance a vision system).
Considering that the fingers joints work under position control, the commanded hand configurations must be such that the commanded positions of the fingertips lie “inside” the object in order to apply a force on the object surface. It must be noted that if the fingertips are positioned exactly on the surface of the object, they will not produce grasping forces on it. From now on, in this work, we will refer to the commanded fingertip positions located “inside” the object as “virtual contact points”, since they are not physically reachable. Furthermore, the magnitude of the force applied by each fingertip on the object surface depends on the distance between the virtual contact points and the real contact points actually reached on the object surface. Thus, each virtual contact point is adjusted as a function of the force error, that is, the difference between the desired and the current contact force sensed on each fingertip. Determining the finger movements using only the virtual contact points allows the object manipulation without knowing its real shape or ay other physical property.
It is assumed that the initial grasp is a Force Closure grasp [42] but the determination of the initial grasp is outside the scope of this work, it can be determined, for instance, using a generic grasp planner [43] or even by trial an error. In our experimentation we simply move the fingers towards the object surface until obtaining a proper grasps. We use three fingers of a robotic hand to grasp and manipulate the object, performing a tripod grasp [44], that is, the thumb works opposite to the other two fingers (abduction movement) in the same way that humans do it. In this work, we will consider that the Thumb works as supporting finger, while the Index and Middle fingers lead the object movements.

3. Proposed Manipulation Strategy

The object manipulation is performed by an iterative process such that, in each iteration, the finger movements are computed according to the sense of rotation, s k , indicated by the user. In this work, the indexes k and k + 1 denote the current and next iteration, respectively.
A finger f i , i { I , M , T } with I, M and T corresponding respectively to the fingers Index, Middle and Thumb, is a kinematic serial chain with n i degrees of freedom (DOF) and n i links. Each finger link has an associated reference frame ε i j , j { 1 , . . . , n i } , which defines its position in the absolute reference frame W located at the palm of the hand. The position of each link j with respect to the previous one is determined by the joint angle q i j . The finger configuration q i is given by the concatenation of all the joint angles of the finger as q i = { q i 1 , , q i n i } . The hand configuration is given by the concatenation of the finger configurations as Q = { q I , q M , q T } .
The flexion/extension joints of each finger i move the finger within a working plane Π i , defined by three points corresponding to the positions of the reference frames ε i j of the three phalanges of the finger. The variables involved in the manipulation are computed using the projections of the relevant points on the working plane of each finger. In a tripod grasp, the finger working planes must be oriented as parallel as possible to each other, as shown in Figure 1. In this way, the fingers can perform cooperative movements and the object can be rotated around an axis orthogonal to the working plane of the fingers, as it is usually done by human beings. Nevertheless, the proposed procedure can be easily generalized to rotate objects around any arbitrary axis, there is no restriction that prevents this but it is evident that the kinematics of the hand may allow a very small rotations around some particular axis.
Given the current virtual contact points P i k , the computation of points P i k + 1 for the leading fingers (Index and Middle) is done as follows. Two auxiliary points P i k + 1 * , i = { I , M } are defined as the points resulting from a displacement ± ζ of P i k along the line perpendicular to the segment between P i k and P T k , as shown in Figure 2, the intention is to make the axis of rotation passing through P T k . The sign of ζ depends on the desired sense of rotation for the current iteration. Thus,
P i k + 1 * = P i k ± ζ p ^
with p ^ R 3 and p ^ · ( P i k P T k ) = 0 .
Since the shape of the object is unknown, any movement of the fingers may alter the contact force F i k . The module of F i k must remain within a threshold around a desired value F i d because if it increases a lot the object or the hand may be damaged and if it decreases the grasp may fail and the object may fall down. In order control the value of the grasping forces, a force error e i k is defined as the difference between the desired force F i d and the current force measured by the sensors F i k , that is,
e i k = F i k F i d
Now, let us consider the distance d i defined as the Euclidean distance between each virtual contact point P i , i = { I , M } and the rotation point P T , that is,
d i k = P i k P T k
An adjustment of d i k allows to change the grasping force applied on the object, then, d i k is modified in each iteration depending on the force error e i k by properly determining the final positions of P i k + 1 and P T k + 1 .
P i k + 1 is determined as,
P i k + 1 = P i k + 1 * + Δ d i k p ^ i *
with
p ^ i * = P i k + 1 * P T k P i k + 1 * P T k
and
Δ d i k = 2 λ ( e i k + e i k 2 ) if e i k 0 λ e i k if e i k > 0
being λ a predefined constant, empirically obtained. The reason for the different expression (different gain) depending on the sign of e i k is that a potential fall of the object ( F i k 0 ) is considered more critical that a potential application of large grasping forces ( F i k F i d ).
In the case of the Thumb, since it is only used as supporting point for the object rotation, the computation of P T k + 1 is done with the only aim of adjusting the contact force without computing any intermediate point. P T k + 1 is computed considering an adjustment with respect to the Index and Middle fingers as,
P T k + 1 = P T k + Δ d T k p ^ T
with
Δ d T k = ( Δ d I k + Δ d M k ) / 2
and
p ^ T = p ^ I * + p ^ M * p ^ I * + p ^ M *
Finally, the new hand configuration Q k + 1 is computed using inverse kinematics (IK) of P i k + 1 , i = { I , M , T } . The movement of the fingers is executed only if each P i k + 1 belong to the workspace of corresponding finger, that is, the target Q k + 1 lies within the hand workspace. Algorithm 1 summarizes the main steps for the computation of the hand configuration that allows the desired object manipulation.
Algorithm 1: Manipulation algorithm
Robotics 08 00086 i001

4. Experimental Validation

4.1. Hardware Set-up

The Allegro Hand from Wonik Robotics [9] was used for the experimental validation. This is a 4-finger anthropomorphic hand with 4 degrees of freedom (DOF) per finger (see Figure 3). The Index, Middle and Ring fingers have the same kinematic structure, the first degree of freedom fixes the orientation of the working plane Π i within the finger workspace, while the other three DOF (flexion/extension) are used to make the fingertip reach a point and an orientation in this plane. In the case of the Thumb, the first DOF produces the abduction movement and the second DOF fixes the orientation of the working plane, leaving only two DOF to work in this plane, that is, the position and the orientation of the fingertip are not independent. The joints of the hand have DC motors as actuators and potentiometers to measure their positions with a resolution of 0.002 degrees. The Allegro Hand is connected to a PC by a CAN bus. The joints of the hand have PID position controllers and the system includes gravity compensation.
The fingertips of the commercial version of the Allegro Hand do not have tactile sensors, thus, the original fingertips were replaced by sensorized fingertips WTS-FT from Weiss Robotics [45], increasing in this way the capabilities of the hand. Each WTS-FT sensor has a tactile sensing matrix with 4 × 8 taxels. The surface of each taxel is a square with side length of 3.8 mm. A measurement of the pressure in each taxel returns a value between 0, when no force is applied and 4095, for the maximum measurable normal force of 1.23 N. In this work, the contact is modeled using the point-contact model [46]. Thus, when the contact between each fingertip and the object takes place over a contact region including several taxels, the barycenter of this region is considered as the current effective contact point and the summation of the forces sensed at each taxel is considered as the current contact force. Two virtual prismatic joints l 1 ( x ) and l 2 ( y ) are used to locate the contact point on the sensor pad, assuming that the sensor surface is flat. These virtual joints add two non-controlled DOF at the end of the finger kinematic chain. Figure 4 shows the taxel distribution of the WTS-FT sensor with an example of a contact region remarked with an ellipsoid. Measures of pressure in the taxels are represented by colors. The Figure also shows the barycenter of the contact region (which is considered as the effective contact point between the fingertip and the object) and the virtual joints that localize the contact point with respect to the fingertip center point (TCP) on the sensor surface.
Robot Operating System (ROS) [47] is the communication layer that allows the integration of the software modules developed for the implementation of the proposed approach: a module to control the Allegro Hand with a PID controller and gravity compensation, a module to get the measurements of the tactile sensor system, a module with a graphical user interface to command the movements of each joint of the hand (used to perform the initial grasp), and a module to perform the object manipulation following the proposed approach, as shown in Figure 5.

4.2. Experiments

In following illustrative experiments the fingers of the Allegro Hand were closed around an unknown object, until approximately reaching a desired contact force F i d = 5 N. In a first set of experiments, the initial grasp was obtained using the graphical application to control individually each hand joint; this application also allows the visualization of the measured force on each sensor at the fingertips. The objects used for experimentation, shown in Figure 6, were chosen looking for different shapes, so that the proposed approach performance can be illustrated under different conditions; the objects also have different stiffness. The constant λ to compute Δ d i and the distance ζ to compute the auxiliary points P i k + 1 * were all set to 1 mm. The manipulation experiment for each object includes the following steps: first, the initial grasp was performed; then, the object was rotated clockwise until reaching the limit of the hand workspace; then, the object was rotated counterclockwise until reaching again the limit of the hand workspace; and, finally, the object was released.
Figure 7 shows snapshots of the manipulation of three objects with different shapes: a regular bottle, a bottle with multiple curvatures and a jar with flat faces. From left to right, the first picture shows the user putting the object in the workspace of the hand; the second picture shows the hand performing the initial grasp; the third picture shows the configuration of the hand when the limit of the hand workspace was reached after rotating the object clockwise; and the last picture shows the configuration of the hand when the limit of the hand workspace was reached after rotating the object counterclockwise.
Figure 8, Figure 9 and Figure 10 show the evolution of the commanded and reached values of the finger joints when the regular bottle (first row in Figure 7), the bottle with multiple curvatures (second row in Figure 7) and jar with flat faces (third row in Figure 7) were manipulated. The commanded joint values correspond to the virtual contact points P i k , i = { I , M , T } and the reached joint values are those obtained due to the real contact with the object surface. Figure 11, Figure 12 and Figure 13 shown the evolution of the measured forces at the fingertips for the three manipulation examples. In these Figures, five regions are remarked using vertical dashed lines and a number inside a circle: region 1 shows the joint and force values at the initial hand configuration before grasping the object; region 2 shows the evolution of the values while the initial grasp was performed; region 3 shows the evolution of the values while the object was rotated clockwise; region 4 shows the evolution of the values while the object was rotated counterclockwise; and, finally, region 5 shows the values when the object was released and the hand returned to the initial configuration. In each region, the contact forces had the following behaviours. In region 1, the contact forces were zero at all the fingertips, since there were no contact between them and the object. In region 2, when the initial grasp was performed, the contact forces at each fingertip did not appear at the same time because the movements of the fingers were performed individually and sequentially using the graphical interface (see previous subsection) to command the finger movements individually. In region 3 and region 4, during the manipulation, the measured forces remain close to the desired value. Finally, in region 5, the measured forces were constant until the object was released.
Figure 14, Figure 15 and Figure 16 show the resulting contact points on the sensor surface for the three manipulation examples. In the first example, the resulting contact points for the three fingers are distributed in a similar way because the constant curvature of the object surface produces rolling over all the sensor surfaces. In the last two examples, the contact points on the Thumb are concentrated in a smaller region because the object surface has a larger curvature at the contact regions.
In a second set of experiments, the used hand is part of a dual-arm mobile manipulator and the initial grasps of the objects (shown in Figure 17) were done by the robot itself. In must be noted that one of the objects (the box) is almost completely rigid. The arm moves the hand to a position such that it envelopes the object and then the fingers are closed until grasping the object with contact forces close to the desired value. We have to remark that, as it was stated before, the problem of obtaining optimized initial grasps is outside the scope of this work. Once the object is grasped, it is lift and then rotated counterclockwise and clockwise until reaching the limits of the hand workspace. The adjustable parameters were set to the same values as in the first set of experiments.
Figure 18 shows snapshots of the manipulation of the two objects. Figure 19 and Figure 20 show the evolution of the commanded and reached values of the finger joints when the objects were manipulated. Figure 21 and Figure 22 shown the evolution of the measured forces at the fingertips for each manipulation example. In Figure 19, Figure 20, Figure 21 and Figure 22 there are remarked four regions using vertical dashed lines and a number inside a circle: region 1 and region 4 show the previous and posterior values to the manipulation process; region 2 shows the values during the counterclockwise rotation and region 3 shows the values during the clockwise rotation of the objects, respectively.
Videos showing the system performance for each case in both sets of experiments can be found in https://bit.ly/2lLvbDY.

5. Conclusions

The paper has presented a simple but effective approach for the manipulation of unknown object based on tactile information. The approach is based on a geometric reasoning to determine the movements of the fingers and is able to keep the grasping forces around the predefined value. The experimental validation was done using three fingers of an anthropomorphic robotic hand equipped with tactile sensors to rotate objects of different shapes and stiffness around an axis parallel to the palm of the hand, clockwise and counterclockwise. The manipulation was performed without using any model of the object, so the object is unknown for the system. The experimental results shown that the approach is effective and can be applied in real practical cases. Some positive aspects of the proposed approach are that the finger movements are determined in a very simple way using basic geometry, which is fast and effective and it can be easily implemented for hands with different kinematics and basic position control in the finger joints. On the other hand, since the object shape is unknown it is not possible to predict, neither to know it with precision, how much the object actually rotates for each commanded movement of the fingers. It must be noted that the maximum possible rotation range of the object depends on the initial grasp (contact points on the object and hand configuration), this is not a particular limitation of the proposed algorithm but it is inherent to the type of in-hand manipulation; starting from a good grasp increases the range of the possible rotation of the object but since the object shape is unknown the resulting initial grasp commonly obtained by closing the fingers until touching the object could not be the most convenient for the in-hand manipulation (as it was already mentioned in the paper, looking for an adequate initial grasp is outside the scope of this work).
A natural extension of the proposed approach is to consider the optimization of a grasp quality index as another goal during the manipulation process.

Author Contributions

Conceptualization, A.M. and R.S.; methodology, A.M. and R.S.; software and validation, A.M.; writing original draft, A.M. and R.S.; writing review and editing, A.M. and R.S.; supervision, R.S.

Funding

This work was partially supported by the Spanish Government through the project DPI2016-80077-R.

Conflicts of Interest

The authors declare no conflict of interest. The funding sponsors had no role in the design of the study or interpretation of data; in the writing of the manuscript and in the decision to publish the results.

References

  1. Okamura, A.M.; Smaby, N.; Cutkosky, M.R. An overview of dexterous manipulation. In Proceedings of the IEEE International Conference on Robotics and Automation, San Francisco, CA, USA, 24–28 April 2000; Volume 1, pp. 255–262. [Google Scholar] [CrossRef]
  2. Bicchi, A. Hands for Dextrous Manipulation and Powerful Grasping: A Difficult Road Towards Simplicity. IEEE Trans. Robot. Autom. 2000, 16, 652–662. [Google Scholar] [CrossRef]
  3. Butterfass, J.; Fischer, M.; Grebenstein, M.; Haidacher, S.; Hirzinger, G. Design and experiences with DLR hand II. In Proceedings of the World Automation Congress, Seville, Spain, 28 June–1 July 2004; Volume 15, pp. 105–110. [Google Scholar]
  4. Grosch, P.; Suárez, R. Dexterous Robotic Hand MA-I, Sofware and Hardware Architecture. In Proceedings of the Intelligent Manipulation and Grasping International Conference, IMG’04, Genoa, Italy, 1–2 July 2004; pp. 91–96. [Google Scholar]
  5. Shadow Robot Company. Shadow Dexterous Hand. 2015. Available online: http://www.shadowrobot.com (accessed on 10 July 2019).
  6. SCHUNK GmbH & Co. KG. Shunck Dexterous Hand—SDH2. 2011. Available online: http://www.schunk.com (accessed on 10 July 2019).
  7. Bridgwater, L.; Ihrke, C.A.; Diftler, M.; Abdallah, M.; Radford, N.; Rogers, J.; Yayathi, S.; Askew, R.S.; Linn, D.M. The Robonaut 2 hand—designed to do work with tools. In Proceedings of the IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 3425–3430. [Google Scholar] [CrossRef]
  8. Sadun, A.S.; Jalani, J.; Jamil, F. Grasping analysis for a 3-Finger Adaptive Robot Gripper. In Proceedings of the IEEE International Symposium on Robotics and Manufacturing Automation (ROMA), Ipoh, Malaysia, 25–27 September 2016; pp. 1–6. [Google Scholar] [CrossRef]
  9. Allegro Robotic Hand—Wonik Robotics. 2018. Available online: http://wonikrobotics.com/Allegro-Hand.htm (accessed on 10 July 2019).
  10. Shimoga, K.B. Robot grasp synthesis algorithms: A survey. Int. J. Robot. Res. 1996, 15, 230–266. [Google Scholar] [CrossRef]
  11. Han, L.; Trinkle, J.C. Dextrous manipulation by rolling and finger gaiting. In Proceedings of the IEEE International Conference on Robotics and Automation, Leuven, Belgium, 20–20 May 1998; Volume 1, pp. 730–735. [Google Scholar] [CrossRef]
  12. Dafle, N.C.; Rodriguez, A.; Paolini, R.; Tang, B.; Srinivasa, S.S.; Erdmann, M.; Mason, M.T.; Lundberg, I.; Staab, H.; Fuhlbrigge, T. Extrinsic dexterity: In-hand manipulation with external forces. In Proceedings of the IEEE International Conference on Robotics and Automation, Hong Kong, China, 31 May–7 June 2014; pp. 1578–1585. [Google Scholar] [CrossRef]
  13. Funabashi, S.; Schmitz, A.; Sato, T.; Somlor, S.; Sugano, S. Robust in-hand manipulation of variously sized and shaped objects. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Hamburg, Germany, 28 September–2 October 2015; pp. 257–263. [Google Scholar] [CrossRef]
  14. Liarokapis, M.; Dollar, A.M. Deriving dexterous, in-hand manipulation primitives for adaptive robot hands. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, 24–28 September 2017; pp. 1951–1958. [Google Scholar] [CrossRef]
  15. Ho, V.A.; Nagatani, T.; Noda, A.; Hirai, S. What can be inferred from a tactile arrayed sensor in autonomous in-hand manipulation? In Proceedings of the IEEE International Conference on Automation Science and Engineering, Seoul, Korea, 20–24 August 2012; pp. 461–468. [Google Scholar] [CrossRef]
  16. Shi, J.; Woodruff, J.Z.; Umbanhowar, P.B.; Lynch, K.M. Dynamic In-Hand Sliding Manipulation. IEEE Trans. Robot. 2017, 33, 778–795. [Google Scholar] [CrossRef]
  17. Bicchi, A.; Marigo, A.; Prattichizzo, D. Dexterity through rolling: Manipulation of unknown objects. In Proceedings of the IEEE International Conference on Robotics and Automation, Detroit, MI, USA, 10–15 May 1999; Volume 2, pp. 1583–1588. [Google Scholar] [CrossRef]
  18. Montaño, A.; Suárez, R. Unknown object manipulation based on tactile information. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Hamburg, Germany, 28 September–2 October 2015; pp. 5642–5647. [Google Scholar] [CrossRef]
  19. Van Hoof, H.; Hermans, T.; Neumann, G.; Peters, J. Learning robot in-hand manipulation with tactile features. In Proceedings of the IEEE-RAS International Conference on Humanoid Robots, Seoul, Korea, 3–5 November 2015; pp. 121–127. [Google Scholar] [CrossRef]
  20. Shaw-Cortez, W.; Oetomo, D.; Manzie, C.; Choong, P. Tactile-Based Blind Grasping: A Discrete-Time Object Manipulation Controller for Robotic Hands. IEEE Robot. Autom. Lett. 2018, 3, 1064–1071. [Google Scholar] [CrossRef]
  21. Hong, J.; Lafferriere, G.; Mishra, B.; Tan, X. Fine manipulation with multifinger hands. In Proceedings of the IEEE International Conference on Robotics and Automation, Cincinnati, OH, USA, 13–18 May 1990; pp. 1568–1573. [Google Scholar] [CrossRef]
  22. Goodwine, B.; Burdick, J.W. Stratified motion planning with application to robotic finger gaiting. IFAC Proc. Vol. 1999, 32, 201–206. [Google Scholar] [CrossRef]
  23. Goodwine, B.; Burdick, J.W. Motion planning for kinematic stratified systems with application to quasi-static legged locomotion and finger gaiting. IEEE Trans. Robot. Autom. 2002, 18, 209–222. [Google Scholar] [CrossRef] [Green Version]
  24. Kappassov, Z.; Corrales, J.A.; Perdereau, V. Tactile sensing in dexterous robot hands—Review. Robot. Auton. Syst. 2015, 74, 195–220. [Google Scholar] [CrossRef]
  25. Nadon, F.; Valencia, A.; Payeur, P. Multi-Modal Sensing and Robotic Manipulation of Non-Rigid Objects: A Survey. Robotics 2018, 7, 74. [Google Scholar] [CrossRef]
  26. Montaño, A.; Suárez, R. Object Shape Reconstruction Based on the Object Manipulation. In Proceedings of the 16th International Conference on Advanced Robotics (ICAR 2013), Montevideo, Uruguay, 25–29 November 2013. [Google Scholar] [CrossRef]
  27. Chebotar, Y.; Kroemer, O.; Peters, J. Learning robot tactile sensing for object manipulation. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 3368–3375. [Google Scholar] [CrossRef]
  28. Kaboli, M.; De La Rosa, A.T.; Walker, R.; Cheng, G. In-hand object recognition via texture properties with robotic hands, artificial skin, and novel tactile descriptors. In Proceedings of the IEEE-RAS International Conference on Humanoid Robots, Seoul, Korea, 3–5 November 2015; pp. 1155–1160. [Google Scholar] [CrossRef]
  29. Funabashi, S.; Morikuni, S.; Geier, A.; Schmitz, A.; Ogasa, S.; Tomo, T.P.; Somlor, S.; Sugano, S. Object Recognition Through Active Sensing Using a Multi-Fingered Robot Hand with 3D Tactile Sensors. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 2589–2595. [Google Scholar] [CrossRef]
  30. Montaño, A.; Suárez, R. Manipulation of Unknown Objects to Improve the Grasp Quality Using Tactile Information. Sensors 2018, 18, 1412. [Google Scholar] [CrossRef] [PubMed]
  31. Shirafuji, S.; Hosoda, K. Detection and prevention of slip using sensors with different properties embedded in elastic artificial skin on the basis of previous experience. Robot. Auton. Syst. 2014, 62, 46–52. [Google Scholar] [CrossRef]
  32. Su, Z.; Hausman, K.; Chebotar, Y.; Molchanov, A.; Loeb, G.E.; Sukhatme, G.S.; Schaal, S. Force estimation and slip detection/classification for grip control using a biomimetic tactile sensor. In Proceedings of the IEEE-RAS International Conference on Humanoid Robots, Seoul, Korea, 3–5 November 2015; pp. 297–303. [Google Scholar] [CrossRef]
  33. Kaboli, M.; Yao, K.; Cheng, G. Tactile-based manipulation of deformable objects with dynamic center of mass. In Proceedings of the IEEE-RAS International Conference on Humanoid Robots, Cancun, Mexico, 15–17 November 2016. [Google Scholar] [CrossRef]
  34. Ward-Cherrier, B.; Rojas, N.; Lepora, N.F. Model-Free Precise in-Hand Manipulation with a 3D-Printed Tactile Gripper. IEEE Robot. Autom. Lett. 2017, 2, 2056–2063. [Google Scholar] [CrossRef] [Green Version]
  35. Palli, G.; Pirozzi, S. A Tactile-Based Wire Manipulation System for Manufacturing Applications. Robotics 2019, 8, 46. [Google Scholar] [CrossRef]
  36. Felip, J.; Bernabé, J.; Morales, A. Contact-based blind grasping of unknown objects. In Proceedings of the IEEE-RAS International Conference on Humanoid Robots, Osaka, Japan, 29 November–1 December 2012; pp. 396–401. [Google Scholar] [CrossRef]
  37. Li, Q.; Haschke, R.; Ritter, H. A visuo-tactile control framework for manipulation and exploration of unknown objects. In Proceedings of the IEEE-RAS International Conference on Humanoid Robots, Seoul, Korea, 3–5 November 2015; pp. 610–615. [Google Scholar] [CrossRef]
  38. Agriomallos, I.; Doltsinis, S.; Mitsioni, I.; Doulgeri, Z. Slippage Detection Generalizing to Grasping of Unknown Objects Using Machine Learning With Novel Features. IEEE Robot. Autom. Lett. 2018, 3, 942–948. [Google Scholar] [CrossRef]
  39. Steffen, J.; Haschke, R.; Ritter, H. Experience-based and tactile-driven dynamic grasp control. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; pp. 2938–2943. [Google Scholar] [CrossRef]
  40. Montaño, A.; Suárez, R. Manipulación diestra de objetos desconocidos usando puntos de contacto virtuales. In Proceedings of the Jornadas Nacionales de Robótica (Spanish National Robotics Conference) (JNR19), Alicante, Spain, 13–14 June 2019; pp. 221–228. [Google Scholar]
  41. Montaño, A.; Suárez, R. Model-free in-hand manipulation based on the commanded virtual contact points. In Proceedings of the 24nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA 2019), Zaragoza, Spain, 10–13 September 2019; pp. 586–592. [Google Scholar]
  42. Bicchi, A. On the Closure Properties of Robotic Grasping. Int. J. Robot. Res. 1995, 14, 319–334. [Google Scholar] [CrossRef]
  43. Rosales, C.; Suárez, R.; Gabiccini, M.; Bicchi, A. On the synthesis of feasible and prehensile robotic grasps. In Proceedings of the IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 550–556. [Google Scholar] [CrossRef]
  44. Feix, T.; Romero, J.; Schmiedmayer, H.; Dollar, A.M.; Kragic, D. The GRASP Taxonomy of Human Grasp Types. IEEE Trans.-Hum.-Mach. Syst. 2016, 46, 66–77. [Google Scholar] [CrossRef]
  45. WTS-FT—Weiss Robotics GmbH & Co. KG. 2018. Available online: https://www.weiss-robotics.com/ (accessed on 10 July 2019).
  46. Salisbury, J.K.; Roth, B. Kinematic and Force Analysis of Articulated Mechanical Hands. J. Mech. Transm. Autom. Des. 1983, 105, 35. [Google Scholar] [CrossRef]
  47. Quigley, M.; Conley, K.; Gerkey, B.P.; Faust, J.; Foote, T.; Leibs, J.; Wheeler, R.; Ng, A.Y. ROS: An open-source Robot Operating System. In Proceedings of the IEEE International Conference on Robotics and Automation— Workshop on Open Source Software, Kobe, Japan, 12–17 May 2009. [Google Scholar]
Figure 1. Allegro hand with the finger working planes Π i for Index, Middle and Thumb and the axis for the object rotation.
Figure 1. Allegro hand with the finger working planes Π i for Index, Middle and Thumb and the axis for the object rotation.
Robotics 08 00086 g001
Figure 2. Example of the computation of P i k + 1 , i = { I , M } , when the contact force F i k is larger than F i d (i.e., e i k 0 ). After obtaining P i k + 1 * with a displacement ζ from the current position P i k , the target virtual contact point P i k + 1 is obtained applying the adjustment Δ d i k to displace P i k + 1 * away from P T k + 1 . All the points are projections onto Π i k .
Figure 2. Example of the computation of P i k + 1 , i = { I , M } , when the contact force F i k is larger than F i d (i.e., e i k 0 ). After obtaining P i k + 1 * with a displacement ζ from the current position P i k , the target virtual contact point P i k + 1 is obtained applying the adjustment Δ d i k to displace P i k + 1 * away from P T k + 1 . All the points are projections onto Π i k .
Robotics 08 00086 g002
Figure 3. Anthropomorphic robotic hand Allegro Hand, with 16 degrees of freedom (DOF) and 4 fingertips with tactile sensors WTS-FT. The index, middle and ring fingers have the same kinematic structure, while the thumb can rotate over the hand palm (abduction movement).
Figure 3. Anthropomorphic robotic hand Allegro Hand, with 16 degrees of freedom (DOF) and 4 fingertips with tactile sensors WTS-FT. The index, middle and ring fingers have the same kinematic structure, while the thumb can rotate over the hand palm (abduction movement).
Robotics 08 00086 g003
Figure 4. Tactile sensor WTS-FT with graphical representation of a measurement highlighting with an ellipse the contact region. The bar in the bottom indicates the scale of colors corresponding to the force values returned by each taxel.
Figure 4. Tactile sensor WTS-FT with graphical representation of a measurement highlighting with an ellipse the contact region. The bar in the bottom indicates the scale of colors corresponding to the force values returned by each taxel.
Robotics 08 00086 g004
Figure 5. Hardware and software components overview. Allegro Hand is connected to a PC with a CAN bus and WTS-FT sensors in the fingertips with a USB port. All the software components are integrated using Robotic Operative System (ROS) as communications layer.
Figure 5. Hardware and software components overview. Allegro Hand is connected to a PC with a CAN bus and WTS-FT sensors in the fingertips with a USB port. All the software components are integrated using Robotic Operative System (ROS) as communications layer.
Robotics 08 00086 g005
Figure 6. Set of everyday objects used for the first set of experiments: Bottle with multiple curvatures (left), jar with flat faces (center) and regular bottle (right).
Figure 6. Set of everyday objects used for the first set of experiments: Bottle with multiple curvatures (left), jar with flat faces (center) and regular bottle (right).
Robotics 08 00086 g006
Figure 7. Snapshots of the manipulation of three objects with different shapes. Objects were rotated clockwise and counterclockwise until reaching the limits of the hand workspace.
Figure 7. Snapshots of the manipulation of three objects with different shapes. Objects were rotated clockwise and counterclockwise until reaching the limits of the hand workspace.
Robotics 08 00086 g007
Figure 8. Evolution of the joint values (in Radians) of the three fingers while the regular bottle was manipulated: the commanded joints values in dashed line and the reached joint values in continuous line.
Figure 8. Evolution of the joint values (in Radians) of the three fingers while the regular bottle was manipulated: the commanded joints values in dashed line and the reached joint values in continuous line.
Robotics 08 00086 g008
Figure 9. Evolution of joint values (in Radians) of the three fingers while the bottle with multiple curvatures was manipulated: the commanded joints values in dashed line and the reached joint values in continuous line.
Figure 9. Evolution of joint values (in Radians) of the three fingers while the bottle with multiple curvatures was manipulated: the commanded joints values in dashed line and the reached joint values in continuous line.
Robotics 08 00086 g009
Figure 10. Evolution of joint values (in Radians) of the three fingers while a jar with flat faces was manipulated: the commanded joints values in dashed line and the reached joint values in continuous line.
Figure 10. Evolution of joint values (in Radians) of the three fingers while a jar with flat faces was manipulated: the commanded joints values in dashed line and the reached joint values in continuous line.
Robotics 08 00086 g010
Figure 11. Evolution of the measured forces (in Newtons) at the fingertips while the regular bottle was manipulated.
Figure 11. Evolution of the measured forces (in Newtons) at the fingertips while the regular bottle was manipulated.
Robotics 08 00086 g011
Figure 12. Evolution of measured forces (in Newtons) at the fingertips while the bottle with multiple curvatures was manipulated.
Figure 12. Evolution of measured forces (in Newtons) at the fingertips while the bottle with multiple curvatures was manipulated.
Robotics 08 00086 g012
Figure 13. Evolution of the measured forces (in Newtons) at the fingertips while a jar with flat faces was manipulated.
Figure 13. Evolution of the measured forces (in Newtons) at the fingertips while a jar with flat faces was manipulated.
Robotics 08 00086 g013
Figure 14. Contact point positions on the tactile sensor pads (in millimeters) when the regular bottle was manipulated.
Figure 14. Contact point positions on the tactile sensor pads (in millimeters) when the regular bottle was manipulated.
Robotics 08 00086 g014
Figure 15. Contact point positions on the tactile sensor pads (in millimeters) when the bottle with multiple curvatures was manipulated.
Figure 15. Contact point positions on the tactile sensor pads (in millimeters) when the bottle with multiple curvatures was manipulated.
Robotics 08 00086 g015
Figure 16. Contact point positions on the tactile sensor pads (in millimeters) when a jar with flat faces was manipulated.
Figure 16. Contact point positions on the tactile sensor pads (in millimeters) when a jar with flat faces was manipulated.
Robotics 08 00086 g016
Figure 17. Set of objects used for the second set of experiments: plastic box (left) and shampoo bottle (right).
Figure 17. Set of objects used for the second set of experiments: plastic box (left) and shampoo bottle (right).
Robotics 08 00086 g017
Figure 18. Snapshots of the manipulation of a plastic box and a shampoo bottle. Objects were rotated counterclockwise and clockwise until reaching the limits of the hand workspace.
Figure 18. Snapshots of the manipulation of a plastic box and a shampoo bottle. Objects were rotated counterclockwise and clockwise until reaching the limits of the hand workspace.
Robotics 08 00086 g018
Figure 19. Evolution of the joint values (in Radians) of the three fingers while the plastic box was manipulated by the robot: the commanded joints values in dashed line and the reached joint values in continuous line.
Figure 19. Evolution of the joint values (in Radians) of the three fingers while the plastic box was manipulated by the robot: the commanded joints values in dashed line and the reached joint values in continuous line.
Robotics 08 00086 g019
Figure 20. Evolution of the joint values (in Radians) of the three fingers while the shampoo bottle was manipulated: the commanded joints values in dashed line and the reached joint values in continuous line.
Figure 20. Evolution of the joint values (in Radians) of the three fingers while the shampoo bottle was manipulated: the commanded joints values in dashed line and the reached joint values in continuous line.
Robotics 08 00086 g020
Figure 21. Evolution of the measured forces (in Newtons) at the fingertips while the plastic box was manipulated.
Figure 21. Evolution of the measured forces (in Newtons) at the fingertips while the plastic box was manipulated.
Robotics 08 00086 g021
Figure 22. Evolution of the measured forces (in Newtons) at the fingertips while the shampoo bottle was manipulated.
Figure 22. Evolution of the measured forces (in Newtons) at the fingertips while the shampoo bottle was manipulated.
Robotics 08 00086 g022

Share and Cite

MDPI and ACS Style

Montaño, A.; Suárez, R. Dexterous Manipulation of Unknown Objects Using Virtual Contact Points. Robotics 2019, 8, 86. https://doi.org/10.3390/robotics8040086

AMA Style

Montaño A, Suárez R. Dexterous Manipulation of Unknown Objects Using Virtual Contact Points. Robotics. 2019; 8(4):86. https://doi.org/10.3390/robotics8040086

Chicago/Turabian Style

Montaño, Andrés, and Raúl Suárez. 2019. "Dexterous Manipulation of Unknown Objects Using Virtual Contact Points" Robotics 8, no. 4: 86. https://doi.org/10.3390/robotics8040086

APA Style

Montaño, A., & Suárez, R. (2019). Dexterous Manipulation of Unknown Objects Using Virtual Contact Points. Robotics, 8(4), 86. https://doi.org/10.3390/robotics8040086

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop