Next Article in Journal
BPNN-Based Behavioral Modeling of the S-Parameter Variation Characteristics of PAs with Frequency at Different Temperatures
Next Article in Special Issue
Ray Optics Model for Optical Trapping of Biconcave Red Blood Cells
Previous Article in Journal
Numerical Study of a Solar Cell to Achieve the Highest InGaN Power Conversion Efficiency for the Whole In-Content Range
Previous Article in Special Issue
Controllable Formation and Real-Time Characterization of Single Microdroplets Using Optical Tweezers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Dual-Arm Visuo-Haptic Optical Tweezers for Bimanual Cooperative Micromanipulation of Nonspherical Objects

Faculty of Engineering and Design, Kagawa University, 2217-20 Hayashi-cho, Takamatsu 761-0396, Japan
*
Author to whom correspondence should be addressed.
Micromachines 2022, 13(11), 1830; https://doi.org/10.3390/mi13111830
Submission received: 28 September 2022 / Revised: 19 October 2022 / Accepted: 23 October 2022 / Published: 26 October 2022
(This article belongs to the Special Issue State-of-the-Art in Optical Trapping and Manipulation)

Abstract

:
Cooperative manipulation through dual-arm robots is widely implemented to perform precise and dexterous tasks to ensure automation; however, the implementation of cooperative micromanipulation through dual-arm optical tweezers is relatively rare in biomedical laboratories. To enable the bimanual and dexterous cooperative handling of a nonspherical object in microscopic workspaces, we present a dual-arm visuo-haptic optical tweezer system with two trapped microspheres, which are commercially available end-effectors, to realize indirect micromanipulation. By combining the precise correction technique of distortions in scanning optical tweezers and computer vision techniques, our dual-arm system allows a user to perceive the real contact forces during the cooperative manipulation of an object. The system enhances the dexterity of bimanual micromanipulation by employing the real-time representation of the forces and their directions. As a proof of concept, we demonstrate the cooperative indirect micromanipulation of single nonspherical objects, specifically, a glass fragment and a large diatom. Moreover, the precise correction method of the scanning optical tweezers is described. The unique capabilities offered by the proposed dual-arm visuo-haptic system can facilitate research on biomedical materials and single-cells under an optical microscope.

1. Introduction

In biomedical studies pertaining to cell exploration, cell surgery, and cell-to-cell interaction, the precise and complex manipulation of single cells, for instance, cell isolation (translation) and three-dimensional (3D) orientation (rotation), is an essential but delicate and time-consuming task [1]. Consequently, this task can only be performed by experienced operators. Optical tweezers (the development of which received the 2018 Nobel Prize in Physics) have emerged as popular tools to handle single biological samples, and their use has been successfully demonstrated in a wide range of biomedical experiments [2,3,4]. In these experiments, the optical tweezers were directly used to manipulate the object of interest. Although the advanced direct manipulation methods based on the use of computer vision techniques and optical multiple-force clamps are valuable for biomedical studies [5,6,7], these techniques involve two key limitations: the samples may undergo photothermal damage because of exposure to a high-power laser beam, and the quality of the trap depends considerably on the shape and refraction index of the samples. To ensure the reliable handling of various biomedical samples, the indirect micromanipulation method, which represents an alternative technique of using optical tweezers, has been developed, in which optically trapped microbeads [8,9] or microstructures [10,11,12] are used as end-effectors. In the indirect micromanipulation method, the microbeads can also function as tactile or force sensors [13] because the microsphere is trapped in the center by the optical tweezers when no external forces act on the sphere. This technique, which associates tactile/force sensor functions and micromanipulation with optical tweezers, is called haptic optical tweezers and allows improved dexterity of micromanipulation. In the comprehensive review paper for haptic optical tweezers [14], the concept, design process, specification, and significant potential of optical tweezers were reviewed, focusing on haptic teleoperation. However, to the best of our knowledge, the realization of real-time multiple tactile/force sensing and its display through the use of optically trapped multiple microbeads as end-effectors (namely, dual-arm visuo-haptic optical tweezers) has not been reported yet, except in one paper [15], where the touching of cells was demonstrated by using holographic optical tweezers.
On the other hand, in general, the use of multiple tactile/force sensors and their control is essential to perform grasping or stable cooperative manipulation of a single object because multiple forces affect an object at different contact points, resulting in a stable position where the posture of the object is determined. Multi-arm/finger robot systems have become the prevalent platform for performing precise and dexterous tasks in medical surgery [16,17] and automation [18,19,20]. However, in biomedical laboratories, the implementation of cooperative micromanipulation with multiple end-effectors or microrobots [12] is relatively rare, except in the case of intracytoplasmic sperm injection [21].
Therefore, to enable the cooperative handling of a nonspherical object in microscopic workspaces, in this paper, we present dual-arm visuo-haptic optical tweezers with two optically trapped microspheres, which are commercially available end-effectors to realize indirect micromanipulation. Through the combination of the precise correction of distortions in scanning optical tweezers and computer vision techniques to detect multiple microspheres, the system enables an operator to perceive the real contact forces during the cooperative manipulation of an object. In this manner, the system helps enhance the dexterity of bimanual micromanipulation by employing the real-time representation of the forces and their directions. As a proof of concept, we demonstrate the cooperative indirect micromanipulation for two types of single nonspherical samples: a glass fragment and a large diatom. Moreover, we describe the precise correction method of the scanning optical tweezers.

2. Dual-Arm Visuo-Haptic Optical Tweezers

2.1. System Design and Experimental Setup

Recently, we developed a dual-arm optical tweezer system to enable precise and dexterous micromanipulation in the 3D workspace [22]. In general, dual-arm systems extend the possibilities of research in the domain of single-cell and 3D biology. However, to perform more precise and complex tasks such as cooperative manipulation, the use of two PC mice as user interface devices for bimanual 3D control is inadequate because it is necessary to realize simultaneous tactile/force sensing at multiple positions with effective control to stabilize the object handled by the multiple end-effectors [14]. In Ref. [15], the touching of red blood cells was only demonstrated in the quite limited 2D narrow workspace ( 10 × 10 μ m), although the real force feedback was performed by the bilateral teleoperation of holographic optical tweezers. In this section, to realize the bimanual cooperative handling of various nonspherical objects in a wider workspace (roughly, from 50 × 50 μm to 100 × 100 μm squares), we describe the design of a dual-arm visuo-haptic optical tweezer system with two optically trapped microspheres.
Figure 1 shows the optical and control system configurations of the dual-arm visuo-haptic 3D optical tweezers involving two haptic devices (Novint Falcon®s). This optical structure was linked to the inverted microscope (Olympus IX70) via its epifluorescence port. The laser source was a continuous wave (cw) YAG laser (Laser Quantum Opus 1064-5000, 1064 nm, TEM00, 5 W(max)), and the beam diameter was expanded to roughly 4 mm by a beam expander (BE). We used one computer (Windows7, Intel® Core™ i7-4790, 3.6 GHz) as the control system. The system pertains to two-beam scanning optical tweezers, in which the two beams divided by a polarized beam splitter (PBS 1 ) individually configure the true 3D optical tweezers through an electrical focus tunable lens (L Z : Optotune EL-10-30-NIR-LD) for z-coordinate steering and a two-axis scanning gimbal mirror (GM: Newport FSM-300) for xy plane steering [23]. These two true 3D optical tweezers, in which the focal position of each trapping beam can be fully controlled in the 3D workspace, employed the 2 f relay optical system under the common relay lens (L R : f R = 200 mm). In particular, although 4 f afocal relay systems are often employed in multibeam optical tweezers involving a spatial light modulator, optical tweezers based on the 2 f relay systems require fewer optical elements resulting in enhanced light transmission, fewer potential optical aberrations, and a simpler alignment process. To realize simultaneous tactile/force sensing at a specific contact position of the two end-effectors, we applied the well-established fact that a microsphere is trapped at its center (P B ) by optical tweezers if no external forces affect the microsphere, that is, P B equals the commanded focus position (P C ) of a trapping beam; the restoring force of the optical tweezers for a microsphere is simply proportional to the distance between P B and P C . Consequently, the contact force (F) is
F = k ( P B P C ) ,
where k is the spring coefficient of the optical tweezers [24]. Therefore, the accurate detection of the center of an individual bead and precise control of each beam focus corresponding to P C are extremely important to represent the contact forces. The circular Hough transform algorithm pertaining to real-time image processing techniques [25,26] was applied to detect the bead centers for the real-time microscopic image, which was captured using a fast USB3 camera (Point Grey GS3-U3-41C6C-C), on which the force arrows (indicating the calculated force vectors) determined using Equation (1) were superimposed. The digital-analog (DA) signal commands were used to control the beam focus positions. The force arrows enable us to visually perceive the contact forces exerted during the cooperative manipulation of a single object with two trapped beads.

2.2. Correction Method for Field Distortion in a Dual-Arm System

As mentioned previously, to represent a force arrow, it is necessary for the actual focus position to exactly correspond with P C , since we assume that a microsphere is trapped at P B by the optical tweezers under a lack of external forces. Hence, to realize visuo-haptic manipulation, the error between the actual and commanded focus positions must be compensated over the entire workspace in which the trapped bead (that is, the end-effector) can be handled by controlling the two-axis scanning mirror. In this section, we describe the precise correction method of the scanning optical tweezers.
As shown in Figure 1, the dual-arm visuo-haptic system involves two-beam scanning optical tweezers, with each beam constructing the optical tweezers through a pre-objective type scanning control system; in this system, the scanning mirror (GM) was located in front of the objective lens of the microscope (L O ), which was an ideal F-theta focusing lens [27]. Moreover, the common relay lens (L R ), which projects the image onto the scanning mirrors onto the back pupil of L O , was inserted in each scanning optical tweezer system. In general, the focus position of a laser beam in an F-theta scanning system is linear to the commanded angle of a scanning mirror. However, in the dual-arm scanning system, the focus position of each trapping beam is significantly influenced by L R , which leads to field distortion. Notably, the individual field distortion of the dual-arm scanning system, which is composed of tangential distortion and asymmetric radial distortion, is mainly generated by the optical path error, which occurs via the decentering, tilt, and aberration of L R . Consequently, the focus position of each trapping beam is not linear to the commanded angle of each two-axis scanning mirror. This nonlinearity of the commanded focus positions, caused by the field distortions, can be corrected using a compensation function. Considering the tangential and radial distortion [27,28], the compensation function of the individual two-axis scanning system can be expressed using a two-dimensional five-order polynomial function Θ ( x , y ) :
Θ ( x , y ) = a 0 + a 1 x + a 2 y + a 3 x y + a 4 x 2 + a 5 y 2 + a 6 x 2 y + a 7 x y 2 + a 8 x 3 + a 9 y 3 + a 10 x 5 + a 11 y 5 + a 12 x 4 y + a 13 x y 4 + a 14 x 3 y 2 + a 15 x 2 y 3 ,
where ( x , y ) denotes the specified focal position (P C ), determined using a camera under the xy imaging coordinates, and Θ ( x , y ) corresponds to the commanded DA signal (voltage) for the xy positioning angles of the two-axis mirror; we considered two voltage values Θ = ( v x , v y ) for the x- and y-axes. In Equation (2), the coefficient a 0 expresses the offset pertaining to the decentering of L R ; the coefficients a 1 , , a 5 express the effect of the tangential distortion owing to the tilt of L R to the optical axis of L O ; and the coefficients a 6 , , a 9 and a 10 , , a 15 express the effect of the radial distortion caused by the tilt and aberration of L R , considering the effect of term r 2 ( r 2 = x 2 + y 2 ) and r 4 , respectively.
For N sets of measured data ( v x , v y , x , y ) , Equation (2) can be represented in the following matrix form:
Θ = MA ,
where
Θ = v x 1 v y 1 v x 2 v y 2 v x N v y N R N × 2 ,
M = 1 x 1 y 1 x 1 y 1 x 1 2 y 1 2 x 1 2 y 1 y 1 3 x 1 5 x 1 3 y 1 2 x 1 2 y 1 3 1 x 2 y 2 x 2 y 2 x 2 2 y 2 2 x 2 2 y 2 y 2 3 x 2 5 x 2 3 y 2 2 x 2 2 y 2 3 1 x N y N x N y N x N 2 y N 2 x N 2 y N y N 3 x N 5 x N 3 y N 2 x N 2 y N 3 R N × 16 ,
A = a x 0 a y 0 a x 1 a y 1 a x 14 a y 14 a x 15 a y 15 R 16 × 2 .
To determine the unknown coefficients, a 0 , , a 15 , in the compensation function presented in Equation (2), we solve the matrix equation presented in Equation (3). Using a pseudoinverse matrix M , we can obtain the coefficient matrix A :
A = M Θ ,
where
M = ( M M ) 1 M ,
in which superscripts ⊤ and 1 denote the transpose and inverse of a matrix, respectively [29]. If the rank of M is less than 16, Equation (3) has no solution. Hence, the number of the measured data sets must be N 16 . Note that the data set ( v x , v y , x , y ) must be uniformly collected over the complete workspace, and the number N > 100 is adequate to satisfy the rank M = 16 and avoid the calculation of Equation (8) under the numerically ill condition.
Figure 2 shows the result of the corrections for the field distortions in the proposed dual-arm scanning system. The errors in four cases involving different coefficients are indicated by different symbols (⋄, ×, etc.) under a five-times-expanded scale. The data were collected by the low-magnification objective lens (Olympus, LCPlanFL, ×40, 0.6 NA), and the number of measured data points for the calculation of Equation (7) was N = 119 . When coefficients a 0 to a 5 were used, only the tangential distortion was compensated. When coefficients a 0 to a 9 and a 0 to a 15 were used, the radial distortion of the relay lens was compensated as well, considering the effect of the terms r 2 and r 4 , respectively. When coefficients a 0 to a 2 were used, the distortions due to the relay lens are not compensated. When all coefficients from a 0 to a 15 concerned with the field distortions were applied, the error between the actual target and commanded focus positions was compensated over the complete workspace of the imaging plane of the microscope with 512 × 440 pixels.

3. Demonstrations and Discussion

3.1. Bimanual Control of End-Effectors

To verify whether the abovementioned correction method can sufficiently reduce the error between the actual and commanded focus positions, we demonstrate the bimanual and simultaneous tactile/force sensing with two optically trapped microbeads as end-effectors to realize cooperative indirect micromanipulation.
Figure 3a shows the snapshot of the 3D position control of the end-effectors (Duke Scientific, borosilicate glass microsphere, 7.8 μ m) based on two haptic devices. The laser powers for trapping the individual end-effectors were adjusted to the equivalent value (50 mW) at the entrance aperture of the objective lens (Olympus, UPlanSApo, ×100, 1.40 NA, IR). As shown in Figure 3a and its Supplementary Videos, the operator, bimanually and independently, manipulated two microbeads, while visually perceiving the forces generated by their contact or the viscous drag. Figure 3b shows the video frame sequence of the simultaneous handling of two end-effectors, in which the beam focus positions, controlled by the individual haptic devices, were superimposed on the monitor of the PC as the green and red circles, which could not be observed through the fast USB camera. Furthermore, the forces generated by the contact or viscous drag were superimposed in real time at their point of action by the yellow lines, whose lengths were proportional to the forces. In Figure 3b1, the individual viscous drag caused by the surrounding material (water), which acted on each trapped microbead in the direction opposite to its movement and was proportional to the velocity of the movement, is represented by a yellow line. In Figure 3b2,b3, the reaction forces on the two microbeads in contact are displayed at the point of contact in real time. When the two beads contact each other, the two yellow lines have almost equal length and indicate the same point at which the microbeads are in contact; thus, we can verify the complete calibration through the correction method.

3.2. Cooperative Micromanipulation of Nonspherical Objects

To demonstrate the enhanced stability and dexterity of bimanual micromanipulation by using the visuo-haptic information, we performed simple cooperative handling tasks for nonspherical objects. The end-effectors for indirect micromanipulation were two microbeads (Duke Scientific, borosilicate glass microsphere, 7.8 μ m), and the laser powers for trapping the individual end-effectors were adjusted to the equivalent value (50 mW) at the entrance aperture of the objective lens (Olympus, UPlanFLN, ×60, 1.25 NA, IR).
Figure 4 shows the video frame sequence of the bimanual and indirect cooperative micromanipulation of a single glass fragment, with the reaction forces and their directions at each contact point of the end-effectors superimposed by yellow lines in real time. In each snapshot of the video frame, the moving direction of the glass fragment during the cooperative manipulation is indicated by a white arrow, and the captured time, as obtained from the Supplementary Videos, is indicated at the left-upper corner. The visuo-haptic information represented by the yellow lines allows the intuitive perception of the individual reaction force when the microbeads are in contact with the glass fragment and the individual viscous drag when the microbeads move through water. In this manner, the real-time visuo-haptic information helps the user apply and control the suitable forces to realize the cooperative pushing/pinching of the nonspherical object during the bimanual manipulation. For example, when the operator pushed the glass fragment with the two beads that contacted on the fragment’s opposite sides, the fragment affected by the two different forces at different action points was rotated to the desired posture (Figure 4a,b,e–k); when the operator also pushed the fragment with two beads that contacted the same side, the fragment was translated to the desired position, rotating to its stable posture (Figure 4c–e). Consequently, the cooperative pushing realized through two microbeads with different forces and action points helped to rotate/translate the fragment without the bead escaping the trap, as shown in Figure 4l. In other words, by exploiting the visuo-haptic information (that is, through the yellow lines representing the current forces), 2D cooperative handling of single nonspherical object pushed/pinched by two end-effectors was robustly performed.
In another demonstration, as shown in Figure 5, a single diatom which had ellipse-like cell walls of silica—larger and heavier than the abovementioned glass fragment—was handled using the two microbeads; the forces exerted on the microbeads are displayed by yellow lines in real time. In each snapshot of the video frame, the movement direction of each microbead and that of the diatom during cooperative manipulation are indicated by white and red arrows, respectively. First, to examine the suitable forces and corresponding movement of the diatom, we interactively performed pushing/pinching motions on both sides of the diatom while monitoring the superimposed forces (Figure 5a–d). Next, the left bead was moved to contact the left-upper edge of the diatom (Figure 5e) and pushed along the left edge of the diatom from its upper to center position, as indicated by the white arrow in Figure 5e; this maneuver resulted in the diatom being gradually rotated, as indicated by the red arrows in Figure 5f,i. Finally, the right bead pushing the diatom escaped its trap (Figure 5l).
Without real-time visuo-haptic information or force feedback [13], the cooperative handling of a heavy and rigid object such as the considered diatom is a difficult task. The beads used as end-effectors often escape from their optical traps because the bead that pushes the rigid object on one side is often subjected to an unexpected and undesired large reaction force exerted by the other bead that is supporting/pushing the object on the opposite side. In this context, by exploiting the real-time representation of the reaction forces, we can perceive and control the corresponding pushing forces within a desirable range, thereby alleviating the escape problem. Thus, the proposed dual-arm system exhibits an enhanced robustness in the conduction of cooperative micromanipulation tasks.

4. Conclusions

We designed a dual-arm visuo-haptic optical tweezer system in which a user can perceive real forces during bimanual micromanipulation. Moreover, we presented a precise correction method for the field distortion of scanning optical tweezers in a wide workspace to represent the force vectors affecting the end-effectors during cooperative indirect micromanipulation. Although we attempted to realize the cooperative indirect micromanipulation for only two types of single samples in a limited (that is, in a 2D and not 3D) workspace because of the insufficient performance of the used PC, it was demonstrated that the system could help to realize highly robust indirect micromanipulation (that is, ensuring the continuous locking of microbeads as end-effectors in the optical tweezers) during cooperative manipulation involving visuo-haptic information. A possible application based on the straightforward implementation of 3D cell rotation by multi-fingers (namely, multiple traps) in a 3D workspace is tomographic imaging and 3D microscopy for nonspherical cells [30]. As future work, we plan to expand the performance of image processing and multiple traps from 2D to 3D traps by replacing the PC. Therefore, the proposed dual-arm system can enable bimanual and dexterous handling of various micro-objects. Furthermore, the capabilities offered by this system can facilitate novel research in the domain of biomedicine (e.g., knotting [31], bridging [32], and twisting of cell/biomaterials) as well as single-cell biology, performed under optical microscopes.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/mi13111830/s1, Video S1: 3D position control of end-effectors; Video S2: Simultaneous handling of the two microbeads; Video S3: Cooperative indirect micromanipulation of a glass fragment; Video S4: Cooperative indirect micromanipulation of a large diatom.

Author Contributions

Conceptualization, Y.T. and K.F.; original draft and data, Y.T. and K.F.; funding acquisition, Y.T. and K.F. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partly supported by the Japan Society for Promotion of Science (JSPS) KAKENHI [Grant No. JP19K04319].

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Xie, M.; Shakoor, A.; Shen, Y.; Mills, J.K.; Sun, D. Out-of-plane rotation control of biological cells with a robot-tweezers manipulation system for orientation-based cell surgery. IEEE Trans. Biomed. Eng. 2019, 66, 199–207. [Google Scholar] [CrossRef] [PubMed]
  2. Ashkin, A.; Dziedzic, J.M.; Yamane, T. Optical trapping and manipulation of single cells using infrared laser beams. Nature 1987, 330, 769–771. [Google Scholar] [CrossRef]
  3. Gross, S.P. Application of optical traps in vivo. Meth. Enzymol. 2003, 361, 162–174. [Google Scholar]
  4. Zhang, H.; Liu, K.K. Optical tweezers for single cells. J. R. Soc. Interface 2008, 5, 671–690. [Google Scholar] [CrossRef] [Green Version]
  5. Rodrigo, P.J.; Kelemen, L.; Palima, D.; Alonzo, C.A.; Ormos, P.; Glückstad, J. Optical microassembly platform for constructing reconfigurable microenvironments for biomedical studies. Opt. Express 2009, 17, 6578–6583. [Google Scholar] [CrossRef] [Green Version]
  6. Tanaka, Y.; Kawada, H.; Hirano, K.; Ishikawa, M.; Kitajima, H. Automated manipulation of non-spherical micro-objects using optical tweezers combined with image processing techniques. Opt. Express 2008, 16, 15115–15122. [Google Scholar] [CrossRef]
  7. Tanaka, Y.; Wakida, S. Controlled 3D rotation of biological cells using optical multiple-force clamps. Biomed. Opt. Express 2014, 5, 2341–2348. [Google Scholar] [CrossRef] [Green Version]
  8. Chowdhury, S.; Thakur, A.; Švec, P.; Wang, C.; Losert, W.; Gupta, S.K. Automated manipulation of biological cells using gripper formations controlled by optical tweezers. IEEE Trans. Autom. Sci. Eng. 2014, 11, 338–347. [Google Scholar] [CrossRef] [Green Version]
  9. Cheah, C.C.; Ta, Q.M.; Haghighi, R. Grasping and manipulation of a micro-particle using multiple optical traps. Automatica 2016, 68, 216–227. [Google Scholar] [CrossRef]
  10. Gerena, E.; Régnier, S.; Haliyo, S. High-bandwidth 3-D multitrap actuation technique for 6-DoF real-time control of optical robots. IEEE Robot. Autom. Lett. 2019, 4, 647–654. [Google Scholar] [CrossRef] [Green Version]
  11. Gerena, E.; Legendre, F.; Molawade, A.; Vitry, Y.; Régnier, S.; Haliyo, S. Tele-robotic platform for dexterous optical single-cell manipulation. Micromachines 2019, 10, 677. [Google Scholar] [CrossRef]
  12. Hu, S.; Hu, R.; Dong, X.; Wei, T.; Chen, S.; Sun, D. Translational and rotational manipulation of filamentous cells using optically driven microrobots. Opt. Express 2019, 27, 16475–16482. [Google Scholar] [CrossRef]
  13. Pacoret, C.; Bowman, R.; Gibson, G.; Haliyo, S.; Carberry, D.; Bergander, A.; Régnier, S.; Padgett, M. Touching the microworld with force-feedback optical tweezers. Opt. Express 2009, 17, 10259–10264. [Google Scholar] [CrossRef] [Green Version]
  14. Pacoret, C.; Régnier, S. Invited article: A review of haptic optical tweezers for an interactive microworld exploration. Rev. Sci. Instrum. 2013, 84, 081301. [Google Scholar] [CrossRef]
  15. Onda, K.; Arai, F. Multi-beam bilateral teleoperation of holographic optical tweezers. Opt. Express 2012, 20, 3633–3641. [Google Scholar] [CrossRef]
  16. Cadière, G.B.; Himpens, J.; Germay, O.; Izizaw, R.; Degueldre, M.; Vandromme, J.; Capelluto, E.; Bruyns, J. Feasibility of robotic laparoscopic surgery: 146 cases. World J. Surg. 2001, 25, 1467–1477. [Google Scholar] [CrossRef]
  17. Crew, B. A closer look at a revered robot. Nature 2020, 580, S5–S7. [Google Scholar] [CrossRef] [Green Version]
  18. SepúLveda, D.; Fernández, R.; Navas, E.; Armada, M.; González-De-Santos, P. Robotic aubergine harvesting using dual-arm manipulation. IEEE Access 2020, 8, 121889–121904. [Google Scholar] [CrossRef]
  19. Kitagawa, S.; Wada, K.; Hasegawa, S.; Okada, K.; Inaba, M. Few-experiential learning system of robotic picking task with selective dual-arm grasping. Adv. Robot. 2020, 34, 1171–1189. [Google Scholar] [CrossRef]
  20. Fleischer, H.; Joshi, S.; Roddelkopf, T.; Klos, M.; Thurow, K. Automated analytical measurement processes using a dual-arm robotic system. SLAS Technol. 2019, 24, 354–356. [Google Scholar] [CrossRef] [PubMed]
  21. Zareinejad, M.; Rezaei, S.M.; Abdullah, A.; Shiry Ghidary, S. Development of a piezo-actuated micro-teleoperation system for cell manipulation. Int. J. Med. Robot. 2009, 5, 66–76. [Google Scholar] [CrossRef] [PubMed]
  22. Tanaka, Y. Double-arm optical tweezer system for precise and dexterous handling of micro-objects in 3D workspace. Opt. Lasers Eng. 2018, 111, 65–70. [Google Scholar] [CrossRef]
  23. Tanaka, Y. 3D multiple optical tweezers based on time-shared scanning with a fast focus tunable lens. J. Opt. 2013, 15, 025708. [Google Scholar] [CrossRef]
  24. Preece, D.; Bowman, R.; Linnenberger, A.; Gibson, G.; Serati, S.; Padgett, M. Increasing trap stiffness with position clamping in holographic optical tweezers. Opt. Express 2009, 17, 22718–22725. [Google Scholar] [CrossRef] [Green Version]
  25. Ballard, D.H.; Brown, C.M. Computer Vision; Prentice-Hall: Englewood Cliffs, NJ, USA, 1982. [Google Scholar]
  26. Kaehler, A.; Bradski, G. Learning OpenCV 3: Computer Vision in C++ with the OpenCV Library, 3rd ed.; O’Reilly: Sebastopol, CA, USA, 2017. [Google Scholar]
  27. Chen, M.F.; Chen, Y.P.; Hsiao, W.T. Correction of field distortion of laser marking systems using surface compensation function. Opt. Lasers Eng. 2009, 47, 84–89. [Google Scholar] [CrossRef]
  28. Brown, D.C. Decentering distortion of lenses. Photogramm. Eng. 1966, 32, 444–462. [Google Scholar]
  29. Rao, C.R.; Mitra, S.K. Generalized Inverse of Matrices and Its Application; Wiley: Hoboken, NJ, USA, 1971. [Google Scholar]
  30. Landenberger, B.; Yatish; Rohrbach, A. Towards non-blind optical tweezing by finding 3D refractive index changes through off-focus interferometric tracking. Nat. Commun. 2021, 12, 6922. [Google Scholar] [CrossRef]
  31. Arai, Y.; Yasuda, R.; Akashi, K.; Harada, Y.; Miyata, H.; Kinosita, K.; Itoh, H. Tying a molecular knot with optical tweezers. Nature 1999, 399, 446–448. [Google Scholar] [CrossRef]
  32. Brouwer, I.; Sitters, G.; Candelli, A.; Heerema, S.J.; Heller, I.; Melo de, A.J.; Zhang, H.; Normanno, D.; Modesti, M.; Peterman, E.J.G.; et al. Sliding sleeves of XRCC4-XLF bridge DNA and connect fragments of broken DNA. Nature 2016, 535, 566–569. [Google Scholar] [CrossRef]
Figure 1. Schematic of dual-arm visuo-haptic 3D optical tweezer system that can represent the forces during indirect micromanipulation with two microspheres.
Figure 1. Schematic of dual-arm visuo-haptic 3D optical tweezer system that can represent the forces during indirect micromanipulation with two microspheres.
Micromachines 13 01830 g001
Figure 2. The result of the correction for distortions in the scanning optical tweezer system. The errors pertaining to the target focus positions are indicated by four symbols under a five-times-expanded scale. The number of measured data points is N = 119 . The 512 × 440 pixel imaging plane is equivalent to the 138 × 118 μ m real workspace for ×40 objective lens, the 92 × 79 μ m for ×60, and the 54 × 47 μ m for ×100, respectively.
Figure 2. The result of the correction for distortions in the scanning optical tweezer system. The errors pertaining to the target focus positions are indicated by four symbols under a five-times-expanded scale. The number of measured data points is N = 119 . The 512 × 440 pixel imaging plane is equivalent to the 138 × 118 μ m real workspace for ×40 objective lens, the 92 × 79 μ m for ×60, and the 54 × 47 μ m for ×100, respectively.
Micromachines 13 01830 g002
Figure 3. (a) Snapshot of the 3D position control of end-effectors (two 7.8 μ m glass microbeads) through two haptic devices. (b) Video frame sequence of the simultaneous handling of the two microbeads. The tweezing beam positions controlled using the haptic devices are superimposed by the green and red circles. The forces generated by the contact or viscous drag are also superimposed in real time at the point of action by the yellow lines. The images are also shown in Supplementary Videos S1 and S2.
Figure 3. (a) Snapshot of the 3D position control of end-effectors (two 7.8 μ m glass microbeads) through two haptic devices. (b) Video frame sequence of the simultaneous handling of the two microbeads. The tweezing beam positions controlled using the haptic devices are superimposed by the green and red circles. The forces generated by the contact or viscous drag are also superimposed in real time at the point of action by the yellow lines. The images are also shown in Supplementary Videos S1 and S2.
Micromachines 13 01830 g003
Figure 4. Video frame sequence of the cooperative indirect micromanipulation of a glass fragment, with the reaction forces and their directions at each contact position of the end-effectors (7.8 μ m microbeads) superimposed by yellow lines in real time. In each snapshot, the movement direction of the fragment during cooperative manipulation is indicated by a white arrow. The images are also shown in Supplementary Video S3.
Figure 4. Video frame sequence of the cooperative indirect micromanipulation of a glass fragment, with the reaction forces and their directions at each contact position of the end-effectors (7.8 μ m microbeads) superimposed by yellow lines in real time. In each snapshot, the movement direction of the fragment during cooperative manipulation is indicated by a white arrow. The images are also shown in Supplementary Video S3.
Micromachines 13 01830 g004
Figure 5. Video frame sequence of the cooperative indirect micromanipulation of a large diatom, with the exerted forces and their directions on the end-effectors (7.8 μ m microbeads) displayed by yellow lines in real time. In each snapshot, the movement direction of the microbeads and the diatom during cooperative manipulation are indicated by white and red arrows, respectively. The images are also shown in Supplementary Video S4.
Figure 5. Video frame sequence of the cooperative indirect micromanipulation of a large diatom, with the exerted forces and their directions on the end-effectors (7.8 μ m microbeads) displayed by yellow lines in real time. In each snapshot, the movement direction of the microbeads and the diatom during cooperative manipulation are indicated by white and red arrows, respectively. The images are also shown in Supplementary Video S4.
Micromachines 13 01830 g005
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Tanaka, Y.; Fujimoto, K. Dual-Arm Visuo-Haptic Optical Tweezers for Bimanual Cooperative Micromanipulation of Nonspherical Objects. Micromachines 2022, 13, 1830. https://doi.org/10.3390/mi13111830

AMA Style

Tanaka Y, Fujimoto K. Dual-Arm Visuo-Haptic Optical Tweezers for Bimanual Cooperative Micromanipulation of Nonspherical Objects. Micromachines. 2022; 13(11):1830. https://doi.org/10.3390/mi13111830

Chicago/Turabian Style

Tanaka, Yoshio, and Ken’ichi Fujimoto. 2022. "Dual-Arm Visuo-Haptic Optical Tweezers for Bimanual Cooperative Micromanipulation of Nonspherical Objects" Micromachines 13, no. 11: 1830. https://doi.org/10.3390/mi13111830

APA Style

Tanaka, Y., & Fujimoto, K. (2022). Dual-Arm Visuo-Haptic Optical Tweezers for Bimanual Cooperative Micromanipulation of Nonspherical Objects. Micromachines, 13(11), 1830. https://doi.org/10.3390/mi13111830

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop