Next Article in Journal
Optimized Task Group Aggregation-Based Overflow Handling on Fog Computing Environment Using Neural Computing
Next Article in Special Issue
Theoretical Disquisition on the Static and Dynamic Characteristics of an Adaptive Stepped Hydrostatic Thrust Bearing with a Displacement Compensator
Previous Article in Journal
Environmental Management through Coopetitive Urban Waste Recycling in Eco-Industrial Parks
Previous Article in Special Issue
Analysis of the Use of a Wind Turbine as an Energy Recovery Device in Transport Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Automatically Tightening Tiny Screw Using Two Images and Positioning Control

Department of Mechanical Engineering, National Central University, Jhong-Li District, Tao-Yuan City 32001, Taiwan
*
Author to whom correspondence should be addressed.
Mathematics 2021, 9(19), 2521; https://doi.org/10.3390/math9192521
Submission received: 14 September 2021 / Revised: 5 October 2021 / Accepted: 5 October 2021 / Published: 7 October 2021
(This article belongs to the Special Issue Mathematical Problems in Mechanical Engineering)

Abstract

:
This paper describes how to tighten M1.4 screws by controlling a manipulator. The whole process is based on a human–machine interface designed using Visual Studio C++ to run image processing algorithms and control the position of a manipulator. Two charge-coupled device cameras are used. One is fixed on the stationary frame above screw holes and used to take pictures of the holes. The positions of the holes are determined using image processing algorithms and then transformed into the coordinate system of the manipulator by using coordinate transformation. The other camera, installed on the end effect of the manipulator, photographs the screw hole to fine-tune the position of the manipulator, improving positioning control. The image processing methods including grayscale, Gaussian filter, bilateral filter, binarization, edge detection, center of gravity, and minimum circumcircle are used to find the center coordinates of the target holes. Experimental study shows that M1.4 screws can be tightened into the target holes with the manipulator.

1. Introduction

With the development of automation in the manufacturing and production industries, increasing types of automation equipment have been designed to replace human labor and improve performance [1]. Automation equipment can be used for many purposes such as spray-painting [2], laser-cutting [3], welding [4], polishing [5], and the handling of objects [6].
Pitipong et al. [7] used two cameras to investigate screw tightening. The two cameras were placed equal distances from the screw and photographed from different perspectives. Image processing was then used to measure the angle of the screw and detect whether the screw was perpendicular to the screw hole. However, the accuracy of the coordinate position of the screw hole was not considered. The screw used by Pitipong et al. [7] was much larger than the one used in our research. In our research, two cameras are placed at different distances from the screw hole; one camera photographs the screw hole from afar, while the other does so from a close distance. The screw used is an M1.4 screw, which is much smaller than that used in [7]. Image processing is used in our research to detect the accurate position of the screw hole.
This study designed a human–machine interface (HMI) for controlling the manipulator that tightens an M1.4 screw. An electric screwdriver is modified and installed on the end of a manipulator. One charge-coupled device (CCD) camera is fixed on the stationary frame above the screw holes to detect their positions. Another CCD camera is installed on the end effect of the manipulator. Once the screw has been positioned above the chosen screw hole, the second camera photographs the screw hole to fine-tune the position of the manipulator.
The visual system [8,9] and HMI were designed using Microsoft Visual Studio C++ (Microsoft Co., Redmond, WA, USA). By clicking the buttons on the interface, different commands can be performed such as image processing, selecting the screw hole, starting the manipulator, and starting the electric screwdriver.
The center coordinates of a screw hole, calculated using image processing, are transformed into the objective coordinates of the manipulator through coordinate transformation. Once the screw has arrived at a fixed height above the intended screw hole, the camera installed on the end effect of the manipulator photographs the screw hole. The system then processes the image, detects the position of the screw hole, and transforms it into the coordinate system of the manipulator to fine-tune the manipulator, increasing the positioning control.
Finally, when the screw is in the appropriate position, touching the screw hole, the user clicks the button to start the motor in the screwdriver, finishing the process of screw tightening.

2. Image Processing

After the camera photographs screw holes on the workpiece, the picture must be processed to detect the center coordinates of the screw holes. Images are processed using OpenCV, which involves the following processing methods: minimum circumcircle, binarization [10], edge detection [11], grayscale [12], center of gravity [13], bilateral filter [14], and Gaussian filter [15].

3. Coordinate Transformation

The coordinates of the screw hole, determined using image processing, must be transformed into the coordinate system of the manipulator through coordinate transformation.
Consider a point P and two coordinate systems {A} and {B}, as shown in Figure 1. Accordingly, AP represents the point P described in coordinate system {A}, BP represents the point P described in coordinate system {B}, and X ^ A represents the unit vector of the X-direction of {A}.
[ X ^ B A Y ^ B A Z ^ B A ] denotes the three unit vectors of {B} described in the coordinate system {A}. The rotation matrix between {A} and {B} is composed of an inner product of these unit vectors:
R B A = [ X ^ B A Y ^ B A Z ^ B A ] = X ^ B · X ^ A Y ^ B · X ^ A Z ^ B · X ^ A X ^ B · Y ^ A Y ^ B · Y ^ A Z ^ B · Y ^ A X ^ B · Z ^ A Y ^ B · Z ^ A Z ^ B · Z ^ A = r 11 r 12 r 13 r 21 r 22 r 23 r 31 r 32 r 33 =   = X ^ B Y ^ B Z ^ B · X ^ A Y ^ A Z ^ A = X ^ A Y ^ A Z ^ A · X ^ B Y ^ B Z ^ B = R T A B
The prime of the matrix denotes the transpose matrix. The equation representing the transformation between AP and BP is as follows,
P A = R B A P B
The displacement between the origins of {A} and {B} is denoted by APBORG, as shown in Figure 2. The equation representing the transformation between AP and BP is then as follows,
P A = R B A P B + P BORG A
P A 1 = R B A P BORG A 0 0 0 1 P B 1
where APCORG denotes the translation vector from the origin of the Camera 1 coordinate system to the origin of the manipulator coordinate system, i.e., ( X 0 , Y 0 , Z 0 ) and R C A denotes the rotation matrix between the two coordinate systems, consisting of r11~r33.
A 4 × 4 transformation matrix can be used to describe the transformation of P,
P A = T B A P B

4. Experiment Platform and Equipment

As shown in Figure 3, a manipulator with six degrees of freedom and Camera 1 are fixed on the stationary frame of the platform; thus, the position of the manipulator relative to that of Camera 1 does not change. Camera 1 is fixed above the screw holes of the workpiece. Camera 2 is installed on the end effect of the manipulator and photographs the screw hole. The resolution of Camera 1 and 2 is 1280 × 960 and 1024 × 768, respectively. The focal length and aperture of the lenses are fixed during the experiment. An Arduino UNO microcontroller board is used to control a dc motor in the electric screwdriver located on the manipulator and an L298N motor controller is used to drive the motor to tighten the screw. The actual diameter of the M1.4 screw hole is 1.1 mm. Thus, the system has to identify a 1.1 mm hole using image processing. Some fixing mechanisms were designed to fix the cameras. Table 1 summarizes the equipment used in this study.

5. Transformation between Camera 1 and Manipulator

The coordinates of the screw hole, calculated using image processing, must be transformed into the coordinate system of the manipulator through coordinate transformation. Therefore, the equation representing the transformation between Camera 1 and the manipulator is required. The coordinate systems of the manipulator and Camera 1 are defined as {A} and {C}, respectively, as shown in Figure 4. The origin of {A} is at the center of the robot’s base, and the unit of measurement used in the coordinate system is millimeters. The origin of {C} is defined at the upper left corner of an image taken by Camera 1 (Figure 5). One length unit in the X-direction multiplied by one length unit in the Y-direction equals one pixel.

5.1. Transformation Formula

P A 1 = R C A P CORG A 0 0 0 1 P C 1
X A Y A Z A 1 = r 11 r 12 r 13 X 0 r 21 r 22 r 23 Y 0 r 31 r 32 r 33 Z 0 0 0 0 1 X C Y C Z C 1
where APCORG denotes the translation vector from the origin of the Camera 1 coordinate system to the origin of the manipulator coordinate system, i.e., ( X 0 , Y 0 , Z 0 ).
A 4 × 4 matrix is used as the transformation matrix, with 12 unknown elements required by calculation. The objective ZA is set as 253 and ZC is set at 0; thus, the formula can be simplified to the following equation with six unknown elements,
X A Y A 1 = r 11 r 12 X 0 r 21 r 22 Y 0 0 0 1 X C Y C 1

5.2. Measurement Data

At least six ( X A , Y A ) and ( X C , Y C ) data points are required to solve Equation (8) for the six unknowns. In the present experiment, the manipulator was manually controlled and the screw was aligned above the screw holes vertically to determine the ( X A , Y A ) data. Each screw hole was aligned three times and the average coordinates were determined. For the same screw hole, if the error was larger than 0.1 mm, the datum was deleted and measurement was repeated. The parameter 0.1 is a custom value for the range of error. Tightening the M1.4 screw with that parameter is allowed. Image processing was used to solve for ( X C , Y C ).
It should be noted that increasing the number of measurement data can increase the accuracy of the least square approach. However, greater accuracy is not required for the calibration of Camera 1 since fine tuning will be performed for Camera 2. If it is necessary to speed up the experimental process, the least square method can be omitted to reduce the number of screw hole detections by half. Furthermore, due to the fact that the elements r11~r22 of the rotation matrix R C A only depend on angle, there are only three unknowns in Equation (8). The calibration of Camera 1 can be performed with just about 3 points measured on the workpiece. Table 2 lists the measurement data of twelve screw holes.

5.3. Calculation

The 12 data points in Table 2 are substituted into Equation (8) as follows,
X C 1 Y C 1 1 X C 2 Y C 2 1 1 X C 12 Y C 12 1 r 11 r 12 X 0 = X A 1 X A 2 X A 12
X C 1 Y C 1 1 X C 2 Y C 2 1 1 X C 12 Y C 12 1 r 21 r 22 Y 0 = Y A 1 Y A 2 Y A 12
The equations are simplified as follows,
C r 11 r 12 X 0 = A X
C r 21 r 22 Y 0 = A Y
Here, C is the matrix of the {C} data and AX and AY are the matrices of the {A} data. Using the 12 data points, the six unknowns are calculated using the least square method,
r 11 r 12 X 0 = C C 1 C A X
r 21 r 22 Y 0 = C C 1 C A Y
Next, the value of the transformation matrix T can be obtained,
T = r 11 r 12 X 0 r 21 r 22 Y 0 0 0 1 = 0.0022 0.0955 383.1165 0.0914 0.0013 54.4697 0 0 1
Due to the influence of the preset z axis, the elements of the third row of the matrix is not symmetrical as in Equation (1). However, r12 is not equal to r21, which is due to the existence of measurement errors of data ( X A , Y A ) and ( X C , Y C ). By using an automatic process, (firstly, randomly move the screw hole workpiece) the coordinates of the screw hole can be determined using image processing. These coordinates can then be substituted into Equation (8) and automatically transformed into coordinates in the manipulator’s coordinate system.

6. Transformation Equation between Camera 2 and Manipulator

Camera 2 photographs the screw hole from a fixed height. The coordinates of the screw hole, determined using image processing, must be transformed into the coordinate system of the manipulator through coordinate transformation. Therefore, the equation representing transformation between the coordinate systems of Camera 2 and the manipulator must be calculated. The coordinate system of the manipulator is defined as {A}, whereas that Camera 2 is defined as {C2}, as shown in Figure 6.
Coordinate system {C2} moves with the manipulator. The coordinate system {D} is defined as having the same X ^ and Ŷ values as those of {C2}, but a different origin, as shown in Figure 7. From its position, the manipulator descends vertically to touch the workpiece with the screw. The point at which the screw touches the workpiece is the origin of {D} and thus also has equal (x, y) to the coordinates of the manipulator. If the origin of {D} is at the center of the screw hole, the screw is considered to be aligned over the screw hole.

6.1. Transformation Formula

P A = R D A P D + P DORG A
X A Y A Z A = X ^ D · X ^ A Y ^ D · X ^ A Z ^ D · X ^ A X ^ D · Y ^ A Y ^ D · Y ^ A Z ^ D · Y ^ A X ^ D · Z ^ A Y ^ D · Z ^ A Z ^ D · Z ^ A X D Y D Z D + x y z
The 3 × 3 matrix is a rotation matrix, for which nine unknowns must be calculated. The objective ZA is fixed, with ZD set to 0; thus, the formula can be simplified to the following equation, which has four unknown elements. In the equation, (x, y) indicates the coordinates of the manipulator.
X A Y A = X ^ D · X ^ A Y ^ D · X ^ A X ^ D · Y ^ A Y ^ D · Y ^ A X D Y D + x y
In Figure 8, X ^ D is perpendicular to the dashed line and Y ^ D is collinear with the dashed line. Therefore, Equation (18) can be rewritten as the following equation, with unknowns (a, b, c, d),
X A Y A = y x 2 + y 2 a x x 2 + y 2 b x x 2 + y 2 c y x 2 + y 2 d X D Y D + x y

6.2. Measurement Data

In the present experiment, the manipulator was manually controlled and the screw manually aligned over each screw hole. Camera 2 took 12 pictures of each screw hole. The coordinates (XC2, YC2) of the screw holes were determined using these 12 pictures (Table 3). The average values indicate the correct position of the screw hole.
If it needs speedy time, we can reduce the number of detections. At least four ( X A , Y A ) and ( X D , Y D ) data points were required to solve for the four unknowns. The manipulator was manually controlled to align the screw at the No. 6 screw hole, after which it was moved 1 mm forward, backward, left, and right. Camera 2 took a picture of the manipulator in each of these positions. The coordinates of the screw hole were calculated using these five pictures. The same steps were performed for the No. 12 screw hole. Table 4 and Table 5 list the measurement data for these two screw holes. For the No. 6 screw hole, (x, y) = (427.5, −42.1). For the No. 12 screw hole, (x, y) = (395.57, 46).

6.3. Calculation

Variables u and v are defined as follows,
u = x x 2 + y 2
v = y x 2 + y 2
The 10 data points in Table 4 and Table 5 were substituted into Equation (19),
X D 1 Y D 1 X D 2 Y D 2 X D 12 Y D 12 v a u b = X A 1 x 1 X A 2 x 2 X A 12 x 12
X D 1 Y D 1 X D 2 Y D 2 X D 12 Y D 12 u c v d = Y A 1 y 1 Y A 2 y 2 Y A 12 y 12
Variables u and v were moved to the left matrix:
v 1 X D 1 u 1 Y D 1 v 2 X D 2 u 2 Y D 2 v 12 X D 12 u 12 Y D 12 a b = X A 1 x 1 X A 2 x 2 X A 12 x 12
u 1 X D 1 v 1 Y D 1 u 2 X D 2 v 2 Y D 2 u 12 X D 12 v 12 Y D 12 c d = Y A 1 y 1 Y A 2 y 2 Y A 12 y 12
The equations were simplified,
D 1 a b = A X
D 2 c d = A Y
The {D} data matrices are represented by D1 and D2. The {A} data matrices are represented by AX and AY. With the 10 data points, the four unknowns were solved using the least square method.
a b = D 1 D 1 1 D 1 A X
c d = D 2 D 2 1 D 2 A Y
The elements of rotation matrix R were then calculated,
R = X ^ D · X ^ A Y ^ D · X ^ A X ^ D · Y ^ A Y ^ D · Y ^ A = 0.0244 y x 2 + y 2 0.0254 x x 2 + y 2 0.0234 x x 2 + y 2 0.0196 y x 2 + y 2
The matrix is not completely symmetrical, and its slight error is caused by the fact that the inside of the robot arm is not an absolute coordinate system.
Through the automatic process, the coordinates of the screw hole are solved using image processing. These coordinates and the current coordinates of the manipulator (x, y) can be substituted into Equation (19) and automatically transformed into coordinates in the manipulator’s coordinate system.
During real-time implementation, the parameter values of mathematical equations are given in Equations (15) and (30). Using these equations, we can execute the real-time implementation in experiments.

7. Program and Experiment Process

The camera photographs the screw hole. The image is processed and the coordinate value is then transformed through the HMI. The manipulator controller and the HMI communicate through the RS232. The dc motor in the electric screwdriver was controlled in this study by an Arduino UNO. The Arduino and HMI communicated through a Universal Serial Bus connection.
Figure 9 displays the HMI, which has buttons and text boxes as well as an image of all the screw holes. First, the user randomly moves the screw hole workpiece. The workpiece should be in the shooting range of Camera 1. After Camera 1 photographs the workpiece, the image is processed using the methods described in Section 2: gray scale, Gaussian filter, bilateral filter, binarization, edge detection, and center of gravity. The system detects 12 screw holes. If the user chooses the No. 8 screw hole, the system calculates its coordinates and then transforms them into the coordinate system of the manipulator, with the resulting coordinates displayed by the HMI. When the user presses the button “Manipulator Start,” the manipulator moves to the intended position automatically, which in this case is a fixed height above the No. 8 screw hole.
After the manipulator has arrived at the calculated position, Camera 2 photographs the No. 8 screw hole. The image is processed, and the system detects the screw hole and calculates its coordinates, transforming them into the coordinate system of the manipulator. The HMI displays the coordinates, as shown in Figure 10. When the user presses the button “Manipulator Start,” the manipulator moves to the determined position automatically, so that the screw touches the No. 8 screw hole vertically. When the user presses the button “Screwdriver Start,” the dc motor in the electric screwdriver starts rotating, tightening the screw into the screw hole (Figure 11). When the user presses the button “Manipulator Return,” the manipulator returns to its initial position (Figure 12).

8. Conclusions

The proposed approach tightens an M1.4 screw automatically. The cost for Digital Image Processing on a general computer is about 0.5 s, for the manipulator with slow motion is about 5 s, and for the fast motion is about 2 s instead. In contrast with manual labor, automatically tightening screw manipulators is faster, and would not affect the success rate from fatigue. The success rate is about 80% in this study. The failure is mainly due to the fact that the screw is not properly be positioned in the screwdriver head since the screw is secured to the screwdriver head by magnetizing the screw head. Since there is no other research related to tightening tiny screws, we cannot provide performance indicators to compare with other studies. Future work will continue to explore the extension of the scheme to curved 3-D Car-body-like-surfaces.

Author Contributions

Conceptualization, P.-C.T.; methodology, C.-K.L., J.-R.H. and P.-C.T.; investigation, S.-Y.C.; resources, C.-K.L., J.-R.H. and P.-C.T.; data curation, P.-C.T.; writing—original draft preparation, S.-Y.C. and P.-C.T.; writing—review and editing, J.-R.H. and C.-K.L.; project administration, J.-R.H. and C.-K.L.; funding acquisition, P.-C.T. and C.-K.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by the Ministry of Science and Technology (Taiwan) under contract numbers MOST 110-2218-E-008-008 and MOST 110-2221-E-008-089.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript or in the decision to publish the results.

References

  1. Brogårdh, T. Present and future robot control development—An industrial perspective. Annu. Rev. Control. 2007, 31, 69–79. [Google Scholar] [CrossRef]
  2. Asakawa, N.; Takeuchi, Y. Teachingless spray-painting of sculptured surface by an industrial robot. In Proceedings of the 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C), Detroit, MI, USA, 10–15 May 1999. [Google Scholar]
  3. Choi, S.; Newman, W.S. Design and evaluation of a laser-cutting robot for laminated, solid freeform fabrication. In Proceedings of the 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065), San Francisco, CA, USA, 24–28 April 2000. [Google Scholar]
  4. Smith, C.B. Robotic friction stir welding using a standard industrial robot. Keikinzoku Yosetsu/J. Light Met. Weld. Constr. 2004, 42, 40–41. [Google Scholar]
  5. Nagata, F.; Watanabe, K.; Kiguchi, K.; Tsuda, K.; Kawaguchi, S.; Noda, Y.; Komino, M. Joystick teaching system for polishing robots using fuzzy compliance control. In Proceedings of the 2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, Banff, AB, Canada, July 29–1 August 2001. [Google Scholar]
  6. Zheng, Y.F.; Luh, J. Optimal load distribution for two industrial robots handling a single object. In Proceedings of the IEEE International Conference on Robotics and Automation, Philadelphia, PA, USA, 24–29 April 1988; pp. 344–349. [Google Scholar]
  7. Pitipong, S.; Pornjit, P.; Watcharin, P. An automated four-DOF robot screw fastening using visual servo. In Proceedings of the 2010 IEEE/SICE International Symposium on System Integration, Sendai, Japan, 21–22 December 2010. [Google Scholar]
  8. Schmid, A.J.; Gorges, N.; Goger, D.; Worn, H. Opening a door with a humanoid robot using multi-sensory tactile feedback. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, 19–23 May 2008; pp. 285–291. [Google Scholar]
  9. Prats, M.; Sanz, P.J.; Del Pobil, A.P. Vision-tactile-force integration and robot physical interaction. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 3975–3980. [Google Scholar]
  10. Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 2007, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
  11. Bliton, A.; Patton, M.; Rolli, M.; Roos, K.; Taylor, S. Microscopic motion analysis: Laplacian-of-Gaussian masks for subpixel edge detection. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, New Orleans, LA, USA, 4–7 November 1988; pp. 1098–1099. [Google Scholar]
  12. Zare, M.; Jampour, M.; Farrokhi, I.R. A heuristic method for gray images pseudo coloring with histogram and RGB layers. In Proceedings of the 2011 IEEE 3rd International Conference on Communication Software and Networks, Xi’an, China, 27–29 May 2011; pp. 524–527. [Google Scholar]
  13. Pratt, W.K. Digital Image Processing: PIKS Inside, 3rd ed.; John Wiley: New York, NY, USA, 2001. [Google Scholar]
  14. Wennersten, P.; Ström, J.; Wang, Y.; Andersson, K.; Sjoberg, R.; Enhorn, J. Bilateral filtering for video coding. In Proceedings of the 2017 IEEE Visual Communications and Image Processing (VCIP), St. Petersburg, FL, USA, 10–13 December 2017; pp. 1–4. [Google Scholar]
  15. Ma, Z.; Zhu, J.; Li, W.; Xu, H. Detection of point sources in X-ray astronomical images using elliptical Gaussian filters. In Proceedings of the 2017 2nd International Conference on Image, Vision and Computing (ICIVC), Chengdu, China, 2–4 June 2017; pp. 36–40. [Google Scholar]
Figure 1. Coordinate systems {A} and {B} share the same origin; thus, one is a rotation of the other.
Figure 1. Coordinate systems {A} and {B} share the same origin; thus, one is a rotation of the other.
Mathematics 09 02521 g001
Figure 2. Rotation and displacement between the coordinate systems {A} and {B}.
Figure 2. Rotation and displacement between the coordinate systems {A} and {B}.
Mathematics 09 02521 g002
Figure 3. Experiment platform.
Figure 3. Experiment platform.
Mathematics 09 02521 g003
Figure 4. Positions of the coordinate systems {A} and {C}.
Figure 4. Positions of the coordinate systems {A} and {C}.
Mathematics 09 02521 g004
Figure 5. Processed image and the coordinate system {C}.
Figure 5. Processed image and the coordinate system {C}.
Mathematics 09 02521 g005
Figure 6. Positions of the coordinate systems {A} and {C2}.
Figure 6. Positions of the coordinate systems {A} and {C2}.
Mathematics 09 02521 g006
Figure 7. Positions of the coordinate systems {C2} and {D}.
Figure 7. Positions of the coordinate systems {C2} and {D}.
Mathematics 09 02521 g007
Figure 8. Relationship between the coordinate systems {A} and {D}.
Figure 8. Relationship between the coordinate systems {A} and {D}.
Mathematics 09 02521 g008
Figure 9. A photograph is captured using Camera 1, and the image is processed on the HMI.
Figure 9. A photograph is captured using Camera 1, and the image is processed on the HMI.
Mathematics 09 02521 g009
Figure 10. A photograph is captured using Camera 2, and the image is processed on the HMI.
Figure 10. A photograph is captured using Camera 2, and the image is processed on the HMI.
Mathematics 09 02521 g010
Figure 11. A screw comes into contact with the workpiece and tightening begins.
Figure 11. A screw comes into contact with the workpiece and tightening begins.
Mathematics 09 02521 g011
Figure 12. Tightening of the screw is complete and the manipulator returns to its original position.
Figure 12. Tightening of the screw is complete and the manipulator returns to its original position.
Mathematics 09 02521 g012
Table 1. Equipment specifications.
Table 1. Equipment specifications.
EquipmentSpecification/Description
ManipulatorMITSUBISHI RV-2SD
CCD Camera 1SONY DFW-SX910
CCD Camera 2SONY XCD-X710
Camera lenses6~60 mm, F1.4
Image capture cardNI PCI-8254R
MicrochipArduino UNO
Drive circuitL298N
Electric screwdriverNeopower, 6V, 200 rpm
ScrewM1.4 glasses screw
Screw hole workpieceTwelve M1.4 screw holes
LampLED
Fixing mechanismsMade in house
Table 2. Camera 1 and manipulator measurement data.
Table 2. Camera 1 and manipulator measurement data.
No. of Screw HoleXCYCXAYA
1921.3380772.729645530.66
2195.9257749.3015454.85−35.30
3635.7407676.7407446.465.05
41116.6790603.7736438.8048.70
5399.4202544.6489434.55−16.70
6122.8602463.9617427.50−42.10
7642.1467411.4099421.365.30
8932.9725385.6583418.3131.58
9328.2440275.1014409−23.21
10460.1701213.3160402.95−11.51
11785.4467198.0349400.8017.86
121091.8260152.6689395.5746
Table 3. Measurement data from 12 pictures taken by Camera 2.
Table 3. Measurement data from 12 pictures taken by Camera 2.
No. of Screw HoleXC2YC2
1348.5488241.7073
2349.4948232.7661
3357235.5000
4354.3511250.7480
5357.8855234.8419
6360223
7358.5000239
8355.8795245.0656
9349240.5000
10348.5000242.5000
11347.3996245.8686
12337.9721246.1950
Average352.0443239.8077
Table 4. Measurement data for No. 6 screw hole.
Table 4. Measurement data for No. 6 screw hole.
PositionxyXC2YC2XDYD
Aligned427.5−42.136022300
1 mm forward428.5−42.1356184.5−4−38.5000
1 mm backward426.5−42.1364.4524264.92324.452441.9232
1 mm left427.5−43.1398.2689218.444638.2689−4.5554
1 mm right427.5−41.1315229.5−456.5
Table 5. Measurement data for No. 12 screw hole.
Table 5. Measurement data for No. 12 screw hole.
PositionxyXC2YC2XDYD
Aligned395.5746337.9721246.195000
1 mm forward396.5746341.4503210.52883.4782−35.6662
1 mm backward394.5746336287−1.972140.8050
1 mm left395.5745378.6491249.579140.67703.3841
1 mm right395.5747292.6097243.6904−45.3624−2.5046
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chen, S.-Y.; Ho, J.-R.; Tung, P.-C.; Lin, C.-K. Automatically Tightening Tiny Screw Using Two Images and Positioning Control. Mathematics 2021, 9, 2521. https://doi.org/10.3390/math9192521

AMA Style

Chen S-Y, Ho J-R, Tung P-C, Lin C-K. Automatically Tightening Tiny Screw Using Two Images and Positioning Control. Mathematics. 2021; 9(19):2521. https://doi.org/10.3390/math9192521

Chicago/Turabian Style

Chen, Shou-Yu, Jeng-Rong Ho, Pi-Cheng Tung, and Chih-Kuang Lin. 2021. "Automatically Tightening Tiny Screw Using Two Images and Positioning Control" Mathematics 9, no. 19: 2521. https://doi.org/10.3390/math9192521

APA Style

Chen, S. -Y., Ho, J. -R., Tung, P. -C., & Lin, C. -K. (2021). Automatically Tightening Tiny Screw Using Two Images and Positioning Control. Mathematics, 9(19), 2521. https://doi.org/10.3390/math9192521

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop