Next Article in Journal
Mobility Analysis of the Lumbar Spine with a Dynamic Spine-Correction Device
Next Article in Special Issue
Performance Comparison of Tungsten-Halogen Light and Phosphor-Converted NIR LED in Soluble Solid Content Estimation of Apple
Previous Article in Journal
FRMDB: Face Recognition Using Multiple Points of View
Previous Article in Special Issue
A Novel Method for Pose and Position Calibration of Laser Displacement Sensors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Calibrating Laser Three-Dimensional Projection Systems Using Binocular Vision

1
College of Automation, University of Science and Technology Beijing, Beijing 100083, China
2
Institute of Microelectronics of the Chinese Academy of Sciences, Beijing 100029, China
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(4), 1941; https://doi.org/10.3390/s23041941
Submission received: 9 December 2022 / Revised: 18 January 2023 / Accepted: 31 January 2023 / Published: 9 February 2023
(This article belongs to the Special Issue Advances in Optical Sensing, Instrumentation and Systems)

Abstract

:
A laser three-dimensional (3D) projection system is an auxiliary system in intelligent manufacturing. It works with a positioning system in practical applications. This study proposes a calibration method for laser 3D projection systems based on binocular vision. The significance of the binocular vision positioning function for the calibration process was analyzed. Two calibration methods for laser 3D projection systems based on the binocular vision positioning function were proposed. One method involves simplified calculation models and another used data to solve the conversion relationship. The experimental calibration of the projection system was performed using data to directly solve the conversion relationship. The experiment demonstrated the simplicity of the proposed calibration method. The calculation time was less under the 3D laser projection system based on binocular vision. Moreover, the mean calibration error was 0.38 mm at a working distance of 1.8–2.2 m.

1. Introduction

Laser three-dimensional (3D) projection is one of the innovative uses of lasers, compensating for the lack of manual processing. Laser 3D projection is an auxiliary tool for precise large-scale equipment production. It uses the vision theory to produce the desired pattern or outline by directing the laser beam at the exact location of the projected target through a fast-moving deflection mechanism [1]. Laser 3D projection is increasingly used in manufacturing processes, such as composite laying, pattern spraying, and assembly guidance, and the part welding of large-scale components with complex shapes in aerospace, ship building, automobile manufacturing, and other large-scale equipment manufacturing industries [2]. It realizes intelligent production while minimizing errors and increasing productivity by functioning as an auxiliary measurement and marking system [3].
The laser deflection device is the primary part of the laser 3D projection system. The operating speed and accuracy of the laser 3D projection system depend on its deflection speed and accuracy. Compared with other deflection devices, the galvo scanner has significant benefits in precision and speed owing to the galvanometer it uses for deflection [4]. The deflection device of a laser 3D projection system is typically a galvo scanner. To determine the relationship between the deflection values of the galvo scanner and target coordinates, most laser 3D projection systems require manual adjustment of the galvo scanner’s deflection and project the laser to a known coordinate point on the projected target. This process is the calibration of the laser 3D projection system [5].
Calibration is essential before operating the 3D laser projection system. Two typical methods are used for calibrating the system: The first method evaluates the physical model of the galvo scanner, establishes the coordinate system of the galvo scanner, and carries out the conversion of the coordinate systems. The other method uses a large amount of data to fit the nonrelational relationship between the input and output. A method for projecting graphic outlines with a laser projector was created by Rueb K D in 1997 [6]. This method manually calibrates the laser projector to match a recognized object coordinate system. This method is labor-intensive, and the production process of the galvo scanner influences the calibration accuracy. Some researchers have looked into simpler algorithms and auxiliary calibration tools to streamline the calibration procedure and enhance the usability of laser 3D projection systems. A laser projector that projects a 3D image onto an object was invented by Kaufman et al. [7]. The projector measures the distance between the instrument and projected target using a time module connected to an optical module, which adds this parameter to the calibration procedure. Cui et al. [8] applied the imaging principle of a camera to laser projection and used the principle of the camera imaging model to calibrate the projection system. Guo Lili [9] proposed using a laser reflection device to acquire the galvo scanner’s exit coordinates in 3D space, shortening the projection time and steps. These techniques simplify the calibration procedure to some extent. However, they do not address the issue of repeating the calibration when the position of the target changes. In 2011, Rueb et al. [10] suggested using a laser template to deduce the projection target’s location. Kaufman et al. [11] also developed a laser 3D projection motion tracking component which enables the system to update the pose according to the target. These techniques address the issue of repeated calibration; however, the overall structural layout of the system becomes more complex. Tobais et al. [12] built a laser 3D projection system under a single camera using Gaussian fitting (GP), ridge regression, artificial neural network (ANN), support vector regression (SVR), and other fitting algorithms to study the relationship between the driving values of the galvo scanner and image coordinates. This calibration method does not consider the complex physical model and is not affected by the system structure design. This calibration method has been widely studied in recent years.
The visual measurement method has the advantages of convenient operation and simple positioning and is suitable for calibrating laser 3D projection systems. A corrected laser 3D scanning system may comprise one or more cameras set in a fixed orientation on the scanner for system calibration, as suggested by Morden et al. [13]. Based on this, Qi et al. [14] proposed the stereo vision laser galvanometric scanning system and a system calibration method using plastic thin film targets, and applied it to cutting duck feathers for badminton shuttle manufacture. Its use scenario is relatively simple, and it needs to be recalibrated when changing the position. Tu et al. [15] proposed a calibration method that builds a neural network and takes the digital control signal at the drives of the GLS system as input and the space vector of the corresponding outgoing laser beam as output. This paper makes some improvements to the calibration method of the neural network. This method takes the set of coordinates in the target point cloud as input and the deflection values of the galvo scanner as output and supports the dynamic adjustment of the input and output data. The accurate calibration of the 3D laser projection system is achieved through this method combined with the pose relationship between the camera and galvo scanner. This method is simpler to use, less repetitive than the conventional approach, more precise and intelligent, and adaptable to various application scenarios.

2. Calibration Method Based on Binocular Vision

2.1. Positioning Theory for Binocular Vision

The binocular vision positioning theory is based on the geometric model of similar triangles in the binocular field of view. A mathematical model can be used to explain how the camera captures an image of the real object. The model comprises three coordinate systems: image, camera, and world. The two-dimensional (2D) image coordinate system has a plane identical to the physical imaging plane of the camera. The pixel layout determines the directions of the X and Y coordinate axes, with the unit of measurement in millimeters. The center of the physical imaging plane serves as the origin of the image coordinate system. The origin of the camera coordinate system is the camera optical center O. The XC and YC coordinate axes are parallel to the image coordinate system. The ZC coordinate axis coincides with the optical axis of the camera and is perpendicular to the physical imaging plane. The experimenter typically establishes the world coordinate system. The location of the origin and axis of the coordinates can be selected randomly; typically, it is placed on the measured object to facilitate description. Figure 1 shows the positional relationship of the three coordinate systems.
Assume a point (xc,yc,zc) on the camera coordinate system. This point corresponds to a point (x′,y′) on the image coordinate system after the camera captures the image. The coordinate transformation relationship from the image coordinate system to the camera coordinate system can be expressed as Equation (1) [16]:
Z c [ x y 1 ] = [ f 0 0 0 0 f 0 0 0 0 1 0 ] [ x c y c z c 1 ] ,
where f is the focal length of the camera, determined during camera calibration, and Zc is a coefficient.
The conversion relationship between the coordinates (xc,yc,zc) of the camera coordinate system and coordinates (xn,yn,zn) of the world coordinate system can be expressed as Equation (2) [16]:
[ x c y c z c 1 ] = [ R t 0 T 1 ] [ x n y n z n 1 ] .
Equations (1) and (2) can be used to obtain the camera imaging model, where R and t are the rotation matrix and translation vector of the coordinate system transformation, respectively, and ax, ay, u0, and v0 are the internal parameters of the camera. The imaging model of the camera can be expressed as Equation (3) [16]:
Z c [ x y 1 ] = [ a x 0 u 0 0 0 a y v 0 0 0 0 1 0 ] [ R t 0 T 1 ] [ x n y n z n 1 ] .
When two cameras are used by the binocular vision system to capture photographs of the same target from different angles, the difference between the two images represents the information on the object’s three dimensions. Disparity can be calculated when the two image planes are completely coplanar, and the lines are aligned. This is known as the binocular stereoscopic correction process, dealing with the perspective transformation of the image. After the binocular stereoscopic correction, the three-dimensional coordinates of the space can be solved using the physical model shown in Figure 2: P is a point on the object, Z is the distance from point P to the focal plane, and f is the focal length. Pl and Pr are the projection points of point P on the focal planes of the left and right cameras with abscissas Xl and Xr, respectively, and T is the distance between the optical centers of the two cameras.
Based on the geometric relationship of similar triangles, the spatial coordinates (x/w, y/w, z/w) of point P in the camera coordinate system can be expressed as Equation (4) [17].
[ x y z w ] = [ 1 0 0 s u 0 0 1 0 s v 0 0 0 0 f 0 0 1 T 1 ] [ X r Y r d 1 ] ,
where (Xr,Yr) are the projected point coordinates of P in the stereo-corrected image, s is the millimeter size of the pixel, d = XlXr is the disparity between the coordinates of the two cameras, and w is the scale factor.
Any known coordinate point in the world coordinate system can be converted to the camera coordinate system using the aforementioned method. The unique conversion relationship between the world and camera coordinate systems can be solved if the four sets of matching coordinates in each of the two coordinate systems are known. The corresponding relationship between the deflection values of the galvo scanner and coordinates of the projected target is based on this conversion relationship.

2.2. Calibration Using Binocular Vision

The relationship between the camera and the projected target can be determined using the positioning function of binocular vision in Section 2.1. The pose relationship between the camera and galvo scanner should be solved to complete the laser 3D projection system calibration. Two methods are studied to solve the relationship between the camera coordinates and deflection values of the galvo scanner.

2.2.1. Simplifying the Computational Model of Galvanometers Using Binocular Vision

The galvo scanner structure can be simplified into a physical model shown in Figure 3. The digital circuit can convert the digital signals dx and dy into voltage signals Vx and Vy to drive the motor to deflect a horizontal angle θx and pitch angle θy. The mirror fixed on the motor also deflects the same angle as a galvanometer; a galvo scanner coordinate system with the center of mirror 2 as the origin, the coordinate axis Xg parallel to the rotation axis of motor 2, and the coordinate axis Zg parallel to the rotation axis of motor 1. After the laser is emitted from the transmitter, it can theoretically be projected to any position (x,y,z) in the front space after being deflected twice. The relationship between the deflection value of the galvanometer and coordinate can be expressed as Equation (5):
{ z = d p y = d p × tan 2 θ y x = e × tan 2 θ x + d p × tan 2 θ x / cos 2 θ y ,
where e is the distance between the central axes of the two mirrors, determined during galvo scanner installation, and dp is the Z-axis coordinate in the galvo scanner coordinate system, typically obtained by the laser exit time tc or measured by a laser reflection device.
Using a binocular camera simplifies solving for the coordinate value z. When installing the hardware platform of the laser 3D projection system, the binocular camera is installed parallel to the XgOZg plane of the galvo scanner coordinate system and relatively fixed. The same applies for the plane to be projected. The deflection values of the two galvanometers are adjusted to zero. The current output coordinates are (0,0,z) in the galvo scanner coordinate system. The galvo scanner is adjusted to deflect an angle θy arbitrarily. The Zg axis coordinate does not change because the projection plane is placed parallel to the XgOYg plane. The exit coordinate in the galvo scanner coordinate system is (0,y,z). As shown in Figure 4, the positioning function of binocular vision can be used to solve coordinate z directly when the binocular camera is installed parallel to the XgOZg plane. The coordinate value y in the galvanometer coordinate system equals the coordinate value in the camera coordinate system. In this case, z = y/tan2θy.
After determining the coordinate value z and deflection angles θx and θy of the galvo scanner, the precise coordinates (x,y,z) on the projected target can be obtained using Equation (5) by controlling the deflection of the galvo scanner arbitrarily. This method is used for obtaining four groups of coordinates of the galvo scanner and camera coordinate systems to determine the relationship between them and solve the affine transformation matrix.
The physical model analysis method of the scanning galvanometer is established under ideal hardware installation conditions. In reality, whether the laser light source is vertically irradiated on the center of the mirror and whether the central axis of the two mirrors is vertical or not significantly influence the calibration result when designing the hardware platform of the 3D projection system. Therefore, the aforedescribed method is only suitable for some specific scenarios.

2.2.2. Solve Calibration Directly Using Data-Driven Approach

In calibrating the system using the binocular vision, the data can be directly used to solve the conversion relationship between the deflection values of galvo scanner and coordinates of the camera coordinate system. Fitting their relationship with least squares is a straightforward method. Given N groups of different deflection values of the galvo scanner,
D = { ( d x i , d y i ) | i = 1 , 2 N } ,
and the corresponding coordinates of camera coordinate system,
V = { ( x i , y i , z i ) | i = 1 , 2 N } ,
the least-squares solution of the conversion relationship H between the two can be determined using Equation (8):
H = ( V V ) 1 V D .
Although the least-squares method is straightforward and simple, it has several limitations because it is susceptible to outlier interference. Moreover, the relationship between the model input and output is nonlinear, whereas the least-squares fitting function is a linear relationship. When the z coordinate in the 3D coordinate is fixed, the laser is deflected and projected on a plane by the galvo scanner. The relationship between the deflection values of the galvo scanner and coordinates of the camera coordinate system is expressed as follows:
( x / tan θ 0 e 2 ) y 2 = d 2 .
Equation (9) shows that when the galvo scanner is linearly driven, the laser output graph obtained is a closed graph with a hyperbola, rather than an ideal rectangle, as shown in Figure 5. The relationship is also obtained when the galvo scanner works in a 3D space; therefore, using linear techniques, such as the least-squares method, to directly solve it is inappropriate.
A neural network is a suitable direct solution technique with good nonlinear function approximation capabilities. The single hidden layer neural network shown in Figure 6 can approximate the nonlinear relationship accurately [18]. For the calibration model in this study, the input and output dimensions were three and two, respectively. No complex network model was required to solve the single hidden layer neural network model to quickly and accurately determine the conversion between the deflection values of the galvo scanner and coordinates of the camera coordinate system.
In the calibration model, the number of neurons in the input and output layers were three and two, respectively. Assuming that there are N groups of different coordinate values as input v i , the network output can be expressed as follows [15]:
f ( v i ) = j = 1 L β i G j ( ω j , b j , v i ) , i = 1 N ,
where L is the number of neurons in the hidden layer; it is typically set to be equal to or close to the number of samples N to obtain a better fitting relationship. ω and b represent the weights and biases from the input layer to the hidden layer, respectively. β represents the weight from the hidden layer to the output layer, and G is the activation function.
To simplify the solution, the input–output relationship of the network is simplified as D = βH, and H is given by Equation (11):
H = H ( ω 1 , , ω L , b 1 , , b L , x 1 , , x N ) .
When the number of neurons in the hidden layer is sufficiently large (greater than N), ω and b in H are randomly selected, where ω is a set of random values with a mean of 1, and b is a set of random values with a mean of 0. The sample points in the interval can be arbitrarily interpolated; that is, the network can approximate any sample point with zero error. Moreover, the matrix H must be invertible; that is, there must be an inverse matrix or a pseudo-inverse matrix. The least-squares solution of β can be obtained using Equation (12):
β = H + D ,
where H + is the generalized inverse of H. The known random values of ω and b and the calculated β are used to determine the mapping relationship. This method refers to the extreme learning machine (ELM) algorithm and calculates the least-squares solution by randomly selecting the weights and biases between the input and hidden layers. The smallest empirical risk can be obtained conveniently, and new data can be added at any time. The solution is also the smallest two-norm solution among all least-squares solutions, as shown in Equation (13) [1]:
| | β 0 | | = | | H + D | | | | β | | , β { | | H β D | | | | H ζ D | | , ζ R L × N } .
This results in a good network generalization performance.
When the camera coordinate system coordinates are given, the deflection values of the galvo scanner can be obtained by solving the network output, and the conversion relationship can be determined through the neural network.
Combining the methods in Section 2.1 and Section 2.2.2, a complete laser 3D projection system calibration process can be completed. The block-level diagram of the calibration is shown in Figure 7.

3. Projection System Calibration Experiment and Result Analysis

3.1. Projection System Calibration Experiment

The calibration process of the laser 3D projection system comprises the following steps:
  • The relative position of the camera and galvo scanner in the laser 3D projection device remain unchanged. Moreover, a set of drive values is predetermined to deflect the laser by an angle such that it is projected on the projection working range (1.8–2.2 m in this experiment) to generate a spot.
  • The laser 3D projection system is adjusted such that the binocular camera can clearly obtain the laser spot image. Moreover, the internal and external parameters of the binocular camera are calibrated. The positioning function of binocular vision is used to obtain the spatial coordinates of the spot and match the deflection values of the galvo scanner.
  • Steps one and two are repeated to obtain the 3D coordinates of multiple groups of different outgoing laser spots and matching deflection values of the galvo scanner. The neural network is used to solve the relationship between the coordinates of the camera coordinate system and deflection values of the galvanometer.
  • The position of the projected target with marking points is fixed, and a world coordinate system is established. The positioning function of binocular vision is used to obtain the coordinates of the four marking points on the projected target in the camera coordinate system. The transformation matrix between the two coordinate systems is solved after determining the coordinates of the four points.
  • The corresponding relationship between the deflection values of the galvo scanner and coordinates of the coordinate system of the object to be projected can be obtained by combining steps three and four. Thereafter, the system calibration is completed.
The relationship between the camera coordinate system coordinates and deflection values of the galvo scanner is solved once because the positional relationship between the galvo scanner and binocular camera is relatively fixed and unchanged. Only the positioning function of the binocular vision is required to determine the conversion relationship between the camera coordinate system and the coordinate system of the projected target for calibrating the laser 3D projection system. Figure 8 shows the flow chart of the calibration procedure.
A metal plate, shown in Figure 9, was customized to verify the calibration accuracy during the calibration experiment of the laser 3D projection system. The metal plate had four holes, used as the target to be located. The reflection effect is good, and the position of the projected laser point can be measured and collected conveniently. The processing error of the metal plate was less than 0.005 mm using vernier calipers and other measuring tools for multiple measurements, satisfying the requirements. A laser 3D projection system was designed to strictly meet the calibration conditions, as shown in Figure 10a. The system comprised two modules: visual positioning and laser projection. The visual positioning module comprised two cameras. The system used two identical sets of cameras and lenses to ensure that the focal length and other lens parameters, are as consistent as possible. This helped satisfy the requirements of the binocular vision positioning principle discussed in Section 1 and improved the calibration accuracy. The installation position of the binocular camera was relatively fixed with the installation position of the galvo scanner. The binocular camera was accurately calibrated before use, and the influence of distortion on the camera imaging was eliminated. The laser projection module comprised a galvo scanner, a galvo scanner control card, and other optics. The galvo scanner is controlled by the galvo scanner control card connected to the computer. The laser transmitter emits laser light through the collimating beam expander, dynamic focus lens, and galvo scanner. Subsequently, millimeter-level laser beams can be projected on targets at different distances. Figure 10b shows the physical diagram of the system.
A program that projects 55 × 50 laser dots each time onto three metal plates was created to form a dot matrix. The position of the metal plate was freely moved four times within the working range of 1.8–2.2 m, and 11,000 sets of data were collected. The network model with the number of hidden layer neurons L = 12,000 and penalty factor C = 5000 was used to train the data. Computing was performed using an ASUS laptop with an AMD Ryzen 5 4600H CPU and an NVIDIA GeForce GTX 1660 Ti graphics card. The squeeze function was used as the activation function, according to the discussion and analysis in Section 2.2.2. The weights ω and bias b were recorded from the input layer to the hidden layer, and the weights β were recorded from the hidden layer to the output layer. When the 3D vector X = (X,Y,Z) of the coordinates of the camera coordinate system is known, the 2D vector T of the drive value of the galvo scanner can be obtained using Equation (14):
T = ( t x , t y ) = β e ( X · ω + b ) + 1 .

3.2. Results Analysis

The calibration findings were verified and evaluated during the laser 3D projection system calibration. A set of drive values of the galvo scanner (tx,ty) and the coordinates of the camera coordinate system (X,Y,Z) of the laser were projected onto the metal plate under these values as a test set. For example, there is a designed coordinate point (xp,yp,zp) in the point cloud of the shape “3” formed by the CAD digital model file in Figure 11a. The deflection values of the galvo scanner (txr,tyr) at this coordinate value can be obtained by using the calibration result as Equation (15):
( t x r , t y r ) = β e ( ( x p , y p , z p ) · ω + b ) + 1 .
Using the solved (txr,tyr) to drive the galvo scanner will generate a laser spot on the projection target, as shown in Figure 11b. The positioning function of the binocular camera can solve the coordinates of the spot (xr,yr,zr) in the camera coordinate system. The distance d between the 3D coordinates (xp,yp,zp) designed in CAD and the actual projected 3D coordinates (xr,yr,zr) is used as the error of the analysis system, as shown in Figure 11c:
d = ( x r x p ) 2 + ( y r y p ) 2 + ( z r z p ) 2 .
A total of 1000 designed coordinate points were selected and the above process was repeated to obtain 1000 error data. Figure 12 shows the error with the number of test sets as Nt = 1000. The maximum and average deviations of the projection laser spot was 0.87 mm and 0.37 mm, respectively.
Solving the conversion between the coordinates of the camera coordinate system and the deflection values of the galvo scanner uses a single hidden layer neural network method, so the error in establishing the relationship between the scanning galvanometer and camera coordinate systems is mainly related to the solution process of the neural network. When the activation function is selected, the training data volume was presumed to influence the calibration accuracy. To test this hypothesis, different numbers of training sets were selected, and the appropriate number of neurons in the hidden layer was used for system calibration. The same test set was used to analyze the calibration error. Table 1 shows the mean deviations, maximum deviations, and computing time. The more data in the training set, the smaller the deviation and the better the calibration effect, but the longer the computing time.
It can be seen from Table 1 that the more data in the training set, the smaller the impact on the error, but the computing time will increase quadratically, as shown in Figure 13. Usually, the system design requires the mean deviation to be less than 0.5 mm. Selecting 11,000 training sets is a more appropriate choice. In actual applications, the amount of data can be increased or reduced according to the precision need.
The calibration errors at the working distances of 1.8 m, 1.9 m, 2 m, 2.1 m, and 2.2 m were determined using the same test set (Nt = 200) to determine the relationship between the calibration effect and projection distance. Table 2 shows the results of the calibrated mean and maximum deviations. The results show that the calibration effect working distance were not correlated.
Table 3 compares the proposed method with the calibration method for establishing a physical model and the binocular vision calibration method based on the coordinate system. In the calibration method using an established physical model, such as that shown in Figure 3, Equation (5) is used to solve the laser output coordinates (x,y,z). Laser trackers or other laser positioning instruments are used to solve the corresponding coordinates in the world coordinate system. In this method, 12 × 12 reflective points are usually selected to form a lattice, and the conversion relationship is solved using the Newton iterative algorithm, the Levenberg–Marquardt algorithm and the hybrid particle swarm algorithm [9]. The instrument design of this method is relatively complicated. In addition, systematic errors occur owing to the difference between the galvo scanner processing technology and ideal parameters. Calibration time is shorter with this method, but the system needs recalibration when the position of the target to be projected changes. This method is the standard system calibration method.
The binocular vision calibration method based on the coordinate system also establishes the galvo scanner coordinate system and determines the relationship between the scanning galvanometer and camera coordinate systems. This method is also affected by the processing technology of the galvo scanner. The proposed method uses binocular cameras for the calibration and directly determines the conversion relationship between the driving values of the galvanometer and coordinates of the camera coordinate system without establishing the coordinate system of the galvo scanner. In addition, the instrument design in this method is simpler. When the position of the target to be projected changes, the proposed method uses the binocular camera to locate the target, simplifying the process operation.

4. Conclusions

This study explored the calibration method for a laser 3D projection system based on binocular vision. First, the positioning principle of binocular vision was analyzed. The basic process of binocular vision positioning was briefly described, and the necessity of the positioning function of binocular vision in the system calibration is explained. Accordingly, two calibration methods for laser 3D projection systems based on binocular vision positioning function were proposed. One method uses binocular vision to simplify the process of solving the mathematical model of the relationship between the deflection values of the galvo scanner and coordinates. The other method is based on the corresponding data of the coordinates of the camera coordinate system and the deflection values of the galvo scanner and uses neural networks to solve the relationship between them. Finally, this study used the second method to perform experiments and analyze the experiment results. The experiment results show that the proposed method improved the calibration accuracy and convenience compared with other traditional methods. Moreover, the proposed method exhibited a relatively good calibration effect.

Author Contributions

Conceptualization, D.L. and C.G.; methodology, Y.W.; software, Y.W.; validation, D.L. and F.W.; formal analysis, D.L. and C.G.; data curation, Y.W. and F.W.; writing—original draft preparation, Y.W.; writing—review and editing, D.L.; project administration, D.L.; funding acquisition, D.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by The National Key Research and Development Program of China, The fund name is displacement sensing technology of large-format structured light field, grant number 2021YFB3200202.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, M. Research on Machine-Vision-Aided Three Dimensional Laser Scan Indication. Master’s Thesis, Nanjing University of Aeronautics and Astronautics, Nanjing, China, 2017. [Google Scholar]
  2. Kaufman, S.; Savikovsky, A.; Chagaris, C. Laser Projection with Object Feature Detection. U.S. Patent 7,306,339, 11 December 2007. [Google Scholar]
  3. Liu, M. Research on Key Technique and Application of the Precise 3D Optical Projecting Indication System. Ph.D. Thesis, Tianjin University, Tianjin, China, 2016. [Google Scholar]
  4. Li, D.; Deng, L.; Duan, J.; Zheng, X.; You, X. Study on large scale 3-D laser dynamic scanning manufacturing systems. Laser Technol. 2016, 40, 466–471. [Google Scholar]
  5. Chen, Y. Research on Autonomous Scanning Laser Projection Technology Integrating integrated with Photogrammetry. Master’s Thesis, Changchun University of Science and Technology, Changchun, China, 2021. [Google Scholar]
  6. Rueb, K.; Bordignon, R.; Wieczorek, J. 3D Imaging Using a Laser Projector. U.S. Patent 5,661,667, 26 August 1997. [Google Scholar]
  7. Kaufman, S.; Savikovsky, A. Apparatus and Method for Projecting a 3D Image. U.S. Patent 6,547,397, 14 April 2003. [Google Scholar]
  8. Cui, S.; Zhu, X.; Wang, W.; Xie, Y. Calibration of a Laser Galvanometric Scanning System by Adapting a Camera Model. Appl. Opt. 2009, 48, 2632–2637. [Google Scholar] [PubMed]
  9. Guo, L. Research on Intelligent Laser 3D Projection System and Calibration Technique for Spatial Positioning Accuracy. Ph.D. Thesis, Changchun University of Science and Technology, Changchun, China, 2018. [Google Scholar]
  10. Rueb, K.; Jackson, V.; Morden, J. Laser Projection Systems and Methods. U.S. Patent 7,986,417, 26 July 2011. [Google Scholar]
  11. Kaufman, S.; Mettinen, K.; Mohazzab, M. 3D Laser Projection, Scanning and Object Tracking. U.S. Patent 9,826,207, 21 November 2017. [Google Scholar]
  12. Tobias, W.; Benjamin, W.; Patrick, S. Data-Driven Learning for Calibrating Galvanometric Laser Scanners. IEEE Sens. J. 2015, 15, 5709–5716. [Google Scholar]
  13. Morden, J.; Rueb, K. Laser Projection System, Intelligent Data Correction System and Method. U.S. Patent 7,463,368, 9 December 2008. [Google Scholar]
  14. Qi, L.; Zhang, Y.; Wang, S.; Tang, Z.; Yang, H.; Zhang, X. Laser cutting of irregular shape object based on stereo vision laser galvanometric scanning system. Opt. Lasers Eng. 2015, 68, 180–187. [Google Scholar]
  15. Tu, J.; Zhang, L. Effective data-driven calibration for a galvanometric laser scanning system using binocular stereo vision. Sensors 2018, 18, 197. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Moreno Noguer, F.; Lepetit, V.; Fua, P. EPnP: Accurate Non-Iterative O(n) Solution to the PnP Problem. In Proceedings of the IEEE 11th International Conference on Computer Vision, Rio de Janeiro, Brazil, 14–21 October 2007; pp. 1–8. [Google Scholar]
  17. Dhond, U.; Aggarwal, J. Structure from stereo-a review. Syst. Man Cybern. IEEE Trans. 1989, 19, 1489–1510. [Google Scholar]
  18. Warren, M.; Walter, P. A Logical Calculus of the Ideas Immanent in Nervous Activity; Springer: Berlin/Heidelberg, Germany, 1986; Volume 52, pp. 229–230. [Google Scholar]
Figure 1. Position diagram of a coordinate system in a camera imaging model.
Figure 1. Position diagram of a coordinate system in a camera imaging model.
Sensors 23 01941 g001
Figure 2. Physical model of binocular vision localization.
Figure 2. Physical model of binocular vision localization.
Sensors 23 01941 g002
Figure 3. Physical model diagram of 2D galvo scanner. The green lines indicate that the laser is emitted along the coordinate axis, and the red lines indicate that the laser is emitted along any direction.
Figure 3. Physical model diagram of 2D galvo scanner. The green lines indicate that the laser is emitted along the coordinate axis, and the red lines indicate that the laser is emitted along any direction.
Sensors 23 01941 g003
Figure 4. Mapping of camera and galvanometer coordinates.
Figure 4. Mapping of camera and galvanometer coordinates.
Sensors 23 01941 g004
Figure 5. Schematic of nonlinear relationship between galvanometer driving and projection coordinates.
Figure 5. Schematic of nonlinear relationship between galvanometer driving and projection coordinates.
Sensors 23 01941 g005
Figure 6. Schematic of a single hidden layer neural network.
Figure 6. Schematic of a single hidden layer neural network.
Sensors 23 01941 g006
Figure 7. The block-level diagram of the calibration.
Figure 7. The block-level diagram of the calibration.
Sensors 23 01941 g007
Figure 8. Calibration flow chart based on binocular vision.
Figure 8. Calibration flow chart based on binocular vision.
Sensors 23 01941 g008
Figure 9. Calibration metal plate.
Figure 9. Calibration metal plate.
Sensors 23 01941 g009
Figure 10. (a) System hardware design schematic diagram and (b) actual laser 3D projection system.
Figure 10. (a) System hardware design schematic diagram and (b) actual laser 3D projection system.
Sensors 23 01941 g010
Figure 11. (a) The coordinates to be projected on the CAD model, in the shape of the number “723”, (b) actual projected coordinates, like the number “3” and (c) System Error Diagram.
Figure 11. (a) The coordinates to be projected on the CAD model, in the shape of the number “723”, (b) actual projected coordinates, like the number “3” and (c) System Error Diagram.
Sensors 23 01941 g011
Figure 12. Laser spot error distribution of the calibration result points (average of every 20 groups).
Figure 12. Laser spot error distribution of the calibration result points (average of every 20 groups).
Sensors 23 01941 g012
Figure 13. (a) The relationship between the training data volume and the average error and (b) the relationship between the training data volume and computing time.
Figure 13. (a) The relationship between the training data volume and the average error and (b) the relationship between the training data volume and computing time.
Sensors 23 01941 g013
Table 1. Average and maximum deviations of calibration under different training set data volumes.
Table 1. Average and maximum deviations of calibration under different training set data volumes.
Number of Training SetsNumber of Hidden Layer NeuronsMean Deviation/mmMaximum Deviation/mmComputing Time/h
100020001.156.520.18
200030001.034.110.41
400050000.764.011.1
800090000.521.653.5
11,00012,0000.370.876.2
14,00015,0000.310.6410.4
19,00020,0000.280.6618.3
Table 2. Average deviation and maximum deviation at different working distances.
Table 2. Average deviation and maximum deviation at different working distances.
Working Distance/mMean Deviation/mmMaximum Deviation/mm
1.80.430.75
1.90.311.41
2.00.350.79
2.10.390.95
2.20.440.52
Table 3. Comparison of accuracy and convenience of different calibration methods.
Table 3. Comparison of accuracy and convenience of different calibration methods.
Calibration MethodMean Deviation/mmStructural ComplexityCalibration Time
Establish physical modelNewton iterative algorithm [9]0.88Complex2 h
Recalibration required
Levenberg–Marquardt algorithm [9]0.32
Hybrid particle swarm algorithm [9]0.998
Binocular vision calibration method based on the coordinate system0.5Complex6 h
No recalibration required
Method in this article0.37Simple7 h
No recalibration required
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lao, D.; Wang, Y.; Wang, F.; Gao, C. Calibrating Laser Three-Dimensional Projection Systems Using Binocular Vision. Sensors 2023, 23, 1941. https://doi.org/10.3390/s23041941

AMA Style

Lao D, Wang Y, Wang F, Gao C. Calibrating Laser Three-Dimensional Projection Systems Using Binocular Vision. Sensors. 2023; 23(4):1941. https://doi.org/10.3390/s23041941

Chicago/Turabian Style

Lao, Dabao, Yukai Wang, Fang Wang, and Chao Gao. 2023. "Calibrating Laser Three-Dimensional Projection Systems Using Binocular Vision" Sensors 23, no. 4: 1941. https://doi.org/10.3390/s23041941

APA Style

Lao, D., Wang, Y., Wang, F., & Gao, C. (2023). Calibrating Laser Three-Dimensional Projection Systems Using Binocular Vision. Sensors, 23(4), 1941. https://doi.org/10.3390/s23041941

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop