Next Article in Journal
RSSI Fingerprint Height Based Empirical Model Prediction for Smart Indoor Localization
Next Article in Special Issue
Three-Dimensional Integral Imaging with Enhanced Lateral and Longitudinal Resolutions Using Multiple Pickup Positions
Previous Article in Journal
A Big Coal Block Alarm Detection Method for Scraper Conveyor Based on YOLO-BS
Previous Article in Special Issue
Calibration of a Catadioptric System and 3D Reconstruction Based on Surface Structured Light
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

3D Point Cloud Acquisition and Correction in Radioactive and Underwater Environments Using Industrial 3D Scanners

Korea Atomic Energy Research Institute, 989-111 Daedeok-daero, Yuseong-gu, Daejeon 305-353, Republic of Korea
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(23), 9053; https://doi.org/10.3390/s22239053
Submission received: 26 October 2022 / Revised: 3 November 2022 / Accepted: 19 November 2022 / Published: 22 November 2022
(This article belongs to the Collection 3D Imaging and Sensing System)

Abstract

:
This study proposes a method to acquire an accurate 3D point cloud in radioactive and underwater environments using industrial 3D scanners. Applications of robotic systems at nuclear facility dismantling require 3D imaging equipment for localization of target structures in radioactive and underwater environments. The use of industrial 3D scanners may be a better option than developing prototypes for researchers with basic knowledge. However, such industrial 3D scanners are designed to operate in normal environments and cannot be used in radioactive and underwater environments. Modifications to environmental obstacles also suffer from hidden technical details of industrial 3D scanners. This study shows how 3D imaging equipment based on the industrial 3D scanner satisfies the requirements of the remote dismantling system, using a robotic system despite insufficient environmental resistance and hidden technical details of industrial 3D scanners. A housing unit is designed for waterproofing and radiation protection using windows, mirrors and shielding. Shielding protects the industrial 3D scanner from radiation damage. Mirrors reflect the light required for 3D scanning because shielding blocks the light. Windows in the waterproof housing also transmit the light required for 3D scanning with the industrial 3D scanner. The basic shielding thickness calculation method through the experimental method is described, including the analysis of the experimental results. The method for refraction correction through refraction modeling, measurement experiments and parameter studies are described. The developed 3D imaging equipment successfully satisfies the requirements of the remote dismantling system: waterproof, radiation resistance of 1 kGy and positional accuracy within 1 mm. The proposed method is expected to provide researchers with an easy approach to 3D scanning in radioactive and underwater environments.

1. Introduction

In recent years, many industrial 3D scanners have been introduced that have achieved sufficient performance to meet the needs of various industries and research fields. However, most industrial 3D scanners are designed to work in normal environments, and some special purpose 3D scanners are very expensive and do not support a wide range of applications. When a researcher develops a 3D scanner suitable for research purposes using well-known basic knowledge, the developed 3D scanner may show insufficient performance compared to industrial 3D scanners, which have competitiveness in the market.
This study starts from the need of 3D imaging equipment for a remote dismantling system using a robotic system in nuclear facility dismantling. When dismantling a nuclear facility that has been in operation for several decades, highly activated structures must be remotely dismantled underwater to prevent exposure to equipment and workers. Existing nuclear facility decommissioning projects have been carried out in a way that large, submerged cutting equipment is operated by field workers above the water surface. Since long-term low-dose exposure to workers cannot be avoided despite the protection of water and all simple repetitive tasks are performed manually, a remote dismantling system using robots is required for safety and efficiency. The 3D imaging equipment is necessary for localization of a target structure of a robotic system in radioactive and underwater environments.
In the remote dismantling system developed by us, the robotic system uses a pre-planned path created by a digital manufacturing system developed in our previous research [1]. Since the pre-planned path is created in a virtual environment, aligning the pre-planned path to the actual target structure is needed through target localization using 3D scanning. The proposed remote dismantling system implements a semi-automated robotic system through the pre-planned path and target localization instead of the direct teleoperation or shared teleoperation in previous studies [2,3,4]. Since both direct teleoperation and shared teleoperation rely on manipulation by an operator and visual feedback, the operator’s excessive workload and visual obstacles, such as flames, bubbles and debris, have made the application of robotic systems impractical in nuclear facility dismantling. Therefore, 3D imaging equipment capable of implementing the pre-planned path and target localization is key to the application of the semi-automated robotic system in nuclear facility dismantling.
Considering the radiation intensity of the dismantling site and the working space of the robotic system, it is necessary for the 3D imaging equipment to be able to measure up to 2.5 m away, to have a positioning accuracy within 1 mm, and to survive up to a cumulative dose of 1 kGy. During a survey of commercial underwater 3D scanners [5], Newton Labs’ NM200UW [6] was found to be usable in radioactive and underwater environments, but the NM200UW is insufficient for the requirements of a remote dismantling system in terms of a maximum measurable distance of 0.9 m and a radiation resistance of 43 Gy. Since Photoneo’s PhoXi XL [7] can measure up to 3.78 m and has a positional accuracy of 0.5 mm, we decided to develop 3D imaging equipment based on the PhoXi XL because it satisfies the requirements of the remote dismantling system. The PhoXi XL uses a continuous wave Laser Line Scanning (LLS) method and consists of a moving laser pointer made of a mirror galvanometer, a camera and an embedded computer for image processing.
We aimed to develop 3D imaging equipment based on the industrial 3D scanner PhoXi XL that is waterproof and has radiation resistance and refraction correction. Waterproofing could be solved simply by making a housing with windows and waterproof connectors. Radiation resistance could also be solved simply by mounting a shield on the front side since the main radiation source is the target structure on the front side in dismantling highly activated structures. However, 3D imaging equipment needs to have an optical path that can send and receive light to the front despite the front shielding. Refraction correction could be solved by refraction modeling, measurement experiments and parameter studies. Hidden technical details of the industrial 3D scanner cause many unknown parameters in the refraction model, so a process for optimizing unknown parameters through experimental results was necessary.
Radiation hardening and shielding are typical methods used for radiation protection. Radiation hardening makes electronic components and circuits radiation tolerant; shielding uses a solid or liquid material that absorbs radiation energy to block radiation. Radiation hardening can be a more fundamental solution compared to shielding and can minimize the increase in volume and weight, but radiation hardening can only be achieved by exchanging fragile electronic components with electronic components that are tolerable to radiation exposure. Therefore, shielding was the only choice for radiation protection of the 3D imaging equipment using an industrial 3D scanner, of which electronic components are not easily accessible to users.
To develop 3D imaging equipment, radiation hardening can be attempted. Zhao and Chi [8,9] developed an underwater 3D scanner using a continuous wave LLS method. To make the developed 3D scanners radiation tolerant, the electronic components and the camera need to be exchanged with radiation tolerant components [10,11], and the embedded computer for image processing needs to be located away from the radioactive environment. However, such an attempt may require a very long time and much effort to satisfy the requirements of the remote dismantling system. In a radioactive environment, the use of industrial products through shielding can be a very efficient alternative, as the study by Shin shows [12].
Refraction correction is required to improve the positional accuracy of the 3D point cloud measured in water, so that the position estimated by 3D registration represents the actual location of the scanned object. When a 3D scanner housed in waterproof housing scans an object in water through a window, the laser emitted from the laser pointer is refracted as it passes through the window and travels through the water, where it is reflected by the object and refracted into the camera through the window. Since the 3D scanner calculates the 3D position by triangulation principles, the angle changed by refraction causes a position error. From previous studies [8,9,13,14,15,16,17,18], all the parameters for refraction correction were known because all 3D scanners were designed and manufactured by the researchers themselves. In this study, unlike in previous studies, we optimized unknown parameters for refraction correction by an intuitive parameter study.
The goal of this study was to develop 3D imaging equipment for use with a robotic system that can accurately localize an activated target structure in water despite the position difference between the digital model and the actual target structure. The requirements of the 3D imaging equipment are the positional accuracy within 1 mm in a distance range of 0.5 to 3.5 m and radiation resistance of 1 kGy. To achieve the requirements, we designed a housing structure for radiation protection and waterproofing, calculating the shielding thickness which we verified through experiments, and performed refraction correction through refraction modeling, measurement experiments and parameter studies.
The following section presents the proposed housing design for radiation protection and waterproofing. Section 3 describes the process of investigating the initial radiation resistance, calculating the thickness of the shield, and verifying the shield thickness. Section 4 includes the process of refraction modeling for the developed 3D imaging equipment, performing measurement experiments in air and water, and studying unknown parameters of the refraction model. The conclusion section summarizes the study and states the potential contributions of the presented method.

2. Housing Design for Radiation Protection and Waterproof

A housing structure is designed to protect the industrial 3D scanner from water and radiation. The industrial 3d scanner that satisfies the requirements of the remote dismantling system is the XL model, which is a Photoneo’s Phoxi 3D scanner product [7]. However, due to the limited size of the laboratory, the industrial 3D scanner applied to this study was the L model, which has a slightly smaller scan volume than the XL model. The scanning range of the L model is 780 mm to 2150 mm, and the dimensions are 77 × 68 × 616 mm.

2.1. Optical Path Configuration

The key to housing design is to protect the 3D scanner from radiation while not obstructing the view of the industrial 3D scanner. In this study, we used reflective mirrors so that the industrial 3D scanner could be protected from a front activated structure by a shield and be visually accessible to the outside. Ray optics simulations [19] were performed to schematically understand the phenomenon of seeing the outside underwater through a reflective mirror and a glass window. Figure 1a shows that the image projected onto the image plane through the reflective mirror is flipped and the refraction of the ray in water narrows the field of view. Figure 1b shows that when the 3D scanner measures a position in water, the measured position is closer than the actual position.
The housing needed to be designed to minimize refraction because the magnitude of the refraction angle is proportional to the error of the calculated position due to a possible modeling error. Figure 2 shows two options for the window layout to configure the optical paths consisting of a camera, a laser pointer and a galvanometer. The neutral lines of the camera’s view and the laser projection pattern are inclined inward by angle θ. A window parallel to the scanner housing can contribute to size reduction, but the magnitude of the refraction angle increases, and the increased refraction angle causes larger errors. Therefore, a window normal to the camera’s z-axis was used.
The optical path was configured as shown Figure 3. Figure 3a shows the scan volume and the mirrored scan volume of the industrial 3D scanner. Figure 3b shows the boundaries of the mirrors and windows, which surround respective intersections by planes and rays. Since the mirrored scan volume shown in Figure 3a is formed by one plane, the two mirror planes shown in Figure 3b belong to the same plane, but each has two boundaries to reduce the size of the mirror. Planes of windows are, respectively, normal to the mirrored neutral lines of the laser projection pattern and the camera’s view.

2.2. Housing Design

The housing contains the industrial 3D scanner, connectors, first surface mirrors, windows and shielding as shown Figure 4. The industrial 3D scanner is fixed to the housing facing upwards and is provided with communication and power through waterproof connectors installed on the housing. The size and position of the two first surface mirrors are determined by the intersections as shown in Figure 3b. The windows located on the camera and the laser pointer are inclined inward by angle θ normal to the two neural lines as shown in the Figure 4a.
We designed the shielding to absorb gamma rays radiated from the activated target structure in front. To reduce the weight of 3D imaging equipment, the shielding was divided into three pieces and each shielding is placed in front of the laser pointer, camera and embedded computer for image processing as shown Figure 4a. At the actual dismantling workshop, the position of the industrial 3D scanner and the size of the shield can be changed.

3. Radiation Protection

Radiation protection through shielding is finally determined by calculating the thickness based on the initial radiation resistance, the sources of radiation and the shielding material. According to the radiological characterization of the reactor pressure vessel in a nuclear power plant, the main source emitting gamma rays is Co-60 [20,21]. Half value layers vary according to the energy of gamma rays and shield materials. When the source of radiation is Co-60 and the shield material is lead, half value layers is 1.2 cm [22]. Now we show the process of investigating the initial radiation resistance, calculating the shielding thickness and verifying final radiation resistance.

3.1. Initial Radiation Resistance

An irradiation experiment was performed to investigate the initial radiation resistance of the industrial 3D scanner and to investigate changes in optical characteristics for window and mirror candidates, as shown in Figure 5. A thick lead block protects the embedded computer inside the industrial 3D scanner, because the embedded computer must survive to the end of the experiment to drive the laser pointer, mirror galvanometer and camera during the irradiation experiment. The initial radiation resistance of the embedded computer was investigated by an embedded computer of the same type. Window and mirror candidates are a fused silica window, an aluminum coated first surface mirror and a silver coated first surface mirror. One of the two IP cameras observes the scan target continuously. When an image change event is caused by the laser pattern projected by the industrial 3D scanner, the IP camera transmits the laser pattern image to the monitoring computer according to the image change event. The other IP camera observes the windows and mirrors and transmits images to the monitoring computer every 60 s. The monitoring computer requests a 3D point cloud from the industrial 3D scanner and requests a ping from the embedded computer every 60 s.
The source of radiation is Co-60 of which the energy spectrum is 1.17 and 1.33 MeV. The absorbed dose rate in the experiment is 4.32 × 101 Gy/h, which is the dose rate at a location 730 mm away from the source of radiation. The total absorbed dose is 1.21 × 103 Gy during the experiment and irradiation time is 28 h. The ambient laboratory temperature is 25.6 °C at the start of the experiment and 25.4 °C at the end of the experiment.
The initial radiation resistance of the industrial 3D scanner is 259.4 Gy, since an abnormality occurred 5 h and 57 min after the start of the experiment. As shown in Figure 6, in the normal case, when the monitoring computer requests data from the industrial 3D scanner, the industrial 3D scanner sends the 3D point cloud and the image captured by the camera. At the same time, the IP camera detects an image change event caused by the laser pattern projected by the industrial 3D scanner and transmits the laser pattern image to the monitoring computer.
At 5 h 57 min, as shown in Figure 7, when the monitoring computer requests data from the industrial 3D scanner, the image captured by the camera of the industrial 3D scanner is the same as the image in Figure 6b, but the 3D point cloud is almost empty. In Figure 7c, the reason that the IP camera transmits the image without any laser pattern is that the holding time of the laser pattern is shorter than normal, so when the IP camera detects the projected laser pattern and then sends the image to the monitoring computer, the laser pattern has already disappeared at that moment. The state shown in Figure 7 lasts for 5 min, after which the industrial 3D scanner stops responding. When the laser pattern completely stops working, the embedded computer also stops responding, and the IP camera stops sending images.
In the industrial 3D scanner, the most vulnerable component to radiation is the laser pattern projector, and the radiation resistances of the remaining components are unknown. Considering that the quality of the images captured by the camera in the industrial 3D scanner is maintained for the last 5 min, the radiation resistance of the camera is larger than the radiation resistance of the laser pattern projector. However, the exact radiation resistance of the camera cannot be determined. Because the monitoring program stops working due to no response of the industrial 3D scanner, the exact surviving time of the other embedded computer is unknown. At the ping test after the irradiation experiment, the embedded computer in the industrial 3D scanner responds normally and the other embedded computer without a shield does not respond.
Figure 8a is an image of the IP camera at the start of irradiation, and Figure 8b is an image of the IP camera at the end of irradiation. The noise observed in Figure 8a is the photon shot noise generated during irradiation, and it disappears at the end of irradiation as shown in Figure 8b. In all optical components, no significant deterioration is observed after irradiation with the naked eye. The reason why the silver-coated mirror in Figure 8b turned slightly yellow is because the substrate of the silver-coated mirror is discolored. In the case of a first surface mirror, the discoloration of the substrate does not affect the reflectivity of the mirror.
Table 1 shows the measurement results of light transmittance and light reflectance for irradiated components. In all optical components, no performance degradation of more than 1% is observed at 637 nm, the wavelength of the industrial 3D scanner’s laser. Therefore, all optical components are suitable for the 3D scanner in this study. Finally, the silver coated first surface mirror is selected because the reflectance of the silver coated first surface mirror is superior to that of the aluminum coated first surface mirror.
Since the degradation of optical components is less than 1% at 1.21 × 103 Gy of the total absorbed dose, the radiation resistance of the developed 3D imaging equipment is estimated to be dominated by the shielding thickness.

3.2. Shielding Thickness Calculation

The shielding thickness is calculated by following the basic equation [21] that assumes a narrow beam of radiation penetrating a thin shield.
X = X 0 e u ρ ρ x
where X is the exposure rate with the shield in place, X0 is the exposure rate without the shield, u/ρ is the mass attenuation coefficient, ρ is the density of the shielding material, and x is the thickness of the shield. If Equation (1) is used, since Co-60 emits 1173 and 1332 keV gamma rays, the exposure rate for the two gamma rays must be calculated and added. In this study, Half Value Layers (HVL) was used for simple calculations. A HVL is the thickness of material that reduces the radiation intensity by one-half.
HVL = 0.693 μ = 0.693 ( μ / ρ ) ρ
X = X 0 e 0.693 HVL x
x = HVL 0.693 l n X 0 X
As shown in Table 2, when the shielding material is lead and the radiation source is Co-60, the HVL is 1.2 cm. The shielding thickness is calculated by Equation (4) where X0 is 1000 Gy and X is 259.4 Gy. The calculated shielding thickness is 3.0 cm, considering a margin, and the expected radiation resistance is 1467 Gy.

3.3. Radiation Resistance Verification

An irradiation experiment was performed to verify the radiation resistance of the shielded 3D scanner, as shown in Figure 9. The industrial 3D scanner is housed in the housing manufactured in this study and it scans the target through mirrors and windows. The housing is equipped with three lead blocks with a 3 cm thickness. Three dosimeters are fixed in front of the components of the 3D scanner, as shown in a, b, and c in Figure 9b. Dosimeter a is fixed in front of the shielding at the position closest to the radiation source, and dosimeters b and c are fixed behind the shielding at the position of the camera and laser pattern projector, respectively. Dosimeter a is measured at a distance of 644 mm from the surface of the radiation source. The irradiation experiment is performed for 24 h at a dose rate of 64.3 Gy/h based on dosimeter a.
After the irradiation experiment for 24 h the 3D scanner operated normally, and the total absorbed dose was 1534.4 Gy based on dosimeter a. An abnormal termination occurred at 22 h and 34 min during the irradiation experiment. However, the abnormal termination turned out to be temporary because restarting the monitoring process created no problems. Figure 10 and Figure 11 show the 3D point clouds and images captured by the industrial 3D scanner at the start of the experiment (a), just before the abnormal termination (b), and at the end of the experiment (c). No degradation of quality is observed for any of the 3D point clouds and images. In Figure 10, since the cylinder-shaped surface of the radiation source is closer than 800 mm from the 3D imaging equipment, only the scan target is measured.
Figure 12 shows the results from analyzing images captured by the 3D scanner using the image analysis tool. In the 3D scanner based on the LLS method, the image quality has a decisive effect on the quality of the measured 3D point cloud, so the result of the image quality analysis may represent the quality of the measured 3D point cloud. The noise level and hot pixels increase rapidly after irradiation starts, but do not show a distinct increase trend during 24 h of the experiment. Since position estimation based on 3D registration uses hundreds or more points, such random noise may not significantly affect position accuracy. Therefore, the developed 3D imaging equipment is expected to be successfully applied to the remote dismantling system in nuclear facility decommissioning.
The total absorbed dose of dosimeter b fixed in front of the laser pattern projector is 184.2 Gy, and the total absorbed dose of dosimeter c fixed in front of the camera is 191.8 Gy. When calculating the actual penetration thickness of gamma rays based on the drawing in Figure 9b, the shielding thickness is 31.3 mm and the total absorbed dose without shielding is calculated as 1112.3 Gy and 1171.6 Gy in dosimeter b and dosimeter c, respectively. In a previous experiment the laser pattern projector was investigated and found to be the weakest to radiation, and in this verification experiment the total absorbed dose of the laser pattern projector equipped with shielding is 1112.3 Gy, so we considered this radiation protection as successful. In addition, considering the radiation resistance of the applied optical components and the result of image quality analysis, easy improvement of the radiation resistance is expected through shielding design reinforcement.

4. Refraction Correction

Refraction correction finds the light plane and camera ray from the 3D coordinates of the point measured by the industrial 3D scanner, and then finds the corrected 3D coordinates of the point by recalculating the refracted light plane and the refracted camera ray based on the refraction model. The light plane is the equation of the plane projected by the laser pointer, and the camera ray is the equation of the line connecting the point on the camera’s image plane and the point measured by the industrial 3D scanner. An accurate refraction model is required for this calculation. However, the exact values required for the refraction model are rarely found due to the inaccessibility of design information in the industrial 3D scanner. In this study, we developed a basic refraction model and studied the unknown parameters by a simple intuitive approach.

4.1. Refraction Modeling

A refraction model is developed fundamentally based on a pinhole model and a perspective projection [23]. Although the calculation algorithm performed inside the industrial 3D scanner is unknown, the point measured by the industrial 3D scanner can be reconstructed with a light plane and camera ray, assuming that the positions of the laser pointer and camera are known. Figure 13 shows the reconstructed light plane and camera ray for the measured point pmea and the refraction correction point pcor on the x-z plane of the coordinate frame, which is located in the pinhole of the 3D scanner’s camera. In front of the laser pointer and camera, a glass window is installed at distances of dL and dC, respectively. The thickness of both glass windows is the same at tG. On both planes of the glass window positioned at the laser pointer, the plane PGL is the interface between air and glass, and the plane PWL is the interface between glass and water. Likewise, the plane PGC is the interface between air and glass, and the plane PWC is the interface between glass and water at the glass window positioned at the camera.
The plane of the glass window located in 3D space can be represented as in Figure 14 for the angles θ and φ, at which the normal vector is inclined with respect to the z-axis of the coordinate frame.
The planes PGC, PWC, PGL and PWL are given by Equations (5)–(8)
P G C = { p   : n t ( p q P ) = 0 } G C w h e r e   n = ( tan θ C tan φ C 1 ) ,   q P = ( 0 0 tan 2 θ C + tan 2 φ C + 1 · d C )
P W C = { p   : n t ( p q P ) = 0 } W C w h e r e   n = ( tan θ C tan φ C 1 ) ,   q P = ( 0 0 tan 2 θ C + tan 2 φ C + 1 · ( d C + t G ) )
P G L = { p   : n t ( p q P ) = 0 } G L w h e r e   n = ( tan θ L tan φ L 1 ) ,   q P = ( 0 0 tan 2 θ L + tan 2 φ L + 1 · d L tan θ L x L tan φ L y L z L )
P W L = { p   : n t ( p q P ) = 0 } W L w h e r e   n = ( tan θ L tan φ L 1 ) ,   q P = ( 0 0 tan 2 θ L + tan 2 φ L + 1 · ( d L + t G ) tan θ L x L tan φ L y L z L )
where θC and φC are the angles of the glass window positioned at the camera, θL and φL are the angles of the glass window positioned at the laser, and the point (xL, yL, zL) represents pL1 of the laser.
When pL1, θC, φC, θL, φL, dC, dL, tG and pmea are known, the refraction correction point pcor can be derived by Equations (9)–(28). First, the camera ray Rmea are given by Equation (9)
R m e a = { p = q L + λ v :   λ 0 } m e a   w h e r e   q L = 0 ,   v = v C 1 = p m e a
The intersection point pC2 of Rmea and PGC can be computed by Equations (10) and (11).
n G C t ( p q G C ) = n t ( λ C 2 v C 1 + q L q G C ) ,   λ C 2 = n G C t ( q G C q L ) n G C t v C 1 where   n G C = ( tan θ C tan φ C 1 ) ,   q G C = ( 0 0 tan 2 θ C + tan 2 φ C + 1 · d C ) ,   q L = 0 ,   v C 1 = p m e a
p C 2 = λ C 2 v C 1
The vector v2 of refraction with the incident vector v1 and the interface normal n is computed by Snell’s law, quaternion rotation operation and Equations (12)–(17) as shown Figure 15.
θ 1 = cos 1 v 1 · n v 1 n
θ 2 = sin 1 N 1 sin θ 1 N 2   w h e r e   N 1   a n d   N 2   a r e   r e f r a c t i v e   i n d i c e
r = v 1 × n v 1 n = ( r x r y r z )
q = Q ( v 1 ,   n , N 1 , N 2 ) = [ cos θ 2 2 sin θ 2 2 ( r x i r y j r z k ) ]
n = [ 0 n ] = [ 0 n x i n y j n z k ]
v 2 = q n q *   w h e r e   v 2 = [ 0 v 2 ] = [ 0 v 2 x i v 2 y j v 2 z k ]
Therefore, the refracted camera ray RC2 at the point pC2 on the plane PGC is computed by Equations (18) and (19).
v C 2 = q n G C q * ,   w h e r e   n G C = [ 0 n G C ] ,   q = Q ( v C 1 , n G C , N a i r , N g l a s s )
R C 2 = { p = q L + λ v :   λ 0 } C 2 ,   w h e r e   q L = p C 2 ,   v = v C 2
The intersection point pC3 of RC2 and PWC is computed by Equation (20).
p C 3 = λ C 3 v C 2 ,   w h e r e ,   λ C 3 = n W C t ( q W C p C 2 ) n W C t v C 2
In the same way, the corrected camera ray Rcor at the point pC3 on the plane PWC is computed by following Equations (21) and (22).
v C 3 = q n W C q *   w h e r e   n W C = [ 0 n W C ] ,   q = Q ( v C 2 , n W C , N g l a s s , N w a t e r )
R c o r = { p = q L + λ v :   λ 0 } c o r   w h e r e   q L = p C 3 ,   v = v C 3
The corrected pcor is the intersection of the corrected camera ray Rcor and the corrected light plane Pcor, so the corrected light plane Pcor needed to be found. Since the laser starts from a point light source, the laser rays RL1, RL2 and RL3 are calculated in the same way as the previous camera ray calculation method, and the corrected light plane Pcor can be found with the condition that it is perpendicular to the xz plane and includes the finally refracted laser ray RL3. The corrected light plane Pcor is computed by following Equations (23)–(30).
R L 1 = { p = q L + λ v :   λ 0 } L 1   w h e r e   q L = p L 1 ,   v = v L 1 = p m e a p L 1
p L 2 = p L 1 + λ L 2 v L 1   w h e r e   λ L 2 = n G L t ( q G L p L 1 ) n G L t v L 1
v L 2 = q n G L q *   w h e r e   n G L = [ 0 n G L ] ,   q = Q ( v L 1 , n G L , N a i r , N g l a s s )
R L 2 = { p = q L + λ v :   λ 0 } L 2   w h e r e   q L = p L 2 ,   v = v L 2
p L 3 = p L 2 + λ L 3 v L 2   w h e r e   λ L 3 = n W L t ( q W L p L 2 ) n W L t v L 2
v L 3 = q n W L q *   w h e r e   n W L = [ 0 n W L ] ,   q = Q ( v L 2 , n W L , N g l a s s , N w a t e r )
R L 3 = { p = q L + λ v :   λ 0 } L 3   w h e r e   q L = p L 3 ,   v = v L 3
P c o r = { p   : n t ( p q P ) = 0 } c o r   w h e r e   n = ( v L 3 z 0 v L 3 x ) ,   q P = p L 3
Finally, the corrected pcor is found by computing the intersection of the corrected camera ray Rcor and the corrected light plane Pcor as following Equation (31).
p c o r = p C 3 + λ c o r v C 3   w h e r e   λ c o r = n c o r t ( p L 3 p C 3 ) n c o r t v C 3

4.2. Experiments and Results

To find unknown parameters in the presented refraction model, experiments to measure 3D point clouds of a target board were performed in air and water at the same distance. The experimental equipment consisted of a water tank, a 3D scanner, a linear motion stage, a rotation stage, a micrometer, and a target board as shown Figure 16. At the bottom of the water tank, a bread board with an M6 threaded hole pattern was installed to fix components in the correct position. The 3D scanner was housed in the manufactured waterproof housing and fixed to one end of the breadboard. The linear motion stage was fixed longitudinally to vary the distance between the 3D scanner and the target board. The rotation stage was fixed on the movable table of the linear motion stage and aligned the target board parallel to the image plane of the 3D scanner. The micrometer was connected to the drive shaft of the linear motion stage by a timing belt, so that the position of the movable table could be precisely adjusted. The target board was a plate with a 30 mm size checkered pattern printed on an 800 × 600 mm aluminum plate.
Figure 17 shows the experimental setup with the water tank filled and the target board.
Figure 18 shows the results of 17 scans in air (a), and water (b), while the movable table moves from 0 mm to 800 mm at 50 mm intervals. The 3D point clouds obtained in water are enlarged and curved compared to 3D point clouds obtained in air.
Figure 19 shows the image captured by the camera of the industrial 3D scanner, and the results of checkerboard detection at the position where the micrometer indicates 0 mm. Since the indexes of the image and the 3D point cloud are the same, the 3D coordinates of the points found by checkerboard detection in the image can be found. Therefore, we could find a pair of 3D points pointing to the same position in the 3D point cloud measured in water and air, and the pair of 3D points could be used to find the error of the point measured underwater. For example, the 3D point indicated by the upper right dot in the image of Figure 19a should have the same position as the 3D point indicated by the upper right dot in the image of Figure 19b. If the positions of the two 3D points are different, the difference in positions indicates a measurement error due to refraction. Figure 20 shows 408 pairs and measurement errors estimated from Figure 19.
A pair of measurement errors was obtained by processing and merging 15 pairs of 3D point clouds with the previously described method. The pair of measurement errors consists of points pair measured in air and points pwater measured in water as shown Figure 21. The 16th and 17th measured 3D point clouds were excluded because the number of measured point clouds was small, and the measured 3D point clouds deviate from the proper measurement range because of narrowing the angle of view and appearing closer than the actual position of the object in water.

4.3. Parameter Study and Refraction Correction Result

In the refraction model presented in this study, the variables to find the refraction correction point pcor are pL1, θC, φC, θL, φL, dC, dL, tG and pmea. Among the listed variables, the glass thickness tG can be considered as a constant and pmea is a value measured by the 3D scanner, so pL1x, pL1y, pL1z, θC, φC, θL, φL, dC and dL can be considered as unknown parameters. The measured point pmea is the same as the points pwater measured in water.
The objective function is the sum of refraction correction errors, which are the distances between the refraction correction points pcor and the points pair measured in air, and the goal of the parameter study was to minimize the objective function. Equation (32) represents the objective function J.
J = i = 1 n p c o r p a i r ,   w h e r e   p c o r = f ( θ C , φ C , θ L , φ L , d C , d L , p L 1 x ,   p L 1 y , p L 1 z )
In this study, we attempted a parameter study with an intuitive and simple approach rather than a sophisticated theoretical approach. A discrete linear vector space was generated for each parameter, and parameter values that minimize the objective function were selected in the generated vector space. However, when the objective function simultaneously searched for the vector space of all the nine parameters, the number of cases increased exponentially, and the search time was too long. Therefore, at the beginning of the search, a strategy of combining parameters with similar characteristics and searching was attempted, and in the stationary phase of the search, a strategy of generating a random combination for the entire vector space and searching of all parameters was attempted. The search by a random combination made a homogeneous search possible for the entire vector space and reduction in search time through parallel operation, despite the incomplete search of the entire vector space.
The initial values of all parameters were obtained from the CAD geometry of the housing design in Section 2.2, and the values are shown in Table 3. The mean error calculated from the initial values is 12.2 mm, and the 95% error is 101.2 mm and the 99% error is 138.0 mm in the normal distribution of the error. The reason that 99% error should be considered is that the error guaranteed by the 3D scanner presented in this study needs to be verified. Considering the result of refraction correction based on the initial value, the 99% error reveals that the characteristics of refraction correction are very bad, and Figure 22 shows the characteristics well. Figure 22 is the initial error of refraction correction using the first pair of 3D point cloud measured in air and water, which shows that the plane of the refraction correction points pcor is inclined in relation to the plane of the points pair measured in the air.
Table 3 shows the results of the parameter study by the search strategy described above. The decrease in mean error is not evident in the case of No. 1, in which the search to minimize the objective function is performed in the vector space of θC and φC, and No. 2, in which the search is performed in the vector space of θL and φL. A distinct decrease in mean error is seen in case 3 searching on the vector space of dC and dL. Through cases 4, 5, and 6, the mean error is reduced to 0.57 mm to reach the precision required by the remote dismantling system described above. In particular, as the difference between the 99% error and the mean error is reduced, the error distribution is considerably improved. A homogeneous 3D registration error can be expected in the entire scan volume.
Figure 23 shows the results of the refractive correction performed on the entire scan volume. The pcor with the pwater transformed through refraction correction appears to be visually the same as pair.
Figure 24 shows the measurement error for all points. Many peak errors of 1 mm or more are observed, exceeding the mean error and 99% error. However, such peak errors may not significantly affect the 3D registration method for finding the position of the target object through a 3D point cloud consisting of hundreds of points.

5. Conclusions

3D imaging equipment required by the remote decommissioning system in the field of nuclear power plant decommissioning was successfully developed using an industrial 3D scanner. The 3D imaging equipment can survive up to a cumulative dose of 1 kGy and measure a 3D point cloud in air and in water. The measured 3D point cloud is accurate enough to estimate the position of a target object with an error of less than 1 mm. To achieve the goal, we selected a suitable industrial 3D scanner, designed a housing structure for water and radiation protection, and provided radiation protection through shielding and accuracy improvement through refraction correction. We proved the validity of the optical path configuration using mirrors, the shielding thickness calculation method, and the refraction modeling equations through the experimental results. The proposed method is expected to contribute to various applications requiring 3D imaging equipment in radioactive or underwater environments.

Author Contributions

Writing—original draft preparation, D.H.; methodology, D.H. and S.J.; software, I.K.; project administration, J.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by a National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIP) (RS-2022-00155255).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hyun, D.; Kim, I.; Lee, J.; Kim, G.H.; Jeong, K.S.; Choi, B.S.; Moon, J. A Methodol. Simulate Cut. Process for a nuclear dismantling simulation based on a digital manufacturing platform. Ann. Nucl. Energy 2017, 103, 369–383. [Google Scholar] [CrossRef]
  2. Borchardt, R. Remote Handling Techniques in Decommissioning—A Report of the NEA Co-Operative Programme on Decommissioning (CPD) Project. No. NEA-RWM-R--2011-2; Radioactive Waste Management Committee-RWMC. 2011. [Google Scholar]
  3. Noakes, M.W. Dual Arm Work Module Development and Applications; No. ORNL/CP-102291; Oak Ridge National Lab (ORNL): Oak Ridge, TN, USA, 1999. [Google Scholar]
  4. Kheddar, A.; Neo, E.-S.; Tadakuma, R.; Yokoi, K. Enhanced Teleoperation Through Virtual Reality Techniques. In Advances in Telerobotics; Springer: Berlin/Heidelberg, Germany, 2007; pp. 139–159. [Google Scholar]
  5. NM200UW Nuclear Underwater Laser Scanner. Available online: https://newtonlabs.com/nuke_nm200uw_hardware.html (accessed on 26 October 2022).
  6. Castillón, M.; Palomer, A.; Forest, J.; Ridao, P. State of the Art of Underwater Active Optical 3D Scanners. Sensors 2019, 19, 5161. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. PhoXi 3D Scanner. Available online: https://www.photoneo.com/products/phoxi-scan-xl/ (accessed on 26 October 2022).
  8. Zhao, J.; Cheng, Y.; Cai, G.; Feng, C.; Liao, L.; Xu, B. Correction model of linear structured light sensor in underwater environment. Opt. Lasers Eng. 2022, 153, 107013. [Google Scholar] [CrossRef]
  9. Chi, S.; Xie, Z.; Chen, W. A Laser Line Auto-Scanning System for Underwater 3D Reconstruction. Sensors 2016, 16, 1534. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Schwank, J.; Shaneyfelt, M.; Draper, B.; Dodd, P. BUSFET-a radiation-hardened SOI transistor. IEEE Trans. Nucl. Sci. 1999, 46, 1809–1816. [Google Scholar] [CrossRef]
  11. Houssay, L.P. Robotics and Radiation Hardening in the Nuclear Industry. Doctoral Dissertation, State University System of Florida, Tallahassee, FL, USA, 2000. [Google Scholar]
  12. Shin, M.W. Cosmic Radiation Shielding Methodology for Satellites; Kyung Hee University: Seoul, Korea, 2003. [Google Scholar]
  13. Massot-Campos, M.; Oliver-Codina, G. Underwater laser-based structured light system for one-shot 3D reconstruction. In Proceedings of the SENSORS, 2014 IEEE, Valencia, Spain, 2–5 November 2014; pp. 1138–1141. [Google Scholar]
  14. Bodenmann, A.; Thornton, B.; Nakajima, R.; Yamamoto, H.; Ura, T. Wide area 3D seafloor reconstruction and its application to sea fauna density mapping. In Proceedings of the 2013 OCEANS, San Diego, CA, USA, 23–27 September 2013. [Google Scholar]
  15. Kondo, H.; Maki, T.; Ura, T.; Nose, Y.; Sakamaki, T.; Inaishi, M. Structure tracing with a ranging system using a sheet laser beam. In Proceedings of the 2004 International Symposium on Underwater Technology (IEEE Cat. No. 04EX869), Taipei, Taiwan, 20–23 April 2004. [Google Scholar]
  16. Bräuer-Burchardt, C.; Heinze, M.; Schmidt, I.; Kühmstedt, P.; Notni, G. Underwater 3D Surface Measurement Using Fringe Projection Based Scanning Devices. Sensors 2015, 16, 13. [Google Scholar] [CrossRef] [PubMed]
  17. Nakatani, T.; Li, S.; Ura, T.; Bodenmann, A.; Sakamaki, T. 3D visual modeling of hydrothermal chimneys using a rotary laser scanning system. In Proceedings of the 2011 IEEE Symposium on Underwater Technology and Workshop on Scientific Use of Submarine Cables and Related Technologies, Tokyo, Japan, 5–8 April 2011. [Google Scholar]
  18. Inglis, G.; Smart, C.; Vaughn, I.; Roman, C. A pipeline for structured light bathymetric mapping. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012. [Google Scholar]
  19. Ray Optics Simulation. Available online: https://ricktu288.github.io/ray-optics/simulator/ (accessed on 26 October 2022).
  20. Loeb, A.; Stanke, D.; Kemp, L. Decommissioning of the reactor pressure vessel and its peripheral facilities of the Nuclear Power Plant in Stade, Germany–11100. In Proceedings of the WM2011 Conference, Phoenix, AZ, USA, 27 February–3 March 2011. [Google Scholar]
  21. Rossini, S.; Bertani, C.; De Salve, M.; Panella, B.; Pistelli, S. Radiological characterization of the reactor pressure vessel of Trino NPP for dismantling purposes. Prog. Nucl. Energy 2018, 107, 17–30. [Google Scholar] [CrossRef]
  22. Shielding Radiation. Alphas, Betas, Gammas and Neutrons. Available online: https://www.nrc.gov/docs/ML1122/ML11229A721.pdf (accessed on 26 October 2022).
  23. Build Your Own 3D Scanner: 3D Photography for Beginners. Available online: http://mesh.brown.edu/byo3d/notes/byo3D.pdf (accessed on 26 October 2022).
Figure 1. Ray Optics Simulation: (a) Vertical Plane; (b) Horizontal Plane.
Figure 1. Ray Optics Simulation: (a) Vertical Plane; (b) Horizontal Plane.
Sensors 22 09053 g001
Figure 2. Optical system configuration of industrial 3D scanner.
Figure 2. Optical system configuration of industrial 3D scanner.
Sensors 22 09053 g002
Figure 3. Optical Path Configuration: (a) Scan Volume; (b) Boundaries of Mirror and Window.
Figure 3. Optical Path Configuration: (a) Scan Volume; (b) Boundaries of Mirror and Window.
Sensors 22 09053 g003
Figure 4. Housing Design: (a) Top View; (b) Isometric View.
Figure 4. Housing Design: (a) Top View; (b) Isometric View.
Sensors 22 09053 g004
Figure 5. Irradiation Experimental Setup: (a) Layout Drawing; (b) Irradiation Facility.
Figure 5. Irradiation Experimental Setup: (a) Layout Drawing; (b) Irradiation Facility.
Sensors 22 09053 g005
Figure 6. Experimental Data at 5H56M: (a) 3D Point Cloud; (b) Image of Camera; (c) Image of IP Camera.
Figure 6. Experimental Data at 5H56M: (a) 3D Point Cloud; (b) Image of Camera; (c) Image of IP Camera.
Sensors 22 09053 g006
Figure 7. Experimental Data at 5H57M: (a) 3D Point Cloud; (b) Image of Camera; (c) Image of IP Camera.
Figure 7. Experimental Data at 5H57M: (a) 3D Point Cloud; (b) Image of Camera; (c) Image of IP Camera.
Sensors 22 09053 g007
Figure 8. Optical Parts: (a) Initial State; (b) Final State.
Figure 8. Optical Parts: (a) Initial State; (b) Final State.
Sensors 22 09053 g008
Figure 9. Verification Experiment: (a) Experimental Setup; (b) Detailed Layout Drawing.
Figure 9. Verification Experiment: (a) Experimental Setup; (b) Detailed Layout Drawing.
Sensors 22 09053 g009
Figure 10. Measured 3D Point Cloud: (a) At Start; (b) Before Abnormal Termination; (c) At End.
Figure 10. Measured 3D Point Cloud: (a) At Start; (b) Before Abnormal Termination; (c) At End.
Sensors 22 09053 g010
Figure 11. Images of Scanner: (a) At Start; (b) Before Abnormal Termination; (c) At End.
Figure 11. Images of Scanner: (a) At Start; (b) Before Abnormal Termination; (c) At End.
Sensors 22 09053 g011
Figure 12. Image Quality Analysis: (a) Noise Level; (b) Dead Pixels.
Figure 12. Image Quality Analysis: (a) Noise Level; (b) Dead Pixels.
Sensors 22 09053 g012
Figure 13. Refraction Model.
Figure 13. Refraction Model.
Sensors 22 09053 g013
Figure 14. Plane Model.
Figure 14. Plane Model.
Sensors 22 09053 g014
Figure 15. Refraction of a ray: (a) Isometric view; (b) Plane normal to the rotation vector.
Figure 15. Refraction of a ray: (a) Isometric view; (b) Plane normal to the rotation vector.
Sensors 22 09053 g015
Figure 16. Design of experimental equipment.
Figure 16. Design of experimental equipment.
Sensors 22 09053 g016
Figure 17. Experimental setup.
Figure 17. Experimental setup.
Sensors 22 09053 g017
Figure 18. Obtained 3d point cloud: (a) in air; (b) in water.
Figure 18. Obtained 3d point cloud: (a) in air; (b) in water.
Sensors 22 09053 g018
Figure 19. Results of checkerboard detection: (a) air; (b) water.
Figure 19. Results of checkerboard detection: (a) air; (b) water.
Sensors 22 09053 g019
Figure 20. Estimation of measurement error due to refraction.
Figure 20. Estimation of measurement error due to refraction.
Sensors 22 09053 g020
Figure 21. A pair of measurement error.
Figure 21. A pair of measurement error.
Sensors 22 09053 g021
Figure 22. Initial error of refraction correction.
Figure 22. Initial error of refraction correction.
Sensors 22 09053 g022
Figure 23. Refraction correction result.
Figure 23. Refraction correction result.
Sensors 22 09053 g023
Figure 24. Measurement error.
Figure 24. Measurement error.
Sensors 22 09053 g024
Table 1. Light Transmittance for Window and Light Reflectance for Mirror.
Table 1. Light Transmittance for Window and Light Reflectance for Mirror.
Optical PartsWavelength
(nm)
Normal Parts
(%)
Irradiated Parts
(%)
Results
Fused Silica Window40090.188.3−1.8%
70097.396.6−0.7%
Aluminum Coated
First Surface Mirror
40091.190.9−0.2%
70080.579.9−0.6%
Silver Coated
First Surface Mirror
4009695.1−0.9%
7009897.5−0.5%
Table 2. Approximate Half Value Layers in cm.
Table 2. Approximate Half Value Layers in cm.
Energy (MeV)UraniumTungstenLeadIronConcreteWater
0.5 0.511.03.37.62
1.0 0.761.524.579.91
1.5 1.271.785.8412.19
2.0 1.522.036.613.97
Ir-1920.280.330.481.274.5
Cs-137 0.651.64.8
Co-600.690.791.22.16.2
Ra-226 1.662.26.9
Table 3. Parameter study.
Table 3. Parameter study.
No.Mean Error (mm)95% Error (mm)99% Error (mm) θ C ( Deg . ) φ C ( Deg . ) θ L ( Deg . ) φ L ( Deg . ) dC
(mm)
dL
(mm)
p L 1 x ( mm ) p L 1 y ( mm ) p L 1 z ( mm )
Init.12.1812101.1734138.04460018.8069.479−542.5−8.787.1
112.099595.6218130.22670.07−0.1218.8069.479−542.5−8.787.1
211.620593.4643127.37380.07−0.1218.570.2469.479−542.5−8.787.1
32.0152.95583.34550−0.0518.690.589.40−542.5−8.787.1
41.58791.89132.01710−0.0518.690.589.40−537.5−27.797
50.83521.09311.1999−0.22−0.0918.4851.1889.40−537.5−27.797
60.56830.68950.7397−0.0038−0.00160.32260.020682.60−536.3−34.8100.2
70.55310.67280.7222−0.225−0.102518.611.15581.80−539.1−32.2103.6
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hyun, D.; Joo, S.; Kim, I.; Lee, J. 3D Point Cloud Acquisition and Correction in Radioactive and Underwater Environments Using Industrial 3D Scanners. Sensors 2022, 22, 9053. https://doi.org/10.3390/s22239053

AMA Style

Hyun D, Joo S, Kim I, Lee J. 3D Point Cloud Acquisition and Correction in Radioactive and Underwater Environments Using Industrial 3D Scanners. Sensors. 2022; 22(23):9053. https://doi.org/10.3390/s22239053

Chicago/Turabian Style

Hyun, Dongjun, Sungmoon Joo, Ikjune Kim, and Jonghwan Lee. 2022. "3D Point Cloud Acquisition and Correction in Radioactive and Underwater Environments Using Industrial 3D Scanners" Sensors 22, no. 23: 9053. https://doi.org/10.3390/s22239053

APA Style

Hyun, D., Joo, S., Kim, I., & Lee, J. (2022). 3D Point Cloud Acquisition and Correction in Radioactive and Underwater Environments Using Industrial 3D Scanners. Sensors, 22(23), 9053. https://doi.org/10.3390/s22239053

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop