Next Article in Journal
Estimation of Steering and Throttle Angles of a Motorized Mobility Scooter with Inertial Measurement Units for Continuous Quantification of Driving Operation
Next Article in Special Issue
Simulation and Experimental Research on a Beam Homogenization System of a Semiconductor Laser
Previous Article in Journal
A New Adaptive GCC Method and Its Application to Slug Flow Velocity Measurement in Small Channels
Previous Article in Special Issue
Fourier Domain Mode Locked Laser and Its Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Time-of-Flight Imaging in Fog Using Polarization Phasor Imaging

1
Key Laboratory of Photoelectronic Imaging Technology and System, Ministry of Education, School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
2
Beijing Institute of Technology, Zhuhai 519000, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(9), 3159; https://doi.org/10.3390/s22093159
Submission received: 27 March 2022 / Revised: 18 April 2022 / Accepted: 19 April 2022 / Published: 20 April 2022
(This article belongs to the Special Issue Optical Imaging, Optical Sensing and Devices)

Abstract

:
Due to the light scattered by atmospheric aerosols, the amplitude image contrast is degraded and the depth measurement is greatly distorted for time-of-flight (ToF) imaging in fog. The problem limits ToF imaging to be applied in outdoor settings, such as autonomous driving. To improve the quality of the images captured by ToF cameras, we propose a polarization phasor imaging method for image recovery in foggy scenes. In this paper, optical polarimetric defogging is introduced into ToF phasor imaging, and the degree of polarization phasor is proposed to estimate the scattering component. A polarization phasor imaging model is established, aiming at separating the target component from the signal received by ToF cameras to recover the amplitude and depth information. The effectiveness of this method is confirmed by several experiments with artificial fog, and the experimental results demonstrate that the proposed method significantly improves the image quality, with robustness in different thicknesses of fog.

1. Introduction

Time-of-flight (ToF) imaging is an active depth-sensing technology with the advantages of a compact structure, low cost, and real-time image capture [1]. It is widely used in many fields, such as autonomous driving [2], machine vision [3], and human–computer interaction [4]. The continuous-wave ToF (CW-ToF) camera is a common depth-sensing camera, which indirectly acquires depth images by calculating the phase difference between sent and received signals [5], simultaneously capturing amplitude images of a scene. When a CW-ToF camera is used in foggy scenes, the contrast of the amplitude image is degraded and the depth image is greatly distorted, which limits the ToF camera to be applied in outdoor settings, as shown in Figure 1. Therefore, it is necessary to improve the quality of the images captured by a ToF camera in fog.
In a foggy scene, a CW-ToF camera receives the summation of reflected light from the target and scattering light caused by atmospheric aerosols. The light scattered in fog causes the multipath interference (MPI), which refers to the fact that a single pixel of the ToF camera receives multipath lights from the scene [6], leading to the significant error of depth measurements. Various studies have been implemented for MPI correction, such as mixed pixel restoration [7,8], multiple frequency measurements [9,10,11,12], compressed sensing [13,14], and deep learning [15,16,17]. In this work, we only consider the MPI caused by scattering media. To correct MPI generated by scattering media, many methods have been studied and developed, including convolutional sparse coding [18], data-driven prior [19], temporal properties [20], iterative optimization algorithms [21,22], image filtering [23], polarization properties of light [24,25], and phasor imaging [26,27]. Kijima D. et al. used multiple time-gating to estimate the scattering property of fog with a short-pulse ToF camera, for reconstructing the depth and intensity of a scene [20], which is not applicable to CW-ToF cameras. Wu R. et al. proposed the transient degree of polarization [24] and used the polarization properties of light to recover transient images in scattering scenes [25]. Transient images record the propagation of light through a scene with rich information [28], but they require hardware modifications or complex computation processes to accurately recover the time domain response [28,29,30,31]. Phasor imaging only recovers the depth information in the scattering scene by high-frequency [26] or multiple frequency measurements [27]. Our goal is to simultaneously recover the depth and amplitude information from a CW-ToF camera in the scattering scene, without restoring the transient images.
Optical polarimetric defogging is a powerful method to enhance visibility in fog, which uses the difference of polarization properties between the reflected and scattered light [32]. It can be divided into passive polarization defogging [33,34,35,36] and active polarization defogging [37,38,39,40,41], with good performance in visibility enhancement for RGB images. The effect of fog on amplitude images captured by a CW-ToF camera is different from that on RGB images, which follows the regularity of MPI. Therefore, optical polarimetric defogging cannot directly be used for restoring the amplitude image. Considering the principle of ToF imaging, we introduce polarization-based defogging to phasor imaging for improving the quality of images from a CW-ToF camera.
In this paper, we propose a polarization phasor imaging method for simultaneously recovering the depth and amplitude information of ToF imaging in fog, and establish the corresponding imaging model based on the polarization properties of light. The model estimates the scattering component using the degree of polarization phasor, and separates the target component to recover the depth and amplitude images. The main contributions of this work are:
  • Firstly, we introduce optical polarimetric defogging to ToF phasor imaging, expanding the application of polarization defogging.
  • Secondly, we define the degree of polarization phasor for describing the scattering effect for ToF imaging.
  • Finally, we establish a polarization phasor imaging model for recovering amplitude and depth images in the foggy scenes by estimating the scattering component.
The remainder of this paper is organized as follows: Section 2 describes the phasor representation and models polarization phasor imaging; Section 3 verifies the effectiveness of the method through experiments; Section 4 discusses the limitations and potential future works; and Section 5 presents the conclusions.

2. The Polarization Phasor Imaging Method

The polarization phasor imaging method processes and analyzes polarization data in polar coordinates. First, a group of polarization images is captured at two orthogonal orientations of the linear polarization analyzer. The amplitude and depth captured at the same orientation are combined to form a phasor. Afterwards, the background is selected to estimate the degree of polarization phasor of the scattering component. After obtaining the scattering component, the amplitude and depth of the target component can be recovered. For a better performance, the image enhancement is applied in the amplitude, and masking background is used in the depth image. The pipeline of the proposed method is shown in Figure 2, and the details are discussed below.

2.1. Polarization Phasor Representation

A CW-ToF camera illuminates the scene with amplitude modulated light, and senses the depth information by calculating the phase shift between the sent and received signals. The amplitude image captured by a CW-ToF camera records the intensity information returned from the scene within a certain integration time, as shown in Figure 3a. In this work, the illumination beam of the ToF camera is polarized by a linear polarizer in front of the light source, and the images are captured from the sensor through a linear polarization analyzer, in Figure 3b.
The phasor representation of measurements from a ToF camera is proposed in [26] and further developed in [27], which comprehensively analyzes the information of the amplitude and depth. Based on the current studies, we applied the phasor representation in ToF polarization measurements (Figure 4), and expressed as
p ( θ , x ) = a ( θ , x ) e i φ ( θ , d ( x ) )
where x represents an exact pixel; θ is the angle between the orientation of linear polarizer and the orientation of the linear polarization analyzer; a is the amplitude at pixel x ; φ is the phase difference between the sent and the reflected light; and φ ( θ , d ( x ) ) = 4 π f d ( x ) / c ; d ( x ) is the distance between the target and the sensor at pixel x . The co-linear image is obtained when the orientation of the analyzer is identical to that of the linear polarizer, which are represented as p ( θ | | , x ) . The cross-linear image p ( θ , x ) is captured with the orientation of the analyzer orthogonal to that of the linear polarizer.
We assume that the target radiance is unpolarized, the scattering medium is spatially homogeneous, and the single scattering is dominated, as with many earlier works [33,34,35,36,38,39,40,41]. Based on the above assumption, the amplitude is invariant at different θ , and the phase is related to the actual distance and the modulation frequency of the light source, in a clear scene. Therefore, p ( θ | | , x ) = p ( θ , x ) , and p ( θ | | , x ) coincides with p ( θ , x ) in the polar coordinate.
When the scattering medium exists in a scene, the signal received by ToF cameras is not the single-path reflected light from the scene, but the summation of the directly reflected light and multipath scattered light. This phenomenon is contrary to the principle of the ToF measurement, so significant errors are induced. Figure 5 shows the process of ToF imaging in fog, and the corresponding phasor representation. Assuming that the scattering medium is spatially uniform, the observed phasor is the composition of the phasors from the target and scattering components, as shown in Figure 5b, and expressed as
p m ( θ , x ) = p t ( θ , x ) + p s ( θ , x ) = a ˜ t ( θ , x ) e i φ ( θ , d ( x ) ) + L ( d ( x ) ) s ( θ ) e i φ ( θ , d ( l ) ) d l
where a ˜ t is the attenuated amplitude of the target component, and a ˜ t = μ a t ; μ is the transmittance of the scattering medium, and a t is the amplitude of the target in clear scenes; s ( θ ) is the amplitude of the scattering component; and L ( d ( x ) ) is the summation of all possible optical paths in front of the target.

2.2. Polarization Phasor Imaging Model

According to the theory of optical polarimetric defogging [33], the observed phasors measured at different orientations of the analyzer can be expressed as
p m ( θ | | , x ) = p t ( θ | | , x ) + p s ( θ | | , x )
p m ( θ , x ) = p t ( θ , x ) + p s ( θ , x )
where p t ( θ | | , x ) and p s ( θ | | , x ) are phasors of the target and scattering components measured at θ | | ; p t ( θ , x ) and p s ( θ , x ) are phasors of the target and scattering components measured at θ . In addition, the total measurement is the vector summation of data measured at two orthogonal polarization states, which is
p m ( x ) = p m ( θ | | , x ) + p m ( θ , x ) p t ( x ) = p t ( θ | | , x ) + p t ( θ , x ) p s ( x ) = p s ( θ | | , x ) + p s ( θ , x )
where p t ( θ | | , x ) = p t ( θ , x ) = 1 2 p t , since the amplitude of the target component is invariant when the orientation of the analyzer changes, and the phase of the target component is independent on the orientation of the analyzer.
Here, we define the degree of polarization phasor (DOPP) of the observed and scattering phasors as P P m and P P s , respectively, and they are represented as
P P m = p m ( θ | | , x )     p m ( θ , x ) p m ( θ | | , x )   +   p m ( θ , x ) P P s = p s ( θ | | , x )     p s ( θ , x ) p s ( θ | | , x )   +   p s ( θ , x )
We assume that the proportion of scattering component is higher in the observed measurements measured at θ | | than that measured at θ . In this case, a m ( θ | | , x ) > a m ( θ , x ) , and a s ( θ | | , x ) > a s ( θ , x ) , as shown in Figure 6. If φ s ( θ | | , x ) = φ s ( θ , x ) , P P s is a constant and same as the degree of polarization. Combining the above equations, the scattering phasor can be calculated from
p s = p s ( θ | | , x ) p s ( θ , x ) P P s = p m ( θ | | , x ) p m ( θ , x ) P P s
It can be observed from Equation (7) that P P s is a critical factor to estimate the scattering component. In an outdoor scene, the images captured by a ToF camera usually contain a background, where the measurements can be approximated as the scattering component, since the reflected light is negligible. At this time, P P s can be estimated from the background:
P P s = p s ( θ | | , x b ) p s ( θ , x b ) p s ( θ | | , x b ) + p s ( θ , x b )
where x b is the pixel in the background region. After estimating P P s , p s can be calculated from Equation (7), and the target component can be obtained from the observed phasor:
p t = p m - p s
At this time, the depth is recovered from the phase of p t and calculated as
d r e c o v e r y = c arg ( p t ) 4 π f ,
where arg ( ) operator is used to calculated the angle of the phasor. In the background region, the recovered depth is messy due to the weak reflected light, which is usually suppressed using an image mask [21,22,27]. The image mask can be obtained by image segmentation, and we implemented the maximum between-class variance method [42] on the recovered amplitude image to mask the background of the depth image.
The amplitude recovered from p t is a ˜ t , which contains the attenuation effect of fog. In the traditional optical imaging, the transmittance μ is usually obtained from the scattering component. However, it is not suitable for ToF imaging, since the amplitude of the scattering phasor is affected by the phase. Therefore, image enhancement can be used here to further improve the quality of the amplitude image, such as histogram-based [43,44], Retinex-based [45,46], and grayscale transformation [47,48]. In this work, we used a simple grayscale power transformation from [49] to enhance the amplitude image.

3. Experiments and Results

The effectiveness of the proposed method was evaluated by several experiments with artificial fog. The experiment was implemented in a closed chamber with a light-absorbing cover to avoid additional MPI. The size of the closed chamber is around 120 × 40 × 50 cm3, as shown in Figure 7a. Figure 7b shows our experimental setup, which includes a CW-ToF camera manufactured by Texas Instruments (OPT8241-CDK-EVM), a linear polarizer in front of the light source, a linear polarization analyzer in front of the sensor, and a fog generator. The fog was generated from the fog generator by combining water, glycerol and alcohol compounds, and the images were captured after a few minutes, when a stable and uniform scattering environment was formed. The thickness of fog was measured with a laser haze detector (HK-B5S), and described as the number of scattering particles per cubic decimeter. The modulation frequency of the light source was 40 MHz, and the illumination power was 8800 mW, in the parameter settings of the ToF camera. A group of polarization images was captured at two orthogonal orientations of the linear polarization analyzer, with the integration time of 6.2 ms. In addition, the image was averaged from 100 frames for pre-processing to reduce time-dependent noise. The amplitude and depth images captured without fog were considered as the ground truth. The background pixels x b were selected from the area of the light-absorbing cover behind the targets, and the DOPP was estimated by averaging the calculation of the background pixels. The background pixels refer to the distant points and dark points of the scene, whose amplitudes are significantly lower than other pixels, and phases are larger than other pixels.
The experiment scene is shown in Figure 8a, which contains five targets: a cylinder made of white paper, white diffuse plane, plush toy, kraft paper box, and blue box. The distance between the targets and the sensor was roughly 1 m. We tested our method under different thicknesses of fog (Figure 8b), and the numbers of scattering particles per cubic decimeter are 753,640, 935,340 and 1,204,890, corresponding to thin, medium, and thick fog, respectively. The experiment results are shown in Figure 9, and the quantitative evaluations of the amplitude and depth images are listed in Table 1 and Table 2, respectively.
Ordinary ToF cameras directly capture the amplitude and depth images through the linear polarization analyzer, as shown in Figure 9a,b. The amplitude measured at θ | | is brighter than that measured at θ , since the sensor receives more fog components at θ | | . The depth measured at θ is closer to the ground truth than that measured at θ | | for the same reason. As the density of fog increases, the contrast of the amplitude gradually decreases, and the difference between the measured depth and the ground truth increases. Our method performs well in different thicknesses of fog, as shown in Figure 9c. The contrast of amplitude images is enhanced by our method, and the fog component is effectively suppressed. Meanwhile, the amplitude images are evaluated by the peak signal-to-noise ratio (PSNR) and structural similarity (SSIM) in Table 1. In both indices, our method is superior to the ordinary ToF camera with the linear polarization analyzer.
The depth is greatly distorted in a foggy scene, and it is significantly underestimated using an ordinary ToF camera without other processes, whereas our method restores the depth of different targets in different thicknesses of fog. Figure 10 shows the absolute error of depth under different thickness of fog, and the quantitative results of depth at different target regions are shown in Table 2. In the region of the white diffuse plane and cylinder, the error of depth recovery is the smallest, since the target reflectance and the depolarization effect are the highest. In the edge of the blue box region, some depth information is lost, and the size of depth-loss area increases as the density of the fog increases. This is because the returned light intensity is low, causing the edge to be incorrectly classified as the background during masking background. In addition, flying pixels exist in the recovered depth, especially in thick fog. The reason for this phenomenon is that the background in the recovered amplitude image contains some background information and residual scattering information, and it is easily misclassified when masking background.

4. Discussion

4.1. The Depolarization Degree of Targets

We assume that the target radiance is unpolarized when ToF polarization imaging in fog. In practice, some target radiance is partial polarization light. In this case, the polarization of the target component should be considered and the scattering phasor is calculated as
p s = [ p m ( θ | | , x ) p m ( θ , x ) ] [ p t ( θ | | , x ) p t ( θ , x ) ] P P s
Once the scattering phasor is calculated by Equation (7), the estimation error increases. However, the target phasor is unknown and expected to be recovered from the observed phasor, so this is an ill-posed problem. Some studies have been implemented to solve this problem in RGB images [37]. In the future work, we will consider the polarization contribution from the target and improve the estimation accuracy of the scattering phasor.

4.2. The Attenuation Factor of the Amplitude

In the computer vision, the transmittance of the scattering medium is usually calculated based on an atmospheric degradation model. However, it is not applicable to directly estimate the transmittance μ from the scattering component for ToF imaging, as the amplitude of the scattering phasor is not a simple superposition of the amplitude in total scattering paths. The transmittance estimation is important to improve the quality of the recovered amplitude image, as well as the image mask. In addition, the transmittance is related to the distance, so calculating the attenuation factor based on the recovered depth information is worth studying in future works.

4.3. The Homogeneity of Scattering Media

In this work, the scattering medium is assumed to be spatially homogeneous. In a real foggy environment, atmospheric aerosols are usually non-uniformly distributed in space, which are usually non-uniformly distributed in the direction vertical to the ground and homogeneously distributed in the direction parallel to the ground. The proposed method can estimate the scattering component caused by fog per pixel, so the assumption is not the limitation of our method in most foggy scenes.

5. Conclusions

We proposed a polarization phasor imaging method for recovering the depth and amplitude information using the polarization ToF imaging system. We established a polarization imaging model and defined DOPP, which was used to estimate the scattering component for image recovery. The effectiveness of the proposed method was verified by several experiments with different thicknesses of fog. However, this method faced several challenges, such as the low depolarization degree of targets, obtaining the attenuation factor of amplitudes, and non-uniform scattering media. In the future research, the polarization degree of targets and the attenuation factor should be considered to enhance the applicability in practical scenes.

Author Contributions

Conceptualization, Y.Z. (Yixin Zhang); methodology, Y.Z. (Yixin Zhang); software, Y.Z. (Yixin Zhang); validation, Y.Z. (Yixin Zhang) and Y.Z. (Yuwei Zhao); formal analysis, Y.Z. (Yixin Zhang); investigation, Y.Z. (Yixin Zhang), Y.Z. (Yuwei Zhao) and Y.F.; resources, X.W. and Y.F.; data curation, Y.Z. (Yixin Zhang); writing—original draft preparation, Y.Z. (Yixin Zhang); writing—review and editing, Y.Z. (Yixin Zhang); visualization, Y.Z. (Yixin Zhang); supervision, X.W., Y.Z. (Yuwei Zhao) and Y.F.; project administration, Y.Z. (Yixin Zhang); funding acquisition, X.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China, grant number 62031018.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

We thank Zhibin Sun for providing the hardware setups.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Mufti, F.; Mahony, R. Statistical Analysis of Measurement Processes for Time-of-Flight Cameras. Proc. Spie Int. Soc. Opt. Eng. 2009, 7447, 720–731. [Google Scholar] [CrossRef]
  2. Niskanen, I.; Immonen, M.; Hallman, L.; Yamamuchi, G.; Mikkonen, M.; Hashimoto, T.; Nitta, Y.; Keränen, P.; Kostamovaara, J.; Heikkilä, R. Time-of-Flight Sensor for Getting Shape Model of Automobiles toward Digital 3D Imaging Approach of Autonomous Driving—Science Direct. Autom. Constr. 2021, 121, 103429. [Google Scholar] [CrossRef]
  3. Conde, M.H. A Material-Sensing Time-of-Flight Camera. IEEE Sensors Lett. 2020, 4, 1–4. [Google Scholar] [CrossRef]
  4. Routray, J.; Rout, S.; Panda, J.J.; Mohapatra, B.S.; Panda, H. Hand Gesture Recognition Using TOF Camera. Int. J. Appl. Eng. Res. 2021, 16, 302–307. [Google Scholar] [CrossRef]
  5. Lange, R.; Seitz, P. Solid-State Time-of-Flight Range Camera. IEEE J. Quantum Electron. 2001, 37, 390–397. [Google Scholar] [CrossRef] [Green Version]
  6. Bhandari, A.; Raskar, R. Signal Processing for Time-of-Flight Imaging Sensors: An Introduction to Inverse Problems in Computational 3-D Imaging. IEEE Signal Process. Mag. 2016, 33, 45–58. [Google Scholar] [CrossRef]
  7. Godbaz, J.P.; Cree, M.J.; Dorrington, A.A. Mixed Pixel Return Separation for a Full-Field Ranger. In Proceedings of the 2008 23rd International Conference Image and Vision Computing New Zealand, Christchurch, New Zealand, 26–28 November 2008; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
  8. Godbaz, J.P.; Cree, M.J.; Dorrington, A.A. Multiple Return Separation for a Full-Field Ranger via Continuous Waveform Modelling. In Image Processing: Machine Vision Applications II; SPIE: Bellingham, WA, USA, 2009; Volume 7251, pp. 269–280. [Google Scholar] [CrossRef] [Green Version]
  9. Bhandari, A.; Feigin, M.; Izadi, S.; Rhemann, C.; Schmidt, M.; Raskar, R. Resolving Multipath Interference in Kinect: An Inverse Problem Approach. In Proceedings of the SENSORS, 2014 IEEE, Valencia, Spain, 15 December 2014; pp. 614–617. [Google Scholar] [CrossRef]
  10. Dorrington, A.A.; Godbaz, J.P.; Cree, M.J.; Payne, A.D.; Streeter, L.V. Separating True Range Measurements from Multi-Path and Scattering Interference in Commercial Range Cameras. In Three-Dimensional Imaging, Interaction, and Measurement; International Society for Optics and Photonics: Bellingham, WA, USA, 2011; Volume 7864, p. 786404. [Google Scholar] [CrossRef] [Green Version]
  11. Kirmani, A.; Benedetti, A.; Chou, P.A. SPUMIC: Simultaneous Phase Unwrapping and Multipath Interference Cancellation in Time-of-Flight Cameras Using Spectral Methods. In Proceedings of the 2013 IEEE International Conference on Multimedia and Expo (ICME), San Jose, CA, USA, 15–19 July 2013; pp. 1–6. [Google Scholar] [CrossRef]
  12. Freedman, D.; Smolin, Y.; Krupka, E.; Leichter, I.; Schmidt, M. SRA: Fast Removal of General Multipath for ToF Sensors. In Proceedings of the Computer Vision—ECCV 2014, Zurich, Switzerland, 6–12 September 2014; Volume 8689, pp. 234–249. [Google Scholar] [CrossRef] [Green Version]
  13. Patil, S.; Bhade, P.M.; Inamdar, V.S. Depth Recovery in Time of Flight Range Sensors via Compressed Sensing Algorithm. Int. J. Intell. Robot. Appl. 2020, 4, 243–251. [Google Scholar] [CrossRef]
  14. Patil, S.S.; Inamdar, V.S. Resolving Interference in Time of Flight Range Sensors via Sparse Recovery Algorithm. In Proceedings of the ICIGP 2020: 2020 3rd International Conference on Image and Graphics Processing, New York, NY, USA, 8–10 February 2020; pp. 106–112. [Google Scholar] [CrossRef]
  15. Guo, Q.; Frosio, I.; Gallo, O.; Zickler, T.; Kautz, J. Tackling 3D ToF Artifacts Through Learning and the FLAT Dataset. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 368–383. [Google Scholar] [CrossRef] [Green Version]
  16. Agresti, G.; Schaefer, H.; Sartor, P.; Zanuttigh, P. Unsupervised Domain Adaptation for ToF Data Denoising with Adversarial Learning. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 5579–5586. [Google Scholar] [CrossRef] [Green Version]
  17. Su, S.; Heide, F.; Wetzstein, G.; Heidrich, W. Deep End-to-End Time-of-Flight Imaging. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 6383–6392. [Google Scholar] [CrossRef] [Green Version]
  18. Heide, F.; Xiao, L.; Kolb, A.; Hullin, M.B.; Heidrich, W. Imaging in Scattering Media Using Correlation Image Sensors and Sparse Convolutional Coding. Opt. Express 2014, 22, 26338–26350. [Google Scholar] [CrossRef] [Green Version]
  19. Gutierrez-Barragan, F.; Chen, H.G.; Gupta, M.; Velten, A.; Gu, J. IToF2dToF: A Robust and Flexible Representation for Data-Driven Time-of-Flight Imaging. IEEE Trans. Comput. Imaging 2021, 7, 1205–1214. [Google Scholar] [CrossRef]
  20. Kijima, D.; Kushida, T.; Kitajima, H.; Tanaka, K.; Kubo, H.; Funatomi, T.; Mukaigawa, Y. Time-of-Flight Imaging in Fog Using Multiple Time-Gated Exposures. Opt. Express 2021, 29, 6453–6467. [Google Scholar] [CrossRef]
  21. Fujimura, Y.; Sonogashira, M.; Iiyama, M. Defogging Kinect: Simultaneous Estimation of Object Region and Depth in Foggy Scenes. arXiv 2019, arXiv:1904.00558. [Google Scholar]
  22. Fujimura, Y.; Sonogashira, M.; Iiyama, M. Simultaneous Estimation of Object Region and Depth in Participating Media Using a ToF Camera. IEICE Trans. Inf. Syst. 2020, 103, 660–673. [Google Scholar] [CrossRef] [Green Version]
  23. Lu, H.; Zhang, Y.; Li, Y.; Zhou, Q.; Tadoh, R.; Uemura, T.; Kim, H.; Serikawa, S. Depth Map Reconstruction for Underwater Kinect Camera Using Inpainting and Local Image Mode Filtering. IEEE Access 2017, 5, 7115–7122. [Google Scholar] [CrossRef]
  24. Wu, R.; Suo, J.; Dai, F.; Zhang, Y.; Dai, Q. Scattering Robust 3D Reconstruction via Polarized Transient Imaging. Opt. Lett. 2016, 41, 3948–3951. [Google Scholar] [CrossRef] [PubMed]
  25. Wu, R.; Jarabo, A.; Suo, J.; Dai, F.; Zhang, Y.; Dai, Q.; Gutierrez, D. Adaptive Polarization-Difference Transient Imaging for Depth Estimation in Scattering Media. Opt. Lett. 2018, 43, 1299–1302. [Google Scholar] [CrossRef] [PubMed]
  26. Gupta, M.; Nayar, S.K.; Hullin, M.B.; Martin, J. Phasor Imaging: A Generalization of Correlation-Based Time-of-Flight Imaging. ACM Trans. Graph. 2015, 34, 156. [Google Scholar] [CrossRef]
  27. Muraji, T.; Tanaka, K.; Funatomi, T.; Mukaigawa, Y. Depth from Phasor Distortions in Fog. Opt. Express 2019, 27, 18858–18868. [Google Scholar] [CrossRef]
  28. Heide, F.; Hullin, M.; Gregson, J.; Heidrich, W. Light-in-Flight: Transient Imaging Using Photonic Mixer Devices. ACM Trans. Graph. 2013, 32, 9. [Google Scholar] [CrossRef]
  29. Heide, F.; Hullin, M.B.; Gregson, J.; Heidrich, W. Low-Budget Transient Imaging Using Photonic Mixer Devices. ACM Trans. Graph. 2013, 32, 45. [Google Scholar] [CrossRef]
  30. Lin, J.; Liu, Y.; Hullin, M.B.; Dai, Q. Fourier Analysis on Transient Imaging with a Multifrequency Time-of-Flight Camera. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Los Alamitos, CA, USA, 23–28 June 2014; pp. 3230–3237. [Google Scholar] [CrossRef] [Green Version]
  31. Qiao, H.; Lin, J.; Liu, Y.; Hullin, M.B.; Dai, Q. Resolving Transient Time Profile in ToF Imaging via Log-Sum Sparse Regularization. Opt. Lett. 2015, 40, 918–921. [Google Scholar] [CrossRef]
  32. Han, J.; Yang, K.; Xia, M.; Sun, L.; Cheng, Z.; Liu, H.; Ye, J. Resolution Enhancement in Active Underwater Polarization Imaging with Modulation Transfer Function Analysis. Appl. Opt. 2015, 54, 3294–3302. [Google Scholar] [CrossRef] [PubMed]
  33. Schechner, Y.Y.; Nayar, S.K.; Narasimhan, S.G. Instant Dehazing of Images Using Polarization. In Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001, Los Alamitos, CA, USA, 8–14 December 2001; Volume 2, p. 325. [Google Scholar] [CrossRef] [Green Version]
  34. Feng, B.; Shi, Z. PD Based Determination of Polarized Reflection Regions in Bad Weather. In Proceedings of the 2009 2nd International Congress on Image and Signal Processing, Tianjin, China, 17–19 October 2009; pp. 1–5. [Google Scholar] [CrossRef] [Green Version]
  35. Dai, Q.; Fan, Z.; Song, Q.; Chen, Y. Polarization Defogging Method for Color Image Based on Automatic Estimation of Global Parameters. J. Appl. Opt. 2015, 39, 511–517. [Google Scholar] [CrossRef] [Green Version]
  36. Zhang, J.; Bao, K.; Zhang, X.; Nian, F.; Li, T.; Zeng, Y. Conditional Generative Adversarial Defogging Algorithm Based on Polarization Characteristics. In Proceedings of the 2021 33rd Chinese Control and Decision Conference (CCDC), Kunming, China, 22–24 May 2021; pp. 2148–2155. [Google Scholar] [CrossRef]
  37. Huang, B.; Liu, T.; Hu, H.; Han, J.; Yu, M. Underwater Image Recovery Considering Polarization Effects of Objects. Opt. Express 2016, 24, 9826–9838. [Google Scholar] [CrossRef] [PubMed]
  38. Dubreuil, M.; Delrot, P.; Leonard, I.; Alfalou, A.; Brosseau, C.; Dogariu, A. Exploring Underwater Target Detection by Imaging Polarimetry and Correlation Techniques. Appl. Opt. 2013, 52, 997–1005. [Google Scholar] [CrossRef] [PubMed]
  39. Treibitz, T.; Schechner, Y.Y. Active Polarization Descattering. IEEE Trans. Pattern Anal. Mach. Intell. 2009, 31, 385–399. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Kim, A.D.; Moscoso, M. Backscattering of Circularly Polarized Pulses. Opt. Lett. 2002, 27, 1589–1591. [Google Scholar] [CrossRef]
  41. Hongzhi, Y.U.; Sun, C.; Yiming, H.U. Underwater Active Polarization Defogging Algorithm for Global Parameter Estimation. J. Appl. Opt. 2020, 41, 107–113. [Google Scholar] [CrossRef]
  42. Otsu, N. A Thresholding Selection Method from Gray-Level Histogram. IEEE Trans Syst. Man Cybern. 1978, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
  43. Khan, U.N.; Arya, K.V.; Pattanaik, M. Histogram Statistics Based Variance Controlled Adaptive Threshold in Anisotropic Diffusion for Low Contrast Image Enhancement. Signal Processing 2013, 93, 1684–1693. [Google Scholar] [CrossRef]
  44. Zhang, J.; Liu, Z.; Lei, Y.; Jiang, Y. Research on Infrared Image Enhancement Algorithm Based on Histogram. In Proceedings of the 5th International Symposium on Advanced Optical Manufacturing and Testing Technologies: Optoelectronic Materials and Devices for Detector, Imager, Display, and Energy Conversion Technology, Dalian, China, 22 October 2010; Volume 7658, pp. 1206–1211. [Google Scholar] [CrossRef]
  45. Liu, J.; Shao, Z.; Cheng, Q. Color Constancy Enhancement under Poor Illumination. Opt. Lett. 2011, 36, 4821–4823. [Google Scholar] [CrossRef]
  46. Jobson, D.J.; Rahman, Z.; Woodell, G.A. Properties and Performance of a Center/Surround Retinex. IEEE Trans. Image Process. 1997, 6, 451–462. [Google Scholar] [CrossRef] [PubMed]
  47. Wang, L.; Yang, K.; Song, Z.; Peng, C. A Self-Adaptive Image Enhancing Method Based on Grayscale Power Transformation. In Proceedings of the 2011 International Conference on Multimedia Technology, Hangzhou, China, 26–28 July 2011; pp. 483–486. [Google Scholar] [CrossRef]
  48. Pan, J.; Yang, X. A Topological Model for Grayscale Image Transformation. In Proceedings of the 2011 Fourth International Symposium on Parallel Architectures, Algorithms and Programming, Tianjin, China, 9–11 December 2011; pp. 123–127. [Google Scholar] [CrossRef]
  49. Gonzalez, R.C.; Woods, R.E. Digital Image Processing; Prentice Hall Int.: Hoboken, NJ, USA, 2008. [Google Scholar]
Figure 1. Amplitude and depth measurements in clear (upper) and foggy scenes (lower) using CW-ToF cameras.
Figure 1. Amplitude and depth measurements in clear (upper) and foggy scenes (lower) using CW-ToF cameras.
Sensors 22 03159 g001
Figure 2. Pipeline for the polarization phasor imaging method. DOPP is an abbreviation for degree of polarization phasor.
Figure 2. Pipeline for the polarization phasor imaging method. DOPP is an abbreviation for degree of polarization phasor.
Sensors 22 03159 g002
Figure 3. The CW-ToF imaging process and polarization imaging setup. (a) The light of a CW-ToF camera is modulated with continuous wave, and a CW-ToF camera simultaneously acquires the amplitude and depth of a scene. (b) The linear polarizers are added in CW-ToF imaging system, which constitutes the polarization imaging system.
Figure 3. The CW-ToF imaging process and polarization imaging setup. (a) The light of a CW-ToF camera is modulated with continuous wave, and a CW-ToF camera simultaneously acquires the amplitude and depth of a scene. (b) The linear polarizers are added in CW-ToF imaging system, which constitutes the polarization imaging system.
Sensors 22 03159 g003
Figure 4. Polarization phasor representation.
Figure 4. Polarization phasor representation.
Sensors 22 03159 g004
Figure 5. The imaging process and phasor representation of CW-ToF imaging in fog. (a) Light path of ToF imaging in the scattering scene. The red lines are the optical paths of directly reflected photons, and the brown lines represent the light paths of scattered photons. (b) Phasor representation of ToF measurements in fog. The observed phasor in red can be decomposed into the target phasor in orange and the scattering phasors in blue.
Figure 5. The imaging process and phasor representation of CW-ToF imaging in fog. (a) Light path of ToF imaging in the scattering scene. The red lines are the optical paths of directly reflected photons, and the brown lines represent the light paths of scattered photons. (b) Phasor representation of ToF measurements in fog. The observed phasor in red can be decomposed into the target phasor in orange and the scattering phasors in blue.
Sensors 22 03159 g005
Figure 6. Polarization phasor representation in a scattering scene.
Figure 6. Polarization phasor representation in a scattering scene.
Sensors 22 03159 g006
Figure 7. Experimental environment for polarization phasor imaging. (a) Fog chamber. Experiments with fog are conducted in this fog chamber. (b) Experimental setups.
Figure 7. Experimental environment for polarization phasor imaging. (a) Fog chamber. Experiments with fog are conducted in this fog chamber. (b) Experimental setups.
Sensors 22 03159 g007
Figure 8. Experimental scene. (a) Target objects. (b) Scenes under different thicknesses of fog. The numbers of scattering particles per cubic decimeter are 753,640, 935,340, and 1,204,890 from left to right.
Figure 8. Experimental scene. (a) Target objects. (b) Scenes under different thicknesses of fog. The numbers of scattering particles per cubic decimeter are 753,640, 935,340, and 1,204,890 from left to right.
Sensors 22 03159 g008
Figure 9. Experimental results in different thicknesses of fog. (a) The image captured by an ordinary ToF camera at θ | | . (b) The image captured by an ordinary ToF camera at θ . (c) The image recovered by our method. (d) The ground truth captured without fog.
Figure 9. Experimental results in different thicknesses of fog. (a) The image captured by an ordinary ToF camera at θ | | . (b) The image captured by an ordinary ToF camera at θ . (c) The image recovered by our method. (d) The ground truth captured without fog.
Sensors 22 03159 g009
Figure 10. Absolute error of depth under different thicknesses of fog. (a) The absolute error of the image captured by an ordinary ToF camera at θ | | . (b) The absolute error of the image captured by an ordinary ToF camera at θ . (c) The absolute error of the image recovered by our method.
Figure 10. Absolute error of depth under different thicknesses of fog. (a) The absolute error of the image captured by an ordinary ToF camera at θ | | . (b) The absolute error of the image captured by an ordinary ToF camera at θ . (c) The absolute error of the image recovered by our method.
Sensors 22 03159 g010
Table 1. Quantitative evaluation of the amplitudes using PSNR and SSIM.
Table 1. Quantitative evaluation of the amplitudes using PSNR and SSIM.
Thickness of FogAmplitudePSNR (dB)SSIM
Thin Ordinary   ToF   measured   at   θ | | 59.480.338
Ordinary   ToF   measured   at   θ 57.920.262
Ours63.750.684
Medium Ordinary   ToF   measured   at   θ | | 59.470.328
Ordinary   ToF   measured   at   θ 57.870.249
Ours61.400.583
Thick Ordinary   ToF   measured   at   θ | | 59.210.294
Ordinary   ToF   measured   at   θ 57.840.239
Ours60.280.504
Table 2. Quantitative evaluation of depths using the mean absolute error and mean squared error (MSE). Each cell shows the mean absolute error (m)/MSE.
Table 2. Quantitative evaluation of depths using the mean absolute error and mean squared error (MSE). Each cell shows the mean absolute error (m)/MSE.
Thickness of FogDepthCylinderWhite Diffuse PlanePlush ToyKraft Paper BoxBlue Box
Thin Ordinary   ToF   measured   at   θ | | 0.28/3.030.18/1.740.43/6.350.21/2.800.34/3.06
Ordinary   ToF   measured   at   θ 0.11/1.270.05/0.450.22/3.230.09/1.210.21/2.30
Ours0.02/0.180.01/0.140.03/0.570.03/0.380.04/0.49
Medium Ordinary   ToF   measured   at   θ | | 0.28/2.520.19/1.500.44/6.580.23/3.750.35/5.32
Ordinary   ToF   measured   at   θ 0.14/1.300.07/0.550.29/4.550.12/2.050.27/3.29
Ours0.02/0.200.02/0.170.04/0.600.03/0.530.05/0.66
Thick Ordinary   ToF   measured   at   θ | | 0.37/3.290.28/2.800.52/6.940.30/4.840.40/5.38
Ordinary   ToF   measured   at   θ 0.19/1.680.10/1.040.35/4.730.16/2.660.30/4.07
Ours0.03/0.230.03/0.280.06/0.670.06/0.980.05/0.70
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, Y.; Wang, X.; Zhao, Y.; Fang, Y. Time-of-Flight Imaging in Fog Using Polarization Phasor Imaging. Sensors 2022, 22, 3159. https://doi.org/10.3390/s22093159

AMA Style

Zhang Y, Wang X, Zhao Y, Fang Y. Time-of-Flight Imaging in Fog Using Polarization Phasor Imaging. Sensors. 2022; 22(9):3159. https://doi.org/10.3390/s22093159

Chicago/Turabian Style

Zhang, Yixin, Xia Wang, Yuwei Zhao, and Yujie Fang. 2022. "Time-of-Flight Imaging in Fog Using Polarization Phasor Imaging" Sensors 22, no. 9: 3159. https://doi.org/10.3390/s22093159

APA Style

Zhang, Y., Wang, X., Zhao, Y., & Fang, Y. (2022). Time-of-Flight Imaging in Fog Using Polarization Phasor Imaging. Sensors, 22(9), 3159. https://doi.org/10.3390/s22093159

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop