Next Article in Journal
Evaluation of Some Rhodes Grass (Chloris gayana) Genotypes for Their Salt Tolerance, Biomass Yield and Nutrient Composition
Next Article in Special Issue
Mid-Infrared Tunable Laser-Based Broadband Fingerprint Absorption Spectroscopy for Trace Gas Sensing: A Review
Previous Article in Journal
Experimental Efficiency Analysis of a Photovoltaic System with Different Module Technologies under Temperate Climate Conditions
Previous Article in Special Issue
Review of Recent Advances in QEPAS-Based Trace Gas Sensing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Real-Time Vision through Haze Based on Polarization Imaging

1
College of Computer Science and Technology, Jilin University, Changchun 130012, China
2
State Key Laboratory of Applied Optics, Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
3
School of Physics and Optoelectronic Engineering, Xidian University, Xi’an 710071, China
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2019, 9(1), 142; https://doi.org/10.3390/app9010142
Submission received: 19 November 2018 / Revised: 19 December 2018 / Accepted: 25 December 2018 / Published: 3 January 2019
(This article belongs to the Special Issue State-of-the-art Laser Gas Sensing Technologies)

Abstract

:
Various gases and aerosols in bad weather conditions can cause severe image degradation, which will seriously affect the detection efficiency of optical monitoring stations for high pollutant discharge systems. Thus, penetrating various gases and aerosols to sense and detect the discharge of pollutants plays an important role in the pollutant emission detection system. Against this backdrop, we recommend a real-time optical monitoring system based on the Stokes vectors through analyzing the scattering characteristics and polarization characteristics of both gases and aerosols in the atmosphere. This system is immune to the effects of various gases and aerosols on the target to be detected and achieves the purpose of real-time sensing and detection of high pollutant discharge systems under bad weather conditions. The imaging system is composed of four polarizers with different polarization directions integrated into independent cameras aligned parallel to the optical axis in order to acquire the Stokes vectors from various polarized azimuth images. Our results show that this approach achieves high-contrast and high-definition images in real time without the loss of spatial resolution in comparison with the performance of conventional imaging techniques.

1. Introduction

Environmental pollution has become a matter of great concern in recent years. In order to protect the environment, the inspection of the environmental quality, especially the sensing and detection of the emission of pollutants, is a problem that must be faced [1,2]. In view of this situation, many countries have set up environmental stations to monitor companies with high pollutant emissions [3]. The optical sensing and detection method has become the most popular monitoring method owing to its high resolution, abundant information, and use of simple equipment. However, in bad weather conditions, such as hazy and rainy weather conditions, the atmospheric aerosols and various gases, especially the haze particles, have a serious scattering effect on the light wave [4,5]. The image obtained under these conditions will go through a serious degradation process, which will reduce the efficiency of optical sensing and detecting, as well as the monitoring distance [6,7]. Therefore, without increasing the number of monitoring stations, how to improve the sense and detect distance of the monitoring station under the condition of bad weather and how to remove the influence of gases and aerosols on the monitoring efficiency have become urgent problems to be solved.
This paper proposes a real-time optical sensing and detection system based on the Stokes vectors through analyzing the scattering characteristics and polarization characteristics of gases and aerosols in the atmosphere. This system can sense and detect the emission of pollutants in real time in bad weather and improve the monitoring distance of the environmental monitoring system. In this study, we first analyzed the principles of polarization imaging algorithms and established a physical model for polarization images through gases and aerosols in bad weather based on the Stokes vectors. Next, we developed a real-time optical sensing and detection system with four cameras aligned parallel to the optical axis. By solving the linear polarization parameters of the Stokes matrix, we achieved real-time registration of different polarization images. Further, we achieved visual enhancement of polarization degree images using an image registration algorithm based on the speeded up robust features (SURF) algorithm and a bilinear interpolation based on the contrast-limited adaptive histogram equalization (CLAHE) algorithm. Our results show that the proposed system can achieve real-time high-contrast, high-definition imaging under bad weather conditions without a loss of spatial resolution.

2. Physical Model for Polarization Image

To construct our physical model, we consider the situation in which two degraded orthogonally polarized azimuth images are first acquired by rotating the polarizer installed in front of the detector in bad weather that has various gases and aerosols present. A clear scene is subsequently acquired by effectively splitting the azimuth images based on the differences of polarization properties between the atmospheric light and the target light [8]. The total intensity incident on the detector can be expressed as:
I T o t a l = T + A = L O b j e c t e β z + A ( 1 e β z )
The total light intensity ITotal received by the detector consists of the target light T and the atmospheric light A. Target light T is the target radiation LObject that reaches the detector after being scattered by various gases and aerosols, and it exponentially decays during transmission [9], as shown in the attenuation model in Figure 1a. Atmospheric light A refers to the sunlight received by the detector after the scattering of various gases and aerosols, and it increases with the detection distance, as shown in the atmospheric light model in Figure 1b. The relationship between ITotal T, A, and LObject is given by Equation (1), where A denotes the atmospheric light intensity from infinity, β denotes the atmospheric attenuation coefficient, and z denotes the distance between the target and the detector.
During polarization imaging, two orthogonally polarized azimuth images are acquired at the detector by rotating the polarizer. In particular, Imin denotes the minimum atmospheric light intensity, whereas its orthogonal counterpart, Imax, denotes the maximum atmospheric light intensity [6]. The expressions for Imin, Imax, and ITotal are as follows:
I max = I T / 2 + I B max I min = I T / 2 + I B min
I T o t a l = I max + I min
Upon selecting an area with uniform atmospheric light without targets and setting Amax(x, y) and Amin(x, y), the polarization of the atmospheric light can be expressed as Equation (4), where Amax(x, y) and Amin(x, y) denotes the brightest and darkest atmospheric light, respectively, and (x, y) denotes image pixel point coordinate:
p = A max ( x , y ) A min ( x , y ) A max ( x , y ) + A min ( x , y )
Substitution of Equations (2)–(4) into Equation (1) yields the mathematical model for a high-definition scenario, as follows:
L O b j e c t = ( I T o t a l ( I max I min ) / p ) 1 A / A
From Equation (5), we note that the polarization image model uses the vector vibration direction difference between the target and the atmospheric light to filter the effects of the atmospheric light. High-contrast and high-definition images of targets under smoggy conditions are acquired by comparing the differences of the orthogonally linear-polarized azimuth images acquired by rotating the polarizers.

3. Real-Time Optical Sensing and Detection System

The polarization image model primarily relies on the random rotation of the polarizer installed in front of the detector for obtaining the maximum and minimum polarization azimuth images, thus limiting its real-time detection results. In this study, we propose a polarization image acquisition method based on the Stokes vectors. According to the Stokes principle, a real-time optical sensing and detection system is designed, which is capable of acquiring the four Stokes parameters in real time, which are then used to solve for the clear scene based on the polarization image model. In addition, to ensure that all the Stokes vectors are targeting the same field of view, the system uses a calibration platform based on a luminescence theodolite to complete the calibration of the common optical axis of the four-camera array.

3.1. Polarization Image Acquisition Method Based on Stokes Vectors

In 1852, Stokes proposed a detection method that utilized light intensity to describe the polarization characteristics. This method, known as the Stokes vector [10] method, allows for a more intuitive and convenient observation and acquisition of the polarization information of light. The Stokes matrix S = [S0, S1, S2, S3]T is commonly used to describe the polarization state of a light wave, where S0 denotes the total intensity of radiation, S1 the intensity difference between vertically and horizontally polarized light waves, S2 the intensity difference between two 45°-polarized light waves, and S3 the intensity difference between right-handed and left-handed circularly polarized light waves. The expression S is as follows:
S = [ S 0 S 1 S 2 S 3 ] = [ I 0 ° + I 90 ° I 0 ° I 90 ° I 45 ° I 135 ° 2 I λ / 4 , 45 ( I 0 ° + I 90 ° ) ]
where I, I45°, I90°, and I135° denote the intensities of light corresponding to polarization directions of 0°, 45°, 90°, and 135°; these intensities of light are obtained by installing polarizers that have different light transmission directions (including 0°, 45°, 90°, and 135°) in front of the detector. Iλ/4, 45° means the right-handed circularly polarized light. This is achieved by installing a polarizer that transmits 45° light in front of the detector, and then installing a wave plate whose fast axis direction is 0° in front of the polarizer.
Under natural conditions, the circular component can be neglected because there is no circular polarization properties in nature. Hence, the stokes vector can be reduced to a vector containing only linear polarization components S′ = [S0, S1, S2]T. Therefore, only the intensities of light corresponding to different polarization directions, namely I, I45°, I90°, and I135°, need to be measured. Subsequently, linear components S0, S1, and S2 of the target-light Stokes matrix can be obtained using Equation (7). The polarization state of the light wave can be expressed as follows:
S = [ S 0 S 1 S 2 ] = [ I 0 ° + I 90 ° I 0 ° I 90 ° I 45 ° I 135 ° ]
After the polarization information of the light wave is obtained through measurement, it is visualized using polarization degree images P or polarization angle images θ. In general, the degree of polarization (DoP) can be calculated by means of the Stokes vectors, and the degree of polarization of the actual measurement, that is, the degree of linear polarization (DoLP), can be calculated using Equation (8):
D o L P = S 1 2 + S 2 2 S 0
Further, Imax and Imin, that is, the maximum and minimum light intensities, respectively, required for the polarization-based imaging model, can be calculated using Equation (9):
I max = ( 1 + D o L P ) S 0 2 I min = ( 1 D o L P ) S 0 2

3.2. Design of Real-Time Optical Sensing and Detection System

As per the polarization image model, the real-time optical sensing and detection system with four cameras aligned parallel to the optical axis was designed. Analyzers were mounted and integrated inside each lens at angles of 0°, 45°, 90°, and 135° relative to the polarization direction of the incident light. The structure of our optical system is shown in Figure 2.
A calibration platform, using a luminous theodolite, was adopted for the calibration of the common optical axis of the four-camera array. During calibration, via controlling the linear motion of a two-dimensional rail, the exit pupil of the theodolite was aligned with the entrance pupils of the cameras separately. Next, with the crosshair lit, the orientation of the optical axis of each camera was adjusted individually to center the crosshair image on the target surface of the camera. High-precision calibration of the common optical axis was thereby completed, and the requirements for subsequent image registration were fulfilled. According to the above analysis, we next constructed a four-axis calibration platform, which was composed of the following elements: a luminous theodolite (1), a horizontal rail (2), a vertical rail (3), a load platform (4), a right angle fixed block (5), and a servo drive (6). Each part is marked on Figure 3.
In addition, the cameras were equipped with external triggers for synchronized exposure (trigger IN) and image capture (trigger OUT) from the four cameras. Transistor–transistor logic (TTL) levels that had both trigger IN and OUT interfaces were employed such that one signal could be used to trigger multiple cameras. That is, trigger OUT was enabled on the first camera, while the remaining three were enabled with trigger IN, such that synchronized imaging using multiple cameras was enabled. Compared with other polarization imaging systems, such as a focal-plane polarization imaging system [11], micropolarizer arrays polarization imaging system [12], and so on [13,14,15,16], the proposed system has the advantages of low cost and simple imaging system. This system can obtain the polarization images without losing the light energy.

4. Target Enhancement Algorithms Based on Polarization Information

4.1. Polarization Image Registration Algorithm Based on SURF

The real-time optical sensing and detection system described in this study used four cameras aligned parallel to the optical axis, which can cause misalignment and rotation of pixel units because of different shooting positions and directions of the cameras. Because the solutions of intensity, polarization difference, and polarization degree images are based on pixel units, misalignment of pixel units between polarized azimuth images can result in false edges or blurring.
In our study, by the adoption of the SURF-based real-time image registration algorithm for subpixel-precision registration of the acquired linear polarization degree images, the pixels between the polarization images were aligned via the procedure shown in Figure 4, which constituted three steps: feature detection and extraction, feature vector matching, and spatial transformation model parameter estimation [17,18]. The detailed algorithm procedure is listed below.
Step 1: Use the high-precision four-dimensional calibration platform to pre-calibrate the overlapping imaging areas of the adjacent cameras.
Step 2: Use fast Hessian detectors to extract the feature points of the reference image and the image to be registered within the overlapping area, and generate SURF description vectors.
Step 3: Use the fast approximate nearest neighbors (FANN) algorithm to obtain the initial matching point pair, and sort the Euclidean distance of the feature vectors of the pair.
Step 4: Use the progressive sample consensus (PROSAC) algorithm to perform spatial transformation model parameter estimation, which derives the geometric spatial transformation relationship between the reference image and the image to be registered.
This algorithm is invariant to changes in image size, rotation, and illumination. The registration speed is within milliseconds, and the accuracy is at the level of 0.1 pixel.

4.2. CLAHE Image Enhancement Algorithm Based on Bilinear Interpolation

While the overall quality and contrast were significantly improved in the polarization reconstructed scenes, local details were not adequately enhanced, and overexposure was noted in the sky portion of the image [19]. To resolve these issues, we used a bilinear interpolation CLAHE algorithm to further enhance the polarization reconstructed images via the following steps:
Step 1: Divide the image into several area blocks to perform individual contrast enhancement based on the block’s histogram. The local contrast enhancement is represented by Equation (10):
x i , j = m i , j + k ( x i , j m i , j )
where x i , j and x i , j denote the grayscale values before and after enhancement, respectively, and m i , j = 1 m × n ( i , j ) W x i , j denotes the average grayscale value of the pixels in block W. Parameter k can be represented as in Equation (11):
k = k [ ( σ i , j 2 σ n 2 ) 1 ]
where k denotes the scale factor, σ n 2 the noise variance of the whole image, and σ i , j 2 the grayscale variance of block W.
Step 2: Stitch neighboring regions by means of bilinear interpolation to effectively eliminate the artifacts between neighboring regions introduced post local contrast enhancement. Assume that the values of four points K11 = (x1, y1), K12 = (x1, y2), K21 = (x2, y1), and K22 = (x2, y2) of a function f(x) are known; then, the value of a point H = (x, y) of the f(x) function can be derived by linear interpolation.
First, linear interpolation is performed along the x-direction to obtain the value as represented by Equations (12) and (13):
f ( H 1 ) x 2 x x 2 x 1 f ( K 11 ) + x x 1 x 2 x 1 f ( K 21 ) H 1 = ( x , y 1 )
f ( H 2 ) x 2 x x 2 x 1 f ( K 12 ) + x x 1 x 2 x 1 f ( K 22 ) H 2 = ( x , y 2 )
Subsequently, the same operation is performed along the y-direction as represented in Equation (14):
f ( H ) y 2 x y 2 y 1 f ( H 1 ) + y y 1 y 2 y 1 f ( H 2 )

5. Results and Discussion

Testing Environment

All images in this study were acquired by use of a real-time optical sensing and detection system with four cameras aligned parallel to the optical axis. Figure 5 shows the photograph of the imaging system. The specifications of the cameras were as follows: 60° field of view, 8-mm focal length, and 4 million pixels. Different from other optical imaging systems, polarization imaging system only requires the installation of a polarizer in front of ordinary industrial cameras to achieve clear scene imaging. In this study, the type of camera is GS3-U3-41C6C-C, which is produced by Pointgrey based in vancouver, Canada, and the type of polarizer mounted in front of the camera isLPVISC100, this polarizer is produced by Thorlabs in the American state of New Jersey.
MATLAB R2017b(developed by MathWorks in Massachusetts, America) was used to execute the real-time polarization image processing and enhancement algorithm, and the program was run on a computer with a Windows 7 (64-bit) operating system and an Intel Core i7-4790K processor with 32 GB RAM, this processor is produced by Dell based in Texas, America. The time required to process a frame of image on the system was 30 ms.
The raw images obtained from the real-time optical sensing and detection system using four cameras parallel to the optical axis are shown in Figure 6. These images were acquired under hazy weather conditions, and the image quality was seriously degraded, and the presence of atmospheric light, which was produced via haze particles, seriously reduced the image contrast and the detection distance of the system. In our study, we first applied the conventional polarization image enhancement algorithm proposed by Y.Y. Schechner [20], and the result is shown in Figure 6h. The algorithms proposed by Y.Y. Schechner are the most classical of all polarization imaging algorithms [21,22,23], and their processing effect can represent the processing effect of most algorithms. When compared with the raw images, the images reconstructed by the conventional algorithm showed obvious visual enhancements as the atmospheric light at both the near and far ends of the images are removed, and the contrast was also improved. However, this enhancement algorithm resulted in overexposure in the sky region, thereby limiting the visual perception of the image. Figure 6i shows the reconstructed image obtained by applying the registration and enhancement algorithm proposed in this study. We first note that this algorithm effectively eliminates the effects of atmospheric light on the image and improved the contrast of the image. Further, the proposed algorithm reduced not only the effects of atmospheric light for objects at the near ends, but also for those at the far ends, thus creating a perception of stereovision. In addition, a comparison of Figure 6h,i demonstrates that the proposed algorithm not only avoided overexposure in the sky but also enriched the details of the image.
The zoomed-in view of the region of interest marked with a red rectangle in Figure 6g,i is shown in Figure 7a,b. The distance between building A and the detector was 1.6 km, and we can only observe the outline of the building under the influence of haze, which means the spatial resolution of Figure 7a at the detection distance of 1.6 km is 15 m. However, we can see two windows of building A in Figure 7b, which means the spatial resolution of Figure 7b at the detection distance of 1.6 km was 2 m. Furthermore, building B, which is completely invisible in Figure 7a, is highlighted in Figure 7b. These changes suggest that our imaging system can improve spatial resolution in severe weather conditions. However, the limitation of the proposed method is also obvious. There was a large amount of noise in the distant area. This was because when using Equation (1) to enhance the reflected light energy of distant targets, the noise existing there also tended to be amplified in line with the same trend [24].
Figure 8 shows the intensity distribution curve in the pixels of row 415 (counting from top to bottom) across the background and the target from Figure 6g–i, which provides an intuitive demonstration of the difference between the atmospheric light and the target light in the imaging scene, as well as changes in the contrast. When compared with the results obtained with the Schechner algorithm (red curve), the reconstruction result of our algorithm (blue curve) increases the atmospheric light-target light difference to a certain extent and enhances the contrast of the image. In Figure 8, the fluctuations of the blue curve are stronger than those of the red one, particularly at the target location. The fluctuation of the pixel intensity curve after reconstruction using the proposed algorithm increased significantly, which indicates that this algorithm is superior to that of the Schechner algorithm in improving the image contrast.
Figure 9a–c shows the grayscale histograms corresponding to Figure 6g–i, respectively. This rendering of information provides a more intuitive representation of the characteristics of Figure 6g–i. When compared with the raw image that has its histogram concentrated in the right half of the panel, the reconstructed image exhibited a wider and more evenly distributed histogram. This result confirms that the proposed algorithm can effectively reduce the effects of scattering due to aerosols and various gases to increase image contrast and enrich image details. Figure 9d–f illustrates the pixel intensity distributions in the red, green and blue (RGB) channels of Figure 6g–i, respectively. We note that the pixel intensity distribution in Figure 9f was optimized over the distributions of the raw image and the image processed by means of the conventional polarization imaging algorithm. In addition, the proposed algorithm increased the dynamic range and stereovision of the image.
Figure 10 shows the results of the detection under different weather conditions. Figure 10a is the total intensity image, which was captured under rainy and foggy weather. Figure 10b is the detection result of Figure 10a using the proposed method, which proved the effectiveness of our method in this weather condition. Figure 10c shows the total intensity image of contaminated polluted lake water, which was captured under dense fog weather, where the pollution of the lake water was difficult to be observed because of the influence of haze, the overall contrast of the image was low, and the detection distance was limited. After the detection by our imaging system, as shown in Figure 10d, the influence of aerosols and various gases was removed. This system could directly observe the pollution of the lake, so that we could control the pollution of the lake. In addition, the detection distance of the imaging system was improved, which greatly reduces the cost.
To validate the proposed algorithm, we selected multiple scenes for imaging, as shown in Figure 11. When compared with the scenes processed by the Schechner algorithm [25], we note that the proposed algorithm could effectively avoid overexposure in the blank regions of the sky and considerably enhance the visual effect of the image. In addition, the proposed algorithm achieved far superior image detail enhancement for both near and far objects in the image, consequently providing an improved sense of stereovision.
To provide an objective evaluation of the effect of sensing and detection with various scenes, we next adopted four commonly used image quality assessment metrics to compare algorithms; the corresponding results are listed in Table 1. The mean gradient assesses high-frequency information such as edges and details of the image; a higher gradient indicates an image with clearer edges and more details. The edge strength is the amplitude of the edge point gradient. The image contrast represents the ratio between the bright and dark areas of the general image; an image with a higher contrast indicates more levels between bright/dark gradual changes, and thus, it contains more information. Overall, the quality of the image processed by the proposed algorithm was significantly improved. When compared with the raw image, the processed image generally had its contrast increased by a factor of ≈10, mean gradient increased by a factor of 4, and edge strength increased by a factor of ≈3. These results demonstrate that the processed images provided improved quality in terms of contrast, details, and definition, which agrees well with the conclusion of the previous subjective assessment.

6. Conclusions

In this study, we proposed a real-time polarization imaging algorithm and investigated its performance, both theoretically and experimentally. We designed and constructed a real-time optical sensing and detection system based on the Stokes vectors underpinned by the principles of polarization imaging, wherein optical analyzers at different angles relative to the polarization direction of the incident light were integrated into four independent cameras individually. Linear polarization components were calculated by use of the Stokes vectors, followed by real-time image registration of images with different polarization components based on the SURF algorithm and subsequent visualization of the polarization images. Finally, we adopted an improved image enhancement algorithm using the CLAHE-based bilinear interpolation to generate real-time high-contrast and high-definition images. Our experimental results further reinforce the conclusion that the proposed method can acquire high-contrast and high-definition images in real time without loss of spatial resolution, which improves the detection range of environmental monitoring stations in hazy weather conditions

Author Contributions

This research article contains five authors, including X.W., J.O., Y.W., F.L. and G.Z. X.W. and F.L. jointly designed the overall architecture and related algorithms. G.Z. and Y.W. conceived and designed the experiments. J.O. coordinated the overall plan and direction of the experiments and related skills. X.W. wrote this paper.

Funding

This research was funded by Jilin Province Science and Technology Development Plan Project (approval number: 20160209006GX, 20170309001GX); Self-fund of State Key Laboratory of Applied Optics (approval number: 2016Y6133FQ164); and Open-fund of State Key Laboratory of Applied Optics (approval number: CS16017050001).

Acknowledgments

The authors would like to thank Pingli Han from XiDian university for her useful writing advises. The authors would also like to thank Xuan Li, Kui Yang and Xin Li for the help about the experiment.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Xie, C.; Nishizawa, T.; Sugimoto, N.; Matsui, I.; Wang, Z. Characteristics of aerosol optical properties in pollution and Asian dust episodes over Beijing, China. Appl. Opt. 2008, 47, 4945–4951. [Google Scholar] [CrossRef]
  2. Edner, H.; Ragnarson, P.; Spännare, S.; Svanberg, S. Differential optical absorption spectroscopy (DOAS) system for urban atmospheric pollution monitoring. Appl. Opt. 1993, 32, 327–333. [Google Scholar] [CrossRef] [PubMed]
  3. Xu, F.; Lv, Z.; Lou, X.; Zhang, Y.; Zhang, Z. Nitrogen dioxide monitoring using a blue LED. Appl. Opt. 2008, 47, 5337–5340. [Google Scholar] [CrossRef] [PubMed]
  4. Guo, F.; Cai, Z.X.; Xie, B.; Tang, J. Review and prospect of image dehazing techniques. J. Comput. Appl. 2010, 30, 2417–2421. [Google Scholar] [CrossRef]
  5. Huang, B.; Liu, T.; Han, J.; Hu, H. Polarimetric target detection under uneven illumination. Opt. Express 2015, 23, 23603–23612. [Google Scholar] [CrossRef]
  6. Chen, C.W.; Fang, S.; Huo, X.; Xia, X.S. Image dehazing using polarization effects of objects and airlight. Opt. Express 2014, 22, 19523–19537. [Google Scholar] [CrossRef]
  7. Liu, F.; Cao, L.; Shao, X.; Han, P.; Bin, X. Polarimetric dehazing utilizing spatial frequency segregation of images. Appl. Opt. 2015, 54, 8116–8122. [Google Scholar] [CrossRef]
  8. Han, P.; Liu, F.; Yang, K.; Ma, J.; Li, J.; Shao, X. Active underwater descattering and image recovery. Appl. Opt. 2017, 56, 6631–6638. [Google Scholar] [CrossRef] [PubMed]
  9. Liu, F.; Shao, X.; Gao, Y.; Xiangli, B.; Han, P.; Li, G. Polarization characteristics of objects in long-wave infrared range. JOSA A 2016, 33, 237–243. [Google Scholar] [CrossRef]
  10. Sadjadi, F.A. Invariants of polarization transformations. Appl. Opt. 2007, 46, 2914–2921. [Google Scholar] [CrossRef]
  11. Zhang, M.; Wu, X.; Cui, N.; Engheta, N.; van der Spiegel, J. Bioinspired focal-plane polarization image sensor design: From application to implementation. Proc. IEEE 2014, 102, 1435–1449. [Google Scholar] [CrossRef]
  12. Zhao, X.; Boussaid, F.; Bermak, A.; Chigrinov, V.G. High-resolution thin “guest-host” micropolarizer arrays for visible imaging polarimetry. Opt. Express 2011, 19, 5565–5573. [Google Scholar] [CrossRef] [PubMed]
  13. Sarkar, M.; Bello, D.S.S.; van Hoof, C.; Theuwissen, A.J. Biologically inspired CMOS image sensor for fast motion and polarization detection. IEEE Sens. J. 2013, 13, 1065–1073. [Google Scholar] [CrossRef]
  14. Garcia, M.; Edmiston, C.; Marinov, R.; Vail, A.; Gruev, V. Bio-inspired color-polarization imager for real-time in situ imaging. Optica 2017, 4, 1263–1271. [Google Scholar] [CrossRef]
  15. Maruyama, Y.; Terada, T.; Yamazaki, T.; Uesaka, Y.; Nakamura, M.; Matoba, Y.; Komori, K.; Ohba, Y.; Arakawa, S.; Hirasawa, Y. 3.2-MP Back-Illuminated Polarization Image Sensor with Four-Directional Air-Gap Wire Grid and 2.5-μm Pixels. IEEE Trans. Electron Devices 2018, 65, 2544–2551. [Google Scholar] [CrossRef]
  16. Garcia, M.; Davis, T.; Blair, S.; Cui, N.; Gruev, V. Bioinspired polarization imager with high dynamic range. Optica 2018, 5, 1240–1246. [Google Scholar] [CrossRef]
  17. Lowe, D.G. Distinctive image features from scale-invariant key-points. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  18. Muja, M.; Lowe, D.G. Scalable nearest neighbor algorithms for high dimensional data. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 2227–2240. [Google Scholar] [CrossRef]
  19. Liang, J.; Ren, L.; Qu, E.; Hu, B.; Wang, Y. Method for enhancing visibility of hazy images based on polarimetric imaging. Photonics Res. 2014, 2, 38–44. [Google Scholar] [CrossRef]
  20. Shwartz, S.; Namer, E.; Schechner, Y.Y. Blind haze separation. In Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, New York, NY, USA, 17–22 June 2006; Volume 2, pp. 1984–1991. [Google Scholar] [CrossRef]
  21. Nishino, K.; Kratz, L.; Lombardi, S. Bayesian defogging. Int. J. Comput. Vis. 2012, 98, 263–278. [Google Scholar] [CrossRef]
  22. Ancuti, C.O.; Ancuti, C. Single image dehazing by multi-scale fusion. IEEE Trans. Image Process. 2013, 22, 3271–3482. [Google Scholar] [CrossRef]
  23. Pust, N.J.; Shaw, J.A. Wavelength dependence of the degree of polarization in cloud-free skies: Simulations of real environments. Opt. Express 2012, 20, 15559–15568. [Google Scholar] [CrossRef] [PubMed]
  24. Schechner, Y.Y.; Averbuch, Y. Regularized image recovery in scattering media. IEEE Trans. Pattern Anal. Mach. Intell. 2007, 29, 1655–1660. [Google Scholar] [CrossRef] [PubMed]
  25. Schechner, Y.Y.; Narasimhan, S.G.; Nayar, S.K. Polarization-based vision through haze. Appl. Opt. 2003, 42, 511–525. [Google Scholar] [CrossRef] [PubMed]
Figure 1. (a) Attenuation model, and (b) atmospheric light model.
Figure 1. (a) Attenuation model, and (b) atmospheric light model.
Applsci 09 00142 g001
Figure 2. Real-time optical sensing and detection system.
Figure 2. Real-time optical sensing and detection system.
Applsci 09 00142 g002
Figure 3. Four-axis calibration platform.
Figure 3. Four-axis calibration platform.
Applsci 09 00142 g003
Figure 4. Flowchart of image registration algorithm.
Figure 4. Flowchart of image registration algorithm.
Applsci 09 00142 g004
Figure 5. Real-time polarization dehazing imaging system.
Figure 5. Real-time polarization dehazing imaging system.
Applsci 09 00142 g005
Figure 6. (ad) Polarization images corresponding to polarization angles of 0°, 45°, 90°, and 135°; (e) Polarization image with minimum airlight; (f) Polarization image with maximum airlight; (g) Total intensity image; (h) Detection result obtained using the method of Schechner; (i) Detection result obtained with proposed method; The zoomed-in view of the region of interest marked with a red rectangle in (g) and (i) will be shown in Figure 7a,b.
Figure 6. (ad) Polarization images corresponding to polarization angles of 0°, 45°, 90°, and 135°; (e) Polarization image with minimum airlight; (f) Polarization image with maximum airlight; (g) Total intensity image; (h) Detection result obtained using the method of Schechner; (i) Detection result obtained with proposed method; The zoomed-in view of the region of interest marked with a red rectangle in (g) and (i) will be shown in Figure 7a,b.
Applsci 09 00142 g006
Figure 7. (a,b) The zoomed-in view of the region of interest marked with a red rectangle in Figure 6g,i; A and B are two buildings with different distances from the detector in the scene.
Figure 7. (a,b) The zoomed-in view of the region of interest marked with a red rectangle in Figure 6g,i; A and B are two buildings with different distances from the detector in the scene.
Applsci 09 00142 g007
Figure 8. Horizontal line plot at vertical position pixel 415 in Figure 6g–i.
Figure 8. Horizontal line plot at vertical position pixel 415 in Figure 6g–i.
Applsci 09 00142 g008
Figure 9. (ac) Gray histograms corresponding to Figure 6g–i; (df) Pixel intensity distributions of channels R, G, and B of Figure 6g–i, respectively.
Figure 9. (ac) Gray histograms corresponding to Figure 6g–i; (df) Pixel intensity distributions of channels R, G, and B of Figure 6g–i, respectively.
Applsci 09 00142 g009
Figure 10. (a) Total intensity image captured under rainy and foggy weather; (b) Detection result of (a) using the proposed method; (c) Total intensity image of contaminated polluted lake water (the region marked with a red circle) captured under dense fog weather; and (d) Detection result of (c) using the proposed method, the region marked with a red circle shows that the pollution level of the lake can be directly observed.
Figure 10. (a) Total intensity image captured under rainy and foggy weather; (b) Detection result of (a) using the proposed method; (c) Total intensity image of contaminated polluted lake water (the region marked with a red circle) captured under dense fog weather; and (d) Detection result of (c) using the proposed method, the region marked with a red circle shows that the pollution level of the lake can be directly observed.
Applsci 09 00142 g010
Figure 11. (a,d) Total intensity images of different scenes; (b,e) Detection results obtained by the method of Schechner; and (c,f) Detection results obtained with the proposed method.
Figure 11. (a,d) Total intensity images of different scenes; (b,e) Detection results obtained by the method of Schechner; and (c,f) Detection results obtained with the proposed method.
Applsci 09 00142 g011
Table 1. Objective evaluation of dehazed images.
Table 1. Objective evaluation of dehazed images.
MetricsMean GradientEdge StrengthContrast
Image
Figure 6gTotal light intensity image0.00490.05231.3455
Result by the conventional algorithm 0.00840.08975.0434
Result by the proposed algorithm0.02330.244317.5323
Figure 8aTotal light intensity image0.00660.071.9452
Result by the conventional algorithm 0.00950.1015.3448
Result by the proposed algorithm0.02450.257522.656
Figure 8dTotal light intensity image0.00480.05050.9295
Result by the conventional algorithm 0.00550.05891.6168
Result by the proposed algorithm0.01550.1659.1692

Share and Cite

MDPI and ACS Style

Wang, X.; Ouyang, J.; Wei, Y.; Liu, F.; Zhang, G. Real-Time Vision through Haze Based on Polarization Imaging. Appl. Sci. 2019, 9, 142. https://doi.org/10.3390/app9010142

AMA Style

Wang X, Ouyang J, Wei Y, Liu F, Zhang G. Real-Time Vision through Haze Based on Polarization Imaging. Applied Sciences. 2019; 9(1):142. https://doi.org/10.3390/app9010142

Chicago/Turabian Style

Wang, Xinhua, Jihong Ouyang, Yi Wei, Fei Liu, and Guang Zhang. 2019. "Real-Time Vision through Haze Based on Polarization Imaging" Applied Sciences 9, no. 1: 142. https://doi.org/10.3390/app9010142

APA Style

Wang, X., Ouyang, J., Wei, Y., Liu, F., & Zhang, G. (2019). Real-Time Vision through Haze Based on Polarization Imaging. Applied Sciences, 9(1), 142. https://doi.org/10.3390/app9010142

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop