Next Article in Journal
Opportunities, Challenges, and Prospects in Electrochemical Biosensing of Circulating Tumor DNA and Its Specific Features
Previous Article in Journal
Review on Smart Gas Sensing Technology
Previous Article in Special Issue
Assessing the Ability of Luojia 1-01 Imagery to Detect Feeble Nighttime Lights
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Night-Light Image Restoration Method Based on Night Scattering Model for Luojia 1-01 Satellite

1
School of Geomatics, Liaoning Technical University, Fuxin 123000, China
2
State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(17), 3761; https://doi.org/10.3390/s19173761
Submission received: 30 June 2019 / Revised: 23 August 2019 / Accepted: 28 August 2019 / Published: 30 August 2019
(This article belongs to the Special Issue The Design, Data Processing and Applications of Luojia 1-01 Satellite)

Abstract

:
Aiming at solving the degradation problem of Luojia 1-01 night-light remote sensing images, the main reason for the “glow” phenomenon was analyzed. The APSF (Atmospheric Point Spread Function) template of night-light image was obtained from atmospheric source scattering. The template was used as the initial value in the regularization restoration model in this paper. Experiments were carried out using single point and regional images. The results demonstrate that the estimated APSF and restoration results of the method are better than those from other methods, and the image quality is improved after restoration.

1. Introduction

With the continuous development of remote sensing technology, different types of remote sensing data are becoming more and more abundant. In particular, night-time remote sensing, which can obtain the information of visible light sources on land and water with free cloud at night, has attracted much attention worldwide. The night-light data can capture information related to human activities, such as urban lighting, ship lighting, the light of oil well burning, and nature lighting, which has the unique ability to reflect human social activities [1]. Therefore, night-light remote sensing data can be used in many fields, such as monitoring socio-economic development [2] and war analysis [3]. Luojia 1-01 satellite is a scientific experimental satellite developed by Wuhan University. It has the function of night-light remote sensing and navigation enhancement, and was successfully launched on 2 June 2018. Its orbital height is 645 km, the ground pixel resolution more than 130 m, and the imaging width is 260 km. Luojia 1-01 satellite is equipped with a 4-megapixel CMOS sensor composed of 2048 × 2048 unique detectors that record weak nighttime light on Earth [4], and a GNSS(Global Navigation Satellite System) receiver for measuring and transmitting the satellite position and velocity [5]. Luojia 1-01 adopts the highly integrated design of imaging, compression, and storage. The mass of Luojia 1-01 is very light, and the rolling shutter array imaging design can last for a long time, and its noise level is low. The design with a high dynamic range greater than 96dB can meet the requirements of long exposure imaging at night. In addition, Luojia 1-01 can be used for imaging all day and night. Compared with other night-light remote sensing data, the data of Luojia 1-01 has the characteristics of high spatial resolution, high temporal resolution, high radiation resolution, and high signal to noise ratio. Thus, night-light remote sensing data of Luojia 1-01 have a wide application space in many fields such as estimating socio-economic parameters [6], evaluating artificial light pollution [7], and change detection [8]. Although Luojia 1-01 makes up for the shortcomings of the existing night-light remote sensing imaging platform in the design process, the image quality will be reduced in the process of camera imaging, under the influence of low illumination at night, optical imaging system, electronic signal conversion, space radiation [9], tremor of satellite platform [10,11], satellite viewing angles [12], atmospheric disturbance, imaging environment and other factors. Moreover, the main application field of night-light image is to estimate social and economic indicators, and the brightness value is particularly important for the study of these aspects. Therefore, under the constraints of the current satellite hardware level, and with the need to improve image quality through post-processing by using software, the main target of this paper is to restore degraded images by using image restoration. The estimation of PSF (Point Spread Function) is very important in image restoration, which can be solved by analyzing the degradation factors of images.
The main reasons for the quality degradation of Luojia 1-01 night-light image are as follows: First, the image degradation is caused by night imaging in a low-light photographing environment. The differences between the night-light image and the optical image taken in the daytime is that the scene being in the picture is a single one, there is no texture spectrum information and it is prone to producing noise [13], as shown in Figure 1a. Second, the image degradation is caused by the complex atmospheric environment of night imaging. Although satellite photography often uses clear and cloudless weather conditions, the effect of particles in the atmosphere on the multi-scattering of light remains significant, including due to clouds and haze. The general appearance of multiple scattering is the “glow” around the light source in various weather conditions, which is the most obvious cause of degradation for night-light images. These reasons eventually result in a comprehensive quality reduction effect, as shown in Figure 1b. Therefore, this paper focuses on the atmospheric environment during imaging, which is a major degradation factor, and studies the restoration method based on APSF estimation in the imaging environment at night.
The estimation of PSF and selection of image restoration model are main problems in image restoration. The PSF can be estimated individually or calculated at the time of recovery. As for the PSF estimation, there are many PSF estimation methods for daytime images, such as APSF estimation based on multiple scattering [14], APSF estimation based on the geometric relationship among sensors, target pixels and neighboring pixels [15]. There is also some research on APSF estimation under low illumination conditions. For example, the APSF estimation of common ground image under weather conditions such as mist, haze, and rain was studied [16,17,18]. The PSF estimation of the source degradation image was also studied [19]. There are more recovery methods during daytime or normal low illumination conditions. For example, Bayesian blind deconvolution image restoration was proposed by Karam [20], Degraded image restoration for uniform linear motion was proposed by Yang Guijun [21], constrained least squares Richardson-Lucy restoration was proposed by Richardson [22], PSF estimation and restoration of image linear features was proposed by Liu Zhengjun [23], a regularization model based on Lp norm was proposed by K. Bredies [24], and a Blind deconvolution method for normalized sparse constraints was proposed by Dilip [25]. Most of these methods use the geometric or texture information of the image for PSF estimation and restoration. In general, the regularization method can effectively use the prior information in the image to reduce the sensitivity of the model to noise, and is currently widely used in the field of image processing.
Due to the low spatial resolution of most commonly used night-light remote sensing data (such as DMSP-OLS, NPP-VIISR), the image lacks necessary details. At present, there is little research on image restoration of night-light remote sensing data. The Luojia 1-01 night-light remote sensing image has a higher spatial resolution, and the image brightness details are increased. The identifiable approximate point features are obvious, such as independent ships or large landmark buildings. These isolated point sources provide good data conditions for APSF estimation of night-light images. The night-light image data of Luojia 1-01 has been corrected by radiation and the image quality has been improved. However, radiometric correction only eliminates various distortions attached to the radiance in the image data [26]. The “glow” phenomenon caused by atmospheric scattering of nighttime lights belongs to the degradation problem in the image, which cannot be eliminated by radiometric correction, although the image quality can be further improved by using image restoration. For the current application of night-light remote sensing data, in addition to radiometric correction and other necessary processing, denoising and other processing should be carried out [27]. According to the image degradation and restoration model, restoration can achieve the effect of denoising and improving quality. Therefore, it can also provide more accurate data as a basis for estimating socio-economic indicators.
In this paper, the APSF estimation method and ratio sparse constraint restoration model were proposed aiming at problems of the fuzzy quality reduction and noise in night-light images of Luojia 1-01. In the problem of APSF estimation, the point object with better imaging quality in the image was selected, and the initial value of APSF was obtained according to the atmospheric transfer equation. Generally, the initial value of APSF estimated in this way may be accurate around the point object, but because the value will be different far away from the point, it means the initial value of APSF estimated cannot properly reflect the APSF of the whole image [28]. In order to get the best APSF for the whole image, the initial value needs to be further optimized. There is also some noise in night-light images, so we proposed to further optimize the APSF value by adopting the ratio sparse constraint restoration model and reduce the noise in images.

2. Degradation Model of Night-Light Image and Atmospheric Scattering Analysis

If the degradation function is a linear and spatially invariant process, the degradation model of night-light remote sensing images is shown in Equation (1) [29].
g ( u , v ) = f ( u , v ) h ( u , v ) + n ( u , v )
where, g ( u , v ) represents degraded images of observations, f ( u , v ) is the original clear image, h ( u , v ) is the image degradation matrix, stands for convolution, n ( u , v ) is noise. The image degradation discussed in this paper is only related to an atmospheric imaging environment.
In the ideal state, the light propagates along a straight line in the atmosphere without deviation in the direction, and shines directly through the pinhole to the imaging plane, which is called the “ideal imaging point”. However, in the actual situation, imaging environments such as the atmosphere, weather conditions, and clouds have an effect on the image. Because the particles in the atmosphere have an effect on the light, the light will be propagated towards a different direction. The light from the light source gets scattered multiple times in different directions through the atmosphere. As a result, the light will diffuse to the periphery of the “ideal imaging point” when passing through the pinhole, resulting in the phenomenon of “glow”. For night-light remote sensing images, the range of the glow phenomenon caused by atmospheric scattering at night is different from that photographed on the ground. Figure 2 shows the influence of atmosphere scattering on the imaging of the light source at night. The light from the light sources on the ground propagates in the atmosphere. Due to the influence of particles in the atmosphere, the received light includes multiple scattering and no scattering, and the range of the “glow” phenomenon will be different because the photographing position and height are different from the ground.
The effect of light sources interaction is different between night-light remote sensing images and common ground night images. The orbital altitude of Luojia 1-01 is 645 km, and the imaging distance is long. The photographing angle of the satellite can be generally regarded as vertical ground photographing, and the influence of the shape and type of light source is also smaller. A single pixel of an image with a spatial resolution of 130 m captures the sum of light sources information within the range of 130 m × 130 m, so the night-light image captures the comprehensive brightness value of multiple light sources. Moreover, the range of field of view produced is smaller than that of common ground photographing at night, and the intensity of the light source in most directions is cut off, resulting in a relatively small scattering scale. As such, the range of “glow” produced by satellite images is small, as shown in Figure 2. The ground night scene usually has a variety of active light sources, such as street lights, car lights, house building lights, and so on, while the imaging distance is short, usually within a range of several meters to several hundred meters. These sources interact with each other due to their relative position, strength, type, and shape, resulting in a brighter image than the actual natural atmospheric light and a larger “glow” range. Image restoration of common ground night images recover the information of buildings and people. However, image restoration of night-light remote sensing images recovers the true brightness information. All the above factors will affect the estimation of atmospheric point spread function.

3. APSF Estimation Model of Luojia 1-01

This paper analyzes the scattering characteristics of the point source at night in the atmosphere which is homogeneous and infinite to an extent, and estimates the shape of APSF of Luojia 1-01. In Figure 3, A represents an isolated point light source on the ground, and the atmospheric environment around it is homogeneous and isotropic. The distance between the camera and the point light source is the radiation depth R. θ represents the scattering angle of light. When the camera is imaging through the pinhole, it will produce “glow” on the image plane, it reflects the point diffusion effect caused by the scattering of an isolated point light source through the atmosphere. The “glow” phenomenon can reveal the effect of the atmosphere on the image. As long as we find an approximate point ground object with good image quality from the image, the APSF from night degraded image can be obtained.
APSF models under various weather conditions such as air, small aerosols, haze, fog and rain were deduced in reference [19]. According to reference [19] and the imaging conditions of Luojia 1-01, the APSF model of night-light remote sensing image is shown in Equations (2)–(5).
I ( T , μ ) = m = 0 ( g m ( T ) + g m + 1 ( T ) ) L m ( μ )
g m ( T ) = I 0 e β m T α m l o g T
α m = m + 1
β m = 2 m + 1 m ( 1 q m 1 )
where, I ( T , μ ) is the APSF derived from the model, m is the number of Legendre polynomial expansion terms, when m = 0, g0 = 0, gm(T) refers to light attenuation in different weather environments, L m μ is the Legendre polynomial of order m, it represents the diffusion of light due to scattering, I0 refers to the intensity of the light source, α m and β m are coefficients, q is the forward scattering coefficient, and T is the atmospheric optical thickness, while the atmospheric optical thickness T at night can be calculated using Equation (6) [19].
It can be seen from Equation (2) that the main influencing factors of the nighttime image APSF are the light attenuation and light diffusion in the weather environment. According to Equations (3)–(5), the essential influencing factors of g m ( T ) include I0, T, q and m. In the image with a spatial resolution of 130 m, large isolated ships at sea can be considered as a point light source, and brightness information of ship is used as value of I0. The value of m determines the number of Legendre polynomials to expand. q is generally in the range of 0 to 1, and the value of q is different under different weather conditions (Figure 4). Night-light image imaging is generally used in a clear night, and the value of q can range from 0 to 0.7.
T = σ R 3.912 V R
where σ is the extinction coefficient of atmosphere, R can refer to radial depth, while T can also be shown by visual distance V and radial depth R. When we estimate APSF, T is less than 1, then the coefficient m is greater than 10, otherwise m is less than 10. If T becomes bigger, then m becomes smaller.
The essential factor affecting the light diffusion L m ( μ ) is scattering angle θ, as shown in Equation (7).
μ = c o s θ
The APSF obtained by using Equation (2) also has the following features: (1) The APSF template is non-negative, meaning when the estimated value is less than 0, it is replaced with 0. (2) The value of APSF is less than 1, and the sum of all its elements is approximately 1 [30]. (3) The size of the APSF increases with the brightness of the light source. The brighter the light source, the greater the effect of blurring.

4. Image Restoration Method of Luojia 1-01 Based on Night Scattering Model

4.1. Image Restoration Model of Night-Light Remote Sensing

In the problem of image restoration, the regularization method has the characteristics of introducing target priori information, strong denoising ability, and high convergence stability, so it has many related and useful applications. Night-light image information mainly includes light information and black background information, and restoration is mainly to reduce degradation of the image and remove image noise. From the analysis in Section 2, it can be determined that the night-light image is taken with the light intensity, and the light will produce “glow” phenomenon when it hits images. APSF at an imaging time can be obtained by analyzing the approximately ideal isolated point light source, and this prior information can improve the “glow” phenomenon in the whole image. And according to Section 1, the initial value needs to be further optimized in order to obtain the best APSF for the whole image.
In terms of restoration methods, blind deconvolution method for normalized sparse constraints was proposed [25]. But the method is insensitive to noise, and the model solving is complex and slow. So L0.5/L2 norm is used as the sparse constraint term to maintain information such as the boundary of lights in the image. The L1 norm was also added to the recovery model to increase the effect of the shape of APSF on the recovery results. In addition, the initial APSF value obtained in Section 3 was optimized. There are good sparse L1 and L0 norm. But there is also the NP-hard (non-deterministic polynomial) problem for L0 norm to solve. This means it is difficult to optimize. Generally, the constrained solution of the L1 norm is equivalent to the L0 norm. So L1 norm can be used to replace L0 norm as the solution of sparse constraint term for images. And L1 norm has the effect of suppressing noise, which can weaken image noise. So our method can accelerate the solving speed and suppress the noise in the image. The restoration model of this paper is shown in Equation (8).
J = min x , k γ y k x 2 2 + x 0.5 x 2 + λ k 1
where, x is the clear image, y refers to the degraded image and k stands for the APSF of the image. y k x 2 2 is fidelity term, in order to ensure the minimum training error between the data and the original data in the calculation process. The second term is the L0.5/L2 regularization term, which is a sparse metric constraint that promotes the size of x that do not change, the third term is the APSF constraint term, the two terms guarantee that the test error of the model is small. γ is the weight value of the fidelity term, λ is the weight value of the APSF regularization term. For noisy images, as γ becomes larger, the effect of noise suppression will be stronger and the image will be smoother. As γ becomes smaller, the details of the image improve. But if γ becomes too small, it can lead to over sharpening. As λ increases, the noise in APSF is suppressed. The initial value of k is the APSF estimated in Section 3.

4.2. Model Solution

It is difficult to converge at the same time for the unknown variables x and k in Equation (8). However, it is easy to produce a minimum value in the local. After solving by using the method of alternating minimization [31], Equation (8) can be divided into two relatively simple and independent problems: Equations (9) and (10). The method of iterative shrinkage-thresholding method [32] and iterative weighted least square [33] are used to alternately iterate the image and k.
J ( x ) = min x γ y k x 2 2 + x 0.5 x 2
J ( k ) = min x γ y k x 2 2 + λ k 1
In Equation (9), x 0.5 / x 2 is non-convex and cannot be solved directly. In the solution process, Equation (9) can be transformed into a convex optimization problem of L1 norm, two loops of inside and outside can be set. First, x 2 can be solved by fixed x 0.5 . Then x 0.5 is used to solve x 2 . The stopping condition of x is determined by Equation (11).
t ( x ( i + 1 ) ) t ( x ( i ) ) > n
where, t ( x ( i + 1 ) ) and t ( x ( i ) ) are the cost function values of the last time and the previous time, respectively, n is a minimum value. Avoiding the Lp norm means not being guided at zero when calculating Equation (9). We can make a smooth approximation of the Lp norm first [34], as shown in Equation (12).
x P P = i = 1 M × N | x i | P i = 1 M × N ( | x i | 2 + ε ) P 2 )
where, ε > 0 refers to minimal constant, usually ε = 10 5 , x is a dimensional column vector of MN × 1 . The general form of x is ( x 1 , x 2 , x 3 , , x M × N ) , and i = 1 M × N | x i | 2 = x 1 2 + x 2 2 + x 3 2 + + x M × N 2 . So, Equation (9) can be change into Equation (13). We can take the partial derivative of Equation (13) with respect to x and Equation (14) can be obtained. The solution of Equation (14) is iterated by the steepest descent method. Finally, the optimal k is solved by Equation (10).
J ( x ) = m i n γ y k x 2 2 + i = 1 M × N ( | x i | 2 + ε ) P 2
J ( x ) x = 2 γ ( k ) T ( y k x ) + P [ i = 1 M × N ( | x i | 2 + ε ) P 2 1 ] · x
= 2 γ ( k ) T ( y k x ) + P · d i a g [ i = 1 M × N ( | x i | 2 + ε ) P 2 1 ] · x
The recovery process of night-light remote sensing image is shown in Figure 5. The detailed steps of the experiment are as follows:
  • Input the night-light remote sensing images taken by Luojia 1-01 satellite.
  • Estimate the APSF of the image. The atmospheric optical thickness T is calculated, the forward scattering coefficient q is estimated according to the weather condition, and put T, q and the light source intensity I0 into the atmospheric point spread function model to calculate the APSF value of the image.
  • Put APSF into the restoration model, and input parameters γ and λ , then perform image restoration with the Equation (8).
  • Output the restored image and evaluate its quality.
For image quality evaluation, the image with more detailed information and less “glow” phenomenon is better in vision. Variance and TenenGrad function were used for quantitative evaluation [35]. Variance refers to the degree of dispersion of image pixel gray values relative to the mean. If the variance is larger, it indicates that the gray value of the image is dispersed, and the image quality is better. The Tenengrad function is an image evaluation index that measures image sharpness and edge information. The larger the Tenengrad function value, the better the image quality.

5. APSF Estimation and Image Restoration Experiments in This Paper

5.1. Introduction of Experimental Data

This paper uses the night-light remote sensing images taken by Luojia 1-01 on 29 October 2018 as the experimental data. Three isolated point light sources with good quality in the Bohai sea (Figure 6) and night-light remote sensing images from Dongying in Shandong province and Tianjin were selected as the experimental areas. The characteristics of the experimental areas are shown in Table 1. The forward scattering coefficient q in the APSF estimation is 0.2, and the atmospheric optical thickness T is 1.2. The estimated APSF template is shown in Table 2, and the 3D result is shown in Figure 7. After restoration, the APSF template is shown in Table 3, and the 3D result is shown in Figure 8. The APSF template estimated by the method presented in this paper is estimated by analyzing the imaging blurred factors of night-light remote sensing images. It is obtained according to its own image information and imaging characteristics. The template can better reflect the characteristics of the image itself.

5.2. Results and Analysis

The “glow” phenomenon around the point source is seen as a concentric circle. The image before and after restoration with this method is shown in Figure 9. When comparing Figure 9c with Figure 9b, the maximum brightness value range of the pixels is increased, the contour line is more dense, and the diffusion range is obviously contracted, which indicates that the point light source is brighter and the blurred effect is weakened after the restoration. The restoration results of night-light remote sensing data in clear and haze weather from Tianjin for different nights were compared in Figure 10. The experimental results show that the method used by this paper has a good effect on dehazing. The effect of haze in the restoration results was decreased and the clearness of the image was increased.
In order to further verify the effectiveness of the APSF estimation and the night-light image restoration method experiments, the blind deconvolution model was used to perform the image restoration experiment and then compared with the restoration effect of the Gaussian PSF template. The blind deconvolution model with Gaussian PSF template used to perform an experiment of image restoration. This is Method Two. The blind deconvolution model with our APSF template used to perform experiment of image restoration, which is Method Three. The experiment results are shown in Figure 11, Figure 12, Figure 13, Figure 14, Figure 15, Figure 16 and Figure 17. At the same time, the experimental results were analyzed by using image variance (Table 4) and The Tenengrad function (Table 5).
By observing Figure 11, Figure 12 and Figure 13, it is found that compared with the original image data, in the point source recovery image obtained by the three restoration methods, the blurred effect of the point source is significantly reduced, and the brightness value is increased, which proves the effectiveness of the restoration result. At the same time, the restoration result of our method is visually superior to the Method Two and the Method Three, and the visual effect of the Method Three is better than that of Method Two. It is further illustrated that the APSF template estimated in this paper is superior to the Gaussian PSF template. At the same time, it also shows that the recovery method of this paper is more effective.
By observing Figure 14, Figure 15, Figure 16 and Figure 17, it is found that compared with the original image data, in the city recovery image obtained by the three methods has different degrees of image clarity, the noise is suppressed, the restoration result is brighter and clearer, and the image details are increased. The blurred effect has been reduced to varying degrees. Compared with Method Two and Method Three, the image details of the image after restoration are obviously increased, and the “glow” phenomenon is obviously improved, indicating that the method is effective.
By observing the evaluation of the two indicators in Table 4 and Table 5, the two indexes of the three groups of recovery results are higher than the original images, indicating that the three groups of restoration methods can improve the image quality. The recovery results of our method are higher than those of Method Two and Method Three, and the index of Method Three is better than that of Method Two.
Therefore, from the perspective of restoration effects and objective evaluation indicators, the visual evaluation and evaluation indicators of our method are superior to other methods, indicating that the image quality after restoration is improved, and the method is feasible.

6. Conclusions

Aiming at the “glow” phenomenon in the image, the characteristics of the Luojia 1-01 night-light image were analyzed. By analyzing the APSF parameters during night imaging, the APSF template was obtained, and the Luojia 1-01 night-light image restoration method based on the night scattering model was proposed. The images of single point and regional were used for the restoration experiment. The experimental results demonstrate that the method of the APSF estimation is feasible. The image “glow” phenomenon is improved after restoration with the method, the noise is reduced and the detail information is increased. The method used in this paper is applicable to images after radiometric correction, and can be used to improve the pre-processing stage of image quality. The processing results are helpful to improve the accuracy of the later index estimation. This method is of great significance to improve the quality of night-light remote sensing images and expand its application field.

Author Contributions

L.B. conceived and designed the experiments; Z.X. completed the experiment of the paper; G.Z. completed the APSF model estimation of the paper; Z.Z. completed the quality evaluation of the experiment; and all authors edited the paper.

Funding

This research was funded by Open Research Fund of State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, grant number 18T07. The National Natural Science Foundation of China, grant number 41801294. The Natural Science Foundation of Liaoning, grant number 20180551209.

Acknowledgments

We give thanks to the research team at Wuhan University for freely providing LuoJia 1-01 nighttime light imgery. Furthermore, the authors would like to thank the reviewers for their helpful comments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Li, D.; Li, X. An overview on data mining of nighttime light remote sensing. Acta Geod. Cartogr. Sin. 2015, 44, 591–601. [Google Scholar]
  2. Bennett, M.M.; Smith, L.C. Advances in using multitemporal night-time lights satellite imagery to detect, estimate, and monitor socioeconomic dynamics. Remote Sens. Environ. 2017, 192, 176–197. [Google Scholar] [CrossRef]
  3. Li, X.; Li, D.; Xu, H.; Wu, C. Intercalibration between DMSP/OLS and VIIRS night-time light images to evaluate city light dynamics of Syria’s major human settlement during Syrian Civil War. Int. J. Remote Sens. 2017, 38, 5934–5951. [Google Scholar] [CrossRef]
  4. Zhang, G.; Li, L.; Jiang, Y.; Shen, X.; Li, D. On-Orbit Relative Radiometric Calibration of the Night-Time Sensor of the LuoJia1-01 Satellite. Sensors 2018, 18, 4225. [Google Scholar] [CrossRef] [PubMed]
  5. Zhang, G.; Wang, J.; Jiang, Y.; Zhou, P.; Zhao, Y.; Xu, Y. On-Orbit Geometric Calibration and Validation of Luojia 1-01 Night-Light Satellite. Remote Sens. 2019, 11, 264. [Google Scholar] [CrossRef]
  6. Zhang, G.; Guo, X.; Li, D.; Jiang, B. Evaluating the Potential of LJ1-01 Nighttime Light Data for Modeling Socio-Economic Parameters. Sensors 2019, 19, 1465. [Google Scholar] [CrossRef] [PubMed]
  7. Jiang, W.; He, G.; Long, T.; Guo, H.; Yin, R.; Leng, W.; Liu, H.; Wang, G. Potentiality of Using Luojia 1-01 Nighttime Light Imagery to Investigate Artificial Light Pollution. Sensors 2018, 18, 2900. [Google Scholar] [CrossRef]
  8. Li, X.; Li, X.; Li, D.; He, X.; Jendryke, M. A preliminary investigation of Luojia-1 night-time light imagery. Remote Sens. Lett. 2019, 10, 526–535. [Google Scholar] [CrossRef]
  9. Kimoto, Y.; Nemoto, N.; Matsumoto, H.; Ueno, K.; Goka, T.; Omodaka, T. Space radiation environment and its effects on satellites: Analysis of the first data from TEDA on board ADEOS-II. IEEE Trans. Nucl. Sci. 2005, 52, 1574–1578. [Google Scholar] [CrossRef]
  10. Sudey, J.; Schulman, J.R. In-orbit measurements of Landsat-4 thematic mapper dynamic disturbances. Acta Astronaut. 1985, 12, 485–503. [Google Scholar] [CrossRef]
  11. Toyoshima, M.; Araki, K. In-orbit measurements of short term attitude and vibrational environment on the Engineering Test Satellite VI using laser communication equipment. Opt. Eng. 2001, 40, 827–832. [Google Scholar]
  12. Li, X.; Ma, R.; Zhang, Q.; Li, D.; Liu, S.; He, T.; Zhao, L. Anisotropic characteristic of artificial light at night—Systematic investigation with VIIRS DNB multi-temporal observations. Remote Sens. Environ. 2019, 233, 111357. [Google Scholar] [CrossRef]
  13. Zhang, Y. The Improvement of Micro-Satellite’s Low-Light Remote Sensing Image Quality. Master’s Thesis, Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun, China, 1 October 2016. [Google Scholar]
  14. Qiu, X.; Dai, M.; Yin, C. UAV remote sensing atmospheric degradation image restoration based on multiple scattering APSF estimation. Optoelectron. Lett. 2017, 13, 386–391. [Google Scholar] [CrossRef]
  15. Hu, B.X.; Li, X.W.; Zhu, C.G.; Strahler, A.H. Deriving the Anisotropic Atmospheric Point Spread Function of Off-nadir Remote Sensing. J. Image Graph. 1996, 1, 19–29. [Google Scholar]
  16. Metari, S.; Deschênes, F. A new convolution kernel for atmospheric point spread function applied to computer vision. In Proceedings of the IEEE International Conference on Computer Vision, Rio DE janeiro, Brazil, 14–21 October 2007; pp. 1–8. [Google Scholar]
  17. He, R.; Wang, Z.; Fan, Y.; Dagan Feng, D. Multiple scattering model based single image dehazing. In Proceedings of the 8th Conference on Industrial Electronics and Applications, ICIEA, Melbourne, Australia, 19–21 June 2013; pp. 733–737. [Google Scholar]
  18. Guo, F.; Tang, J.; Xiao, X. Foggy Scene Rendering Based on Transmission Map Estimation. Int. J. Comput. Games Technol. 2014, 2014, 10. [Google Scholar] [CrossRef]
  19. Narasimhan, S.G.; Nayar, S.K. Shedding Light on the Weather. In Proceedings of the IEEE Computer Society Conference on Computer Vision & Pattern Recognition, Madison, WI, USA, 18–20 June 2003; pp. 1–8. [Google Scholar]
  20. Dowy, H.G.; Kareem, H.H.; Abood, Z.M.; Ghada, S.K. Blurred Image Restoration with Unknown Point Spread Function. Al-Mustansiriyah J. Sci. 2018, 29, 189–194. [Google Scholar] [Green Version]
  21. Yang, G.; Xing, Z.; Huang, W.; Wang, J. On-orbit MTF estimation and restoration for CCD cameras of environment and disaster reduction small satellites. China Univ. Min. Technol. 2012, 40, 481–486. [Google Scholar]
  22. Rudin, L.I.; Osher, S.; Fatemi, E. Nonlinear total variation based noise removal algorithms. Phys. D: Nonlinear Phenom. 1992, 60, 259–268. [Google Scholar] [CrossRef]
  23. Liu, Z.; Wang, C.; Luo, C. Estimation of CBERS-1 Point Spread Function and Image Restoration. J. Remote Sens. 2004, 8, 234–238. [Google Scholar]
  24. Bredies, K.; Lorenz, D.A. Iterated hard shrinkage for minimization problems with sparsity constraints. SIAM J. Sci. Comput. 2008, 30, 657–683. [Google Scholar] [CrossRef]
  25. Krishnan, D.; Tay, T.; Fergus, R. Blind deconvolution using a normalized sparsity measure. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. IEEE Computer Society, Washington, DC, USA, 20–25 June 2011; pp. 233–240. [Google Scholar]
  26. Tian, Q.; Zheng, L.; Tong, L. Image-based atmospheric radiation correction and refletance retrieval methods. J. Appl. Meteorol. 1998, 4, 456–461. [Google Scholar]
  27. Li, X.; Zhao, L.; Li, D.; Xu, H. Mapping Urban Extent Using Luojia 1-01 Nighttime Light Imagery. Sensors 2018, 18, 3665. [Google Scholar] [CrossRef] [PubMed]
  28. Zheng, Y.; Huang, W.; Pan, Y.; Xu, M. Optimal PSF Estimation for Simple Optical System Using a Wide-Band Sensor Based on PSF Measurement. Sensors 2018, 18, 3552. [Google Scholar] [CrossRef] [PubMed]
  29. Rafael, C.; Woods, R.E. Digial Image Processing; Publishing House of Electronics Industry: Beijing, China, 2007. [Google Scholar]
  30. Qu, R. Regularization Image Restoration Methods Based on Point Spreed Function Estimation. Master’s Thesis, Harbin Institute of Technology, Harbin, China, 1 June 2016. [Google Scholar]
  31. Cho, S.; Lee, S. Fast motion deblurring. ACM Trans. Graph. 2009, 28, 145. [Google Scholar] [CrossRef]
  32. Wu, G.; Luo, S. Adaptive fixed-point iterative shrinkage/thresholding algorithm for MR imaging reconstruction using compressed sensing. Magn. Reson. Imaging 2014, 32, 372–378. [Google Scholar] [CrossRef] [PubMed]
  33. Rubin, D.B. Iteratively reweighted Least Squares. In Encyclopedia of Statistical Sciences; John Wiley & Sons: Hoboken, NJ, USA, 2004. [Google Scholar] [CrossRef]
  34. Wang, Z. Technique of SAR Image Superrssolution; Science Press: Beijing, China, 2007. [Google Scholar]
  35. Zhai, Y.; Zhou, D.; Liu, Y.; Liu, S.; Peng, K. Design of performance evaluation index focusing function and selection of optimal function. Opt. J. 2011, 4, 242–252. [Google Scholar]
Figure 1. Degradation phenomenon of Luojia 1-01 night-light date: (a) Isolated noise degradation phenomenon in Beijing; (b) Comprehensive degradation phenomenon in Wuhan.
Figure 1. Degradation phenomenon of Luojia 1-01 night-light date: (a) Isolated noise degradation phenomenon in Beijing; (b) Comprehensive degradation phenomenon in Wuhan.
Sensors 19 03761 g001
Figure 2. Multiple scattering glow model of light from a source to a sensor.
Figure 2. Multiple scattering glow model of light from a source to a sensor.
Sensors 19 03761 g002
Figure 3. Scattering of an isotropic point light source in homogeneous atmosphere.
Figure 3. Scattering of an isotropic point light source in homogeneous atmosphere.
Sensors 19 03761 g003
Figure 4. The range of forward scattering coefficient q under different weather conditions [19].
Figure 4. The range of forward scattering coefficient q under different weather conditions [19].
Sensors 19 03761 g004
Figure 5. Flow chart of image restoration of night-light remote sensing.
Figure 5. Flow chart of image restoration of night-light remote sensing.
Sensors 19 03761 g005
Figure 6. Point light distribution.
Figure 6. Point light distribution.
Sensors 19 03761 g006
Figure 7. 3D results.
Figure 7. 3D results.
Sensors 19 03761 g007
Figure 8. 3D results of restored APSF.
Figure 8. 3D results of restored APSF.
Sensors 19 03761 g008
Figure 9. (a) The point source and its surrounding glow phenomenon; (b) Approximate representation of the blurred point light source; (c) Approximate representation of the point light source after restoration.
Figure 9. (a) The point source and its surrounding glow phenomenon; (b) Approximate representation of the blurred point light source; (c) Approximate representation of the point light source after restoration.
Sensors 19 03761 g009
Figure 10. The restoration results of Tianjin in clear and hazy nights. (a) The original image of the clear night; (b) The restoration result of (a), T = 1.2, q = 0.2; (c) The original image of the hazy night; (d) The restoration result of (c), T = 4, q = 0.75.
Figure 10. The restoration results of Tianjin in clear and hazy nights. (a) The original image of the clear night; (b) The restoration result of (a), T = 1.2, q = 0.2; (c) The original image of the hazy night; (d) The restoration result of (c), T = 4, q = 0.75.
Sensors 19 03761 g010
Figure 11. The restoration results of Point 1. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Figure 11. The restoration results of Point 1. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Sensors 19 03761 g011
Figure 12. The restoration results of Point 2. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Figure 12. The restoration results of Point 2. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Sensors 19 03761 g012
Figure 13. The restoration results of Point 3. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Figure 13. The restoration results of Point 3. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Sensors 19 03761 g013
Figure 14. The restoration results of Dongying. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Figure 14. The restoration results of Dongying. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Sensors 19 03761 g014
Figure 15. The detail restoration results of Dongying. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Figure 15. The detail restoration results of Dongying. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Sensors 19 03761 g015
Figure 16. The restoration results of Tianjin. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Figure 16. The restoration results of Tianjin. (a) The original image; (b) The method used by this paper; (c) Method Two; (d) Method Three.
Sensors 19 03761 g016
Figure 17. The detail restoration results of Tianjin. (a) The original image; (b) The method of this paper; (c) Method Two; (d) Method Three.
Figure 17. The detail restoration results of Tianjin. (a) The original image; (b) The method of this paper; (c) Method Two; (d) Method Three.
Sensors 19 03761 g017
Table 1. Characteristics of experimental areas.
Table 1. Characteristics of experimental areas.
Experimental AreasImaging Time (UTC)Weather
Condition
Positioning Accuracy/mSpatial Resolution/mCharacteristics of Experimental Areas
Three isolated point light sources2018.10.29
14:17:34
Clear82130The image quality of point light source is good, clear and bright, without interference from other light sources.
Dongying2018.10.29
14:17:34
Clear82130Geography plays an important role in the Bohai economic zone.
Tianjin2018.10.29
14:17:34
Clear82130The largest open coastal city in northern China, with a large population, plays an important geographical, economic and social role.
2018.09.26
14:12:10
Haze399
Table 2. The APSF template estimated.
Table 2. The APSF template estimated.
0.00000.00000.00000.00010.00000.00000.00000.00000.00000.00000.0001
0.00010.00050.00120.00090.00060.00110.00070.00060.00040.00010.0001
0.00000.00010.00080.00200.00320.00450.00380.00260.00160.00090.0003
0.00010.00040.00180.00490.01060.01560.01620.00960.00450.00200.0007
0.00030.00120.00410.01130.03690.06750.06950.03260.01270.00410.0016
0.00080.00240.00760.01700.05080.12670.10350.05390.01650.00470.0015
0.00000.00150.00500.01080.01940.06460.05810.03310.01010.00380.0011
0.00000.00040.00130.00320.00690.01130.01260.00950.00410.00210.0012
0.00010.00000.00020.00090.00180.00270.00290.00260.00160.00080.0001
0.00010.00000.00070.00070.00060.00080.00040.00040.00050.00060.0001
0.00010.00000.00000.00010.00040.00030.00010.00010.00040.00010.0000
Table 3. Restored APSF template.
Table 3. Restored APSF template.
0.00000.00000.00000.00000.00000.00320.00000.00640.00000.00310.0000
0.00000.00320.00000.00000.00000.00000.00600.00000.00510.00000.0000
0.00000.00000.00000.00420.00820.00740.00000.00630.00000.00000.0000
0.00000.00000.01850.00000.00000.00500.00620.00550.00000.00380.0000
0.00000.00000.04110.00000.05240.02130.01860.00400.00590.00000.0103
0.01700.00000.01930.00000.14340.25160.00000.01570.00000.00570.0000
0.00000.01100.00000.03070.01600.11250.00000.01230.00630.00830.0000
0.00000.00440.00490.00000.00800.00000.01010.00000.00000.00310.0000
0.00460.00000.00800.00000.00990.00640.00590.00610.00000.00000.0046
0.00000.00280.00000.00450.00000.00350.00000.00000.00280.00000.0000
0.00000.00000.00680.00270.00280.00000.00000.00000.00000.00000.0027
Table 4. Quality evaluation of the variance of deblurred images.
Table 4. Quality evaluation of the variance of deblurred images.
VariancePoint 1Point 2Point 3DongyingTianjin
The original image10.219.469.3760.7767.20
The method used by this paper11.9511.3510.8764.4971.98
Method Two10.389.689.7361.8868.45
Method Three10.6310.0110.1761.9470.05
Table 5. Quality evaluation of the Tenengrad of deblurred images.
Table 5. Quality evaluation of the Tenengrad of deblurred images.
TenenGradPoint 1Point 2Point 3DongyingTianjin
The original image3.572.802.912517.44953.6
The method of this paper6.764.934.80577514122
Method Two4.183.183.663357.16696
Method Three5.184.024.763402.57240.4

Share and Cite

MDPI and ACS Style

Bu, L.; Xu, Z.; Zhang, G.; Zhang, Z. Night-Light Image Restoration Method Based on Night Scattering Model for Luojia 1-01 Satellite. Sensors 2019, 19, 3761. https://doi.org/10.3390/s19173761

AMA Style

Bu L, Xu Z, Zhang G, Zhang Z. Night-Light Image Restoration Method Based on Night Scattering Model for Luojia 1-01 Satellite. Sensors. 2019; 19(17):3761. https://doi.org/10.3390/s19173761

Chicago/Turabian Style

Bu, Lijing, Zhenghui Xu, Guo Zhang, and Zhengpeng Zhang. 2019. "Night-Light Image Restoration Method Based on Night Scattering Model for Luojia 1-01 Satellite" Sensors 19, no. 17: 3761. https://doi.org/10.3390/s19173761

APA Style

Bu, L., Xu, Z., Zhang, G., & Zhang, Z. (2019). Night-Light Image Restoration Method Based on Night Scattering Model for Luojia 1-01 Satellite. Sensors, 19(17), 3761. https://doi.org/10.3390/s19173761

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop