An Image Fusion Algorithm for Sustainable Development Goals Satellite-1 Night-Time Light Images Based on Optimized Image Stretching and Dual-Domain Fusion
Abstract
:1. Introduction
- An OIS method combining linear stretching and gamma stretching is proposed. The stretched PAN image can be closer to MS images and is applied to GIU images in different world regions.
- A DDF method using different fusion methods at different luminance levels is pro-posed and applied to GIU images from different world regions.
- The comprehensive evaluation of different fusion methods for different NL images using multiple evaluation metrics proves that the fusion method proposed in this study is more effective.
2. Materials and Analysis
2.1. Datasets and Study Area
2.2. Dark Value Analysis of NL Images
3. Methods
3.1. Optimized Image Stretching
Algorithm 1. OIS Algorithm |
Input: SDGSAT-1 GIU MS, PAN image original_pan, original_ms 1: Use rgb2gray function to convert original_ms to grayscale image and calculate mean, standard deviation target_mean, target_std 2: Set initial parameters a, b, c, d = 1, 1, 0, 0 3: Calculate the mean, standard deviation, minimum, and maximum values of the stretched image under the current parameters current_mean, current_std, current_min, current_max 4: Set penalties for not meeting the optimization objective penalty = (current_mean − target_mean)^2 + (current_std − target_std)^2 + (current_min − 1)^2 + ((current_max − 4095)/10)^2 5: Calculate the derivative of the optimized stretch formula derivative = a × b × .^(b − 1) + c 6: Set nonlinear inequality constraints, derivative > 0 7: Apply the fmincon objective optimization function and iteratively modify the parameters under deviation penalties and nonlinear inequality constraints until the penalty no longer decreases. 8: Apply the final parameters for optimized stretching to obtain and calculate the current maximum value transmax 9: Calculate the column number N of original_pan 10: for i in range (N): 11: get the current pixel value 12: if > 2730 1: 13: = 2730 + ( − 2730)·(4095 − 2730)/(transmax − 2730) 14: elif < 1: 15: = 1 Output: PAN image after applying optimized stretching OIS_pan |
1 The threshold value of 2700 was chosen because it is possible to stretch pixels larger than 4095 without changing the pixels with smaller pixel values too much. |
3.2. Dual-Domain Fusion
4. Experiments and Results
4.1. Image Fusion Quality Assessment Methods
4.1.1. Subjective Evaluation
4.1.2. Objective Evaluation
4.2. Results of the Four Datasets
4.2.1. Results for the Beijing Dataset
4.2.2. Results for the Shanghai Dataset
4.2.3. Results for the Moscow Dataset
4.2.4. Results for the New York Dataset
5. Discussion
- Using a single Otsu algorithm to select the thresholds for light and dark zone selection is not always useful.
- The OIS algorithm proposed in this study has more OIS parameters and relies on MATLAB’s objective optimization function to compute the parameters, which leads to some uncertainties.
- The image fusion evaluation metrics used in this study are those commonly used for daytime optical images, and no experiments apply metrics that are applicable to NL images.
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Ma, T.; Zhou, C.; Pei, T.; Haynie, S.; Fan, J. Quantitative estimation of urbanization dynamics using time series of DMSP/OLS nighttime light data: A comparative case study from China’s cities. Remote Sens. Environ. 2012, 124, 99–107. [Google Scholar] [CrossRef]
- Stokes, E.C.; Seto, K.C. Characterizing urban infrastructural transitions for the Sustainable Development Goals using multi-temporal land, population, and nighttime light data. Remote Sens. Environ. 2019, 234, 111430. [Google Scholar] [CrossRef]
- Bailang, Y.; Congxiao, W.; Wenkang, G.; Zuoqi, C.; Kaifang, S.; Bin, W.; Yuchen, H.; Qiaoxuan, L.; Jianping, W. Nighttime light remote sensing and urban studies: Data, methods, applications, and prospects. Natl. Remote Sens. Bull. 2021, 25, 342–364. [Google Scholar]
- Li, C.; Chen, F.; Wang, N.; Yu, B.; Wang, L. SDGSAT-1 nighttime light data improve village-scale built-up delineation. Remote Sens. Environ. 2023, 297, 113764. [Google Scholar] [CrossRef]
- Jia, M.; Zeng, H.; Chen, Z.; Wang, Z.; Ren, C.; Mao, D.; Zhao, C.; Zhang, R.; Wang, Y. Nighttime light in China’s coastal zone: The type classification approach using SDGSAT-1 Glimmer Imager. Remote Sens. Environ. 2024, 305, 114104. [Google Scholar] [CrossRef]
- Lin, Z.; Jiao, W.; Liu, H.; Long, T.; Liu, Y.; Wei, S.; He, G.; Portnov, B.A.; Trop, T.; Liu, M. Modelling the public perception of urban public space lighting based on SDGSAT-1 glimmer imagery: A case study in Beijing, China. Sustain. Cities Soc. 2023, 88, 104272. [Google Scholar] [CrossRef]
- Liu, S.; Zhou, Y.; Wang, F.; Wang, S.; Wang, Z.; Wang, Y.; Qin, G.; Wang, P.; Liu, M.; Huang, L. Lighting characteristics of public space in urban functional areas based on SDGSAT-1 glimmer imagery: A case study in Beijing, China. Remote Sens. Environ. 2024, 306, 114137. [Google Scholar] [CrossRef]
- Quan, X.; Song, X.; Miao, J.; Huang, C.; Gao, F.; Li, J.; Ying, L. Study on the substitutability of nighttime light data for SDG indicators: A case study of Yunnan Province. Front. Environ. Sci. 2023, 11, 1309547. [Google Scholar] [CrossRef]
- Xie, Q.; Li, H.; Jing, L.; Zhang, K. Road Extraction Based on Deep Learning Using Sdgsat-1 Nighttime Light Data. In Proceedings of the IGARSS 2024—2024 IEEE International Geoscience and Remote Sensing Symposium, Athens, Greece, 7–12 July 2024; pp. 8033–8036. [Google Scholar]
- Guo, B.; Hu, D.; Zheng, Q. Potentiality of SDGSAT-1 glimmer imagery to investigate the spatial variability in nighttime lights. Int. J. Appl. Earth Obs. Geoinf. 2023, 119, 103313. [Google Scholar] [CrossRef]
- Chang, D.; Wang, Q.; Yang, J.; Xu, W. Research on road extraction method based on sustainable development goals satellite-1 nighttime light data. Remote Sens. 2022, 14, 6015. [Google Scholar] [CrossRef]
- Chen, J.; Cheng, B.; Zhang, X.; Long, T.; Chen, B.; Wang, G.; Zhang, D. A TIR-visible automatic registration and geometric correction method for SDGSAT-1 thermal infrared image based on modified RIFT. Remote Sens. 2022, 14, 1393. [Google Scholar] [CrossRef]
- Zhang, D.; Cheng, B.; Shi, L.; Gao, J.; Long, T.; Chen, B.; Wang, G. A destriping algorithm for SDGSAT-1 nighttime light images based on anomaly detection and spectral similarity restoration. Remote Sens. 2022, 14, 5544. [Google Scholar] [CrossRef]
- Guo, H.; Chen, H.; Chen, L.; Fu, B. Progress on CASEarth satellite development. Chin. J. Space Sci. 2020, 40, 707–717. [Google Scholar] [CrossRef]
- Jiang, M.; Shen, H.; Li, J.; Yuan, Q.; Zhang, L. A differential information residual convolutional neural network for pansharpening. ISPRS J. Photogramm. Remote Sens. 2020, 163, 257–271. [Google Scholar] [CrossRef]
- González-Audícana, M.; Saleta, J.L.; Catalán, R.G.; García, R. Fusion of multispectral and panchromatic images using improved IHS and PCA mergers based on wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1291–1299. [Google Scholar] [CrossRef]
- Chavez, P.; Sides, S.C.; Anderson, J.A. Comparison of three different methods to merge multiresolution and multispectral data- Landsat TM and SPOT panchromatic. Photogramm. Eng. Remote Sens. 1991, 57, 295–303. [Google Scholar]
- Shah, V.P.; Younan, N.H.; King, R.L. An efficient pan-sharpening method via a combined adaptive PCA approach and contourlets. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1323–1335. [Google Scholar] [CrossRef]
- Laben, C.A.; Brower, B.V. Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening. Google Patents US6011875A, 4 January 2000. [Google Scholar]
- Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A.; Selva, M. MTF-tailored multiscale fusion of high-resolution MS and Pan imagery. Photogramm. Eng. Remote Sens. 2006, 72, 591–596. [Google Scholar] [CrossRef]
- Li, S.; Kwok, J.T.; Wang, Y. Using the discrete wavelet frame transform to merge Landsat TM and SPOT panchromatic images. Inf. Fusion 2002, 3, 17–23. [Google Scholar] [CrossRef]
- Wei, Q.; Dobigeon, N.; Tourneret, J.-Y. Fast fusion of multi-band images based on solving a Sylvester equation. IEEE Trans. Image Process. 2015, 24, 4109–4121. [Google Scholar] [CrossRef]
- Ma, J.; Yu, W.; Chen, C.; Liang, P.; Guo, X.; Jiang, J. Pan-GAN: An unsupervised pan-sharpening method for remote sensing image fusion. Inf. Fusion 2020, 62, 110–120. [Google Scholar] [CrossRef]
- Chen, J.; Pan, Y.; Chen, Y. Remote sensing image fusion based on Bayesian GAN. arXiv 2020, arXiv:2009.09465. [Google Scholar]
- Masi, G.; Cozzolino, D.; Verdoliva, L.; Scarpa, G. Pansharpening by convolutional neural networks. Remote Sens. 2016, 8, 594. [Google Scholar] [CrossRef]
- Shakya, A.; Biswas, M.; Pal, M. CNN-based fusion and classification of SAR and Optical data. Int. J. Remote Sens. 2020, 41, 8839–8861. [Google Scholar] [CrossRef]
- Huang, W.; Xiao, L.; Wei, Z.; Liu, H.; Tang, S. A new pan-sharpening method with deep neural networks. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1037–1041. [Google Scholar] [CrossRef]
- Azarang, A.; Manoochehri, H.E.; Kehtarnavaz, N. Convolutional autoencoder-based multispectral image fusion. IEEE Access 2019, 7, 35673–35683. [Google Scholar] [CrossRef]
- Rousseeuw, P.J. Silhouettes: A graphical aid to the interpretation and validation of cluster analysis. J. Comput. Appl. Math. 1987, 20, 53–65. [Google Scholar] [CrossRef]
- Davies, D.L.; Bouldin, D.W. A cluster separation measure. IEEE Trans. Pattern Anal. Mach. Intell. 1979, PAMI-1, 224–227. [Google Scholar] [CrossRef]
- Otsu, N. A threshold selection method from gray-level histograms. Automatica 1975, 11, 23–27. [Google Scholar] [CrossRef]
- Wald, L.; Ranchin, T.; Mangolini, M. Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images. Photogramm. Eng. Remote Sens. 1997, 63, 691–699. [Google Scholar]
- Jagalingam, P.; Hegde, A.V. A review of quality metrics for fused image. Aquat. Procedia 2015, 4, 133–142. [Google Scholar] [CrossRef]
- Vivone, G.; Alparone, L.; Chanussot, J.; Dalla Mura, M.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A critical comparison among pansharpening algorithms. IEEE Trans. Geosci. Remote Sens. 2014, 53, 2565–2586. [Google Scholar] [CrossRef]
- Qu, G.; Zhang, D.; Yan, P. Information measure for performance of image fusion. Electron. Lett. 2002, 38, 313–315. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
- Santini, S.; Jain, R. Similarity measures. IEEE Trans. Pattern Anal. Mach. Intell. 1999, 21, 871–883. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C. A universal image quality index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
- Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A.; Nencini, F.; Selva, M. Multispectral and panchromatic data fusion assessment without reference. Photogramm. Eng. Remote Sens. 2008, 74, 193–200. [Google Scholar] [CrossRef]
- Zhou, B.; Khosla, A.; Lapedriza, A.; Oliva, A.; Torralba, A. Learning deep features for discriminative localization. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 2921–2929. [Google Scholar]
Satellite/Sensor | Operational Years | Spatial Resolution (m) | Temporal Resolution | Spectral Bands |
---|---|---|---|---|
DMSP/OLS | 1992 to 2013 | 2700 | Daily global coverage | PAN: 400–1100 nm |
NPP/VIIRS | 2011 to present | 750 | PAN: 505–890 nm | |
ISS | 2003 to present | 5–200 | Irregular image acquisition | RGB |
Luojia1-01 | 2018 to 2019 | 130 | Global coverage in 15 days | PAN: 460–980 nm |
SDGSAT-1/GIU | 2021 to present | PAN: 10 MS: 40 | Global coverage in 11 days | PAN: 430–900 nm; B: 430–520 nm; G: 520–615 nm; R: 615–900 nm. |
ID | Location | Date | Time (UTC) | Image Size (MS/PAN) |
---|---|---|---|---|
1 | Beijing, China | 16 April 2024 | 13:13 | 800 × 800/3200 × 3200 |
2 | Shanghai, China | 18 May 2024 | 13:05 | 800 × 800/3200 × 3200 |
3 | Moscow, Russia | 27 February 2024 | 18:22 | 800 × 800/3200 × 3200 |
4 | New York, USA | 7 February 2024 | 02:00 | 800 × 800/3200 × 3200 |
ID | Location | Band | Mean | Standard Deviation | Minimum | Maximum |
---|---|---|---|---|---|---|
1 | Beijing, China | R | 342 | 509 | 1 | 4095 |
G | 319 | 500 | 1 | 4095 | ||
B | 98 | 263 | 1 | 4095 | ||
PAN | 25 | 120 | 1 | 4095 | ||
2 | Shanghai, China | R | 439 | 690 | 1 | 4095 |
G | 478 | 710 | 1 | 4095 | ||
B | 186 | 427 | 1 | 4095 | ||
PAN | 47 | 228 | 1 | 4095 | ||
3 | Moscow, Russia | R | 850 | 922 | 1 | 4095 |
G | 847 | 883 | 1 | 4095 | ||
B | 265 | 377 | 1 | 4095 | ||
PAN | 58 | 142 | 1 | 4095 | ||
4 | New York, USA | R | 281 | 441 | 1 | 4095 |
G | 334 | 544 | 1 | 4095 | ||
B | 110 | 254 | 1 | 4095 | ||
PAN | 22 | 106 | 1 | 4095 |
Location | Image | Mean | Standard Deviation | Minimum | Maximum |
---|---|---|---|---|---|
Beijing, China | Original MS | 254 | 401 | 1 | 4095 |
Original PAN | 25 | 120 | 1 | 4095 | |
OIS PAN | 253 | 391 | 1 | 4095 | |
Shanghai, China | Original MS | 368 | 591 | 1 | 4095 |
Original PAN | 47 | 228 | 1 | 4095 | |
OIS PAN | 359 | 527 | 1 | 4095 | |
Moscow, Russia | Original MS | 654 | 700 | 1 | 4095 |
Original PAN | 58 | 142 | 1 | 4095 | |
OIS PAN | 646 | 655 | 1 | 4095 | |
New York, USA | Original MS | 242 | 399 | 1 | 4095 |
Original PAN | 22 | 106 | 1 | 4095 | |
OIS PAN | 241 | 391 | 1 | 4095 |
Method | ERGAS | SSIM | SM | MI | QNR |
---|---|---|---|---|---|
HSV | 48.333 | 0.289 | 0.307 | 0.654 | 0.212 |
GS | 30.786 | 0.672 | 0.564 | 0.559 | 0.941 |
WT | 27.486 | 0.822 | 0.624 | 0.865 | 0.915 |
Laplacian | 49.797 | 0.262 | 0.281 | 0.280 | 0.367 |
DDF | 30.239 | 0.747 | 0.562 | 0.770 | 0.886 |
DO | 22.858 (3) | 0.827 (2) | 0.709 (2) | 0.922 (3) | 0.960 (3) |
DW | 21.005 (2) | 0.806 (3) | 0.626 (3) | 0.961 (2) | 0.963 (2) |
DWO | 20.161 (1) | 0.855 (1) | 0.724 (1) | 0.983 (1) | 0.973 (1) |
Method | ERGAS | SSIM | SM | MI | QNR |
---|---|---|---|---|---|
HSV | 43.491 | 0.223 | 0.372 | 0.786 | 0.182 |
GS | 29.802 | 0.594 | 0.533 | 0.585 | 0.927 |
WT | 18.026 (3) | 0.759 | 0.584 | 0.999 | 0.960 (3) |
Laplacian | 45.767 | 0.182 | 0.338 | 0.260 | 0.377 |
DDF | 29.273 | 0.665 | 0.531 | 0.783 | 0.881 |
DO | 19.598 | 0.788 (2) | 0.680 (2) | 1.014 (3) | 0.963 (2) |
DW | 17.912 (2) | 0.778 (3) | 0.611 (3) | 1.065 (2) | 0.948 |
DWO | 16.822 (1) | 0.827 (1) | 0.698 (1) | 1.085 (1) | 0.974 (1) |
Method | ERGAS | SSIM | SM | MI | QNR |
---|---|---|---|---|---|
HSV | 35.844 | 0.199 | 0.241 | 0.927 | 0.216 |
GS | 21.471 | 0.598 | 0.496 | 0.794 | 0.948 |
WT | 12.315 (3) | 0.728 | 0.543 | 1.323 (2) | 0.969 (3) |
Laplacian | 37.318 | 0.171 | 0.212 | 0.257 | 0.531 |
DDF | 20.737 | 0.660 | 0.494 | 0.974 | 0.936 |
DO | 13.451 | 0.737 (3) | 0.669 (2) | 1.199 | 0.968 |
DW | 11.627 (1) | 0.754 (2) | 0.588 (3) | 1.364 (1) | 0.971 (2) |
DWO | 11.697 (2) | 0.786 (1) | 0.683 (1) | 1.315 (3) | 0.978 (1) |
Method | ERGAS | SSIM | SM | MI | QNR |
---|---|---|---|---|---|
HSV | 47.109 | 0.376 | 0.305 | 0.600 | 0.188 |
GS | 32.282 | 0.699 | 0.541 | 0.614 | 0.939 |
WT | 19.345 (3) | 0.815 | 0.558 | 1.047 (3) | 0.971 (2) |
Laplacian | 48.688 | 0.352 | 0.280 | 0.230 | 0.383 |
DDF | 31.281 | 0.791 | 0.540 | 0.879 | 0.901 |
DO | 20.264 | 0.841 (3) | 0.719 (2) | 0.973 | 0.969 (3) |
DW | 18.068 (2) | 0.847 (2) | 0.622 (3) | 1.123 (1) | 0.969 |
DWO | 17.189 (1) | 0.874 (1) | 0.734 (1) | 1.083 (2) | 0.978 (1) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, K.; Cheng, B.; Li, X.; Zhang, X.; Wang, G.; Gao, J.; He, Q.; Gan, Y. An Image Fusion Algorithm for Sustainable Development Goals Satellite-1 Night-Time Light Images Based on Optimized Image Stretching and Dual-Domain Fusion. Remote Sens. 2024, 16, 4298. https://doi.org/10.3390/rs16224298
Li K, Cheng B, Li X, Zhang X, Wang G, Gao J, He Q, Gan Y. An Image Fusion Algorithm for Sustainable Development Goals Satellite-1 Night-Time Light Images Based on Optimized Image Stretching and Dual-Domain Fusion. Remote Sensing. 2024; 16(22):4298. https://doi.org/10.3390/rs16224298
Chicago/Turabian StyleLi, Kedong, Bo Cheng, Xiaoming Li, Xiaoping Zhang, Guizhou Wang, Jie Gao, Qinxue He, and Yaocan Gan. 2024. "An Image Fusion Algorithm for Sustainable Development Goals Satellite-1 Night-Time Light Images Based on Optimized Image Stretching and Dual-Domain Fusion" Remote Sensing 16, no. 22: 4298. https://doi.org/10.3390/rs16224298
APA StyleLi, K., Cheng, B., Li, X., Zhang, X., Wang, G., Gao, J., He, Q., & Gan, Y. (2024). An Image Fusion Algorithm for Sustainable Development Goals Satellite-1 Night-Time Light Images Based on Optimized Image Stretching and Dual-Domain Fusion. Remote Sensing, 16(22), 4298. https://doi.org/10.3390/rs16224298