Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery
Abstract
:1. Introduction
2. Guided Image Filtering
2.1. Guided Image Filtering
2.2. Influence of Parameters
3. Proposed Algorithm for GaoFen-2 (GF-2) Datasets
3.1. Problem Formulation and Notations
3.2. Guided Filtering Based Pan-Sharpening
- (i)
- The original multispectral image is registered and resampled to be the same size as the original Pan image .
- (ii)
- By minimizing the residual sum of squares (Equation (5)), the weights (with ) can be easily estimated.Thereafter, a synthetic low-resolution panchromatic image can be obtained with Equation (6).
- (iii)
- Each (with ) is taken as the guidance image to guide the filtering process of the low-resolution Pan image , and the filter output (with ) is obtained as follows:
- (iv)
- The pan-sharpening result is obtained by extracting the spatial information from the Pan image and injecting it into the resampled MS image according to the weight . This process can be formulated as shown in Equations (8) and (9):
3.3. Effectiveness of the Proposed Method
4. Datasets and Experimental Settings
4.1. GF-2 Datasets
4.2. Methods Considered for Comparison
- (1)
- Gram-Schmidt Transformation (GS) [30]: The general GS method uses the Blue, Green, Red and NIR bands of an MS image to simulate a low-resolution Pan band according to corresponding predefined weights. Thereafter, the GS transformation is applied to the synthetic Pan and low-resolution MS images using the first band from the former. Finally, the high-resolution Pan image replaces the first band of the GS transformed bands, and the inverse GS transformation is employed to produce a fused MS image. This method has been integrated into the ENVI 5.3 software. The average of the low-resolution multispectral bands is calculated as the low-resolution Pan band.
- (2)
- Adaptive GS method [12]. The GSA method has the same processing procedures as the general GS method except that the GS method uses equal weight coefficients for each MS band to obtain an intensity image, whereas the GSA method employs the weight coefficients derived using a regression between the MS and degraded low-resolution Pan images. Both methods enjoy the injection gains given by Equation (10):
- (3)
- Nearest-neighbor Diffusion-based Pan-Sharpening (NND) [17]: The NND method first downsamples the high-resolution Pan image to match the size of the MS image. Then, it calculates the spectral band contribution vector using linear regression and obtains the difference factors from the neighboring super pixels of each pixel in the original Pan image. Finally, this method applies a linear mixture model to acquire a fused image. Two important external parameters, an intensity smoothness factor and a spatial smoothness factor, are set based on the intended application. In this study, the default values of the parameters were utilized across all the experiments.
- (4)
- University of New Brunswick method (UNB) [14]: The UNB pan-sharpening method first equalizes the histogram of the MS image and Pan image. Then, the spectral bands of the MS image, that are covered by the Pan band, are employed to produce a new synthetic image using the least squares technique. Finally, all the equalized bands of the MS image are fused with the synthesized image to obtain a high-resolution multispectral image. This method is currently integrated into the PCI Geomatica software. In this study, the method was executed using the default parameters.
- (5)
- GD method: Zhao et al. [31] proposed a fusion method based on a guided image filter that takes the resampled MS image as the guidance image and the original Pan image as the input image to implement the filtering process. Next, it obtains the filtered image with the related spatial information. Finally, the spatial details of the original Pan image are extracted and injected into each MS band according to the weight defined by Equation (11) [12] to obtain the fused image. Equation (11) calculates the optimal coefficient, which indicates the amount of spatial detail that should be injected into the corresponding MS band:
4.3. Evaluation Methods
- (1)
- Entropy is used to measure the spatial information contained in a fused image. The higher the Entropy is, the richer the spatial information possessed by the fused image is. Entropy is expressed as follows:
- (2)
- CC [36] measures the correlation between the MS image and fused image. The value of CC ranges from 0 to 1. A higher correlation value indicates a better correspondence between the MS image and fused image, the ideal correlation coefficient value is 1. CC is defined as follows:
- (3)
- UIQI [37] models any distortion as a combination of three different factors: loss of correlation, luminance distortion, and contrast distortion. It is suitable for most image evaluations, and the best value is 1. UIQI is given by:
- (4)
- ERGAS [32] evaluates the overall spectral distortion of the pan-sharpened image. The lower the ERGAS value is, the better the spectrum quality of the fused image is. The best ERGAS value is 0. The definition of ERGAS is as follows:
5. Results and Discussion
5.1. Analysis of the Influence of Parameters
5.1.1. Parameter Influences in the Guided Filter
5.1.2. The Influence of the Window Radius for Calculating Weights
5.2. Comparison of Different Pan-Sharpening Approaches
5.3. Computational Complexity
6. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Zhang, Y.; Mishra, R.K. A review and comparison of commercially available pan-sharpening techniques for high resolution satellite image fusion. In Proceedings of the Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012; pp. 182–185. [Google Scholar]
- Dong, J.; Zhuang, D.; Huang, Y.; Fu, J. Advances in multi-sensor data fusion: Algorithms and applications. Sensors 2009, 9, 7771–7784. [Google Scholar] [CrossRef] [PubMed]
- Ehlers, M.; Klonus, S.; Johan Åstrand, P.; Rosso, P. Multi-sensor image fusion for pansharpening in remote sensing. Int. J. Image Data Fusion 2010, 1, 25–45. [Google Scholar] [CrossRef]
- Xu, Q.; Li, B.; Zhang, Y.; Ding, L. High-Fidelity Component Substitution Pansharpening by the Fitting of Substitution Data. IEEE Trans. Geosci. Remote Sens. 2014, 52, 7380–7392. [Google Scholar] [CrossRef]
- Gillespie, A.R.; Kahle, A.B.; Walker, R.E. Color enhancement of highly correlated images. II. Channel ratio and chromaticity transformation techniques. Remote Sens. Environ. 1987, 22, 343–365. [Google Scholar] [CrossRef]
- Zhang, Y. A new merging method and its spectral and spatial effects. Int. J. Remote Sens. 1999, 20, 2003–2014. [Google Scholar] [CrossRef]
- Murga, J.N.D.; Otazu, X.; Fors, O.; Arbiol, R. Multiresolution-based image fusion with additive wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1204–1211. [Google Scholar] [CrossRef] [Green Version]
- Tu, T.-M.; Huang, P.S.; Hung, C.-L.; Chang, C.-P. A fast intensity–hue–saturation fusion technique with spectral adjustment for IKONOS imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 309–312. [Google Scholar] [CrossRef]
- Dou, W.; Chen, Y.; Li, X.; Sui, D.Z. A general framework for component substitution image fusion: An implementation using the fast image fusion method. Comput. Geosci. 2007, 33, 219–228. [Google Scholar] [CrossRef]
- Aiazzi, B.; Baronti, S.; Selva, M. Improving Component Substitution Pansharpening Through Multivariate Regression of MS +Pan Data. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3230–3239. [Google Scholar] [CrossRef]
- Vivone, G.; Alparone, L.; Chanussot, J. A Critical Comparison Among Pansharpening Algorithms. IEEE Trans. Geosci. Remote Sens. 2015, 53, 2565–2585. [Google Scholar] [CrossRef]
- Zhang, Y. Understanding image fusion. Photogramm. Eng. Remote Sens. 2004, 70, 657–661. [Google Scholar]
- Pohl, C.; Van Genderen, J.L. Review article Multisensor image fusion in remote sensing: Concepts, methods and applications. Int. J. Remote Sens. 1998, 19, 823–854. [Google Scholar] [CrossRef]
- Xie, B.; Zhang, H.; Huang, B. Revealing Implicit Assumptions of the Component Substitution Pansharpening Methods. Remote Sens. 2017, 9, 443. [Google Scholar] [CrossRef]
- Fryskowska, A.; Wojtkowska, M.; Delis, P.; Grochala, A. Some aspects of satellite imagery integration from eros b and landsat 8. ISPRS–Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B7, 647–652. [Google Scholar] [CrossRef]
- Santurri, L.; Carlà, R.; Fiorucci, F.; Aiazzi, B.; Baronti, S.; Guzzetti, F. Assessment of very high resolution satellite data fusion techniques for landslide recognition. In Proceedings of the ISPRS Centenary Symposium, Vienna, Austria, 5–7 July 2010; Volume 38, pp. 492–497. [Google Scholar]
- Sun, W.; Chen, B.; Messinger, D.W. Nearest-neighbor diffusion-based pan-sharpening algorithm for spectral images. Opt. Eng. 2014, 53, 013107. [Google Scholar] [CrossRef]
- Hinton, G.E.; Salakhutdinov, R.R. Reducing the dimensionality of data with neural networks. Science 2006, 313, 504–507. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Chen, X.; Peng, H.; Wang, Z. Multi-focus image fusion with a deep convolutional neural network. Inf. Fusion 2017, 36, 191–207. [Google Scholar] [CrossRef]
- Jiang, W.; Baker, M.L.; Wu, Q.; Bajaj, C.; Chiu, W. Applications of a bilateral denoising filter in biological electron microscopy. J. Struct. Biol. 2003, 144, 114–122. [Google Scholar] [CrossRef] [PubMed]
- Fukunaga, K.; Hostetler, L.D. The estimation of the gradient of a density function, with applications in pattern recognition. IEEE Trans. Inf. Theory 1975, 21, 32–40. [Google Scholar] [CrossRef]
- Cheng, Y.Z. Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 1995, 17, 790–799. [Google Scholar] [CrossRef]
- He, K.; Sun, J.; Tang, X. Guided Image Filtering. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1397–1409. [Google Scholar] [CrossRef] [PubMed]
- Levin, A.; Lischinski, D.; Weiss, Y. A closed form solution to natural image matting. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 30, 228–242. [Google Scholar] [CrossRef] [PubMed]
- Durand, F.; Dorsey, J. Fast bilateral filtering for the display of high-dynamic-range images. ACM Trans. Graph. (TOG) 2002, 21, 257–266. [Google Scholar] [CrossRef]
- Petschnigg, G.; Szeliski, R.; Agrawala, M.; Cohen, M.; Hoppe, H.; Toyama, K. Digital photography with flash and no-flash image pairs. ACM Trans. Graph. 2004, 23, 664–672. [Google Scholar] [CrossRef]
- He, K.; Sun, J.; Tang, X. Single Image Haze Removal Using Dark Channel Prior. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 2341–2353. [Google Scholar] [CrossRef] [PubMed]
- Li, S.; Kang, X.; Hu, J. Image Fusion With Guided Filtering. IEEE Trans. Image Process. 2013, 22, 2864–2875. [Google Scholar] [CrossRef] [PubMed]
- Draper, N.; Smith, H. Applied Regression Analysis, 2nd ed.; John Wiley: New York, NY, USA, 1981. [Google Scholar]
- Laben, C.A.; Brower, B.V. Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening. U.S. Patent 6011875 A, 4 January 2000. [Google Scholar]
- Zhao, W.; Dai, Q.; Zheng, Y.; Wang, L. A new pansharpen method based on guided image filtering: A case study over Gaofen-2 imagery. In Proceedings of the IGARSS IEEE International Geoscience and Remote Sensing Symposium, Beijing, China, 10–15 July 2016; pp. 3766–3769. [Google Scholar]
- Wald, L.; Ranchin, T.; Mangolini, M. Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images. Photogramm. Eng. Remote Sens. 1997, 63, 691–699. [Google Scholar]
- Klonus, S.; Ehlers, M. Image fusion using the Ehlers spectral characteristics preserving algorithm. GIS Remote Sens. 2007, 44, 93–116. [Google Scholar] [CrossRef]
- Bovik, A.; Wang, Z. A universal image quality index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
- Ranchin, T.; Wald, L. Fusion of high spatial and spectral resolution images: The ARSIS concept and its implementation. Photogramm. Eng. Remote Sens. 2000, 66, 49–61. [Google Scholar]
- Yuhas, R.H.; Goetz, A.F.H.; Boardman, J.W. Discrimination among semi-arid landscape endmembers using the Spectral Angle Mapper (SAM) algorithm. In Proceedings of the Summaries 3rd Annual JPL Airborne Geoscience Workshop, Pasadena, CA, USA, 1–5 June 1992; pp. 147–149. [Google Scholar]
- Alparone, L.; Baronti, S.; Garzelli, A.; Nencini, F. A global quality measurement of Pan-sharpened multispectral imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 313–317. [Google Scholar] [CrossRef]
- Boser, B.; Guyon, I.; Vapnik, V. A training algorithm for optimal margin classifiers. In Proceedings of the 5th Annual Workshop on Computational Learning Theory, Pittsburgh, PA, USA, 27–29 July 1992; pp. 144–152. [Google Scholar]
Spatial resolution | MS: 3.2 m | |
Pan: 0.8 m | ||
Spectral range | Blue: 450–520 nm | |
Green: 520–590 nm | ||
Red: 630–690 nm | ||
NIR: 770–890 nm | ||
Pan: 450–900 nm | ||
Image locations | Guangzhou | |
Land cover types | Urban, rural, water body, cropland, forest, concrete buildings, etc. | |
Image size | ① | MS: 250 250 |
Pan: 1000 1000 | ||
② | MS: 1250 1250 | |
Pan: 5000 5000 | ||
③ | MS: 1250 1250 | |
Pan: 5000 5000 | ||
④ | MS: 1250 1250 | |
Pan: 5000 5000 |
Method | Entropy | UIQI | CC | ERGAS |
---|---|---|---|---|
MS | 7.189 | |||
GS | 6.877 | 0.848 | 0.878 | 22.504 |
NND | 6.863 | 0.779 | 0.881 | 36.835 |
UNB | 6.685 | 0.824 | 0.881 | 26.102 |
GSA | 6.912 | 0.893 | 0.888 | 21.001 |
GD | 6.891 | 0.878 | 0.902 | 25.731 |
Proposed | 7.156 | 0.959 | 0.962 | 14.150 |
Method | Entropy | UIQI | CC | ERGAS |
---|---|---|---|---|
MS | 4.113 | |||
GS | 3.978 | 0.608 | 0.633 | 31.796 |
NND | 3.997 | 0.393 | 0.593 | 70.754 |
UNB | 3.802 | 0.606 | 0.661 | 40.418 |
GSA | 3.916 | 0.700 | 0.757 | 39.723 |
GD | 3.805 | 0.538 | 0.598 | 44.323 |
Proposed | 3.933 | 0.726 | 0.790 | 39.589 |
Method | Entropy | UIQI | CC | ERGAS |
---|---|---|---|---|
MS | 6.709 | |||
GS | 6.642 | 0.855 | 0.859 | 14.697 |
NND | 6.765 | 0.661 | 0.766 | 40.106 |
UNB | 6.464 | 0.828 | 0.844 | 18.575 |
GSA | 6.625 | 0.908 | 0.912 | 15.126 |
GD | 6.658 | 0.777 | 0.841 | 29.136 |
Proposed | 6.563 | 0.905 | 0.921 | 16.460 |
Method | Entropy | UIQI | CC | ERGAS |
---|---|---|---|---|
MS | 5.882 | |||
GS | 6.749 | 0.663 | 0.732 | 56.697 |
NND | 6.578 | 0.727 | 0.745 | 28.991 |
UNB | 6.482 | 0.688 | 0.752 | 56.064 |
GSA | 6.588 | 0.720 | 0.792 | 55.977 |
GD | 6.667 | 0.675 | 0.753 | 59.855 |
Proposed | 6.504 | 0.856 | 0.896 | 34.666 |
Method | GS | NND | UNB | GSA | GD | Proposed |
---|---|---|---|---|---|---|
OA (%) | 74.08 | 73.91 | 72.50 | 74.77 | 74.77 | 76.04 |
Kappa | 0.7029 | 0.7009 | 0.6844 | 0.7107 | 0.7103 | 0.7256 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zheng, Y.; Dai, Q.; Tu, Z.; Wang, L. Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery. ISPRS Int. J. Geo-Inf. 2017, 6, 404. https://doi.org/10.3390/ijgi6120404
Zheng Y, Dai Q, Tu Z, Wang L. Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery. ISPRS International Journal of Geo-Information. 2017; 6(12):404. https://doi.org/10.3390/ijgi6120404
Chicago/Turabian StyleZheng, Yalan, Qinling Dai, Zhigang Tu, and Leiguang Wang. 2017. "Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery" ISPRS International Journal of Geo-Information 6, no. 12: 404. https://doi.org/10.3390/ijgi6120404
APA StyleZheng, Y., Dai, Q., Tu, Z., & Wang, L. (2017). Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery. ISPRS International Journal of Geo-Information, 6(12), 404. https://doi.org/10.3390/ijgi6120404