Pansharpening of WorldView-2 Data via Graph Regularized Sparse Coding and Adaptive Coupled Dictionary
Abstract
:1. Introduction
- (1)
- Considering the degree of correlation among the MS channels and the PAN channel, the PS process of the WorldView-2 data is regarded as a multitask problem. The first task is to process the adjacent MS channels, i.e., green, yellow, red, and red edge, with high correlation to the PAN band and within the wavelength range well covered by the PAN image. The second task is to process a single MS channel, i.e., blue band, partially outside the wavelength range covered by the PAN image and with low correlation to the PAN image. The third task is to process the MS channels, i.e., coastal, NIR1, and NIR2 outside the wavelength range covered by the PAN image.
- (2)
- To acquire precise sparse representations of the MS image patches, the GRSC algorithm is used in the GRSC-ACD method by exploiting the local manifold structure that describes the spatial similarity of the image patches. In each task, the LR MS channels are tiled into image patches, which make up an image patch set. Then, the image patch set is clustered into several subsets using the K-means algorithm so that the structural similarities of the image patches are further strengthened. Finally, each subset is sparsely represented by the GRSC algorithm. The accurate sparse representations contribute to a high-quality reconstruction of the HR MS image.
- (3)
- Adaptive coupled dictionary is constructed for different PS tasks. For the first task, a coupled dictionary learned from the PAN image and its degraded version is used to sparsely represent the MS image patches. For the second task, to effectively represent the single blue band, the PAN image and the reconstructed green band that has high correlation to the blue band are selected as the image dataset to train the coupled dictionary. For the third PS task, the reconstructed blue band with high correlation to the coastal band is selected as the image dataset to learn the adaptive coupled dictionary for the coastal band. Meanwhile, the reconstructed red edge band is taken as the image dataset to learn the adaptive coupled dictionary for sharpening the NIR 1 and NIR 2 bands.
2. Related Works
2.1. SR-Based PS Methods
2.2. Sparse Representation
2.3. GRSC
3. Multitask Pansharpening Method: GRSC-ACD
3.1. Description of Multitask Pansharpening
- (1)
- First task: The correlation coefficients between the MS channels including green, yellow, red and red edge, and the PAN channel are listed in Table 1, which are highlighted in red. The green, yellow, red and red edge bands have high correlation to the PAN image; also, these bands are almost within the wavelength range covered by the PAN image. Hence, in the first task, these MS channels will be sharpened together. For this task, the HR PAN image and its degraded version are used to learn the coupled dictionary pair.
- (2)
- Second task: In Figure 1, the blue band is mostly within the wavelength range covered by the PAN image. However, it has low correlation to the PAN image. Hence, the second task specially deals with the blue band. From the correlation coefficient labeled with blue color, the blue band and the green band have high correlation. Hence, the PAN image and the reconstructed green band are used as the dataset to learn the adaptive coupled dictionary for this task.
- (3)
- Third task: The remaining MS channels, i.e., coastal, NIR1, and NIR2, are almost outside the wavelength range covered by the PAN image shown in Figure 1. In this task, three MS channels are divided into two groups: (1) coastal band; (2) NIR1 and NIR2. For these two groups, different reconstructed HR MS bands are chosen to learn the adaptive coupled dictionaries. From the correlation coefficient labeled with purple color, it can be concluded that the coastal band is highly related to the blue band. Hence, the reconstructed blue band is used to learn the coupled dictionary for sharpening the coastal band. The correlation coefficients labeled with green color show the high degree of correlation among red edge, NIR1, and NIR2. Hence, for sharpening the NIR1 and NIR2 bands, we use the reconstructed red edge band to train the coupled dictionary.
3.2. Pansharpening Algorithm via GRSC for Each Task
- (1)
- Constructing image patch sets with similar geometrical structure: To acquire the precise sparse representations of the image patches, the set is first separated into several subsets with K-means clustering algorithm. Let be the subset of each class, where , and is the total number of the subsets. All the image patches in a subset share the same or similar local geometrical structures.
- (2)
- Sparse coding of the subsets via GRSC: The proposed method is based on the assumption that the LR MS image patch and its corresponding HR MS image patch share the same sparse representation over the coupled dictionary pair. Let and be the LR dictionary and the HR dictionary, respectively. The dictionary construction method will be introduced in the following subsection. Considering the graph regularized sparse coding for image representation, we first construct the weighted graph matrix and the degree matrix for the subset . Then, the Laplacian matrix can be defined as . The sparse representation of the subset can be estimated by solving the following objective function:Then, by combining (7) and (8), the problem (5) can be written asBased on the feature-sign search algorithm proposed in [59], the problem in (9) can be effectively solved to acquire the optimal sparse coefficient matrix .
- (3)
- Reconstructing the HR MS channels for each task: The estimated sparse coefficient matrix for the subset can be obtained by solving the problem in (9). Then, the HR MS image patch subset corresponding to can be calculated through the following Formula (10).
3.3. Dictionary Learning
- (1)
- First task: This task processes the MS channels: green, red, yellow, and red edge. These MS bands are within the wavelength range covered by the PAN image and show high correlation to the PAN image. Hence, the HR PAN image and its degraded version are suitable to learn the coupled dictionary pair for the first task.
- (2)
- Second task: This task only processes the blue band, which is partially outside the wavelength band covered by the PAN image, and has low correlation to the PAN image. Thus, only using the PAN image to learn the coupled dictionary is not suitable for this task. To effectively represent the image patches subsets, the PAN image and the reconstructed HR green band with high correlation to the blue band are selected to learn the coupled dictionary.
- (3)
- Third task: This task sharpens the MS channels that are almost outside the wavelength range covered by the PAN image, i.e., coastal, NIR1, and NIR2. As shown in Table 1, the coastal band has very low correlation to the NIR1 and NIR2 bands. Hence, this task is divided into two subtasks. One subtask processes the coastal spectral band. For this subtask, the reconstructed blue band is used to learn the coupled dictionary. Another subtask processes the NIR1 and NIR2 bands. For this subtask, the reconstructed red edge band is used to learn the coupled dictionary.
Algorithm 1. The GRSC-ACD Pansharpening Method. |
Input: LR MS image , PAN image |
Initialization: Set parameters , , , and |
1: Split the PS process into multiple tasks according to the relative spectral response as shown in Figure 1 and the channel correlation matrix as listed in Table 1 |
2: for do |
3: Separate all the MS bands , into image patches and form an image patch set |
4: Generate each subset , using K-means clustering algorithm |
5: for do |
6: Learn the LR dictionary and the HR dictionary |
7: Compute the sparse coefficient matrix according to (7) |
8: Compute the HR image patch subset through (10) |
9: end for |
10: Generate the HR MS bands , |
11: end for |
Output: Target HR MS image . |
4. Experiments
4.1. Experimental Dataset and Comparison Methods
4.2. Quality Assessment Indexes
4.3. The Choice of Tunning Parameters
4.3.1. Regularization Parameters
4.3.2. Patch Size and Overlapping Size
4.4. Experimental Results on Degraded Images
4.5. Analysis of Difference Images
4.6. Experimental Results of Real Images
4.7. Algorithm Exceution Time Analsysis
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A. Remote Sensing Image Fusion; CRC Press: Boca Raton, FL, USA, 2015. [Google Scholar]
- Ghassemian, H. A Review of Remote Sensing Image Fusion Methods. Inf. Fusion 2016, 32, 75–89. [Google Scholar] [CrossRef]
- Meng, X.; Shen, H.; Li, H.; Zhang, L.; Fu, R. Review of the Pansharpening Methods for Remote Sensing Images Based on the Idea of Meta-analysis: Practical Discussion and Challenges. Inf. Fusion 2019, 46, 102–113. [Google Scholar] [CrossRef]
- Tu, T.M.; Huang, P.S.; Hung, C.L.; Chang, C.P. A Fast Intensity-hue-saturation Fusion Technique with Spectral Adjustment for IKONOS Imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 309–312. [Google Scholar] [CrossRef]
- Laben, C.A.; Brower, B.V. Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening. U.S. Patent 6011875, 4 January 2000. [Google Scholar]
- Shan, V.P.; Younan, N.H.; King, R.L. An Efficient Pan-sharpening Method via a Combined Adaptive PCA Approach and Contourlets. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1323–1335. [Google Scholar]
- Aiazzi, B.; Baronti, S.; Selva, M. Improving Component Substitution Pansharpening through Multivariate Regression of MS +Pan Data. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3230–3239. [Google Scholar] [CrossRef]
- Choi, J.; Yu, K.; Kim, Y. A New Adaptive Component-substitution-based Satellite Image Fusion by Using Partial Replacement. IEEE Trans. Geosci. Remote Sens. 2011, 49, 295–309. [Google Scholar] [CrossRef]
- Xu, Q.; Li, B.; Zhang, Y.; Ding, L. High-fidelity Component Substitution Pansharpening by the Fitting of Substitution Data. IEEE Trans. Geosci. Remote Sens. 2014, 52, 7380–7392. [Google Scholar]
- Rahmani, S.; Strait, M.; Merkurjev, D.; Moeller, M.; Wittman, T. An Adaptive IHS Pan-sharpening Method. IEEE Geosci. Remote Sens. Lett. 2010, 7, 746–750. [Google Scholar] [CrossRef] [Green Version]
- Li, H.; Manjunath, B.S.; Mitra, S.K. Multisensor Image Fusion Using the Wavelet Transform. Gr. Models Image Process. 1995, 57, 235–245. [Google Scholar] [CrossRef]
- Garzelli, A.; Nencini, F. Panchromatic Sharpening of Remote Sensing Images Using a Multiscale Kalman Filter. Pattern Recognit. 2007, 40, 3568–3577. [Google Scholar] [CrossRef]
- Otazu, X.; González-Audícana, M.; Fors, O.; Núnez, J. Introduction of Sensor Spectral Response into Image Fusion Methods. Application to Wavelet-based Methods. IEEE Trans. Geosci. Remote Sens. 2005, 43, 2376–2385. [Google Scholar] [CrossRef] [Green Version]
- Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A. Context-driven Fusion of High Spatial and Spectral Resolution Data Based on Oversampled Multiresolution Analysis. IEEE Trans. Geosci. Remote Sens. 2002, 40, 2300–2312. [Google Scholar] [CrossRef]
- Burt, P.J.; Adelson, E.H. The Laplacian Pyramid as a Compact Image Code. IEEE Trans. Commun. 1983, 4, 532–540. [Google Scholar] [CrossRef]
- Do, M.N.; Vetterli, M. The Contourlet Transform: An Efficient Directional Multiresolution Image Representation. IEEE Trans. Image Process. 2005, 14, 2091–2106. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ghahremani, M.; Ghassemian, H. Remote-sensing Image Fusion Based on Curvelets and ICA. Int. J. Remote Sens. 2015, 36, 4131–4143. [Google Scholar] [CrossRef]
- Vivone, G.; Alparone, L.; Chanussot, J.; Mura, M.D.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A Critical Comparison among Pansharpening Algorithms. IEEE Trans. Geosci. Remote Sens. 2015, 53, 2565–2586. [Google Scholar] [CrossRef]
- Vivone, G.; Restaino, R.; Mura, M.D.; Licciardi, G.; Chanussot, J. Contrast and Error-based Fusion Schemes for Multispectral Image Pansharpening. IEEE Geosci. Remote Sens. Lett. 2014, 11, 930–934. [Google Scholar] [CrossRef] [Green Version]
- Vivone, G.; Restaino, R.; Chanussot, J. Full Scale Regression-based Injection Coefficients for Panchromatic Sharpening. IEEE Trans. Image Process. 2018, 27, 3418–3431. [Google Scholar] [CrossRef]
- Nunez, J.; Otazu, X.; Fors, O.I.; Prades, A.; Pala, V.; Arbiol, R. Multiresolution-based Image Fusion with Additive Wavelet Decomposition. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1204–1212. [Google Scholar] [CrossRef] [Green Version]
- Chen, F.; Qin, F.; Peng, G.; Chen, S. Fusion of Remote Sensing Images Using Improved ICA Mergers Based on Wavelet Decomposition. Proc. Eng. 2012, 29, 2938–2943. [Google Scholar] [CrossRef] [Green Version]
- Ballester, C.; Caselles, V.; Igual, L.; Verdera, J.; Rougé, B. A Variational Model for P+XS Image. Int. J. Comput. Vis. 2006, 69, 43–58. [Google Scholar] [CrossRef]
- Liu, P.; Xiao, L.; Zhang, J.; Naz, B. Spatial-Hessian-Feature-Guided Variational Model for Pansharpening. IEEE Trans. Geosci. Remote Sens. 2016, 54, 2235–2253. [Google Scholar] [CrossRef]
- Zhang, L.; Shen, H.; Gong, W.; Zhang, H. Adjustable Model-based Fusion Method for Multispectral and Panchromatic Images. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2012, 42, 1693–1704. [Google Scholar] [CrossRef]
- Li, Z.; Leung, H. Fusion of Multispectral and Panchromatic Images Using a Restoration-based Method. IEEE Trans. Geosci. Remote Sens. 2009, 47, 1482–1491. [Google Scholar]
- Fang, F.; Li, F.; Zhang, G.; Shen, C. A Variational Method for Multisource Remote Sensing Image Fusion. Int. J. Remote Sens. 2013, 34, 2470–2486. [Google Scholar] [CrossRef]
- Wang, W.; Liu, H.; Liang, L.; Liu, Q.; Xie, G. A Regularised Model-based Pan-sharpening Method for Remote Sensing Images with Local Dissimilarities. Int. J. Remote Sens. 2019, 40, 3029–3054. [Google Scholar] [CrossRef]
- Molina, R.; Vega, M.; Mateos, J.; Katsaggelos, A.K. Variational Posterior Distribution Approximation in Bayesian Super Resolution Reconstruction of Multispectral Images. Appl. Comput. Harmon. Anal. 2008, 24, 251–267. [Google Scholar] [CrossRef] [Green Version]
- Palsson, F.; Sveinsson, J.R.; Ulfarsson, M.O. A New Pansharpening Algorithm Based on Total Variation. IEEE Geosci. Remote Sens. Lett. 2014, 11, 318–322. [Google Scholar] [CrossRef]
- Duran, J.; Buades, A.; Coll, B.; Sbert, C. A Nonlocal Variational Model for Pansharpening Image Fusion. SIAM J. Imaging Sci. 2014, 7, 761–796. [Google Scholar] [CrossRef]
- Liu, P.; Xiao, L.; Li, T. A Variational Pan-Sharpening Method Based on Spatial Fractional-Order Geometry and Spectral-Spatial Low-Rank Priors. IEEE Trans. Geosci. Remote Sens. 2018, 56, 1788–1802. [Google Scholar] [CrossRef]
- Li, S.; Yang, B. A New Pansharpening Method Using a Compressed Sensing Technique. IEEE Trans. Geosci. Remote Sens. 2011, 49, 738–746. [Google Scholar] [CrossRef]
- Zhu, X.X.; Bamler, R. A Sparse Image Fusion Algorithm with Application to Pan-sharpening. IEEE Trans. Geosci. Remote Sens. 2013, 51, 2827–2836. [Google Scholar] [CrossRef]
- Jiang, C.; Zhang, H.; Shen, H.; Zhang, L. A Practical Compressed Sensing-based Pansharpening Method. IEEE Geosci. Remote Sens. Lett. 2012, 9, 629–633. [Google Scholar] [CrossRef]
- Wang, W.; Jiao, L.; Yang, S. Fusion of Multispectral and Panchromatic Images via Sparse Representation and Local Autoregressive Model. Inf. Fusion 2014, 20, 73–87. [Google Scholar] [CrossRef]
- Li, S.; Yin, H.; Fang, L. Remote Sensing Image Fusion via Sparse Representations Over Learned Dictionaries. IEEE Trans. Geosci. Remote Sens. 2013, 51, 4779–4789. [Google Scholar] [CrossRef]
- Ayas, S.; Gormus, E.T.; Ekinci, M. An Efficient Pan Sharpening via Texture Based Dictionary Learning and Sparse Representation. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2018, 7, 2448–2460. [Google Scholar] [CrossRef]
- Zhu, X.X.; Grohnfeldt, C.; Bamler, R. Exploiting Joint Sparsity for Pansharpening: The J-SparseFI Algorithm. IEEE Trans. Geosci. Remote Sens. 2016, 54, 2664–2681. [Google Scholar] [CrossRef] [Green Version]
- Jiang, C.; Zhang, H.; Shen, H.; Zhang, L. Two-Step Sparse Coding for the Pan-Sharpening of Remote Sensing Images. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2014, 7, 1792–1805. [Google Scholar] [CrossRef]
- Guo, M.; Zhang, H.; Li, J.; Zhang, L.; Shen, H. An Online Coupled Dictionary Learning Approach for Remote Sensing Image Fusion. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2014, 7, 1284–1294. [Google Scholar] [CrossRef]
- Vicinanza, M.R.; Restaino, R.; Vivone, G.; Mura, M.D.; Chanussot, J. A Pansharpening Method Based on the Sparse Representation of Injected Details. IEEE Goesci. Remote Sens. Lett. 2015, 12, 180–184. [Google Scholar] [CrossRef]
- Tian, X.; Chen, Y.; Yang, C.; Gao, X.; Ma, J. A Variational Pansharpening Method Based on Gradient Sparse Representation. IEEE Signal Process. Lett. 2020, 27, 1180–1184. [Google Scholar] [CrossRef]
- Tian, X.; Chen, Y.; Yang, C.; Ma, J. Variational Pansharpening by Exploiting Cartoon-Texture Similarities. IEEE Trans. Geosci. Remote Sens. 2021, 1–16. [Google Scholar] [CrossRef]
- Fei, R.; Zhang, J.; Liu, J.; Du, F.; Chang, P.; Hu, J. Convolutional Sparse Representation of Injected Details for Pansharpening. IEEE Goesci. Remote Sens. Lett. 2019, 16, 1595–1599. [Google Scholar] [CrossRef]
- Yin, H. Sparse Representation Based Pansharpening with Details Injection Model. Signal Process. 2015, 113, 218–227. [Google Scholar] [CrossRef]
- Ghanremani, M.; Ghassemian, H. Remote Sensing Image Fusion Using Ripplet Transform and Compressed Sensing. IEEE Geosci. Remote Sens. Lett. 2015, 12, 502–506. [Google Scholar] [CrossRef]
- Ghahremani, M.; Ghassemian, H.A. Compressed-Sensing-Based Pan-Sharpening Method for Spectral Distortion Reduction. IEEE Trans. Geosci. Remote Sens. 2016, 54, 2194–2206. [Google Scholar] [CrossRef]
- Ghahremani, M.; Liu, Y.; Yuen, P.; Behera, A. Remote Sensing Image Fusion via Compressive Sensing. ISPRS J. Photogramm. Remote Sens. 2019, 152, 34–48. [Google Scholar] [CrossRef] [Green Version]
- Dong, C.; Loy, C.C.; He, K.; Tang, X. Image Super-Resolution Using Deep Convolutional Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 38, 295–307. [Google Scholar] [CrossRef] [Green Version]
- Masi, G.; Cozzolino, D.; Verdoliva, L.; Scarpa, G. Pansharpening by Convolutional Neural Networks. Remote Sens. 2016, 8, 594. [Google Scholar] [CrossRef] [Green Version]
- Yuan, Q.; Wei, Y.; Meng, X.; Shen, H.; Zhang, L. A Multiscale and Multidepth Convolutional Neural Nework for Remote Sensing Imagery Pan-Sharpening. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2018, 11, 978–989. [Google Scholar] [CrossRef] [Green Version]
- Choi, J.; Kim, Y.; Kim, M. S3: A Spectral-Spatial Structure Loss for Pan-Sharpening Networks. IEEE Geosci. Remote Sens. Lett. 2020, 17, 829–833. [Google Scholar] [CrossRef] [Green Version]
- Ma, J.; Yu, W.; Chen, C.; Liang, P.; Guo, X.; Jiang, J. Pan-GAN: An Unsupervised Pan-sharpening Method for Remote Sensing Image Fusion. Inf. Fusion 2020, 62, 110–120. [Google Scholar] [CrossRef]
- Zhang, H.; Ma, J. GTP-PNet: A Residual Learning Network Based on Gradient Transformation Prior for Pansharpening. ISPRS J. Photogramm. Remote Sens. 2021, 172, 223–239. [Google Scholar] [CrossRef]
- Wang, W.; Liu, H.; Liang, L.; Liu, Q. Pan-sharpening of Remote Sensing Images via Graph Regularized Sparse Coding. In Proceedings of the 2018 13th IEEE Conference on Industrial Electronics and Applications (ICIEA) IEEE, Wuhan, China, 31 May–2 June 2018. [Google Scholar]
- Zheng, M.; Bu, J.; Chen, C.; Wang, C.; Zhang, L.; Qiu, G.; Cai, D. Graph Regularized Sparse Coding for Image Representation. IEEE Trans. Image Process. 2011, 20, 1327–1336. [Google Scholar] [CrossRef]
- Elad, M.; Figueiredo, M.A.T.; Ma, Y. On the Role of Sparse and Redundant Representations in Image Processing. Proc. IEEE 2010, 98, 972–982. [Google Scholar] [CrossRef]
- Lee, H.; Battle, A.; Raina, R.; Ng, A.Y. Efficient Sparse Coding Algorithms. Adv. Neural Inf. Process. Syst. 2007, 20, 801–808. [Google Scholar]
- Chavez, P.S., Jr.; Sides, S.C.; Anderson, A. Comparison of Three Different Methods to Merge Multiresolution and Multispectral Data: Landsat TM and SPOT Panchromatic. Photogramm. Eng. Remote Sens. 1991, 57, 295–303. [Google Scholar]
- Garzelli, A.; Nencini, F.; Capobianco, L. Optimal MMSE Pansharpening of Very High Resolution Multispectral Images. IEEE Trans. Geosci. Remote Sens. 2008, 46, 228–236. [Google Scholar] [CrossRef]
- Vivone, G.; Alparone, L.; Garzelli, A.; Lolli, S. Fast Reproducible Pansharpening Based on Instrument and Acquisition Modeling: AWLP Revisited. Remote Sens. 2019, 11, 2315. [Google Scholar] [CrossRef] [Green Version]
- Vivone, G. Robust Band-Dependent Spatial-Detail Approaches for Panchromatic Sharpening. IEEE Trans. Geosci. Remote Sens. 2019, 57, 6421–6433. [Google Scholar] [CrossRef]
- Yuhas, R.H.; Goetz, A.F.H.; Boardman, J.W. Discrimination among Semi-arid Landscape Endmembers Using the Spectral AngleMapper (SAM) Algorithm. In Proceedings of the Summaries of the Third Annual JPL Airborne Geoscience Workshop, AVIRIS Workshop, Pasadena, CA, USA, 1–5 June 1992; pp. 147–149. [Google Scholar]
- Wald, L. Data Fusion: Definitions and Architectures-Fusion of Images of Different Spatial Resolutions; Presses des Mines: Paris, France, 2002. [Google Scholar]
- Wang, Z.; Bovik, A. A Universal Image Quality Index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.; Sheikh, H.; Simoncelli, E. Image Quality Assessment: From Error Visibility to Structural Similarity. IEEE Trans. Image Process. 2004, 13, 1–13. [Google Scholar] [CrossRef] [Green Version]
- Garzelli, A.; Nencini, F. Hypercomplex Quality Assessment of Multi/Hyper-spectral Images. IEEE Geosci. Remote Sens. Lett. 2009, 6, 662–665. [Google Scholar] [CrossRef]
- Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A.; Nencini, F.; Selva, M. Multispectral and Panchromatic Data Fusion Assessment without Reference. Photogramm. Eng. Remote Sens. 2008, 74, 193–200. [Google Scholar] [CrossRef] [Green Version]
PAN | Coastal | Blue | Green | Yellow | Red | Red Edge | NIR1 | NIR2 | |
---|---|---|---|---|---|---|---|---|---|
PAN | 1.0000 | 0.7493 | 0.8811 | 0.9629 | 0.9636 | 0.9535 | 0.9493 | 0.8253 | 0.8193 |
Coastal | 0.7493 | 1.0000 | 0.9494 | 0.8317 | 0.7464 | 0.6974 | 0.6337 | 0.5072 | 0.4973 |
Blue | 0.8811 | 0.9494 | 1.0000 | 0.9552 | 0.8897 | 08636 | 0.7584 | 0.6020 | 0.5887 |
Green | 0.9629 | 0.8317 | 0.9552 | 1.0000 | 0.9675 | 0.9578 | 0.8679 | 0.7110 | 0.6982 |
Yellow | 0.9636 | 0.7464 | 0.8897 | 0.9675 | 1.0000 | 0.9843 | 0.8825 | 0.6886 | 0.6855 |
Red | 0.9535 | 0.6974 | 0.8636 | 0.9578 | 0.9843 | 1.0000 | 0.8620 | 0.6772 | 0.6682 |
Red Edge | 0.9493 | 0.6337 | 0.7584 | 0.8679 | 0.8825 | 0.8620 | 1.0000 | 0.9357 | 0.9352 |
NIR1 | 0.8253 | 0.5072 | 0.6020 | 0.7110 | 0.6886 | 0.6772 | 0.9357 | 1.0000 | 0.9928 |
NIR2 | 0.8193 | 0.4973 | 0.5887 | 0.6982 | 0.6855 | 0.6682 | 0.9352 | 0.9928 | 1.0000 |
Methods | RMSE | ERGAS | SAM | Q | SSIM | Q2n | Time(s) |
---|---|---|---|---|---|---|---|
EXP | 48.7750 | 5.7132 | 5.0109 | 0.8222 | 0.8441 | 0.8145 | 0.004 |
GS | 41.9282 | 5.0331 | 6.5103 | 0.8735 | 0.9207 | 0.8574 | 0.09 |
HPF | 35.7340 | 4.2492 | 4.9089 | 0.9139 | 0.9306 | 0.9102 | 0.06 |
MTF-GLP-HPM | 31.2260 | 3.6602 | 4.0269 | 0.9289 | 0.9475 | 0.9274 | 0.28 |
PRACS | 36.2372 | 4.3248 | 5.0098 | 0.9102 | 0.9251 | 0.9108 | 0.22 |
BDSD | 48.0075 | 4.6408 | 6.3423 | 0.8961 | 0.9216 | 0.8871 | 0.09 |
RBDSD | 45.6704 | 5.4831 | 6.1062 | 0.8817 | 0.9053 | 0.8680 | 0.19 |
AWLPH | 38.5274 | 4.5192 | 4.7565 | 0.9193 | 0.9331 | 0.9148 | 0.17 |
OCDL | 36.4954 | 4.3003 | 4.8793 | 0.8767 | 0.9252 | 0.8648 | 67.16 |
PNTSSC | 44.4959 | 4.2806 | 4.9471 | 0.9069 | 0.9278 | 0.9065 | 4.22 |
GRSC | 30.5811 | 3.5347 | 4.9572 | 0.9293 | 0.9401 | 0.9223 | 142.53 |
GRSC-ACD | 28.5203 | 3.3265 | 4.1763 | 0.9353 | 0.9484 | 0.9305 | 147.63 |
Methods | RMSE | ERGAS | SAM | Q | SSIM | Q2n | Time (s) |
---|---|---|---|---|---|---|---|
EXP | 66.1508 | 6.9654 | 5.2397 | 0.7948 | 0.7749 | 0.7931 | 0.004 |
GS | 48.1403 | 5.1607 | 5.6282 | 0.8957 | 0.9245 | 0.8876 | 0.08 |
HPF | 39.4451 | 4.2435 | 4.3694 | 0.9340 | 0.9336 | 0.9322 | 0.07 |
MTF-GLP-HPM | 36.1436 | 3.8433 | 4.2311 | 0.9451 | 0.9463 | 0.9437 | 0.29 |
PRACS | 34.8559 | 3.9944 | 4.6390 | 0.9429 | 0.9436 | 0.9474 | 0.23 |
BDSD | 44.9257 | 4.6012 | 4.9426 | 0.9306 | 0.9301 | 0.9297 | 0.09 |
RBDSD | 38.6037 | 4.0112 | 4.0298 | 0.9468 | 0.9421 | 0.9454 | 0.18 |
AWLPH | 40.7509 | 4.3785 | 4.3772 | 0.9498 | 0.9432 | 0.9492 | 0.15 |
OCDL | 44.6863 | 4.7780 | 4.7649 | 0.9006 | 0.9181 | 0.8987 | 54.14 |
PN-TSSC | 44.4959 | 2.2806 | 4.9471 | 0.9069 | 0.9278 | 0.9065 | 4.98 |
GRSC | 28.8187 | 3.0460 | 3.9993 | 0.9656 | 0.9587 | 0.9615 | 139.24 |
GRSC-ACD | 27.5646 | 2.9205 | 3.6299 | 0.9659 | 0.9625 | 0.9672 | 151.27 |
Methods | Real Dataset 1 | Real Dataset 2 | ||||||
---|---|---|---|---|---|---|---|---|
QNR | Time(s) | QNR | Time (s) | |||||
EXP | 0.0000 | 0.1403 | 0.8597 | 0.005 | 0.0000 | 0.1263 | 0.8737 | 0.005 |
GS | 0.0148 | 0.0785 | 0.9078 | 0.23 | 0.0122 | 0.0722 | 0.9165 | 0.21 |
HPF | 0.0312 | 0.0530 | 0.9175 | 0.21 | 0.0360 | 0.0505 | 0.9153 | 0.18 |
MTF-GLP-HPM | 0.0184 | 0.0489 | 0.9335 | 0.43 | 0.0265 | 0.0495 | 0.9253 | 0.42 |
PRACS | 0.0139 | 0.0539 | 0.9330 | 0.80 | 0.0133 | 0.0530 | 0.9344 | 0.74 |
BDSD | 0.0145 | 0.0725 | 0.9141 | 0.13 | 0.0183 | 0.0472 | 0.9354 | 0.15 |
RBDSD | 0.0081 | 0.0592 | 0.9332 | 0.49 | 0.0144 | 0.0597 | 0.9268 | 0.34 |
AWLPH | 0.0279 | 0.0388 | 0.9344 | 0.27 | 0.0282 | 0.0371 | 0.9357 | 0.26 |
OCDL | 0.0145 | 0.0426 | 0.9435 | 145.70 | 0.0195 | 0.0442 | 0.9371 | 163.12 |
PN-TSSC | 0.0164 | 0.0619 | 0.9228 | 28.23 | 0.0153 | 0.0518 | 0.9337 | 26.08 |
GRSC | 0.0133 | 0.0541 | 0.9332 | 222.08 | 0.0278 | 0.0435 | 0.9299 | 223.86 |
GRSC-ACD | 0.0055 | 0.0501 | 0.9447 | 399.01 | 0.0128 | 0.0467 | 0.9411 | 353.69 |
Methods | Real Dataset 3 | |||
---|---|---|---|---|
QNR | Time (s) | |||
EXP | 0.0000 | 0.1155 | 0.8845 | 0.002 |
GS | 0.0318 | 0.0987 | 0.8727 | 0.28 |
HPF | 0.0309 | 0.0623 | 0.9087 | 0.20 |
MTF-GLP-HPM | 0.0403 | 0.0782 | 0.8847 | 0.56 |
PRACS | 0.0204 | 0.0721 | 0.9090 | 0.81 |
BDSD | 0.0288 | 0.0453 | 0.9272 | 0.14 |
RBDSD | 0.0257 | 0.0950 | 0.8817 | 0.58 |
AWLPH | 0.0427 | 0.0689 | 0.8914 | 0.30 |
OCDL | 0.0295 | 0.0614 | 0.9109 | 159.05 |
PN-TSSC | 0.0264 | 0.0505 | 0.9245 | 22.31 |
GRSC | 0.0385 | 0.0514 | 0.9121 | 217.87 |
GRSC-ACD | 0.0121 | 0.0481 | 0.9403 | 318.06 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, W.; Liu, H.; Xie, G. Pansharpening of WorldView-2 Data via Graph Regularized Sparse Coding and Adaptive Coupled Dictionary. Sensors 2021, 21, 3586. https://doi.org/10.3390/s21113586
Wang W, Liu H, Xie G. Pansharpening of WorldView-2 Data via Graph Regularized Sparse Coding and Adaptive Coupled Dictionary. Sensors. 2021; 21(11):3586. https://doi.org/10.3390/s21113586
Chicago/Turabian StyleWang, Wenqing, Han Liu, and Guo Xie. 2021. "Pansharpening of WorldView-2 Data via Graph Regularized Sparse Coding and Adaptive Coupled Dictionary" Sensors 21, no. 11: 3586. https://doi.org/10.3390/s21113586
APA StyleWang, W., Liu, H., & Xie, G. (2021). Pansharpening of WorldView-2 Data via Graph Regularized Sparse Coding and Adaptive Coupled Dictionary. Sensors, 21(11), 3586. https://doi.org/10.3390/s21113586