An Enhanced Single-Pair Learning-Based Reflectance Fusion Algorithm with Spatiotemporally Extended Training Samples
Abstract
:1. Introduction
2. Methodology
2.1. Proposed Fusion Scheme with Enhanced Dictionary-Training Process
2.2. Assessment Indices of the Proposed Fusion Scheme
3. Results
3.1. Datasets
3.2. Experimental Results with the Rural Dataset
3.2.1. Experiments with Spatially Extended Training Samples
3.2.2. Experiments with Temporally Extended Training Samples
3.3. Experimental Results with the Urban Dataset
3.3.1. Experiments with Spatially Extended Training Samples
3.3.2. Experiments with Temporal Extended Training Data
4. Discussion
4.1. Fusion Quality with Spatial Extended Training Samples
4.2. Fusion Quality with Temporal Extended Training Samples
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Dewan, A.M.; Yamaguchi, Y. Land use and land cover change in Greater Dhaka, Bangladesh: Using remote sensing to promote sustainable urbanization. Appl. Geogr. 2009, 29, 390–401. [Google Scholar] [CrossRef]
- Cohen, W.B.; Goward, S.N. Landsat’s role in ecological applications of remote sensing. AIBS Bull. 2004, 54, 535–545. [Google Scholar] [CrossRef]
- Kennedy, R.E.; Yang, Z.; Cohen, W.B. Detecting trends in forest disturbance and recovery using yearly Landsat time series: 1. LandTrendr-Temporal segmentation algorithms. Remote Sens. Environ. 2010, 114, 2897–2910. [Google Scholar] [CrossRef]
- Steinberg, D.K.; Carlson, C.A.; Bates, N.R.; Johnson, R.J.; Michaels, A.F.; Knap, A.H. Overview of the US JGOFS Bermuda Atlantic Time-series Study (BATS): A decade-scale look at ocean biology and biogeochemistry. Deep Sea Res. Part II Top. Stud. Oceanogr. 2001, 48, 1405–1447. [Google Scholar] [CrossRef]
- Joyce, K.E.; Belliss, S.E.; Samsonov, S.V.; McNeill, S.J.; Glassey, P.J. A review of the status of satellite remote sensing and image processing techniques for mapping natural hazards and disasters. Prog. Phys. Geogr. 2009, 33, 183–207. [Google Scholar] [CrossRef]
- Richardson, W.H. Bayesian-based iterative method of image restoration. JOSA 1972, 62, 55–59. [Google Scholar] [CrossRef]
- Elad, M.; Feuer, A. Restoration of a single superresolution image from several blurred, noisy, and undersampled measured images. IEEE Trans. Image Process. 1997, 6, 1646–1658. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Park, S.C.; Park, M.K.; Kang, M.G. Super-resolution image reconstruction: A technical overview. IEEE Signal Process. Mag. 2003, 20, 21–36. [Google Scholar] [CrossRef]
- Tseng, D.C.; Tseng, H.T.; Chien, C.L. Automatic cloud removal from multi-temporal SPOT images. Appl. Math. Comput. 2008, 205, 584–600. [Google Scholar] [CrossRef]
- Chen, J.; Zhu, X.; Vogelmann, J.E.; Gao, F.; Jin, S. A simple and effective method for filling gaps in Landsat ETM+ SLC-off images. Remote Sens. Environ. 2011, 115, 1053–1064. [Google Scholar] [CrossRef]
- Kwarteng, P.S.; Chavez, A.Y. Extracting spectral contrast in Landsat Thematic Mapper image data using selective principal component analysis. Photogramm. Eng. Remote Sens. 1989, 55, 339–348. [Google Scholar]
- Carper, W.; Lillesand, T.; Kiefer, R. The use of intensity-hue-saturation transformations for merging SPOT panchromatic and multispectral image data. Photogramm. Eng. Remote Sens. 1990, 56, 459–467. [Google Scholar]
- Yocky, D.A. Multiresolution wavelet decomposition image merger of Landsat Thematic Mapper and SPOT panchromatic data. Photogramm. Eng. Remote Sens. 1996, 62, 1067–1074. [Google Scholar]
- Sun, H.; Dou, W.; Yi, W. Discussion of status, predicament and development tendency in the remotely sensed image fusion. Remote Sens. Inf. 2011, 1, 104–108. [Google Scholar]
- Fortin, J.P.; Bernier, M.; Lapointe, S.; Gauthier, Y.; De Sève, D.; Beaudoin, S. Estimation of Surface Variables at the Sub-Pixel Level for Use as Input to Climate and Hydrological Models; INRS-Eau: Sainte-Foy, QC, Canada, 1998. [Google Scholar]
- Zhukov, B.; Oertel, D. Multi-sensor multi-resolution technique and its simulation. Zeitschrift für Photogrammetrie und Fernerkundung 1996, 1, 11–21. [Google Scholar]
- Minghelli-Roman, A.; Mangolini, M.; Petit, M.; Polidori, L. Spatial resolution improvement of MeRIS images by fusion with TM images. IEEE Trans. Geosci. Remote Sens. 2001, 39, 1533–1536. [Google Scholar] [CrossRef]
- Zurita-Milla, R.; Clevers, J.G.P.W.; Schaepman, M.E. Unmixing-based Landsat TM and MERIS FR data fusion. IEEE Geosci. Remote Sens. Lett. 2008, 5, 453–457. [Google Scholar] [CrossRef] [Green Version]
- Gevaert, C.M.; García-Haro, F.J. A comparison of STARFM and an unmixing-based algorithm for Landsat and MODIS data fusion. Remote Sens. Environ. 2015, 156, 34–44. [Google Scholar] [CrossRef]
- Maselli, F. Definition of spatially variable spectral endmembers by locally calibrated multivariate regression analyses. Remote Sens. Environ. 2001, 75, 29–38. [Google Scholar] [CrossRef]
- Cherchali, S.; Flouzat, G. Linear mixture modelling applied to AVHRR data for monitoring vegetation. IEEE Geoscience and Remote Sensing Symposium. IGARSS Surf. Atmos. Remote Sens. Technol. Data Anal. Interpret. 1994, 2, 1242–1244. [Google Scholar]
- Zhu, X.; Helmer, E.H.; Gao, F.; Liu, D.; Chen, J.; Lefsky, M.A. A flexible spatiotemporal method for fusing satellite images with different resolutions. Remote Sens. Environ. 2016, 172, 165–177. [Google Scholar] [CrossRef]
- Gao, F.; Masek, J.; Schwaller, M.; Hall, F. On the blending of the Landsat and MODIS surface reflectance: Predicting daily Landsat surface reflectance. IEEE Trans. Geosci. Remote Sens. 2006, 44, 2207–2218. [Google Scholar]
- Shen, H.; Wu, P.; Liu, Y.; Ai, T.; Wang, Y.; Liu, X. A spatial and temporal reflectance fusion model considering sensor observation differences. Int. J. Remote Sens. 2013, 34, 4367–4383. [Google Scholar] [CrossRef]
- Hilker, T.; Wulder, M.A.; Coops, N.C.; Linke, J.; McDermid, G.; Masek, J.G.; Gao, F.; White, J.C. A new data fusion model for high spatial-and temporal-resolution mapping of forest disturbance based on Landsat and MODIS. Remote Sens. Environ. 2009, 113, 1613–1627. [Google Scholar] [CrossRef]
- Zhu, X.; Chen, J.; Gao, F.; Chen, X.; Masek, J.G. An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions. Remote Sens. Environ. 2010, 114, 2610–2623. [Google Scholar] [CrossRef]
- Weng, Q.; Fu, P.; Gao, F. Generating daily land surface temperature at Landsat resolution by fusing Landsat and MODIS data. Remote Sens. Environ. 2014, 145, 55–67. [Google Scholar] [CrossRef]
- Michishita, R.; Jiang, Z.; Gong, P.; Xu, B. Bi-scale analysis of multitemporal land cover fractions for wetland vegetation mapping. ISPRS J. Photogramm. Remote Sens. 2012, 72, 1–15. [Google Scholar] [CrossRef]
- Roy, D.P.; Ju, J.; Lewis, P.; Schaaf, C.; Gao, F.; Hansen, M.; Lindquist, E. Multi-temporal MODIS–Landsat data fusion for relative radiometric normalization, gap filling, and prediction of Landsat data. Remote Sens. Environ. 2008, 112, 3112–3130. [Google Scholar] [CrossRef]
- Dacheng, L.; Ping, T.; Changmiao, H.; Ke, Z. Spatial-temporal fusion algorithm based on an extended semi-physical model and its preliminary application. J. Remote Sens. 2014, 18, 307–319. [Google Scholar]
- Song, H.; Huang, B. Spatiotemporal satellite image fusion through one-pair image learning. IEEE Trans. Geosci. Remote Sens. 2013, 51, 1883–1896. [Google Scholar] [CrossRef]
- Huang, B.; Song, H. Spatiotemporal reflectance fusion via sparse representation. IEEE Trans. Geosci. Remote Sens. 2012, 50, 3707–3716. [Google Scholar] [CrossRef]
- Chen, B.; Huang, B.; Xu, B. Comparison of spatiotemporal fusion models: A review. Remote Sens. 2015, 7, 1798–1835. [Google Scholar] [CrossRef]
- Huang, B.; Song, H.; Cui, H.; Peng, J.; Xu, Z. Spatial and spectral image fusion using sparse matrix factorization. IEEE Trans. Geosci. Remote Sens. 2014, 52, 1693–1704. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
- Yuhas, R.H.; Goetz, A.F.H.; Boardman, J.W. Discrimination among Semi-Arid Landscape Endmembers Using the Spectral Angle Mapper (SAM) Algorithm. In JPL, Summaries of the Third Annual JPL Airborne Geoscience Workshop; NASA: Pasadena, CA, USA, 1992; Volume 1, pp. 147–149. [Google Scholar]
- Renza, D.; Martinez, E.; Arquero, A. A new approach to change detection in multispectral images by means of ERGAS index. IEEE Geosci. Remote Sens. Lett. 2013, 10, 76–80. [Google Scholar] [CrossRef] [Green Version]
- Vermote, E.; Justice, C.; Claverie, M.; Franch, B. Preliminary analysis of the performance of the Landsat 8/OLI land surface reflectance product. Remote Sens. Environ. 2016, 185, 46–56. [Google Scholar] [CrossRef]
- Mairal, J.; Bach, F.; Ponce, J.; Sapiro, G. Online learning for matrix factorization and sparse coding. J. Mach. Learn. Res. 2010, 11, 19–60. [Google Scholar]
- Mairal, J.; Bach, F.; Ponce, J.; Sapiro, G. Online dictionary learning for sparse coding. In Proceedings of the 26th Annual International Conference on Machine Learning, Montreal, QC, Canada, 14–18 June 2009; ACM: New York, NY, USA, 2009; pp. 689–696. [Google Scholar] [Green Version]
Landsat-8 OLI | MODIS MOD09A1/Q1 | ||||
---|---|---|---|---|---|
Date | Data Info | Date | Data Info | ||
31 July 2013 | 21 April 2017 | Orbit: 123–32 Band: 3–5 Resolution: 30 m | 28 July 2013 | 23 April 2017 | Orbit: 26–04,05 Band: 1, 2 (MOD09Q1) and 4 (MOD09A1) Resolution: 250 m (MOD09Q1) and 500 (MOD09A1) |
1 September 2013 | 7 May 2017 | 29 August 2013 | 9 May 2017 | ||
19 August 2014 | 23 May 2017 | 21 August 2014 | 25 May 2017 | ||
4 September 2014 | 10 July 2017 | 6 September 2014 | 12 July 2017 | ||
22 August 2015 | 12 September 2017 | 21 August 2015 | 14 September 2017 | ||
7 September 2015 | 28 September 2017 | 6 September 2015 | 30 September 2017 | ||
20 May 2016 | 30 October 2017 | 16 May 2016 | 1 November 2017 | ||
11 October 2016 | 15 November 2017 | 7 October 2016 | 17 November 2017 | ||
31 January 2017 | 1 December 2017 | 2 February 2017 | 3 December 2017 | ||
4 March 2017 | 17 December 2017 | 6 March 2017 | 19 December 2017 |
Methods | Training Image Size | AAD × 102 | RMSE × 102 | SSIM × 102 | SAM | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
G | R | NIR | G | R | NIR | G | R | NIR | ||||
Original algorithm | 500 × 500 | 0.46 | 0.79 | 1.74 | 0.66 | 1.16 | 2.63 | 96.16 | 90.79 | 79.73 | 1.8065 | 18.8364 |
Proposed algorithm | 600 × 600 | 0.43 | 0.75 | 1.55 | 0.63 | 1.11 | 2.33 | 96.53 | 91.51 | 83.17 | 1.8116 | 17.7550 |
700 × 700 | 0.42 | 0.72 | 1.37 | 0.60 | 1.08 | 2.07 | 96.83 | 91.89 | 86.52 | 1.8151 | 16.9564 | |
800 × 800 | 0.41 | 0.70 | 1.26 | 0.59 | 1.05 | 1.84 | 96.95 | 92.26 | 89.18 | 1.8175 | 16.3198 | |
900 × 900 | 0.39 | 0.68 | 1.19 | 0.58 | 1.02 | 1.75 | 97.10 | 92.66 | 90.16 | 1.8198 | 15.8187 | |
1000 × 1000 | 0.39 | 0.67 | 1.16 | 0.57 | 1.01 | 1.69 | 97.13 | 92.82 | 90.80 | 1.8206 | 15.6352 | |
1100 × 1100 | 0.38 | 0.66 | 1.13 | 0.56 | 0.99 | 1.63 | 97.2 | 92.95 | 91.32 | 1.8215 | 15.3832 | |
1200 × 1200 | 0.38 | 0.64 | 1.11 | 0.56 | 0.98 | 1.63 | 97.21 | 93.03 | 91.37 | 1.8219 | 15.2739 | |
STARFM | - | 0.42 | 0.69 | 1.78 | 0.60 | 1.08 | 2.65 | 97.01 | 92.11 | 88.31 | 1.8123 | 17.0671 |
SPFM | - | 0.41 | 0.71 | 1.68 | 0.59 | 1.10 | 2.47 | 96.49 | 91.99 | 88.52 | 1.8163 | 16.5105 |
Methods | Training Image Size | AAD × 102 | RMSE × 102 | SSIM × 102 | SAM | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
G | R | NIR | G | R | NIR | G | R | NIR | ||||
Original algorithm | 500 × 500 | 0.45 | 0.67 | 2.26 | 0.60 | 1.00 | 3.23 | 96.43 | 92.29 | 79.17 | 1.8050 | 20.4154 |
Proposed algorithm | 600 × 600 | 0.43 | 0.62 | 2.17 | 0.60 | 0.88 | 3.15 | 96.61 | 93.93 | 79.75 | 1.8102 | 18.7794 |
700 × 700 | 0.43 | 0.61 | 2.13 | 0.60 | 0.87 | 3.06 | 96.86 | 94.06 | 80.90 | 1.8130 | 18.2613 | |
800 × 800 | 0.40 | 0.56 | 1.89 | 0.55 | 0.80 | 2.71 | 97.36 | 94.97 | 84.44 | 1.8190 | 16.5935 | |
900 × 900 | 0.38 | 0.55 | 1.82 | 0.54 | 0.78 | 2.61 | 97.49 | 95.18 | 85.52 | 1.8206 | 16.1860 | |
1000 × 1000 | 0.38 | 0.54 | 1.79 | 0.53 | 0.77 | 2.58 | 97.59 | 95.34 | 85.78 | 1.8215 | 15.9259 | |
1100 × 1100 | 0.37 | 0.53 | 1.77 | 0.52 | 0.75 | 2.52 | 97.65 | 95.45 | 86.29 | 1.8223 | 15.6691 | |
1200 × 1200 | 0.37 | 0.52 | 1.75 | 0.52 | 0.75 | 2.51 | 97.66 | 95.52 | 86.33 | 1.8226 | 15.5429 | |
STARFM | - | 0.50 | 0.68 | 2.03 | 0.70 | 1.06 | 2.83 | 96.74 | 92.54 | 84.02 | 1.8172 | 16.4957 |
SPFM | - | 0.41 | 0.74 | 1.91 | 0.59 | 1.08 | 2.79 | 97.10 | 92.19 | 84.81 | 1.8171 | 16.5169 |
Modeled Dates | AAD × 102 | RMSE × 102 | SSIM × 102 | SAM | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
G | R | NIR | G | R | NIR | G | R | NIR | |||
24 May | 0.38 | 0.65 | 1.13 | 0.57 | 1.00 | 1.66 | 97.15 | 92.99 | 91.11 | 1.8213 | 15.4358 |
11 July | 0.38 | 0.53 | 1.77 | 0.53 | 0.76 | 2.55 | 97.59 | 95.38 | 85.83 | 1.8217 | 15.7507 |
Methods | Training Image Size | AAD × 102 | RMSE × 102 | SSIM × 102 | SAM | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
G | R | NIR | G | R | NIR | G | R | NIR | ||||
Original algorithm | 500 × 500 | 1.76 | 2.11 | 3.54 | 2.41 | 2.96 | 4.71 | 84.48 | 81.45 | 77.53 | 1.7854 | 26.2220 |
Proposed algorithm | 600 × 600 | 1.71 | 2.02 | 3.42 | 2.29 | 2.79 | 4.58 | 86.66 | 84.08 | 79.49 | 1.7946 | 24.9515 |
700 × 700 | 1.70 | 1.99 | 3.40 | 2.27 | 2.73 | 4.56 | 87.34 | 84.8 | 79.87 | 1.7973 | 24.6278 | |
800 × 800 | 1.69 | 1.98 | 3.39 | 2.26 | 2.70 | 4.54 | 87.56 | 85.64 | 80.35 | 1.7986 | 24.3276 | |
900 × 900 | 1.68 | 1.97 | 3.38 | 2.25 | 2.69 | 4.53 | 87.75 | 85.72 | 80.78 | 1.8002 | 24.2388 | |
1000 × 1000 | 1.67 | 1.96 | 3.38 | 2.24 | 2.68 | 4.51 | 87.81 | 85.88 | 80.94 | 1.8010 | 24.1858 | |
1100 × 1100 | 1.67 | 1.96 | 3.37 | 2.23 | 2.68 | 4.50 | 87.89 | 85.91 | 81.09 | 1.8009 | 24.1359 | |
1200 × 1200 | 1.66 | 1.95 | 3.36 | 2.22 | 2.66 | 4.47 | 87.97 | 85.94 | 81.27 | 1.8025 | 24.0686 | |
1300 × 1300 | 1.66 | 1.94 | 3.35 | 2.22 | 2.66 | 4.46 | 87.97 | 85.99 | 81.35 | 1.8024 | 24.0527 | |
1400 × 1400 | 1.65 | 1.93 | 3.35 | 2.21 | 2.64 | 4.46 | 88.02 | 86.03 | 81.42 | 1.8028 | 24.0454 | |
1500 × 1500 | 1.65 | 1.92 | 3.35 | 2.21 | 2.64 | 4.44 | 88.02 | 86.05 | 81.47 | 1.8031 | 23.9964 | |
1600 × 1600 | 1.65 | 1.92 | 3.35 | 2.21 | 2.64 | 4.46 | 88.03 | 86.05 | 81.46 | 1.8030 | 24.0518 | |
1700 × 1700 | 1.66 | 1.94 | 3.37 | 2.22 | 2.65 | 4.51 | 88.03 | 86.03 | 81.44 | 1.8027 | 24.0764 | |
1800 × 1800 | 1.65 | 1.92 | 3.35 | 2.22 | 2.64 | 4.43 | 88.03 | 86.04 | 81.45 | 1.8030 | 24.0452 | |
1900 × 1900 | 1.65 | 1.92 | 3.34 | 2.21 | 2.63 | 4.44 | 88.04 | 86.07 | 81.47 | 1.8032 | 23.9916 | |
2000 × 2000 | 1.65 | 1.92 | 3.35 | 2.22 | 2.64 | 4.45 | 88.03 | 86.06 | 81.45 | 1.8028 | 24.0490 | |
STARFM | — | 1.66 | 1.95 | 3.50 | 2.22 | 2.67 | 4.63 | 87.96 | 85.95 | 78.39 | 1.8016 | 24.7963 |
SPFM | — | 1.65 | 2.00 | 3.61 | 2.21 | 2.95 | 4.86 | 88.01 | 85.81 | 77.28 | 1.7953 | 25.1976 |
Methods | Training Image Size | AAD × 102 | RMSE × 102 | SSIM × 102 | SAM | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
G | R | NIR | G | R | NIR | G | R | NIR | ||||
Original algorithm | 500 × 500 | 1.73 | 2.00 | 3.45 | 2.18 | 2.64 | 4.64 | 87.55 | 85.01 | 79.01 | 1.7850 | 29.3382 |
Proposed algorithm | 600 × 600 | 1.70 | 1.97 | 3.40 | 2.14 | 2.59 | 4.47 | 88.61 | 86.35 | 82.89 | 1.7922 | 28.3268 |
700 × 700 | 1.70 | 1.96 | 3.40 | 2.14 | 2.56 | 4.35 | 88.63 | 86.47 | 82.88 | 1.7924 | 28.2852 | |
800 × 800 | 1.68 | 1.94 | 3.38 | 2.12 | 2.53 | 4.31 | 88.95 | 87.12 | 83.30 | 1.7943 | 27.8460 | |
900 × 900 | 1.69 | 1.95 | 3.40 | 2.13 | 2.54 | 4.33 | 88.83 | 86.79 | 82.89 | 1.7886 | 27.8857 | |
1000 × 1000 | 1.68 | 1.93 | 3.39 | 2.12 | 2.53 | 4.31 | 89.02 | 87.21 | 83.49 | 1.7947 | 27.7921 | |
1100 × 1100 | 1.68 | 1.92 | 3.37 | 2.12 | 2.53 | 4.31 | 89.03 | 87.44 | 83.54 | 1.7947 | 27.7726 | |
1200 × 1200 | 1.68 | 1.91 | 3.37 | 2.12 | 2.52 | 4.30 | 89.06 | 87.56 | 83.62 | 1.7949 | 27.6851 | |
1300 × 1300 | 1.68 | 1.90 | 3.35 | 2.12 | 2.52 | 4.29 | 89.07 | 87.61 | 83.63 | 1.7952 | 27.5976 | |
1400 × 1400 | 1.68 | 1.90 | 3.34 | 2.12 | 2.51 | 4.28 | 89.09 | 87.63 | 83.67 | 1.7964 | 27.5520 | |
1500 × 1500 | 1.68 | 1.90 | 3.34 | 2.12 | 2.51 | 4.28 | 89.10 | 87.66 | 83.70 | 1.7985 | 27.5435 | |
1600 × 1600 | 1.68 | 1.90 | 3.34 | 2.12 | 2.51 | 4.29 | 89.09 | 87.64 | 83.69 | 1.7980 | 27.5481 | |
1700 × 1700 | 1.68 | 1.89 | 3.32 | 2.12 | 2.49 | 4.21 | 89.12 | 87.71 | 83.78 | 1.8000 | 27.4747 | |
1800 × 1800 | 1.69 | 1.91 | 3.35 | 2.12 | 2.53 | 4.31 | 89.10 | 87.65 | 83.71 | 1.7975 | 27.5554 | |
1900 × 1900 | 1.68 | 1.90 | 3.34 | 2.12 | 2.51 | 4.29 | 89.11 | 87.66 | 83.69 | 1.7989 | 27.5174 | |
2000 × 2000 | 1.68 | 1.90 | 3.33 | 2.12 | 2.51 | 4.25 | 89.13 | 87.69 | 83.75 | 1.7991 | 27.4951 | |
STARFM | — | 1.70 | 1.95 | 3.43 | 2.16 | 2.56 | 4.51 | 88.51 | 86.68 | 82.79 | 1.7939 | 28.5247 |
SPFM | — | 1.68 | 2.01 | 3.44 | 2.17 | 2.63 | 4.50 | 87.68 | 84.83 | 82.45 | 1.7901 | 29.2313 |
Added Training Dates | AAD × 102 | RMSE × 102 | SSIM × 102 | SAM | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
G | R | NIR | G | R | NIR | G | R | NIR | |||
31 January 2017 | 1.76 | 2.08 | 3.57 | 2.40 | 2.90 | 4.83 | 84.55 | 82.25 | 76.36 | 1.7857 | 26.1273 |
4 March 2017 | 1.74 | 2.03 | 3.48 | 2.37 | 2.82 | 4.63 | 85.23 | 83.41 | 78.58 | 1.7918 | 25.4679 |
21 April 2017 | 1.75 | 2.01 | 3.49 | 2.37 | 2.79 | 4.59 | 85.2 | 83.97 | 79.05 | 1.7930 | 25.2690 |
7 May 2017 | 1.75 | 2.03 | 3.52 | 2.37 | 2.83 | 4.65 | 85.04 | 83.24 | 78.44 | 1.7910 | 25.5186 |
23 May 2017 | 1.73 | 1.96 | 3.46 | 2.35 | 2.72 | 4.74 | 85.63 | 84.96 | 77.24 | 1.7947 | 25.1566 |
28 September 2017 | 1.73 | 1.97 | 3.46 | 2.36 | 2.73 | 4.58 | 85.35 | 84.73 | 78.92 | 1.7953 | 25.0129 |
30 October 2017 | 1.72 | 2.00 | 3.47 | 2.36 | 2.78 | 4.67 | 85.38 | 84.10 | 78.06 | 1.7935 | 25.3112 |
15 November 2017 | 1.76 | 2.06 | 3.46 | 2.41 | 2.88 | 4.61 | 84.59 | 82.61 | 78.75 | 1.7901 | 25.7875 |
1 December 2017 | 1.73 | 2.07 | 3.51 | 2.37 | 2.89 | 4.69 | 85.05 | 82.45 | 77.85 | 1.7903 | 25.8093 |
17 December 2017 | 1.75 | 2.05 | 3.63 | 2.38 | 2.84 | 5.00 | 85.19 | 83.22 | 72.14 | 1.7888 | 26.0933 |
Added Training Dates | AAD × 102 | RMSE × 102 | SSIM × 102 | SAM | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
G | R | NIR | G | R | NIR | G | R | NIR | |||
31 January 2017 | 1.72 | 2.02 | 3.48 | 2.17 | 2.65 | 4.70 | 87.22 | 84.73 | 78.46 | 1.7836 | 29.4916 |
4 March 2017 | 1.72 | 1.98 | 3.69 | 2.17 | 2.59 | 4.86 | 87.38 | 85.26 | 73.24 | 1.7813 | 29.4935 |
21 April 2017 | 1.71 | 2.00 | 3.4 | 2.15 | 2.60 | 4.45 | 87.95 | 85.46 | 81.61 | 1.7885 | 28.7471 |
7 May 2017 | 1.71 | 1.97 | 3.39 | 2.15 | 2.56 | 4.35 | 87.98 | 85.95 | 82.39 | 1.7900 | 28.3728 |
23 May 2017 | 1.72 | 1.95 | 3.40 | 2.15 | 2.53 | 4.35 | 87.96 | 86.15 | 82.38 | 1.7902 | 28.2556 |
28 September 2017 | 1.70 | 1.92 | 3.35 | 2.12 | 2.48 | 4.34 | 88.30 | 86.80 | 82.27 | 1.7916 | 27.8873 |
30 October 2017 | 1.72 | 1.98 | 3.39 | 2.18 | 2.59 | 4.53 | 87.49 | 85.58 | 80.27 | 1.7869 | 28.9585 |
15 November 2017 | 1.71 | 2.00 | 3.46 | 2.16 | 2.64 | 4.69 | 87.68 | 84.99 | 77.91 | 1.7844 | 29.3909 |
1 December 2017 | 1.73 | 2.01 | 4.63 | 2.19 | 2.62 | 6.42 | 87.32 | 85.25 | 38.30 | 1.7599 | 32.9495 |
17 December 2017 | 1.72 | 1.99 | 3.40 | 2.18 | 2.61 | 4.37 | 87.40 | 85.15 | 82.36 | 1.7878 | 28.7952 |
Added Training Years | AAD × 102 | RMSE × 102 | SSIM × 102 | SAM | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
G | R | NIR | G | R | NIR | G | R | NIR | |||
2013 | 1.73 | 2.01 | 3.47 | 2.36 | 2.78 | 4.62 | 85.43 | 84.17 | 78.66 | 1.7927 | 25.2327 |
2013 and 2014 | 1.69 | 1.99 | 3.45 | 2.29 | 2.75 | 4.62 | 86.58 | 84.69 | 78.72 | 1.7960 | 24.8839 |
2013 to 2015 | 1.66 | 1.93 | 3.38 | 2.25 | 2.66 | 4.45 | 87.53 | 85.81 | 80.64 | 1.8020 | 24.1563 |
2013 to 2016 | 1.66 | 1.93 | 3.37 | 2.25 | 2.66 | 4.47 | 87.58 | 85.81 | 80.37 | 1.8018 | 24.2050 |
Added Training Years | AAD × 102 | RMSE × 102 | SSIM × 102 | SAM | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
G | R | NIR | G | R | NIR | G | R | NIR | |||
2013 | 1.69 | 1.95 | 3.38 | 2.13 | 2.52 | 4.4 | 88.55 | 86.89 | 81.97 | 1.7921 | 28.1927 |
2013 and 2014 | 1.69 | 1.94 | 3.36 | 2.13 | 2.5 | 4.3 | 88.76 | 87.18 | 83.32 | 1.7941 | 27.8998 |
2013 to 2015 | 1.68 | 1.91 | 3.34 | 2.12 | 2.46 | 4.25 | 88.98 | 87.58 | 83.86 | 1.7955 | 27.6131 |
2013 to 2016 | 1.69 | 1.92 | 3.36 | 2.13 | 2.48 | 4.41 | 88.87 | 87.29 | 82.07 | 1.7934 | 28.0341 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, D.; Li, Y.; Yang, W.; Ge, Y.; Han, Q.; Ma, L.; Chen, Y.; Li, X. An Enhanced Single-Pair Learning-Based Reflectance Fusion Algorithm with Spatiotemporally Extended Training Samples. Remote Sens. 2018, 10, 1207. https://doi.org/10.3390/rs10081207
Li D, Li Y, Yang W, Ge Y, Han Q, Ma L, Chen Y, Li X. An Enhanced Single-Pair Learning-Based Reflectance Fusion Algorithm with Spatiotemporally Extended Training Samples. Remote Sensing. 2018; 10(8):1207. https://doi.org/10.3390/rs10081207
Chicago/Turabian StyleLi, Dacheng, Yanrong Li, Wenfu Yang, Yanqin Ge, Qijin Han, Lingling Ma, Yonghong Chen, and Xuan Li. 2018. "An Enhanced Single-Pair Learning-Based Reflectance Fusion Algorithm with Spatiotemporally Extended Training Samples" Remote Sensing 10, no. 8: 1207. https://doi.org/10.3390/rs10081207
APA StyleLi, D., Li, Y., Yang, W., Ge, Y., Han, Q., Ma, L., Chen, Y., & Li, X. (2018). An Enhanced Single-Pair Learning-Based Reflectance Fusion Algorithm with Spatiotemporally Extended Training Samples. Remote Sensing, 10(8), 1207. https://doi.org/10.3390/rs10081207