Deep Learning Application for Crop Classification via Multi-Temporal Remote Sensing Images
Abstract
:1. Introduction
2. Materials
2.1. Study Area
2.2. Data
2.2.1. Remote Sensing Images
2.2.2. Training and Validation Samples
3. Methodology
3.1. Methodological Overview
3.2. Temporal Phenological Patterns
3.3. Deep Learning Models
3.4. Sample Dimensions
3.5. Deep Learning Architectures
3.6. Experiment Design
4. Results
4.1. Classification Based on VI Time Series
4.2. Classification Based on Multi-Spectral Time Series
5. Discussion
5.1. Analysis of Time-Series Profile
5.2. Effects of Temporal, Spectral, and Spatial Feature
5.3. Comparison of Deep Learning Models
5.4. Potential of 3D-CNN and ConvLSTM2D for Crop Classification from Multi-Temporal Images
6. Conclusions
- (1)
- Greater data diversity (temporal, spectral and spatial information) is effective in improving crop classification accuracy. The temporal feature only provides limited improvement in the accuracy of crop classification from multi-temporal images. As more spectral information is added, the accuracy can be further improved, and the impact of salt-and-pepper noise can be alleviated. The inclusion of spatial information can eliminate salt-and-pepper noise, and its contribution to accuracy decreases as the number of input features increases.
- (2)
- Various deep learning models have limitations in crop classification from multi-temporal images. 1D-CNN and LSTM models cannot extract spatial features while integrating temporal and spectral features. Additionally, a 2D-CNN is suitable for crop classification of time-series data given a single feature such as a VI or band because the multi-spectral advantages are hard to consider when combining temporal and spatial information. The 3D-CNN and ConvLSTM2D models are the most accurate for classifying crops and are more suitable for multi-temporal crop classification than other deep learning models.
- (3)
- The deep learning models based on Conv3D and ConvLSTM2D, which integrate temporal, spectral, and spatial information, are the most accurate models for multi-temporal crop classification. In addition, the advantages of incorporating RNN and CNN and the more flexible structure mean that ConvLSTM should be investigated.
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Blickensdörfer, L.; Schwieder, M.; Pflugmacher, D.; Nendel, C.; Erasmi, S.; Hostert, P. Mapping of crop types and crop sequences with combined time series of Sentinel-1, Sentinel-2 and Landsat 8 data for Germany. Remote Sens. Environ. 2022, 269, 112831. [Google Scholar] [CrossRef]
- Griffiths, P.; Nendel, C.; Hostert, P. Intra-annual reflectance composites from Sentinel-2 and Landsat for national-scale crop and land cover mapping. Remote Sens. Environ. 2019, 220, 135–151. [Google Scholar] [CrossRef]
- Xu, J.; Zhu, Y.; Zhong, R.; Lin, Z.; Xu, J.; Jiang, H.; Huang, J.; Li, H.; Lin, T. DeepCropMapping: A multi-temporal deep learning approach with improved spatial generalizability for dynamic corn and soybean mapping. Remote Sens. Environ. 2020, 247, 111946. [Google Scholar] [CrossRef]
- Xu, J.; Yang, J.; Xiong, X.; Li, H.; Huang, J.; Ting, K.C.; Ying, Y.; Lin, T. Towards interpreting multi-temporal deep learning models in crop mapping. Remote Sens. Environ. 2021, 264, 112599. [Google Scholar] [CrossRef]
- Cai, Y.; Guan, K.; Peng, J.; Wang, S.; Seifert, C.; Wardlow, B.; Li, Z. A high-performance and in-season classification system of field-level crop types using time-series Landsat data and a machine learning approach. Remote Sens. Environ. 2018, 210, 35–47. [Google Scholar] [CrossRef]
- Belgiu, M.; Csillik, O. Sentinel-2 cropland mapping using pixel-based and object- based time-weighted dynamic time warping analysis. Remote Sens. Environ. 2018, 204, 509–523. [Google Scholar] [CrossRef]
- Pelletier, C.; Webb, G.; Petitjean, F. Temporal convolutional neural network for the classification of satellite image time series. Remote Sens. 2019, 11, 523. [Google Scholar] [CrossRef]
- Dou, P.; Shen, H.; Li, Z.; Guan, X. Time series remote sensing image classification framework using combination of deep learning and multiple classifiers system. Int. J. Appl. Earth Obs. Geoinf. 2021, 103, 102477. [Google Scholar] [CrossRef]
- Zhong, L.; Hu, L.; Zhou, H. Deep learning based multi-temporal crop classification. Remote Sens. Environ. 2018, 221, 430–443. [Google Scholar] [CrossRef]
- Mou, L.; Bruzzone, L.; Zhu, X.X. Learning spectral-spatial-temporal features via a recurrent convolutional neural network for change detection in multispectral imagery. IEEE Trans. Geosci. Remote Sens. 2019, 57, 924–935. [Google Scholar] [CrossRef]
- Ji, S.; Zhang, C.; Xu, A.; Shi, Y.; Duan, Y. 3D convolutional neural networks for crop classification with multi-temporal remote sensing images. Remote Sens. 2018, 10, 75. [Google Scholar] [CrossRef]
- Qu, Y.; Yuan, Z.; Zhao, W.; Chen, X.; Chen, J. Crop classification based on multi-temporal features and convolutional neural network. Remote Sens. Technol. Appl. 2021, 36, 304–313. [Google Scholar] [CrossRef]
- Giannopoulos, M.; Tsagkatakis, G.; Tsakalides, P. 4D U-Nets for Multi-Temporal Remote Sensing Data Classification. Remote Sens. 2022, 14, 634. [Google Scholar] [CrossRef]
- Yang, X.; Ye, Y.; Li, X.; Lau, R.Y.K.; Zhang, X.; Huang, X. Hyperspectral image classification with deep learning models. IEEE Trans. Geosci. Remote Sens. 2018, 56, 5408–5423. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Sharma, A.; Liu, X.; Yang, X. Land cover classification from multi-temporal, multi-spectral remotely sensed imagery using patch-based recurrent neural networks. Neural Netw. 2018, 105, 346–355. [Google Scholar] [CrossRef]
- Xie, Y.; Zhang, Y.; Xun, L.; Chai, X. Crop classification based on multi-source remote sensing data fusion and LSTM algorithm. Trans. Chin. Soc. Agric. Eng. 2019, 35, 129–137. [Google Scholar] [CrossRef]
- Shi, X.; Chen, Z.; Wang, H.; Yeung, D.Y.; Wong, W.K.; Woo, W.C. Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting. In Proceedings of the 28th International Conference on Neural Information Processing Systems, Montreal, QC, Canada, 7–12 December 2015. [Google Scholar]
- Hu, W.-S.; Li, H.-C.; Pan, L.; Li, W.; Tao, R.; Du, Q. Spatial-Spectral Feature Extraction via Deep ConvLSTM Neural Networks for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2020, 58, 4237–4250. [Google Scholar] [CrossRef]
- Ahmad, R.; Yang, B.; Ettlin, G.; Berger, A.; Rodriguez-Bocca, P. A machine-learning based ConvLSTM architecture for NDVI forecasting. Int. Trans. Oper. Res. 2020, 30, 2025–2048. [Google Scholar] [CrossRef]
- Seydgar, M.; Naeini, A.A.; Zhang, M.; Li, W.; Satari, M. 3-D convolution-recurrent networks for spectral-spatial classification of hyperspectral images. Remote Sens. 2019, 11, 883. [Google Scholar] [CrossRef]
- Boryan, C.; Yang, Z.; Mueller, R.; Craig, M. Monitoring US agriculture: The US Department of Agriculture, National Agricultural Statistics Service, Cropland Data Layer Program. Geocarto Int. 2011, 26, 341–358. [Google Scholar] [CrossRef]
- NASS/USDA. Minnesota Cropland Data Layer. 2021. Available online: https://www.nass.usda.gov/Research_and_Science/Cropland/metadata/meta.php (accessed on 10 March 2023).
- Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
- Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.; Gao, X.; Ferreira, L. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
- Chen, Y.; Jiang, H.; Li, C.; Jia, X.; Ghamisi, P. Deep feature extraction and classification of hyperspectral images based on convolutional neural networks. IEEE Trans. Geosci. Remote Sens. 2016, 54, 6232–6251. [Google Scholar] [CrossRef]
- Li, Y.; Zhang, H.; Shen, Q. Spectral-spatial classification of Hyper-spectral imagery with 3D convolutional neural network. Remote Sens. 2017, 9, 67. [Google Scholar] [CrossRef]
- Tian, F.; Wu, B.; Zeng, H.; He, Z.; Zhang, M.; Jose, B. Identifying Soybean Cropped Area with Sentinel-2 Data and Multi-Layer Neural Network. J. Geo-Inf. Sci. 2019, 21, 918–927. [Google Scholar] [CrossRef]
- Wang, H.; Zhao, X.; Zhang, X.; Wu, D.; Du, X. Long time series land cover classification in China from 1982 to 2015 based on Bi-LSTM deep learning. Remote Sens. 2019, 11, 1639. [Google Scholar] [CrossRef]
- Yang, S.; Gu, L.; Li, X.; Tao, J. Crop classification method based on optimal feature selection and hybrid CNN-RF networks for multi-temporal remote sensing imagery. Remote Sens. 2020, 12, 3119. [Google Scholar] [CrossRef]
- Sun, Z.; Chen, W.; Guo, B.; Cheng, D. Integration of Time Series Sentinel-1 and Sentinel-2 Imagery for Crop Type Mapping over Oasis Agricultural Areas. Remote Sens. 2020, 12, 158. [Google Scholar] [CrossRef]
- Lu, Y.; Li, H.; Zhang, S. Multi-temporal remote sensing based crop classification using a hybrid 3D-2D CNN model. Trans. Chin. Soc. Agric. Eng. 2021, 37, 142–151. [Google Scholar] [CrossRef]
- Dong, Y.; Zhang, Q. A Combined Deep Learning Model for the Scene Classification of High-Resolution Remote Sensing Image. IEEE Geosci. Remote Sens. Lett. 2019, 16, 1540–1544. [Google Scholar] [CrossRef]
- Zhang, J.; Zhao, H.; Li, J. TRS: Transformers for Remote Sensing Scene Classification. Remote Sens. 2021, 13, 4143. [Google Scholar] [CrossRef]
- Hou, X.; Bai, Y.; Li, Y.; Shang, C.; Shen, Q. High-resolution triplet network with dynamic multiscale feature for change detection on satellite images. ISPRS J. Photogramm. Remote Sens. 2021, 177, 103–115. [Google Scholar] [CrossRef]
- Garnot, V.S.F.; Landrieu, L.; Giordano, S.; Chehata, N. Time-space tradeoff in deep learning models for crop classification on satellite multi-spectral image time series. In Proceedings of the IGARSS 2019—2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019. [Google Scholar]
Band Names | Spectral Band | Central Wavelength (nm) | Band Names | Spectral Band | Central Wavelength (nm) |
---|---|---|---|---|---|
Blue | B2 | 490 | Red-Edge | B7 | 775 |
Green | B3 | 560 | NIR | B8 | 842 |
Red | B4 | 665 | NIR | B8a | 865 |
Red-Edge | B5 | 705 | SWIR | B11 | 1610 |
Red-Edge | B6 | 740 | SWIR | B12 | 2190 |
Day of Year (DOY) | Acquisition Time | Day of Year (DOY) | Acquisition Time |
---|---|---|---|
112 | 22 April 2021 | 230 | 18 August 2021 |
137 | 17 May 2021 | 235 | 23 August 2021 |
150 | 30 May 2021 | 242 | 30 August 2021 |
165 | 14 June 2021 | 257 | 14 September 2021 |
192 | 11 July 2021 | 270 | 27 September 2021 |
207 | 26 July 2021 | 295 | 22 October 2021 |
225 | 13 August 2021 |
Sample Type | Training and Validation Samples | Testing Samples |
---|---|---|
Corn | 1481 | 4096 |
Soybeans | 1487 | 4738 |
Spring Wheat | 1445 | 4674 |
Sugarbeets | 1471 | 4167 |
Others | 1546 | 5210 |
Number | Features | Samples Dimensions | Model | |
---|---|---|---|---|
E1 | NDVI | 1-D time-series | 1D-CNN | LSTM |
E2 | EVI | |||
E3 | B2348 | 2-D time-series | 1D-CNN | LSTM |
E4 | B2348 + B11 + B12 | |||
E5 | B2345678 | |||
E6 | All Bands | |||
E7 | NDVI | 3-D time-series | 2D-CNN | |
E8 | EVI | |||
E9 | B2348 | 4-D time-series | 3D-CNN | ConvLSTM-2D |
E10 | All Bands |
Number | Model | Accuracy | |
---|---|---|---|
OA | Kappa | ||
E1 | RF | 91.02 | 0.891 |
1D-CNN | 92.50 | 0.906 | |
LSTM | 93.25 | 0.915 | |
E2 | RF | 91.24 | 0.893 |
1D-CNN | 92.76 | 0.909 | |
LSTM | 93.94 | 0.924 | |
E7 | 2D-CNN | 94.74 | 0.934 |
E8 | 2D-CNN | 94.76 | 0.934 |
Number | Model | Accuracy | |
---|---|---|---|
OA | Kappa | ||
E3 | RF | 93.48 | 0.918 |
1D-CNN | 94.89 | 0.936 | |
LSTM | 95.31 | 0.941 | |
E4 | 1D-CNN | 96.28 | 0.953 |
LSTM | 96.72 | 0.959 | |
E5 | 1D-CNN | 96.02 | 0.950 |
LSTM | 96.37 | 0.955 | |
E6 | RF | 95.51 | 0.944 |
1D-CNN | 96.84 | 0.960 | |
LSTM | 96.94 | 0.962 | |
E9 | 3D-CNN | 96.77 | 0.960 |
ConvLSTM2D | 96.56 | 0.957 | |
E10 | 3D-CNN | 97.43 | 0.968 |
ConvLSTM2D | 97.25 | 0.966 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Q.; Tian, J.; Tian, Q. Deep Learning Application for Crop Classification via Multi-Temporal Remote Sensing Images. Agriculture 2023, 13, 906. https://doi.org/10.3390/agriculture13040906
Li Q, Tian J, Tian Q. Deep Learning Application for Crop Classification via Multi-Temporal Remote Sensing Images. Agriculture. 2023; 13(4):906. https://doi.org/10.3390/agriculture13040906
Chicago/Turabian StyleLi, Qianjing, Jia Tian, and Qingjiu Tian. 2023. "Deep Learning Application for Crop Classification via Multi-Temporal Remote Sensing Images" Agriculture 13, no. 4: 906. https://doi.org/10.3390/agriculture13040906
APA StyleLi, Q., Tian, J., & Tian, Q. (2023). Deep Learning Application for Crop Classification via Multi-Temporal Remote Sensing Images. Agriculture, 13(4), 906. https://doi.org/10.3390/agriculture13040906