A Dual Attention Convolutional Neural Network for Crop Classification Using Time-Series Sentinel-2 Imagery
Abstract
:1. Introduction
- (I)
- Most crop mapping studies have focused on conventional machine learning methods (e.g., RF and SVM). These algorithms do not usually provide the highest possible accuracies due to several factors, such as climatic conditions and the fluctuations in planting times.
- (II)
- Many studies have only used spectral-temporal information for crop mapping. However, spatial information should be included in the classification algorithm to produce highly accurate maps.
- (III)
- Many state-of-the-art deep learning methods for crop mapping have only used the 2D/3D convolution blocks for extracting deep features. All of these extracted deep features are not informative for crop mapping and provide redundant information. In this regard, attention blocks should be implemented to select the most informative features.
- (I)
- Proposing a novel framework for mapping crop types based on a two-stream CNN with a DAM.
- (II)
- Introducing novel spatial and spectral attention mechanisms (AMs) to extract informative deep features for crop mapping.
- (III)
- Utilizing multi-scale and residual blocks for increasing the accuracy of the proposed network.
- (IV)
- Evaluating the sensitivity of the proposed method during the growing season of crops based on a time-series normalized difference vegetation index (NDVI).
- (V)
- Evaluating the performance of commonly used machine learning and deep learning methods for crop type mapping.
2. Study Area and Datasets
2.1. Study Area
2.2. Sentinel-2 Imagery
2.3. Reference Samples
3. Method
3.1. Data Preprationand Time-Series NDVI Calculation
3.2. Proposed Deep Learning Architecture
- (1)
- Utilizing a double streams framework for investigating spatial/spectral deep feature extraction.
- (2)
- Proposing a novel AM framework for extraction of informative deep features that have a higher efficiency compared to the convolutional block attention module (CBAM).
- (3)
- Taking advantage of residual, depth-wise, and separable convolution blocks as well as combining them for deep feature extraction.
- (4)
- Employing separable (point/depth-wise convolution layers) convolution which has a better performance.
3.2.1. Attention Mechanism (AM)
3.2.2. Convolution Layer
3.3. Model Training
3.4. Accuracy Assessment
3.5. Comparison with Other Classification Methods
4. Experiments and Results
4.1. Parameter Setting
4.2. Classification Results
4.3. Impacts of the Time-Series NDVI on the Classification Results
4.4. Ablation Analysis
5. Discussion
5.1. Accuracy
5.2. Sensitivity Analysis
5.3. Proposed Architecture and Deep Feature Extraction
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- United Nations, Department of Economic and Social Affairs, Population Division. World Population Prospects: The 2015 Revision; Key Findings and Advance Tables; United Nations: New York, NY, USA, 2015. [Google Scholar]
- Waldner, F.; Canto, G.S.; Defourny, P. Automated annual cropland mapping using knowledge-based temporal features. ISPRS J. Photogramm. Remote Sens. 2015, 110, 1–13. [Google Scholar] [CrossRef]
- Khan, M.A.; Tahir, A.; Khurshid, N.; Ahmed, M.; Boughanmi, H. Economic effects of climate change-induced loss of agricultural production by 2050: A case study of Pakistan. Sustainability 2020, 12, 1216. [Google Scholar] [CrossRef] [Green Version]
- Shi, W.; Wang, M.; Liu, Y. Crop yield and production responses to climate disasters in China. Sci. Total Environ. 2021, 750, 141147. [Google Scholar] [CrossRef]
- Shelestov, A.; Lavreniuk, M.; Kussul, N.; Novikov, A.; Skakun, S. Exploring Google Earth Engine platform for big data processing: Classification of multi-temporal satellite imagery for crop mapping. Front. Earth Sci. 2017, 5, 17. [Google Scholar] [CrossRef] [Green Version]
- Agovino, M.; Casaccia, M.; Ciommi, M.; Ferrara, M.; Marchesano, K. Agriculture, climate change and sustainability: The case of EU-28. Ecol. Indic. 2019, 105, 525–543. [Google Scholar] [CrossRef]
- Anwar, M.R.; Li Liu, D.; Macadam, I.; Kelly, G. Adapting agriculture to climate change: A review. Theor. Appl. Climatol. 2013, 113, 225–245. [Google Scholar] [CrossRef]
- Amani, M.; Kakooei, M.; Moghimi, A.; Ghorbanian, A.; Ranjgar, B.; Mahdavi, S.; Davidson, A.; Fisette, T.; Rollin, P.; Brisco, B. Application of google earth engine cloud computing platform, sentinel imagery, and neural networks for crop mapping in canada. Remote Sens. 2020, 12, 3561. [Google Scholar] [CrossRef]
- Bégué, A.; Arvor, D.; Bellon, B.; Betbeder, J.; De Abelleyra, D.; Ferraz, R.P.D.; Lebourgeois, V.; Lelong, C.; Simões, M.; Verón, S.R. Remote sensing and cropping practices: A review. Remote Sens. 2018, 10, 99. [Google Scholar] [CrossRef] [Green Version]
- Karthikeyan, L.; Chawla, I.; Mishra, A.K. A review of remote sensing applications in agriculture for food security: Crop growth and yield, irrigation, and crop losses. J. Hydrol. 2020, 586, 124905. [Google Scholar] [CrossRef]
- Orynbaikyzy, A.; Gessner, U.; Conrad, C. Crop type classification using a combination of optical and radar remote sensing data: A review. Int. J. Remote Sens. 2019, 40, 6553–6595. [Google Scholar] [CrossRef]
- Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
- Di, Y.; Zhang, G.; You, N.; Yang, T.; Zhang, Q.; Liu, R.; Doughty, R.B.; Zhang, Y. Mapping Croplands in the Granary of the Tibetan Plateau Using All Available Landsat Imagery, A Phenology-Based Approach, and Google Earth Engine. Remote Sens. 2021, 13, 2289. [Google Scholar] [CrossRef]
- Ren, S.; An, S. Temporal Pattern Analysis of Cropland Phenology in Shandong Province of China Based on Two Long-Sequence Remote Sensing Data. Remote Sens. 2021, 13, 4071. [Google Scholar] [CrossRef]
- Mutanga, O.; Dube, T.; Galal, O. Remote sensing of crop health for food security in Africa: Potentials and constraints. Remote Sens. Appl. Soc. Environ. 2017, 8, 231–239. [Google Scholar] [CrossRef]
- Cai, Y.; Guan, K.; Peng, J.; Wang, S.; Seifert, C.; Wardlow, B.; Li, Z. A high-performance and in-season classification system of field-level crop types using time-series Landsat data and a machine learning approach. Remote Sens. Environ. 2018, 210, 35–47. [Google Scholar] [CrossRef]
- Johnson, D.M.; Mueller, R. Pre-and within-season crop type classification trained with archival land cover information. Remote Sens. Environ. 2021, 264, 112576. [Google Scholar] [CrossRef]
- Kenduiywo, B.K.; Bargiel, D.; Soergel, U. Crop-type mapping from a sequence of Sentinel 1 images. Int. J. Remote Sens. 2018, 39, 6383–6404. [Google Scholar] [CrossRef]
- Donohue, R.J.; Lawes, R.A.; Mata, G.; Gobbett, D.; Ouzman, J. Towards a national, remote-sensing-based model for predicting field-scale crop yield. Field Crops Res. 2018, 227, 79–90. [Google Scholar] [CrossRef]
- Kern, A.; Barcza, Z.; Marjanović, H.; Árendás, T.; Fodor, N.; Bónis, P.; Bognár, P.; Lichtenberger, J. Statistical modelling of crop yield in Central Europe using climate data and remote sensing vegetation indices. Agric. For. Meteorol. 2018, 260, 300–320. [Google Scholar] [CrossRef]
- Son, N.-T.; Chen, C.-F.; Chen, C.-R.; Guo, H.-Y. Classification of multitemporal Sentinel-2 data for field-level monitoring of rice cropping practices in Taiwan. Adv. Space Res. 2020, 65, 1910–1921. [Google Scholar] [CrossRef]
- Zhang, H.; Kang, J.; Xu, X.; Zhang, L. Accessing the temporal and spectral features in crop type mapping using multi-temporal Sentinel-2 imagery: A case study of Yi’an County, Heilongjiang province, China. Comput. Electron. Agric. 2020, 176, 105618. [Google Scholar] [CrossRef]
- Dey, S.; Mandal, D.; Robertson, L.D.; Banerjee, B.; Kumar, V.; McNairn, H.; Bhattacharya, A.; Rao, Y. In-season crop classification using elements of the Kennaugh matrix derived from polarimetric RADARSAT-2 SAR data. Int. J. Appl. Earth Obs. Geoinf. 2020, 88, 102059. [Google Scholar] [CrossRef]
- Planque, C.; Lucas, R.; Punalekar, S.; Chognard, S.; Hurford, C.; Owers, C.; Horton, C.; Guest, P.; King, S.; Williams, S. National crop mapping using sentinel-1 time series: A knowledge-based descriptive algorithm. Remote Sens. 2021, 13, 846. [Google Scholar] [CrossRef]
- Prins, A.J.; Van Niekerk, A. Regional Mapping of Vineyards Using Machine Learning and LiDAR Data. Int. J. Appl. Geospatial Res. (IJAGR) 2020, 11, 1–22. [Google Scholar] [CrossRef]
- ten Harkel, J.; Bartholomeus, H.; Kooistra, L. Biomass and crop height estimation of different crops using UAV-based LiDAR. Remote Sens. 2020, 12, 17. [Google Scholar] [CrossRef] [Green Version]
- Meng, S.; Wang, X.; Hu, X.; Luo, C.; Zhong, Y. Deep learning-based crop mapping in the cloudy season using one-shot hyperspectral satellite imagery. Comput. Electron. Agric. 2021, 186, 106188. [Google Scholar] [CrossRef]
- Moriya, É.A.S.; Imai, N.N.; Tommaselli, A.M.G.; Berveglieri, A.; Santos, G.H.; Soares, M.A.; Marino, M.; Reis, T.T. Detection and mapping of trees infected with citrus gummosis using UAV hyperspectral data. Comput. Electron. Agric. 2021, 188, 106298. [Google Scholar] [CrossRef]
- Chandel, A.K.; Molaei, B.; Khot, L.R.; Peters, R.T.; Stöckle, C.O. High resolution geospatial evapotranspiration mapping of irrigated field crops using multispectral and thermal infrared imagery with metric energy balance model. Drones 2020, 4, 52. [Google Scholar] [CrossRef]
- James, K.; Nichol, C.J.; Wade, T.; Cowley, D.; Gibson Poole, S.; Gray, A.; Gillespie, J. Thermal and Multispectral Remote Sensing for the Detection and Analysis of Archaeologically Induced Crop Stress at a UK Site. Drones 2020, 4, 61. [Google Scholar] [CrossRef]
- Kyere, I.; Astor, T.; Graß, R.; Wachendorf, M. Agricultural crop discrimination in a heterogeneous low-mountain range region based on multi-temporal and multi-sensor satellite data. Comput. Electron. Agric. 2020, 179, 105864. [Google Scholar] [CrossRef]
- Pott, L.P.; Amado, T.J.C.; Schwalbert, R.A.; Corassa, G.M.; Ciampitti, I.A. Satellite-based data fusion crop type classification and mapping in Rio Grande do Sul, Brazil. ISPRS J. Photogramm. Remote Sens. 2021, 176, 196–210. [Google Scholar] [CrossRef]
- Hasanlou, M.; Shah-Hosseini, R.; Seydi, S.T.; Karimzadeh, S.; Matsuoka, M. Earthquake Damage Region Detection by Multitemporal Coherence Map Analysis of Radar and Multispectral Imagery. Remote Sens. 2021, 13, 1195. [Google Scholar] [CrossRef]
- Seydi, S.; Rastiveis, H. A Deep Learning Framework for Roads Network Damage Assessment Using Post-Earthquake Lidar Data. Int. Arch. Photogramm. Remote Sens. Photogramm. Spat. Inf. Sci. 2019, 42, 955–961. [Google Scholar] [CrossRef] [Green Version]
- Seydi, S.T.; Hasanlou, M.; Amani, M.; Huang, W. Oil Spill Detection Based on Multi-Scale Multi-Dimensional Residual CNN for Optical Remote Sensing Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 10941–10952. [Google Scholar] [CrossRef]
- Lary, D.J.; Alavi, A.H.; Gandomi, A.H.; Walker, A.L. Machine learning in geosciences and remote sensing. Geosci. Front. 2016, 7, 3–10. [Google Scholar] [CrossRef] [Green Version]
- Maxwell, A.E.; Warner, T.A.; Fang, F. Implementation of machine-learning classification in remote sensing: An applied review. Int. J. Remote Sens. 2018, 39, 2784–2817. [Google Scholar] [CrossRef] [Green Version]
- Zhang, H.; Li, Q.; Liu, J.; Shang, J.; Du, X.; McNairn, H.; Champagne, C.; Dong, T.; Liu, M. Image classification using rapideye data: Integration of spectral and textual features in a random forest classifier. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 5334–5349. [Google Scholar] [CrossRef]
- Mandal, D.; Kumar, V.; Rao, Y.S. An assessment of temporal RADARSAT-2 SAR data for crop classification using KPCA based support vector machine. Geocarto Int. 2020, 1–13. [Google Scholar] [CrossRef]
- Maponya, M.G.; Van Niekerk, A.; Mashimbye, Z.E. Pre-harvest classification of crop types using a Sentinel-2 time-series and machine learning. Comput. Electron. Agric. 2020, 169, 105164. [Google Scholar] [CrossRef]
- Saini, R.; Ghosh, S.K. Crop classification in a heterogeneous agricultural environment using ensemble classifiers and single-date Sentinel-2A imagery. Geocarto Int. 2019, 36, 2141–2159. [Google Scholar] [CrossRef]
- Seydi, S.T.; Hasanlou, M.; Chanussot, J. DSMNN-Net: A Deep Siamese Morphological Neural Network Model for Burned Area Mapping Using Multispectral Sentinel-2 and Hyperspectral PRISMA Images. Remote Sens. 2021, 13, 5138. [Google Scholar] [CrossRef]
- Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep learning–Method overview and review of use for fruit detection and yield estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
- Wan, X.; Zhao, C.; Wang, Y.; Liu, W. Stacked sparse autoencoder in hyperspectral data classification using spectral-spatial, higher order statistics and multifractal spectrum features. Infrared Phys. Photogramm. Technol. 2017, 86, 77–89. [Google Scholar] [CrossRef]
- Zhong, L.; Hu, L.; Zhou, H. Deep learning based multi-temporal crop classification. Remote Sens. Environ. 2019, 221, 430–443. [Google Scholar] [CrossRef]
- Bhosle, K.; Musande, V. Evaluation of CNN model by comparing with convolutional autoencoder and deep neural network for crop classification on hyperspectral imagery. Geocarto Int. 2020, 1–15. [Google Scholar] [CrossRef]
- Ji, S.; Zhang, C.; Xu, A.; Shi, Y.; Duan, Y. 3D convolutional neural networks for crop classification with multi-temporal remote sensing images. Remote Sens. 2018, 10, 75. [Google Scholar] [CrossRef] [Green Version]
- Li, Z.; Chen, G.; Zhang, T. A CNN-Transformer Hybrid Approach for Crop Classification Using Multitemporal Multisensor Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 847–858. [Google Scholar] [CrossRef]
- Mazzia, V.; Khaliq, A.; Chiaberge, M. Improvement in land cover and crop classification based on temporal features learning from Sentinel-2 data using recurrent-convolutional neural network (R-CNN). Appl. Sci. 2020, 10, 238. [Google Scholar] [CrossRef] [Green Version]
- Yang, S.; Gu, L.; Li, X.; Jiang, T.; Ren, R. Crop classification method based on optimal feature selection and hybrid CNN-RF networks for multi-temporal remote sensing imagery. Remote Sens. 2020, 12, 3119. [Google Scholar] [CrossRef]
- Zhao, H.; Duan, S.; Liu, J.; Sun, L.; Reymondin, L. Evaluation of Five Deep Learning Models for Crop Type Mapping Using Sentinel-2 Time Series Images with Missing Information. Remote Sens. 2021, 13, 2790. [Google Scholar] [CrossRef]
- Akbari, E.; Darvishi Boloorani, A.; Neysani Samany, N.; Hamzeh, S.; Soufizadeh, S.; Pignatti, S. Crop mapping using random forest and particle swarm optimization based on multi-temporal Sentinel-2. Remote Sens. 2020, 12, 1449. [Google Scholar] [CrossRef]
- Asgarian, A.; Soffianian, A.; Pourmanafi, S. Crop type mapping in a highly fragmented and heterogeneous agricultural landscape: A case of central Iran using multi-temporal Landsat 8 imagery. Comput. Electron. Agric. 2016, 127, 531–540. [Google Scholar] [CrossRef]
- Saadat, M.; Hasanlou, M.; Homayouni, S. Rice Crop Mapping Using SENTINEL-1 Time Series Images (case Study: Mazandaran, Iran). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 897–904. [Google Scholar] [CrossRef] [Green Version]
- Rezaei, E.E.; Ghazaryan, G.; Moradi, R.; Dubovyk, O.; Siebert, S. Crop harvested area, not yield, drives variability in crop production in Iran. Environ. Res. Lett. 2021, 16, 064058. [Google Scholar] [CrossRef]
- Maghrebi, M.; Noori, R.; Bhattarai, R.; Mundher Yaseen, Z.; Tang, Q.; Al-Ansari, N.; Danandeh Mehr, A.; Karbassi, A.; Omidvar, J.; Farnoush, H. Iran’s Agriculture in the Anthropocene. Earth’s Future 2020, 8, e2020EF001547. [Google Scholar] [CrossRef]
- Karandish, F. Socioeconomic benefits of conserving Iran’s water resources through modifying agricultural practices and water management strategies. Ambio 2021, 50, 1824–1840. [Google Scholar] [CrossRef]
- Momm, H.G.; ElKadiri, R.; Porter, W. Crop-type classification for long-term modeling: An integrated remote sensing and machine learning approach. Remote Sens. 2020, 12, 449. [Google Scholar] [CrossRef] [Green Version]
- Boali, H.; Asgari, H.; Mohammadian Behbahani, A.; Salmanmahiny, A.; Naimi, B. Provide early desertification warning system based on climate and groundwater criteria (Study area: Aq Qala and Gomishan counties). Geogr. Dev. Iran. J. 2021, 19, 285–306. [Google Scholar] [CrossRef]
- Nasrollahi, N.; Kazemi, H.; Kamkar, B. Feasibility of ley-farming system performance in a semi-arid region using spatial analysis. Ecol. Indic. 2017, 72, 239–248. [Google Scholar] [CrossRef]
- Seydi, S.T.; Akhoondzadeh, M.; Amani, M.; Mahdavi, S. Wildfire damage assessment over Australia using sentinel-2 imagery and MODIS land cover product within the google earth engine cloud platform. Remote Sens. 2021, 13, 220. [Google Scholar] [CrossRef]
- Pan, L.; Xia, H.; Zhao, X.; Guo, Y.; Qin, Y. Mapping winter crops using a phenology algorithm, time-series Sentinel-2 and Landsat-7/8 images, and Google Earth Engine. Remote Sens. 2021, 13, 2510. [Google Scholar] [CrossRef]
- Lambert, M.-J.; Traoré, P.C.S.; Blaes, X.; Baret, P.; Defourny, P. Estimating smallholder crops production at village level from Sentinel-2 time series in Mali’s cotton belt. Remote Sens. Environ. 2018, 216, 647–657. [Google Scholar] [CrossRef]
- Morais, C.L.; Santos, M.C.; Lima, K.M.; Martin, F.L. Improving data splitting for classification applications in spectrochemical analyses employing a random-mutation Kennard-Stone algorithm approach. Bioinformatics 2019, 35, 5257–5263. [Google Scholar] [CrossRef] [PubMed]
- Butcher, B.; Smith, B.J. Feature Engineering and Selection: A Practical Approach for Predictive Models; Kuhn, M., Johnson, K., Eds.; Chapman & Hall/CRC Press: Boca Raton, FL, USA, 2019; ISBN 978-1-13-807922-9. [Google Scholar]
- Ghorbanian, A.; Kakooei, M.; Amani, M.; Mahdavi, S.; Mohammadzadeh, A.; Hasanlou, M. Improved land cover map of Iran using Sentinel imagery within Google Earth Engine and a novel automatic workflow for land cover classification using migrated training samples. ISPRS J. Photogramm. Remote Sens. 2020, 167, 276–288. [Google Scholar] [CrossRef]
- Ghorbanian, A.; Zaghian, S.; Asiyabi, R.M.; Amani, M.; Mohammadzadeh, A.; Jamali, S. Mangrove ecosystem mapping using Sentinel-1 and Sentinel-2 satellite images and random forest algorithm in Google Earth Engine. Remote Sens. 2021, 13, 2565. [Google Scholar] [CrossRef]
- Main-Knorn, M.; Pflug, B.; Louis, J.; Debaecker, V.; Müller-Wilm, U.; Gascon, F. Sen2Cor for sentinel-2. In Proceedings of the Image and Signal Processing for Remote Sensing XXIII, Warsaw, Poland, 11–14 September 2017; p. 1042704. [Google Scholar]
- Pettorelli, N. The Normalized Difference Vegetation Index; Oxford University Press: Oxford, UK, 2013. [Google Scholar]
- Townshend, J.R.; Justice, C. Analysis of the dynamics of African vegetation using the normalized difference vegetation index. Int. J. Remote Sens. 1986, 7, 1435–1445. [Google Scholar] [CrossRef]
- Wold, S.; Esbensen, K.; Geladi, P. Principal component analysis. Chemom. Intell. Lab. Syst. 1987, 2, 37–52. [Google Scholar] [CrossRef]
- Jakubauskas, M.E.; Legates, D.R.; Kastens, J.H. Crop identification using harmonic analysis of time-series AVHRR NDVI data. Comput. Electron. Agric. 2002, 37, 127–139. [Google Scholar] [CrossRef]
- Pan, Z.; Huang, J.; Zhou, Q.; Wang, L.; Cheng, Y.; Zhang, H.; Blackburn, G.A.; Yan, J.; Liu, J. Mapping crop phenology using NDVI time-series derived from HJ-1 A/B data. Int. J. Appl. Earth Obs. Geoinf. 2015, 34, 188–197. [Google Scholar] [CrossRef] [Green Version]
- Skakun, S.; Franch, B.; Vermote, E.; Roger, J.-C.; Becker-Reshef, I.; Justice, C.; Kussul, N. Early season large-area winter crop mapping using MODIS NDVI data, growing degree days information and a Gaussian mixture model. Remote Sens. Environ. 2017, 195, 244–258. [Google Scholar] [CrossRef]
- Wardlow, B.D.; Egbert, S.L. Large-area crop mapping using time-series MODIS 250 m NDVI data: An assessment for the US Central Great Plains. Remote Sens. Environ. 2008, 112, 1096–1116. [Google Scholar] [CrossRef]
- Li, F.; Ren, J.; Wu, S.; Zhao, H.; Zhang, N. Comparison of regional winter wheat mapping results from different similarity measurement indicators of NDVI time series and their optimized thresholds. Remote Sens. 2021, 13, 1162. [Google Scholar] [CrossRef]
- Wu, Z.; Liu, Y.; Han, Y.; Zhou, J.; Liu, J.; Wu, J. Mapping farmland soil organic carbon density in plains with combined cropping system extracted from NDVI time-series data. Sci. Total Environ. 2021, 754, 142120. [Google Scholar] [CrossRef]
- Ghaffarian, S.; Valente, J.; Van Der Voort, M.; Tekinerdogan, B. Effect of Attention Mechanism in Deep Learning-Based Remote Sensing Image Processing: A Systematic Literature Review. Remote Sens. 2021, 13, 2965. [Google Scholar] [CrossRef]
- Li, M.; Wang, Y.; Wang, Z.; Zheng, H. A deep learning method based on an attention mechanism for wireless network traffic prediction. Ad Hoc Netw. 2020, 107, 102258. [Google Scholar] [CrossRef]
- Li, X.; Zhang, W.; Ding, Q. Understanding and improving deep learning-based rolling bearing fault diagnosis with attention mechanism. Signal Process. 2019, 161, 136–154. [Google Scholar] [CrossRef]
- Niu, Z.; Zhong, G.; Yu, H. A review on the attention mechanism of deep learning. Neurocomputing 2021, 452, 48–62. [Google Scholar] [CrossRef]
- Woo, S.; Park, J.; Lee, J.-Y.; Kweon, I.S. Cbam: Convolutional block attention module. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- Huang, G.; Zhu, J.; Li, J.; Wang, Z.; Cheng, L.; Liu, L.; Li, H.; Zhou, J. Channel-attention U-Net: Channel attention mechanism for semantic segmentation of esophagus and esophageal cancer. IEEE Access. 2020, 8, 122798–122810. [Google Scholar] [CrossRef]
- Li, H.; Qiu, K.; Chen, L.; Mei, X.; Hong, L.; Tao, C. SCAttNet: Semantic segmentation network with spatial and channel attention mechanism for high-resolution remote sensing images. IEEE Geosci. Remote Sens. Lett. 2020, 18, 905–909. [Google Scholar] [CrossRef]
- Tong, W.; Chen, W.; Han, W.; Li, X.; Wang, L. Channel-attention-based DenseNet network for remote sensing image scene classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 4121–4132. [Google Scholar] [CrossRef]
- Zhou, T.; Canu, S.; Ruan, S. Automatic COVID-19 CT segmentation using U-Net integrated spatial and channel attention mechanism. Int. J. Imaging Syst. Technol. 2021, 31, 16–27. [Google Scholar] [CrossRef]
- Mohanty, A.; Gitelman, D.R.; Small, D.M.; Mesulam, M.M. The spatial attention network interacts with limbic and monoaminergic systems to modulate motivation-induced attention shifts. Cereb. Cortex 2008, 18, 2604–2613. [Google Scholar] [CrossRef] [Green Version]
- Mou, L.; Zhao, Y.; Chen, L.; Cheng, J.; Gu, Z.; Hao, H.; Qi, H.; Zheng, Y.; Frangi, A.; Liu, J. CS-Net: Channel and spatial attention network for curvilinear structure segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Shenzhen, China, 13–17 October 2019; pp. 721–730. [Google Scholar]
- Sun, H.; Zheng, X.; Lu, X.; Wu, S. Spectral–spatial attention network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2019, 58, 3232–3245. [Google Scholar] [CrossRef]
- Seydi, S.T.; Hasanlou, M.; Amani, M. A new end-to-end multi-dimensional CNN framework for land cover/land use change detection in multi-source remote sensing datasets. Remote Sens. 2020, 12, 2010. [Google Scholar] [CrossRef]
- Seydi, S.T.; Hasanlou, M. A New Structure for Binary and Multiple Hyperspectral Change Detection Based on Spectral Unmixing and Convolutional Neural Network. Measurement 2021, 186, 110137. [Google Scholar] [CrossRef]
- Dobrinić, D.; Medak, D.; Gašparović, M. Integration of Multitemporal SENTINEL-1 and SENTINEL-2 Imagery for Land-Cover Classification Using Machine Learning Methods. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 43, 91–98. [Google Scholar] [CrossRef]
- Zhang, W.; Liu, H.; Wu, W.; Zhan, L.; Wei, J. Mapping rice paddy based on machine learning with Sentinel-2 multi-temporal data: Model comparison and transferability. Remote Sens. 2020, 12, 1620. [Google Scholar] [CrossRef]
- Tatsumi, K.; Yamashiki, Y.; Torres, M.A.C.; Taipe, C.L.R. Crop classification of upland fields using Random forest of time-series Landsat 7 ETM+ data. Comput. Electron. Agric. 2015, 115, 171–179. [Google Scholar] [CrossRef]
- Chuc, M.D.; Anh, N.H.; Thuy, N.T.; Hung, B.Q.; Thanh, N.T.N. Paddy rice mapping in red river delta region using landsat 8 images: Preliminary results. In Proceedings of the 2017 9th International Conference on Knowledge and Systems Engineering (KSE), Hue, Vietnam, 19–21 October 2017; pp. 209–214. [Google Scholar]
- Naboureh, A.; Ebrahimy, H.; Azadbakht, M.; Bian, J.; Amani, M. RUESVMs: An Ensemble Method to Handle the Class Imbalance Problem in Land Cover Mapping Using Google Earth Engine. Remote Sens. 2020, 12, 3484. [Google Scholar] [CrossRef]
- Naboureh, A.; Li, A.; Bian, J.; Lei, G.; Amani, M. A hybrid data balancing method for classification of imbalanced training data within google earth engine: Case studies from mountainous regions. Remote Sens. 2020, 12, 3301. [Google Scholar] [CrossRef]
- Xu, J.; Yang, J.; Xiong, X.; Li, H.; Huang, J.; Ting, K.; Ying, Y.; Lin, T. Towards interpreting multi-temporal deep learning models in crop mapping. Remote Sens. Environ. 2021, 264, 112599. [Google Scholar] [CrossRef]
- Wei, S.; Zhang, H.; Wang, C.; Wang, Y.; Xu, L. Multi-temporal SAR data large-scale crop mapping based on U-Net model. Remote Sens. 2019, 11, 68. [Google Scholar] [CrossRef] [Green Version]
- Tamiminia, H.; Homayouni, S.; McNairn, H.; Safari, A. A particle swarm optimized kernel-based clustering method for crop mapping from multi-temporal polarimetric L-band SAR observations. Int. J. Appl. Earth Obs. Geoinf. 2017, 58, 201–212. [Google Scholar] [CrossRef]
- Hamidi, M.; Safari, A.; Homayouni, S. An auto-encoder based classifier for crop mapping from multitemporal multispectral imagery. Int. J. Remote Sens. 2021, 42, 986–1016. [Google Scholar] [CrossRef]
- Kwak, G.-H.; Park, N.-W. Two-stage Deep Learning Model with LSTM-based Autoencoder and CNN for Crop Classification Using Multi-temporal Remote Sensing Images. Korean J. Remote Sens. 2021, 37, 719–731. [Google Scholar]
- Virnodkar, S.; Pachghare, V.K.; Murade, S. A Technique to Classify Sugarcane Crop from Sentinel-2 Satellite Imagery Using U-Net Architecture. In Progress in Advanced Computing Intelligent Engineering; Springer: Berlin/Heidelberg, Germany, 2021; pp. 322–330. [Google Scholar]
Data | Date | Description |
---|---|---|
Dataset–Time-1 | November 2017 | The first two weeks |
Dataset–Time-2 | November 2017 | The second two weeks |
Dataset–Time-3 | December 2017 | The first two weeks |
Dataset–Time-4 | December 2017 | The second two weeks |
Dataset–Time-5 | January 2018 | The first two weeks |
Dataset–Time-6 | January 2018 | The second two weeks |
Dataset–Time-7 | February 2018 | The first two weeks, high-cloudy, not used |
Dataset–Time-8 | February 2018 | The second two weeks |
Dataset–Time-9 | March 2018 | The first two weeks |
Dataset–Time-10 | March 2018 | The second two weeks, high-cloudy, not used |
Dataset–Time-11 | April 2018 | The first two weeks |
Dataset–Time-12 | April 2018 | The second two weeks |
Dataset–Time-13 | May 2018 | The first two weeks |
Dataset–Time-14 | May 2018 | The second two weeks |
Dataset–Time-15 | June 2018 | The first two weeks |
ID | Crop Type | All Samples | Training (3%) | Validation (0.1%) | Test (96.9%) |
---|---|---|---|---|---|
1 | Arboretum | 9336 | 306 | 67 | 8963 |
2 | Agricultural-Vegetable | 1618 | 53 | 12 | 1553 |
3 | Broad Bean | 71 | 3 | 1 | 67 |
4 | Barren | 58,604 | 1922 | 422 | 56,260 |
5 | Built-Up | 43,252 | 1419 | 311 | 41,522 |
6 | Barley | 17,363 | 569 | 125 | 16,669 |
7 | Water | 5813 | 191 | 42 | 5580 |
8 | Wheat | 58,701 | 1925 | 423 | 56,353 |
9 | Canola | 8282 | 271 | 60 | 7951 |
10 | Alfalfa | 17,995 | 590 | 130 | 17,275 |
Total | 221,035 | 7249 | 1593 | 212,193 |
Data | Description |
---|---|
RF | number of estimators = 105, number of features to split each node = 3 |
XGBoost | number of rounds = 500, max. depth = 5, subsample = 1, min. child weight = 1, lambda = 1, colsample bytree = 0.8. |
Deep Learning Models | dropout rate = 0.1, epochs = 500, initial learning = 10−4, mini-batch size = 550, weight initializer = He normal |
Method | Index | Arboretum | Agricultural-Vegetable | Broad-Bean | Barren | Built-Up | Barley | Water | Wheat | Canola | Alfalfa |
---|---|---|---|---|---|---|---|---|---|---|---|
RF | PA | 98.63 | 18.19 | 0.00 | 86.53 | 67.47 | 57.93 | 26.62 | 90.97 | 69.89 | 98.34 |
UA | 40.08 | 20.61 | 0.00 | 69.51 | 85.62 | 78.88 | 88.62 | 76.61 | 44.17 | 75.21 | |
OE | 1.37 | 81.81 | 100 | 13.47 | 32.53 | 42.07 | 73.38 | 9.03 | 30.11 | 1.66 | |
CE | 59.92 | 79.39 | 100 | 30.49 | 14.38 | 21.12 | 11.38 | 23.39 | 55.83 | 24.79 | |
OA | 73.68 | ||||||||||
KC | 0.678 | ||||||||||
XGBOOST | PA | 84.85 | 94.99 | 87.50 | 81.86 | 84.74 | 95.97 | 98.50 | 89.86 | 92.26 | 94.13 |
UA | 71.74 | 56.15 | 10.29 | 86.80 | 85.24 | 83.09 | 92.90 | 94.93 | 76.32 | 89.69 | |
OE | 15.15 | 5.01 | 12.50 | 18.14 | 15.26 | 4.03 | 1.50 | 10.14 | 7.74 | 5.87 | |
CE | 28.26 | 43.85 | 89.71 | 13.20 | 14.76 | 16.91 | 7.10 | 5.07 | 23.68 | 10.31 | |
OA | 87.48 | ||||||||||
KC | 0.844 | ||||||||||
R-CNN | PA | 82.95 | 70.50 | 0.00 | 79.97 | 87.90 | 90.14 | 96.49 | 86.54 | 78.68 | 86.05 |
UA | 62.09 | 52.90 | 0.00 | 85.78 | 77.22 | 79.35 | 91.77 | 94.19 | 77.77 | 91.22 | |
OE | 17.05 | 29.50 | 0.00 | 20.03 | 12.10 | 9.86 | 3.51 | 13.46 | 21.32 | 13.95 | |
CE | 37.91 | 47.10 | 100 | 14.22 | 22.78 | 20.65 | 8.23 | 5.81 | 22.23 | 8.78 | |
OA | 84.87 | ||||||||||
KC | 0.810 | ||||||||||
2D-CNN | PA | 94.31 | 93.80 | 0.00 | 95.86 | 97.40 | 98.30 | 99.73 | 95.16 | 90.99 | 92.62 |
UA | 81.22 | 78.98 | 0.00 | 96.22 | 97.64 | 91.54 | 99.78 | 96.28 | 84.97 | 98.57 | |
OE | 5.69 | 6.20 | 0.00 | 4.14 | 2.60 | 1.70 | 0.27 | 4.84 | 9.01 | 7.38 | |
CE | 18.78 | 24.02 | 100 | 3.78 | 2.36 | 8.46 | 0.22 | 1.72 | 15.03 | 1.43 | |
OA | 95.73 | ||||||||||
KC | 0.947 | ||||||||||
3D-CNN | PA | 93.09 | 97.61 | 100 | 96.87 | 98.62 | 97.97 | 99.61 | 98.09 | 92.52 | 97.77 |
UA | 94.20 | 89.44 | 77.94 | 97.68 | 97.69 | 95.60 | 99.84 | 98.54 | 91.76 | 98.66 | |
OE | 6.91 | 2.39 | 0.00 | 3.13 | 1.38 | 2.03 | 0.39 | 1.91 | 7.48 | 2.23 | |
CE | 5.80 | 10.56 | 22.06 | 2.32 | 2.31 | 4.40 | 0.16 | 1.46 | 8.24 | 1.34 | |
OA | 97.45 | ||||||||||
KC | 0.968 | ||||||||||
CBAM | PA | 95.65 | 96.30 | 78.48 | 97.96 | 97.91 | 97.71 | 99.28 | 97.77 | 96.31 | 96.23 |
UA | 92.98 | 93.75 | 91.18 | 96.55 | 98.84 | 96.44 | 99.07 | 98.81 | 92.67 | 99.72 | |
OE | 4.35 | 3.70 | 21.52 | 2.04 | 2.09 | 2.29 | 0.72 | 2.23 | 3.69 | 3.77 | |
CE | 7.02 | 6.25 | 8.82 | 3.45 | 1.16 | 3.56 | 0.93 | 1.19 | 7.33 | 0.28 | |
OA | 97.59 | ||||||||||
KC | 0.970 | ||||||||||
Proposed Method | PA | 94.64 | 95.63 | 100 | 98.77 | 98.50 | 98.76 | 99.82 | 99.02 | 94.46 | 99.46 |
UA | 96.47 | 95.75 | 76.47 | 98.46 | 99.33 | 97.14 | 99.82 | 98.73 | 96.42 | 99.15 | |
OE | 5.36 | 4.37 | 0.00 | 1.23 | 1.50 | 1.24 | 0.18 | 0.98 | 5.54 | 0.54 | |
CE | 3.53 | 4.25 | 23.53 | 1.54 | 0.67 | 2.86 | 0.18 | 1.27 | 3.58 | 0.85 | |
OA | 98.54 | ||||||||||
KC | 0.981 |
Time | Index | Arboretum | Agricultural-Vegetable | Broad Bean | Barren | Built-Up | Barley | Water | Wheat | Canola | Alfalfa |
---|---|---|---|---|---|---|---|---|---|---|---|
Two months after planting | PA | 88.49 | 99.14 | 88.24 | 96.56 | 97.43 | 97.69 | 99.01 | 96.09 | 91.09 | 96.36 |
UA | 86.59 | 89.18 | 66.18 | 96.80 | 97.22 | 94.89 | 98.12 | 98.53 | 81.54 | 97.75 | |
OE | 11.51 | 0.86 | 11.76 | 3.44 | 2.57 | 2.31 | 0.99 | 3.91 | 8.91 | 3.65 | |
CE | 13.41 | 10.82 | 66.18 | 96.80 | 97.22 | 94.89 | 98.12 | 98.53 | 81.54 | 97.75 | |
OA | 96.23 | ||||||||||
KC | 0.953 | ||||||||||
Three months after planting | PA | 94.68 | 98.23 | 84.09 | 96.56 | 98.48 | 98.85 | 99.71 | 96.69 | 91.75 | 98.59 |
UA | 90.90 | 89.31 | 54.41 | 98.07 | 98.04 | 95.31 | 98.98 | 98.83 | 86.58 | 96.68 | |
OE | 5.32 | 1.77 | 15.91 | 3.44 | 1.52 | 1.15 | 0.29 | 3.31 | 8.25 | 1.41 | |
CE | 9.10 | 10.69 | 45.59 | 1.93 | 1.96 | 4.69 | 1.02 | 1.17 | 13.42 | 3.32 | |
OA | 97.15 | ||||||||||
KC | 0.964 | ||||||||||
Four months after planting | PA | 94.45 | 99.17 | 97.44 | 98.10 | 98.65 | 98.28 | 99.82 | 97.86 | 94.19 | 98.70 |
UA | 95.0 | 91.82 | 55.88 | 97.83 | 98.90 | 96.32 | 99.75 | 98.75 | 92.99 | 99.04 | |
OE | 5.55 | 0.83 | 2.56 | 1.90 | 1.35 | 1.72 | 0.18 | 2.14 | 5.81 | 1.30 | |
CE | 4.96 | 91.82 | 55.88 | 97.83 | 98.90 | 96.32 | 99.75 | 98.75 | 92.99 | 99.04 | |
OA | 97.96 | ||||||||||
KC | 0.974 | ||||||||||
Five months after planting | PA | 94.66 | 93.82 | 100 | 97.78 | 99.23 | 98.62 | 99.62 | 97.80 | 95.93 | 98.81 |
UA | 94.14 | 94.85 | 22.06 | 98.50 | 98.41 | 96.40 | 99.75 | 99.14 | 90.69 | 99.06 | |
OE | 5.34 | 6.18 | 0.00 | 2.22 | 0.77 | 1.38 | 0.38 | 2.20 | 4.07 | 1.19 | |
CE | 5.86 | 5.15 | 7.94 | 1.50 | 1.59 | 3.60 | 0.25 | 0.86 | 9.31 | 0.94 | |
OA | 98.04 | ||||||||||
KC | 0.975 | ||||||||||
Six months after planting | PA | 97.24 | 98.23 | 54.17 | 98.67 | 98.52 | 98.79 | 99.87 | 98.63 | 93.80 | 99.22 |
UA | 96.69 | 96.72 | 57.35 | 98.44 | 99.35 | 95.84 | 99.25 | 98.86 | 96.11 | 99.54 | |
OE | 2.76 | 1.77 | 45.83 | 1.33 | 1.48 | 1.21 | 0.13 | 1.37 | 6.20 | 0.78 | |
CE | 3.31 | 3.28 | 42.65 | 1.56 | 0.65 | 4.16 | 0.75 | 1.14 | 3.89 | 0.46 | |
OA | 98.45 | ||||||||||
KC | 0.980 | ||||||||||
Seven months after planting | PA | 94.64 | 95.63 | 100 | 98.77 | 98.50 | 98.76 | 99.82 | 99.02 | 94.46 | 99.46 |
UA | 96.47 | 95.75 | 76.47 | 98.46 | 99.33 | 97.14 | 99.82 | 98.73 | 96.42 | 99.15 | |
OE | 5.36 | 4.37 | 0.00 | 1.23 | 1.50 | 1.24 | 0.18 | 0.98 | 5.54 | 0.54 | |
CE | 3.53 | 4.25 | 23.53 | 1.54 | 0.67 | 2.86 | 0.18 | 1.27 | 3.58 | 0.85 | |
OA | 98.54 | ||||||||||
KC | 0.981 |
Reference | OA% | Method |
---|---|---|
Zhong, Hu, and Zhou [45] | 85.54 | Deep learning |
Xu et al. [98] | 98.3 | Deep learning |
Wei et al. [99] | 85.01 | Deep learning |
Tamiminia et al. [100] | 80.48 | Kernel-based clustering |
Hamidi et al. [101] | 95.26 | Deep learning |
Kwak and Park [102] | 96.37 | Deep learning |
Proposed | 98.54 | Deep learning |
Index | Value |
---|---|
OA | 98.50, 98.51, 98.52, 98.54, 98.40, 98.46, 98.48, 98.52, 98.47, 98.49 |
Mean | 98.49 |
Standard Deviation | ±0.04 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Seydi, S.T.; Amani, M.; Ghorbanian, A. A Dual Attention Convolutional Neural Network for Crop Classification Using Time-Series Sentinel-2 Imagery. Remote Sens. 2022, 14, 498. https://doi.org/10.3390/rs14030498
Seydi ST, Amani M, Ghorbanian A. A Dual Attention Convolutional Neural Network for Crop Classification Using Time-Series Sentinel-2 Imagery. Remote Sensing. 2022; 14(3):498. https://doi.org/10.3390/rs14030498
Chicago/Turabian StyleSeydi, Seyd Teymoor, Meisam Amani, and Arsalan Ghorbanian. 2022. "A Dual Attention Convolutional Neural Network for Crop Classification Using Time-Series Sentinel-2 Imagery" Remote Sensing 14, no. 3: 498. https://doi.org/10.3390/rs14030498
APA StyleSeydi, S. T., Amani, M., & Ghorbanian, A. (2022). A Dual Attention Convolutional Neural Network for Crop Classification Using Time-Series Sentinel-2 Imagery. Remote Sensing, 14(3), 498. https://doi.org/10.3390/rs14030498