Multisource High-Resolution Remote Sensing Image Vegetation Extraction with Comprehensive Multifeature Perception
Abstract
:1. Introduction
- We constructed a convolutional network (MSICN) for vegetation extraction that considers the comprehensive perception of multiple features, introducing simplified dense connections and cross-attention mechanisms to enhance information sharing and weighting among multiscale and multi-feature layers, achieving multi-feature representation, enhancement, fusion, and extraction of vegetation.
- Using random forests for the selection of vegetation index features, the impact of vegetation index features on the accuracy of vegetation extraction across different data sources was determined.
- The universality and generalization of the network across different data sources were verified.
2. Related Work
2.1. Pixel-Based Inversion Method for Vegetation Extraction
2.2. Vegetation Index Feature fusion Method
3. Materials and Methods
3.1. Vegetation Index Selection
3.2. Method
- (1)
- Simplified Dense Block
- (2)
- Dual-Path Multihead Cross-Attention Feature Fusion
4. Experiments and Results
4.1. Dataset Acquisition and Preprocessing
4.1.1. Study Area
4.1.2. Dataset Acquisition
4.1.3. Data Preprocessing
4.2. Experimental Settings
4.3. Evaluation Metrics
4.4. Experimental Results
5. Discussion and Analysis
5.1. Comparative Experimental Results Analysis
5.2. Ablation and Analysis
5.2.1. The Effectiveness of Vegetation Feature Selection
5.2.2. The Effectiveness of Spectral and Index Feature Fusion
5.2.3. The Role of Simplified Dense Connections and Dual-Path Cross-Attention Mechanisms in the Network
5.3. Universal Validation Analysis
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wu, J.; Zhang, L.; Zhao, B.; Yang, N.; Gao, P. Remote sensing monitoring of vegetation and its resilience based on the critical slowdown model and GLASS LAI: A case study of the three gorges reservoir area. Acta Ecol. Sin. 2023, 12, 1–12. [Google Scholar]
- Guo, J.; Xu, Q.; Zeng, Y.; Liu, Z.; Zhu, X.X. Nationwide urban tree canopy mapping and coverage assessment in Brazil from high-resolution remote sensing images using deep learning. ISPRS J. Photogramm. Remote Sens. 2023, 198, 1–15. [Google Scholar] [CrossRef]
- Zhang, Z.; Liu, X.; Zhu, L.; Li, J.; Zhang, Y. Remote Sensing Extraction Method of Illicium verum Based on Functional Characteristics of Vegetation Canopy. Remote Sens. 2022, 14, 6248. [Google Scholar] [CrossRef]
- Askam, E.; Nagisetty, R.M.; Crowley, J.; Bobst, A.L.; Shaw, G.; Fortune, J. Satellite and sUAS Multispectral Remote Sensing Analysis of Vegetation Response to Beaver Mimicry Restoration on Blacktail Creek, Southwest Montana. Remote Sens. 2022, 14, 6199. [Google Scholar] [CrossRef]
- Zhang, Y.; Wang, H.; Li, H.; Sun, J.; Liu, H.; Yin, Y. Optimization Model of Signal-to-Noise Ratio for a Typical Polarization Multispectral Imaging Remote Sensor. Sensors 2022, 22, 6624. [Google Scholar] [CrossRef]
- Wang, R.; Shi, F.; Xu, D. The Extraction Method of Alfalfa (Medicago sativa L.) Mapping Using Different Remote Sensing Data Sources Based on Vegetation Growth Properties. Land 2022, 11, 1996. [Google Scholar] [CrossRef]
- Zhang, J.; Li, J.; Wang, X.; Wu, P.; Liu, X.; Ji, Y. Remote sensing imaging: A useful method for assessing wetland vegetation evolution processes in the Nanjishan Wetland National Nature Reserve, Lake Poyang. IOP Conf. Ser. Earth Environ. Sci. 2019, 349, 012011. [Google Scholar] [CrossRef]
- Moesinger, L.; Zotta, R.-M.; van der Schalie, R.; Scanlon, T.; de Jeu, R.; Dorigo, W. Monitoring vegetation condition using microwave remote sensing: The standardized vegetation optical depth index (SVODI). Biogeosciences 2022, 19, 5107–5123. [Google Scholar] [CrossRef]
- El-Mezouar, M.C.; Taleb, N.; Kpalma, K.; Ronsin, J. Vegetation extraction from IKONOS imagery using high spatial resolution index. J. Appl. Remote Sens. 2011, 5, 053543. [Google Scholar] [CrossRef]
- Yao, F.; Luo, J.; Shen, Z.; Dong, D.; Yang, K. Automatic urban vegetation extraction method using high resolution imagery. J. Geo Inf. Sci. 2018, 18, 248–254. [Google Scholar]
- Li, W.; Fu, H.; Yu, L.; Gong, P.; Feng, D.; Li, C.; Clinton, N. Stacked Autoencoder-based deep learning for remote-sensing image classification: A case study of African land-cover mapping. Int. J. Remote Sens. 2016, 37, 5632–5646. [Google Scholar] [CrossRef]
- Feng, S.; Zhao, J.; Liu, T.; Zhang, H.; Zhang, Z.; Guo, X. Crop Type Identification and Mapping Using Machine Learning Algorithms and Sentinel-2 Time Series Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 3295–3306. [Google Scholar] [CrossRef]
- Xi, Y.; Ren, C.; Tian, Q.; Ren, Y.; Dong, X.; Zhang, Z. Exploitation of Time Series Sentinel-2 Data and Different Machine Learning Algorithms for Detailed Tree Species Classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 7589–7603. [Google Scholar] [CrossRef]
- Gašparović, M.; Dobrinić, D. Comparative Assessment of Machine Learning Methods for Urban Vegetation Mapping Using Multitemporal Sentinel-1 Imagery. Remote Sens. 2020, 12, 1952. [Google Scholar] [CrossRef]
- Hoeser, T.; Kuenzer, C. Object Detection and Image Segmentation with Deep Learning on Earth Observation Data: A Review-Part I: Evolution and Recent Trends. Remote Sens. 2020, 12, 1667. [Google Scholar] [CrossRef]
- He, H.; Qian, H.; Xie, L.; Duan, P. Interchange recognition method based on CNN. Acta Geod. Cartogr. Sin. 2018, 47, 385. [Google Scholar]
- Barbosa, A.; Trevisan, R.; Hovakimyan, N.; Martin, N.F. Modeling yield response to crop management using convolutional neural networks. Comput. Electron. Agric. 2020, 170, 105197. [Google Scholar] [CrossRef]
- Fricker, G.A.; Ventura, J.D.; Wolf, J.A.; North, M.P.; Davis, F.W.; Franklin, J. A Convolutional Neural Network Classifier Identifies Tree Species in Mixed-Conifer Forest from Hyperspectral Imagery. Remote Sens. 2019, 11, 2326. [Google Scholar] [CrossRef]
- Torres, D.L.; Feitosa, R.Q.; Happ, P.N.; La Rosa, L.E.C.; Junior, J.M.; Martins, J.; Bressan, P.O.; Gonçalves, W.N.; Liesenberg, V. Applying Fully Convolutional Architectures for Semantic Segmentation of a Single Tree Species in Urban Environment on High Resolution UAV Optical Imagery. Sensors 2020, 20, 563. [Google Scholar] [CrossRef] [PubMed]
- Freudenberg, M.; Nölke, N.; Agostini, A.; Urban, K.; Wörgötter, F.; Kleinn, C. Large Scale Palm Tree Detection in High Resolution Satellite Images Using U-Net. Remote Sens. 2019, 11, 312. [Google Scholar] [CrossRef]
- López-Jiménez, E.; Vasquez-Gomez, J.I.; Sanchez-Acevedo, M.A.; Herrera-Lozada, J.C.; Uriarte-Arcia, A.V. Columnar cactus recognition in aerial images using a deep learning approach. Ecol. Inform. 2019, 52, 131–138. [Google Scholar] [CrossRef]
- Liu, R.; Lehman, J.; Molino, P.; Petroski Such, F.; Frank, E.; Sergeev, A.; Yosinski, J. An intriguing failing of convolutional neural networks and the coordconv solution. In Proceedings of the 32nd Annual Conference on Neural Information Processing Systems (NeurIPS2018), Montréal, QC, Canada, 2–8 December 2018; pp. 9628–9639. [Google Scholar]
- Zou, J.; Dado, W.T.; Pan, R. Early Crop Type Image Segmentation from Satellite Radar Imagery. 2020. Available online: https://api.semanticscholar.org/corpusid:234353421 (accessed on 25 December 2023).
- Mo, Y.; Wu, Y.; Yang, X.; Liu, F.; Liao, Y. Review the state-of-the-art technologies of semantic segmentation based on deep learning. Neurocomputing 2022, 493, 626–646. [Google Scholar] [CrossRef]
- Rußwurm, M.; Körner, M. Multi-Temporal Land Cover Classification with Sequential Recurrent Encoders. ISPRS Int. J. Geo-Inf. 2018, 7, 129. [Google Scholar] [CrossRef]
- Sun, W.; Wang, R. Fully Convolutional Networks for Semantic Segmentation of Very High Resolution Remotely Sensed Images Combined With DSM. IEEE Geosci. Remote Sens. Lett. 2018, 15, 474–478. [Google Scholar] [CrossRef]
- Kattenborn, T.; Leitloff, J.; Schiefer, F.; Hinz, S. Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J. Photogramm. Remote Sens. 2021, 173, 24–49. [Google Scholar] [CrossRef]
- Jin, X.; Bagavathiannan, M.; Maity, A.; Chen, Y.; Yu, J. Deep learning for detecting herbicide weed control spectrum in turfgrass. Plant Methods 2022, 18, 1–11. [Google Scholar] [CrossRef]
- de Camargo, T.; Schirrmann, M.; Landwehr, N.; Dammer, K.-H.; Pflanz, M. Optimized Deep Learning Model as a Basis for Fast UAV Mapping of Weed Species in Winter Wheat Crops. Remote Sens. 2021, 13, 1704. [Google Scholar] [CrossRef]
- Jegou, S.; Drozdzal, M.; Vazquez, D.; Romero, A.; Bengio, Y. The one hundred layers tiramisu: Fully convolutional DenseNets for semantic segmentation. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Honolulu, HI, USA, 21–26 July 2017; pp. 1175–1183. [Google Scholar]
- Xu, Z.; Zhou, Y.; Wang, S.; Wang, L.; Li, F.; Wang, S.; Wang, Z. A Novel Intelligent Classification Method for Urban Green Space Based on High-Resolution Remote Sensing Images. Remote Sens. 2020, 12, 3845. [Google Scholar] [CrossRef]
- Chen, L.-C.; Hermans, A.; Papandreou, G.; Schroff, F.; Wang, P.; Adam, H. MaskLab: Instance Segmentation by Refining Object Detection with Semantic and Direction Features. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 4013–4022. [Google Scholar]
- Mazzia, V.; Khaliq, A.; Chiaberge, M. Improvement in Land Cover and Crop Classification based on Temporal Features Learning from Sentinel-2 Data Using Recurrent-Convolutional Neural Network (R-CNN). Appl. Sci. 2019, 10, 238. [Google Scholar] [CrossRef]
- Chen, S.T.; Yu, H. High resolution remote sensing image classification based on multi-scale and multi-feature fusion. Chin. J. Quantum Electron. 2016, 33, 420–426. [Google Scholar]
- Radke, D.; Radke, D.; Radke, J. Beyond measurement: Extracting vegetation height from high resolution imagery with deep learning. Remote Sens. 2020, 12, 3797. [Google Scholar] [CrossRef]
- Woo, S.; Park, J.; Lee, J.Y.; Kweon, I.S. CBAM: Convolutional block attention module. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- Kalita, I.; Kumar, R.N.S.; Roy, M. Deep learning-based cross-sensor domain adaptation under active learning for land cover classification. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- Dobrinić, D.; Gašparović, M.; Medak, D. Evaluation of Feature Selection Methods for Vegetation Mapping Using Multitemporal Sentinel Imagery. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, 43, 485–491. [Google Scholar] [CrossRef]
- Luo, C.; Meng, S.; Hu, X.; Wang, X.; Zhong, Y. Cropnet: Deep spatial-temporal-spectral feature learning network for crop classification from time-series multi-spectral images. In Proceedings of the IGARSS 2020—2020 IEEE International Geoscience and Remote Sensing Symposium, Waikoloa, HI, USA, 26 September–2 October 2020; pp. 4187–4190. [Google Scholar]
- Ju, Y.; Bohrer, G. Classification of wetland vegetation based on NDVI time series from the HLS dataset. Remote Sens. 2022, 14, 2107. [Google Scholar] [CrossRef]
- Fang, S.; Li, K.; Shao, J.; Li, Z. SNUNet-CD: A densely connected Siamese network for change detection of VHR images. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- Liu, Y.; Pang, C.; Zhan, Z.; Zhang, X.; Yang, X. Building change detection for remote sensing images using a dual-task constrained deep siamese convolutional network model. IEEE Geosci. Remote Sens. Lett. 2020, 18, 811–815. [Google Scholar] [CrossRef]
- Huang, Z.; Wang, X.; Huang, L.; Huang, C.; Wei, Y.; Liu, W. Ccnet: Criss-cross attention for semantic segmentation. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019; pp. 603–612. [Google Scholar]
- Lee, P.H.; Yu, P.L. Distance-based tree models for ranking data. Comput. Stat. Data Anal. 2010, 54, 1672–1682. [Google Scholar] [CrossRef]
- Zeng, F.; Yang, B.; Zhao, M.; Xing, Y.; Ma, Y. MASANet: Multi-Angle Self-Attention Network for Semantic Segmentation of Remote Sensing Images. Teh. Vjesn. 2022, 29, 1567–1575. [Google Scholar]
- Abraham, A.; Pedregosa, F.; Eickenberg, M.; Gervais, P.; Mueller, A.; Kossaifi, J.; Gramfort, A.; Thirion, B.; Varoquaux, G. Machine learning for neuroimaging with scikit-learn. Front. Neurosci. 2014, 8, 14. [Google Scholar] [CrossRef]
- Wójcik-Gront, E.; Gozdowski, D.; Stępień, W. UAV-Derived Spectral Indices for the Evaluation of the Condition of Rye in Long-Term Field Experiments. Agriculture 2022, 12, 1671. [Google Scholar] [CrossRef]
- Bernardes, T.; Moreira, M.A.; Adami, M.; Giarolla, A.; Rudorff, B.F.T. Monitoring Biennial Bearing Effect on Coffee Yield Using MODIS Remote Sensing Imagery. Remote Sens. 2012, 4, 2492–2509. [Google Scholar] [CrossRef]
- Manna, S.; Mondal, P.P.; Mukhopadhyay, A.; Akhand, A.; Harza, S.; Mitra, D. Vegetation cover change analysis from multi-temporal satellite data in Jharkhali Island, Sundarbans, India. Indian J. Geo Mar. Sci. 2013, 42, 331–342. [Google Scholar]
- Szigarski, C.; Jagdhuber, T.; Baur, M.; Thiel, C.; Parrens, M.; Wigneron, J.-P.; Piles, M.; Entekhabi, D. Analysis of the Radar Vegetation Index and Potential Improvements. Remote Sens. 2018, 10, 1776. [Google Scholar] [CrossRef]
- Naji, T.A.H. Study of vegetation cover distribution using DVI, PVI, WDVI indices with 2D-space plot. J. Phys. Conf. Ser. 2018, 1003, 012083. [Google Scholar] [CrossRef]
- Meivel, S.; Maheswari, S. Remote Sensing Analysis of Agricultural Drone. J. Indian Soc. Remote Sens. 2020, 49, 689–701. [Google Scholar] [CrossRef]
- Suppakittpaisarn, P.; Jiang, B.; Slavenas, M.; Sullivan, W.C. Does density of green infrastructure predict preference? Urban For. Urban Green. 2018, 40, 236–244. [Google Scholar] [CrossRef]
- Nedkov, R. Normalized differential greenness index for vegetation dynamics assessment. Comptes Rendus L’Academie Bulg. Sci. 2017, 70, 1143–1146. [Google Scholar]
- Meng, Q.Y.; Dong, H.; Qin, Q.M.; Wang, J.L.; Zhao, J.H. MTCARIA kind of vegetation index monitoring vegetation leaf chlorophyll content based on hyperspectral remote sensing. Spectrosc. Spectr. Anal. 2012, 32, 2218–2222. [Google Scholar] [CrossRef]
- Ingram, R.E. Self-focused attention in clinical disorders: Review and a conceptual model. Psychol. Bull. 1990, 107, 156–176. [Google Scholar] [CrossRef]
- Wu, B.; Huang, W.; Ye, H.; Luo, P.; Ren, Y.; Kong, W. Using multi-angular hyperspectral data to estimate the vertical distribution of leaf chlorophyll content in wheat. Remote Sens. 2021, 13, 1501. [Google Scholar] [CrossRef]
Index | Name | Formulation | Reference |
---|---|---|---|
NDVI | Normalized Difference VI | (NIR − Red)/(NIR + Red) | [9] |
GNDVI | Green Normalized Difference VI | (NIR − Green)/(NIR + Green) | [48] |
EVI | Enhanced VI | 2.5 ×(NIR − Red)/(NIR + 6 × Red − 7.5 × Blue + 1)] | [49] |
OSAVI | Optimized Soil Adjusted VI | (NIR − Red)/(NIR + Red + 0.16) | [50] |
RVI | Ratio VI | NIR/Red | [51] |
DVI | Difference VI | NIR − Red | [52] |
TVI | Transform VI | 0.5 × [120 × (NIR − Green) − 200 × (Red − Green)] | [53] |
GVI | Green VI | (NIR/(NIR + Green)) − (Red/(Red + Green)) | [54] |
GI | Green Index | (NIR/Green) − 1 | [55] |
NDGI | Normalized Difference Green Index | (Green − SWIR)/(Green + SWIR) | [56] |
MCARI | Modified Soil Adjusted VI | (1.2 × (NIR − Red) − 2.5 × (Blue − Red))/sqrt((2 × NIR + 1) ^2 − (6 × NIR − 5 × sqrt(Red)) − 0.5) | [57] |
TCARI | Transform Soil Adjusted VI | 3 × ((NIR − Red) − 0.2 × (NIR − Green) × (NIR/Red)) | [57] |
Dataset | Band No. | Band Name | Resolution | Wavelength (nm) |
---|---|---|---|---|
GF-2 | Pan | Pan | 1 | 450–900 |
B1 | Blue | 4 | 450–520 | |
B2 | Green | 4 | 520–590 | |
B3 | Red | 4 | 630–690 | |
B4 | Near-infrared | 4 | 770–890 | |
GF-7 | Pan | Pan | ≤0.8 | 450–900 |
B1 | Blue | ≥3.2 | 450–520 | |
B2 | Green | ≥3.2 | 520–590 | |
B3 | Red | ≥3.2 | 630–690 | |
B4 | Near-infrared | ≥3.2 | 770–890 | |
Planet | B1 | Blue | 3 | 465–515 |
B2 | Green | 3 | 547–593 | |
B3 | Red | 3 | 650–680 | |
B4 | Near-infrared | 3 | 845–885 |
Test Area | Imagery | Method | F1 | Precision | IOU | Recall | OA | Kappa |
---|---|---|---|---|---|---|---|---|
A,B | GF2 | MSICN | 0.9243 | 0.9253 | 0.8646 | 0.9285 | 0.9285 | 0.8558 |
C,D | GF7 | MSICN | 0.9133 | 0.9405 | 0.8421 | 0.8734 | 0.9203 | 0.8681 |
E,F | Planet | MSICN | 0.9219 | 0.9135 | 0.8696 | 0.9196 | 0.9182 | 0.8328 |
Test Area | Imagery | Method | F1 | Precision | IOU | Recall | OA | Kappa |
---|---|---|---|---|---|---|---|---|
A,B | GF2 | NDVI | 0.8934 | 0.8913 | 0.8343 | 0.9084 | 0.9103 | 0.8433 |
RFC | 0.8715 | 0.8213 | 0.7398 | 0.8503 | 0.8640 | 0.7283 | ||
OSBDL | 0.8824 | 0.8911 | 0.7911 | 0.8540 | 0.8853 | 0.7649 | ||
HRnet | 0.8969 | 0.8662 | 0.8139 | 0.9054 | 0.9023 | 0.8032 | ||
MSICN | 0.9243 | 0.9253 | 0.8646 | 0.9285 | 0.9285 | 0.8558 | ||
C,D | GF7 | NDVI | 0.8207 | 0.8342 | 0.7967 | 0.8136 | 0.8518 | 0.7123 |
RFC | 0.6399 | 0.6235 | 0.5526 | 0.6879 | 0.7234 | 0.5330 | ||
OCBDL | 0.7905 | 0.9349 | 0.7590 | 0.7672 | 0.8169 | 0.7423 | ||
HRnet | 0.7798 | 0.7112 | 0.7036 | 0.7289 | 0.8067 | 0.6660 | ||
MSICN | 0.9133 | 0.9405 | 0.8421 | 0.8734 | 0.9203 | 0.8681 | ||
E,F | Planet | NDVI | 0.8998 | 0.8740 | 0.8339 | 0.8275 | 0.7868 | 0.7216 |
RFC | 0.8170 | 0.8909 | 0.6907 | 0.6952 | 0.7337 | 0.5714 | ||
OCBDL | 0.8802 | 0.8852 | 0.7860 | 0.7955 | 0.8190 | 0.6907 | ||
HRnet | 0.9171 | 0.9465 | 0.8470 | 0.8899 | 0.8579 | 0.7841 | ||
MSICN | 0.9219 | 0.9135 | 0.8696 | 0.9196 | 0.9182 | 0.8328 |
Imagery | Method | F1 | IOU | Recall | OA | Kappa |
---|---|---|---|---|---|---|
GF2 | Spectral+8 indices | 0.8634 | 0.8111 | 0.8642 | 0.8613 | 0.8124 |
Spectral+12 indices | 0.9131 | 0.8588 | 0.9042 | 0.8921 | 0.8343 | |
Ours | 0.9243 | 0.8646 | 0.9285 | 0.9285 | 0.8558 | |
GF7 | Spectral+8 indices | 0.8612 | 0.7617 | 0.7911 | 0.8536 | 0.7008 |
Spectral+12 indices | 0.8903 | 0.8357 | 0.8549 | 0.8917 | 0.7731 | |
Ours | 0.9133 | 0.8421 | 0.8734 | 0.9203 | 0.8681 | |
Planet | Spectral+8 indices | 0.7586 | 0.7125 | 0.7269 | 0.8407 | 0.6651 |
Spectral+12 indices | 0.8405 | 0.7682 | 0.8701 | 0.8533 | 0.7381 | |
Ours | 0.9219 | 0.8696 | 0.9196 | 0.9182 | 0.8328 |
Imagery | Method | F1 | IOU | Recall | OA | Kappa |
---|---|---|---|---|---|---|
GF2 | only image data | 0.8855 | 0.7958 | 0.8548 | 0.8834 | 0.7629 |
only vegetation index | 0.8870 | 0.8005 | 0.8978 | 0.8936 | 0.7831 | |
Ours | 0.9243 | 0.8646 | 0.9285 | 0.9285 | 0.8558 | |
GF7 | only image data | 0.7759 | 0.6393 | 0.6471 | 0.8003 | 0.6145 |
only vegetation index | 0.8443 | 0.7322 | 0.8205 | 0.8744 | 0.7649 | |
Ours | 0.9133 | 0.8421 | 0.8734 | 0.9203 | 0.8681 | |
Planet | only image data | 0.8164 | 0.6898 | 0.6943 | 0.7327 | 0.6993 |
only vegetation index | 0.8007 | 0.7254 | 0.8179 | 0.7723 | 0.7279 | |
Ours | 0.9219 | 0.8696 | 0.9196 | 0.9182 | 0.8328 |
Imagery | Method | F1 | IOU | Recall | OA | Kappa |
---|---|---|---|---|---|---|
GF2 | No simplified dense block | 0.8963 | 0.8052 | 0.8833 | 0.8980 | 0.7937 |
No cross-attention feature fusion | 0.8869 | 0.7799 | 0.8339 | 0.8783 | 0.7582 | |
Ours | 0.9243 | 0.8646 | 0.9285 | 0.9285 | 0.8558 | |
GF7 | No simplified dense block | 0.8346 | 0.7168 | 0.7573 | 0.8767 | 0.7379 |
No cross-attention feature fusion | 0.7953 | 0.6673 | 0.6823 | 0.8237 | 0.6524 | |
Ours | 0.9133 | 0.8421 | 0.8734 | 0.9203 | 0.8681 | |
Planet | No simplified dense block | 0.9014 | 0.8207 | 0.8488 | 0.8747 | 0.7298 |
No cross-attention feature fusion | 0.8649 | 0.7477 | 0.8281 | 0.8220 | 0.6150 | |
Ours | 0.9219 | 0.8696 | 0.9196 | 0.9182 | 0.8328 |
Imagery | Method | F1 | Precision | IOU | Recall | OA | Kappa |
---|---|---|---|---|---|---|---|
GF2 | RFC | 0.7720 | 0.7546 | 0.6887 | 0.8109 | 0.8237 | 0.7283 |
OCBDL | 0.8502 | 0.7908 | 0.7395 | 0.9014 | 0.8530 | 0.7411 | |
HRet | 0.8479 | 0.8217 | 0.7666 | 0.8594 | 0.8859 | 0.7680 | |
MSICN | 0.8862 | 0.8842 | 0.7957 | 0.8882 | 0.8965 | 0.8113 | |
GF7 | RFC | 0.6318 | 0.6617 | 0.5618 | 0.6704 | 0.7103 | 0.5217 |
OCBDL | 0.7211 | 0.9145 | 0.7590 | 0.7155 | 0.7325 | 0.7390 | |
HRet | 0.8339 | 0.7704 | 0.7564 | 0.7010 | 0.8307 | 0.7219 | |
MSICN | 0.9058 | 0.8805 | 0.8972 | 0.8995 | 0.9072 | 0.9167 | |
Planet | RFC | 0.7745 | 0.7868 | 0.7770 | 0.7851 | 0.7106 | 0.6778 |
OCBDL | 0.8109 | 0.8843 | 0.7820 | 0.7927 | 0.7224 | 0.6923 | |
HRnet | 0.8241 | 0.9465 | 0.8108 | 0.8669 | 0.8579 | 0.7841 | |
MSICN | 0.8964 | 0.8883 | 0.8740 | 0.8948 | 0.8702 | 0.8258 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Y.; Min, S.; Song, B.; Yang, H.; Wang, B.; Wu, Y. Multisource High-Resolution Remote Sensing Image Vegetation Extraction with Comprehensive Multifeature Perception. Remote Sens. 2024, 16, 712. https://doi.org/10.3390/rs16040712
Li Y, Min S, Song B, Yang H, Wang B, Wu Y. Multisource High-Resolution Remote Sensing Image Vegetation Extraction with Comprehensive Multifeature Perception. Remote Sensing. 2024; 16(4):712. https://doi.org/10.3390/rs16040712
Chicago/Turabian StyleLi, Yan, Songhan Min, Binbin Song, Hui Yang, Biao Wang, and Yongchuang Wu. 2024. "Multisource High-Resolution Remote Sensing Image Vegetation Extraction with Comprehensive Multifeature Perception" Remote Sensing 16, no. 4: 712. https://doi.org/10.3390/rs16040712
APA StyleLi, Y., Min, S., Song, B., Yang, H., Wang, B., & Wu, Y. (2024). Multisource High-Resolution Remote Sensing Image Vegetation Extraction with Comprehensive Multifeature Perception. Remote Sensing, 16(4), 712. https://doi.org/10.3390/rs16040712