Individual Tree Species Classification Based on a Hierarchical Convolutional Neural Network and Multitemporal Google Earth Images
Abstract
:1. Introduction
2. Study Area and Materials
2.1. Study Area
2.2. Materials
2.2.1. Multitemporal GE Data
2.2.2. Field Dataset
3. Experimental Process
3.1. GE Preprocessing
3.2. Construction of Remote Sensing Image Sets
3.2.1. Individual Tree Crown Delineation Using Multitemporal GE Images
RGB Vegetation Indices Used for Multitemporal GE Images
Individual Tree Crown Delineation
3.2.2. Sample Dataset Construction
3.3. H-CNN Classification
3.3.1. CNN Algorithms
3.3.2. Hierarchical Classification Structure Design
3.3.3. Experimental Environment
3.3.4. Training and Prediction
4. Results
4.1. Classification Accuracy
4.2. H-CNN vs. CNN Classification Results
4.3. Comparison of the Classification Results of Different Tree Species
4.4. Comparison of the Classification Results of the Different Sample Sets
4.5. Classification Map
5. Discussion
5.1. Effect of Combining Multitemporal Data with RGB Vegetation Indices on Crown Delineation
5.2. Influence of the Selection of Multitemporal Data on the Classification of ITS
5.3. Design of the Hierarchical Classification Structure
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Torabzadeh, H.; Leiterer, R.; Hueni, A.; Schaepman, M.E.; Morsdorf, F. Tree species classification in a temperate mixed forest using a combination of imaging spectroscopy and airborne laser scanning. Agric. For. Meteorol. 2019, 279, 107744. [Google Scholar] [CrossRef]
- Komura, R.; Muramoto, K. Classification of forest stand considering shapes and sizes of tree crown calculated from high spatial resolution satellite image. In Proceedings of the 2007 IEEE International Geoscience and Remote Sensing Symposium, Barcelona, Spain, 23–28 July 2007; pp. 4356–4359. [Google Scholar]
- Wang, M.; Zheng, Y.; Huang, C.; Meng, R.; Pang, Y.; Jia, W.; Zhou, J.; Huang, Z.; Fang, L.; Zhao, F. Assessing Landsat-8 and Sentinel-2 spectral-temporal features for mapping tree species of northern plantation forests in Heilongjiang Province, China. For. Ecosyst. 2022, 9, 100032. [Google Scholar] [CrossRef]
- Kamińska, A.; Lisiewicz, M.; Stereńczak, K. Single tree classification using multi-temporal ALS data and CIR imagery in mixed old-growth forest in Poland. Remote Sens. 2021, 13, 5101. [Google Scholar] [CrossRef]
- Fricker, G.A.; Ventura, J.D.; Wolf, J.A.; North, M.P.; Davis, F.W.; Franklin, J. A convolutional neural network classifier identifies tree species in mixed-conifer forest from hyperspectral imagery. Remote Sens. 2019, 11, 2326. [Google Scholar] [CrossRef] [Green Version]
- Chemura, A.; van Duren, I.; van Leeuwen, L.M. Determination of the age of oil palm from crown projection area detected from WorldView-2 multispectral remote sensing data: The case of Ejisu-Juaben district, Ghana. ISPRS J. Photogramm. Remote Sens. 2015, 100, 118–127. [Google Scholar] [CrossRef]
- Yin, W.; Yang, J.; Yamamoto, H.; Li, C. Object-based larch tree-crown delineation using high-resolution satellite imagery. Int. J. Remote Sens. 2015, 36, 822–844. [Google Scholar] [CrossRef]
- Allouis, T.; Durrieu, S.; Vega, C.; Couteron, P. Stem volume and above-ground biomass estimation of individual pine trees from LiDAR data: Contribution of full-waveform signals. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 924–934. [Google Scholar] [CrossRef]
- Jamal, J.; Zaki, N.A.M.; Talib, N.; Saad, N.M.; Mokhtar, E.S.; Omar, H.; Latif, Z.A.; Suratman, M.N. Dominant tree species classification using remote sensing data and object -based image analysis. IOP Conf. Ser. Earth Environ. Sci. 2022, 1019, 012018. [Google Scholar] [CrossRef]
- Qin, H.; Zhou, W.; Yao, Y.; Wang, W. Individual tree segmentation and tree species classification in subtropical broadleaf forests using UAV-based LiDAR, hyperspectral, and ultrahigh-resolution RGB data. Remote Sens. Environ. 2022, 280, 113143. [Google Scholar] [CrossRef]
- Bergmüller, K.O.; Vanderwel, M.C. Predicting tree mortality using spectral indices derived from multispectral UAV imagery. Remote Sens. 2022, 14, 2195. [Google Scholar] [CrossRef]
- Illarionova, S.; Trekin, A.; Ignatiev, V.; Oseledets, I. Neural-based hierarchical approach for detailed dominant forest species classification by multispectral satellite imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 1810–1820. [Google Scholar] [CrossRef]
- Liu, H. Classification of urban tree species using multi-features derived from four-season RedEdge-MX Data. Comput. Electron. Agric. 2022, 194, 106794. [Google Scholar] [CrossRef]
- Guo, X.; Li, H.; Jing, L.; Wang, P. Individual tree species classification based on convolutional neural networks and multitemporal high-resolution remote sensing images. Sensors 2022, 22, 3157. [Google Scholar] [CrossRef] [PubMed]
- Hill, R.A.; Wilson, A.K.; George, M.; Hinsley, S.A. Mapping tree species in temperate deciduous woodland using time-series multi-spectral data. Appl. Veg. Sci. 2010, 13, 86–99. [Google Scholar] [CrossRef]
- Fang, F.; McNeil, B.E.; Warner, T.A.; Maxwell, A.E.; Dahle, G.A.; Eutsler, E.; Li, J. Discriminating tree species at different taxonomic levels using multi-temporal WorldView-3 imagery in Washington D.C., USA. Remote Sens. Environ. 2020, 246, 111811. [Google Scholar] [CrossRef]
- Xie, Z.; Chen, Y.; Lu, D.; Li, G.; Chen, E. Classification of land cover, forest, and tree species classes with ZiYuan-3 multispectral and Stereo data. Remote Sens. 2019, 11, 164. [Google Scholar] [CrossRef] [Green Version]
- Franklin, S.E.; Ahmed, O.S. Deciduous tree species classification using object-based analysis and machine learning with unmanned aerial vehicle multispectral data. Int. J. Remote Sens. 2018, 39, 5236–5245. [Google Scholar] [CrossRef]
- Sun, Y.; Xin, Q.; Huang, J.; Huang, B.; Zhang, H. Characterizing tree species of a tropical Wetland in southern China at the individual tree level based on convolutional neural network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 4415–4425. [Google Scholar] [CrossRef]
- Zhang, L.; Zhang, L.; Du, B. Deep learning for remote sensing data: A technical tutorial on the state of the art. IEEE Geosci. Remote Sens. Mag. 2016, 4, 22–40. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef] [Green Version]
- de Souza, I.E.; Falcao, A.X. Learning CNN filters from user-drawn image markers for coconut-tree image classification. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
- Zhang, C.; Sargent, I.; Pan, X.; Li, H.; Gardiner, A.; Hare, J.; Atkinson, P.M. An object-based convolutional neural network (OCNN) for urban land use classification. Remote Sens. Environ. 2018, 216, 57–70. [Google Scholar] [CrossRef] [Green Version]
- Zhang, C.; Sargent, I.; Pan, X.; Li, H.; Gardiner, A.; Hare, J.; Atkinson, P.M. Joint deep learning for land cover and land use classification. Remote Sens. Environ. 2019, 221, 173–187. [Google Scholar] [CrossRef] [Green Version]
- Zhang, C.; Zhou, J.; Wang, H.; Tan, T.; Cui, M.; Huang, Z.; Wang, P.; Zhang, L. Multi-species individual tree segmentation and identification based on improved mask R-CNN and UAV imagery in mixed forests. Remote Sens. 2022, 14, 874. [Google Scholar] [CrossRef]
- Rezaee, M.; Zhang, Y.; Mishra, R.; Tong, F.; Tong, H. Using a VGG-16 network for individual tree species detection with an object-based approach. In Proceedings of the 2018 10th IAPR Workshop on Pattern Recognition in Remote Sensing (PRRS), Beijing, China, 19–20 August 2018; pp. 1–7. [Google Scholar]
- Chen, C.; Jing, L.; Li, H.; Tang, Y. A new individual tree species classification method based on the ResU-Net model. Forests 2021, 12, 1202. [Google Scholar] [CrossRef]
- Yu, K.; Hao, Z.; Post, C.J.; Mikhailova, E.A.; Lin, L.; Zhao, G.; Tian, S.; Liu, J. Comparison of classical methods and mask R-CNN for automatic tree detection and mapping using UAV imagery. Remote Sens. 2022, 14, 295. [Google Scholar] [CrossRef]
- Apostol, B.; Petrila, M.; Lorenţ, A.; Ciceu, A.; Gancz, V.; Badea, O. Species discrimination and individual tree detection for predicting main dendrometric characteristics in mixed temperate forests by use of airborne laser scanning and ultra-high-resolution imagery. Sci. Total Environ. 2020, 698, 134074. [Google Scholar] [CrossRef]
- Hologa, R.; Scheffczyk, K.; Dreiser, C.; Gärtner, S. Tree species classification in a temperate mixed mountain forest landscape using random forest and multiple datasets. Remote Sens. 2021, 13, 4657. [Google Scholar] [CrossRef]
- Yan, Z.; Zhang, H.; Piramuthu, R.; Jagadeesh, V.; DeCoste, D.; Di, W.; Yu, Y. HD-CNN: Hierarchical deep convolutional neural networks for large scale visual recognition. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 13–16 December 2015; pp. 2740–2748. [Google Scholar]
- Zheng, Y.; Chen, Q.; Fan, J.; Gao, X. Hierarchical convolutional neural network via hierarchical cluster validity based visual tree learning. Neurocomputing 2020, 409, 408–419. [Google Scholar] [CrossRef]
- Waśniewski, A.; Hościło, A.; Chmielewska, M. Can a hierarchical classification of Sentinel-2 data improve land cover mapping? Remote Sens. 2022, 14, 989. [Google Scholar] [CrossRef]
- Fan, J.; Zhang, J.; Mei, K.; Peng, J.; Gao, L. Cost-sensitive learning of hierarchical tree classifiers for large-scale image classification and novel category detection. Pattern Recognit. 2015, 48, 1673–1687. [Google Scholar] [CrossRef]
- Zhang, H.; Xu, D.; Luo, G.; He, K. Learning multi-level representations for affective image recognition. Neural Comput. Applic. 2022, 34, 14107–14120. [Google Scholar] [CrossRef]
- Qiu, Z.; Hu, M.; Zhao, H. Hierarchical classification based on coarse- to fine-grained knowledge transfer. Int. J. Approx. Reason. 2022, 149, 61–69. [Google Scholar] [CrossRef]
- Liu, Y.; Suen, C.Y.; Liu, Y.; Ding, L. Scene classification using hierarchical wasserstein CNN. IEEE Trans. Geosci. Remote Sens. 2019, 57, 2494–2509. [Google Scholar] [CrossRef]
- Zhao, S.; Jiang, X.; Li, G.; Chen, Y.; Lu, D. Integration of ZiYuan-3 multispectral and stereo imagery for mapping urban vegetation using the hierarchy-based classifier. Int. J. Appl. Earth Obs. Geoinf. 2021, 105, 102594. [Google Scholar] [CrossRef]
- Jiang, X.; Zhao, S.; Chen, Y.; Lu, D. Exploring tree species classification in subtropical regions with a modified hierarchy-based classifier using high spatial resolution multisensor Data. J. Remote Sens. 2022, 2022, 1–16. [Google Scholar] [CrossRef]
- Fan, J.; Zhou, N.; Peng, J.; Gao, L. Hierarchical Learning of Tree Classifiers for Large-Scale Plant Species Identification. IEEE Trans. Image Process. 2015, 24, 4172–4184. [Google Scholar] [CrossRef]
- Xing, X.; Hao, P.; Dong, L. Color characteristics of Beijing’s regional woody vegetation based on natural color system. Color Res. Appl. 2019, 44, 595–612. [Google Scholar] [CrossRef]
- Batalova, A.Y.; Putintseva, Y.A.; Sadovsky, M.G.; Krutovsky, K.V. Comparative genomics of seasonal senescence in forest trees. Int. J. Mol. Sci. 2022, 23, 3761. [Google Scholar] [CrossRef]
- Li, W.; Dong, R.; Fu, H.; Wang, J.; Yu, L.; Gong, P. Integrating Google Earth imagery with Landsat data to improve 30-m resolution land cover mapping. Remote Sens. Environ. 2020, 237, 111563. [Google Scholar] [CrossRef]
- Jing, L.; Hu, B.; Li, J.; Noland, T.; Guo, H. Automated tree crown delineation from imagery based on morphological techniques. IOP Conf. Ser. Earth Environ. Sci. 2014, 17, 012066. [Google Scholar] [CrossRef] [Green Version]
- Woebbecke, D.M.; Meyer, G.E.; von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Shimada, S.; Matsumoto, J.; Sekiyama, A.; Aosier, B.; Yokohana, M. A new spectral index to detect Poaceae grass abundance in Mongolian grasslands. Adv. Space Res. 2012, 50, 1266–1273. [Google Scholar] [CrossRef]
- Du, M.; Noguchi, N. Monitoring of wheat growth status and mapping of wheat yield’s within-field spatial variations using color images acquired from UAV-camera system. Remote Sens. 2017, 9, 289. [Google Scholar] [CrossRef] [Green Version]
- Xie, J.; Zhou, Z.; Zhang, H.; Zhang, L.; Li, M. Combining canopy coverage and plant height from UAV-based RGB images to estimate spraying volume on potato. Sustainability 2022, 14, 6473. [Google Scholar] [CrossRef]
- Wan, L.; Li, Y.; Cen, H.; Zhu, J.; Yin, W.; Wu, W.; Zhu, H.; Sun, D.; Zhou, W.; He, Y. Combining UAV-based vegetation indices and image classification to estimate flower number in oilseed rape. Remote Sens. 2018, 10, 1484. [Google Scholar] [CrossRef] [Green Version]
- Guo, Z.; Wang, T.; Liu, S.; Kang, W.; Chen, X.; Feng, K.; Zhang, X.; Zhi, Y. Biomass and vegetation coverage survey in the Mu Us sandy land-based on unmanned aerial vehicle RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 94, 102239. [Google Scholar] [CrossRef]
- Li, H.; Hu, B.; Li, Q.; Jing, L. CNN-Based individual tree species classification using high-resolution satellite imagery and airborne LiDAR data. Forests 2021, 12, 1697. [Google Scholar] [CrossRef]
- Chen, L.; Wei, Y.; Yao, Z.; Chen, E.; Zhang, X. Data augmentation in prototypical networks for forest tree species classification using airborne hyperspectral images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–16. [Google Scholar] [CrossRef]
- Moreno-Barea, F.J.; Jerez, J.M.; Franco, L. Improving classification accuracy using data augmentation on small data sets. Expert Syst. Appl. 2020, 161, 113696. [Google Scholar] [CrossRef]
- Yan, S.; Jing, L.; Wang, H. A new individual tree species recognition method based on a convolutional neural network and high-spatial resolution remote sensing imagery. Remote Sens. 2021, 13, 479. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 2261–2269. [Google Scholar]
- Zeiler, M.D.; Fergus, R. Visualizing and understanding convolutional networks. In Computer Vision–ECCV 2014; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Germany, 2014; Volume 8689, pp. 818–833. ISBN 978-3-319-10589-5. [Google Scholar]
- Minowa, Y.; Kubota, Y.; Nakatsukasa, S. Verification of a deep learning-based tree species identification model using images of broadleaf and coniferous tree leaves. Forests 2022, 13, 943. [Google Scholar] [CrossRef]
- Nezami, S.; Khoramshahi, E.; Nevalainen, O.; Pölönen, I.; Honkavaara, E. Tree species classification of drone hyperspectral and RGB imagery with deep learning convolutional neural networks. Remote Sens. 2020, 12, 1070. [Google Scholar] [CrossRef]
Tree Species Shorthand | Flowering Time | Flower Color | Fruiting Period | Fruit Color | Leaf Color Change | Leaf Shape | |
---|---|---|---|---|---|---|---|
Growth Period | Senescence Period | ||||||
Pl. o | March–April | Yellow/Cyan | October | Brown | Green | Green | Coniferous |
Pi. t | April–May | Yellow | October | Brown | Green | Green | Coniferous |
Ro. p | June–July | Light Yellow | August–October | Green | Green | Green to Yellow | Coniferous |
Ac. t | April–May | Light Green | September–October | Brown | Green | Green to Yellow to Red | Coniferous |
Qu. v | April–May | Green | September | Brown | Green (from light to dark) | Green to Brown to Yellow | Broadleaf |
Gi. b | April–May | Green | September–October | White | Green (from light to dark) | Green to Yellow | Broadleaf |
Ko. b | June–August | Yellow | September–October | Brown | Green (from light to dark) | Green to Brown to Red | Broadleaf |
Abbreviations | Vegetation Index Name | Formula | References |
---|---|---|---|
ExG | Excess green | 2G − R − B | [45] |
NGRDI | Normalized green–red difference index | (G − R)/(G + R) | [46] |
NGBDI | Normalized green–blue difference index | (G − B)/(G + B) | [47] |
EXGR | Excess green minus excess red | 2G − R − B − (1.4R − G) | [48] |
MGRVI | Modified green–red vegetation index | (G2 − R2)/(G2 + R2) | [49] |
RGBVI | Red–green–blue vegetation index | (G2 − B × R)/(G2 + B × R) | [50] |
Tree Species | Training Samples | Validation Samples | Test Samples | Total Samples |
---|---|---|---|---|
Pl. o | 366 | 120 | 120 | 606 |
Pi. t | 342 | 114 | 114 | 570 |
Ro. p | 372 | 120 | 120 | 612 |
Ac. t | 246 | 78 | 78 | 402 |
Qu. v | 336 | 108 | 108 | 522 |
Gi. b | 198 | 60 | 60 | 318 |
Ko. b | 246 | 78 | 78 | 420 |
Class I Category | Conifers (Evergreens) | Broadleaf (Deciduous) Trees | |||||
---|---|---|---|---|---|---|---|
Class II category | Pl. o | Pi. t | Ro. p | Ac. t | Qu. v | Gi. b | Ko. b |
CNN | Confusion Matrices | Evaluation Indicators | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Tree Species | Pl. o | Pi. t | Ro. p | Ac. t | Qu. v | Gi. b | Ko. b | PA | UA | OA%/Kappa% | |
ResNet-18 | Pl. o | 118 | 2 | 0 | 0 | 0 | 0 | 0 | 0.98 | 0.86 | 73.98/ 69.38 |
Pi. t | 19 | 80 | 0 | 7 | 0 | 1 | 7 | 0.70 | 0.98 | ||
Ro. p | 0 | 0 | 59 | 6 | 0 | 1 | 18 | 0.70 | 0.54 | ||
Ac. t | 0 | 0 | 18 | 27 | 3 | 5 | 1 | 0.50 | 0.47 | ||
Qu. v | 0 | 0 | 6 | 1 | 97 | 3 | 1 | 0.90 | 0.86 | ||
Gi. b | 0 | 0 | 16 | 12 | 1 | 62 | 11 | 0.61 | 0.85 | ||
Ko. b | 0 | 0 | 10 | 5 | 12 | 1 | 32 | 0.53 | 0.46 | ||
GoogLeNet | Pl. o | 107 | 13 | 0 | 0 | 0 | 0 | 0 | 0.89 | 0.85 | 72.74/ 67.71 |
Pi. t | 19 | 79 | 4 | 5 | 0 | 1 | 6 | 0.69 | 0.86 | ||
Ro. p | 0 | 0 | 65 | 1 | 2 | 13 | 3 | 0.77 | 0.59 | ||
Ac. t | 0 | 0 | 6 | 19 | 9 | 15 | 5 | 0.35 | 0.61 | ||
Qu. v | 0 | 0 | 3 | 0 | 103 | 1 | 1 | 0.95 | 0.76 | ||
Gi. b | 0 | 0 | 19 | 2 | 7 | 68 | 6 | 0.67 | 0.69 | ||
Ko. b | 0 | 0 | 14 | 4 | 15 | 1 | 26 | 0.43 | 0.55 | ||
DenseNet-40 | Pl. o | 109 | 11 | 0 | 0 | 0 | 0 | 0 | 0.91 | 0.86 | 77.73/ 73.56 |
Pi. t | 18 | 90 | 0 | 2 | 0 | 0 | 4 | 0.79 | 0.86 | ||
Ro. p | 0 | 0 | 72 | 0 | 0 | 1 | 11 | 0.86 | 0.73 | ||
Ac. t | 0 | 0 | 6 | 13 | 11 | 24 | 0 | 0.24 | 0.57 | ||
Qu. v | 0 | 0 | 0 | 0 | 108 | 0 | 0 | 1.00 | 0.82 | ||
Gi. b | 0 | 0 | 11 | 8 | 0 | 78 | 5 | 0.76 | 0.72 | ||
Ko. b | 0 | 4 | 9 | 0 | 12 | 6 | 29 | 0.48 | 0.59 |
CNN | Confusion Matrices | Evaluation Indicators | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Tree Species | Pl. o | Pi. t | Ro. p | Ac. t | Qu. v | Gi. b | Ko. b | PA | UA | OA%/Kappa | |
ResNet-18 | Pl. o | 93 | 25 | 1 | 0 | 0 | 0 | 1 | 0.78 | 0.64 | 64.60/ 58.21 |
Pi. t | 25 | 82 | 1 | 0 | 0 | 6 | 0 | 0.72 | 0.62 | ||
Ro. p | 1 | 19 | 79 | 14 | 0 | 2 | 5 | 0.66 | 0.72 | ||
Ac. t | 0 | 4 | 15 | 48 | 3 | 5 | 3 | 0.62 | 0.51 | ||
Qu. v | 3 | 0 | 0 | 7 | 85 | 3 | 10 | 0.79 | 0.90 | ||
Gi. b | 4 | 2 | 2 | 14 | 0 | 26 | 12 | 0.43 | 0.55 | ||
Ko. b | 19 | 0 | 12 | 11 | 6 | 5 | 25 | 0.32 | 0.45 | ||
GoogLeNet | Pl. o | 94 | 20 | 0 | 6 | 0 | 0 | 0 | 0.78 | 0.79 | 70.21/ 64.91 |
Pi. t | 7 | 96 | 0 | 0 | 0 | 11 | 0 | 0.84 | 0.66 | ||
Ro. p | 0 | 26 | 82 | 6 | 0 | 6 | 0 | 0.68 | 0.76 | ||
Ac. t | 0 | 0 | 11 | 52 | 0 | 9 | 6 | 0.67 | 0.62 | ||
Qu. v | 0 | 0 | 0 | 8 | 90 | 6 | 4 | 0.83 | 0.94 | ||
Gi. b | 2 | 4 | 6 | 5 | 0 | 28 | 15 | 0.47 | 0.42 | ||
Ko. b | 16 | 0 | 9 | 7 | 6 | 6 | 34 | 0.44 | 0.58 | ||
DenseNet-40 | Pl. o | 90 | 24 | 0 | 6 | 0 | 0 | 0 | 0.75 | 0.73 | 67.70/ 61.88 |
Pi. t | 16 | 91 | 7 | 0 | 0 | 0 | 0 | 0.80 | 0.62 | ||
Ro. p | 0 | 25 | 85 | 4 | 0 | 6 | 0 | 0.71 | 0.70 | ||
Ac. t | 0 | 0 | 12 | 44 | 5 | 3 | 14 | 0.56 | 0.56 | ||
Qu. v | 6 | 0 | 0 | 6 | 84 | 0 | 12 | 0.78 | 0.94 | ||
Gi. b | 6 | 6 | 0 | 10 | 0 | 32 | 6 | 0.53 | 0.59 | ||
Ko. b | 5 | 1 | 17 | 9 | 0 | 13 | 33 | 0.42 | 0.51 |
CNN | Confusion Matrices | Evaluation Indicators | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Tree Species | Pl. o | Pi. t | Ro. p | Ac. t | Qu. v | Gi. b | Ko. b | PA | UA | OA%/Kappa% | |
ResNet-18 | Pl. o | 90 | 21 | 1 | 0 | 1 | 5 | 2 | 0.75 | 0.69 | 55.31/ 47.40 |
Pi. t | 29 | 76 | 0 | 0 | 9 | 0 | 0 | 0.67 | 0.63 | ||
Ro. p | 1 | 11 | 55 | 7 | 17 | 7 | 22 | 0.46 | 0.57 | ||
Ac. t | 0 | 0 | 7 | 46 | 3 | 2 | 20 | 0.59 | 0.48 | ||
Qu. v | 10 | 5 | 11 | 12 | 62 | 4 | 4 | 0.57 | 0.59 | ||
Gi. b | 0 | 6 | 9 | 6 | 2 | 24 | 13 | 0.40 | 0.52 | ||
Ko. b | 0 | 2 | 14 | 25 | 11 | 4 | 22 | 0.28 | 0.27 | ||
GoogLeNet | Pl. o | 94 | 12 | 1 | 10 | 1 | 1 | 1 | 0.78 | 0.72 | 58.70/ 51.40 |
Pi. t | 29 | 70 | 9 | 0 | 6 | 0 | 0 | 0.64 | 0.72 | ||
Ro. p | 0 | 6 | 54 | 2 | 17 | 4 | 37 | 0.45 | 0.48 | ||
Ac. t | 0 | 0 | 19 | 51 | 0 | 4 | 4 | 0.65 | 0.47 | ||
Qu. v | 6 | 6 | 1 | 16 | 74 | 4 | 1 | 0.69 | 0.67 | ||
Gi. b | 2 | 2 | 6 | 6 | 1 | 36 | 7 | 0.60 | 0.73 | ||
Ko. b | 0 | 1 | 23 | 23 | 12 | 0 | 19 | 0.24 | 0.28 | ||
DenseNet-40 | Pl. o | 97 | 16 | 0 | 7 | 0 | 0 | 0 | 0.81 | 0.75 | 62.98/ 56.35 |
Pi. t | 24 | 76 | 2 | 0 | 5 | 7 | 0 | 0.67 | 0.66 | ||
Ro. p | 2 | 13 | 68 | 5 | 11 | 7 | 14 | 0.57 | 0.57 | ||
Ac. t | 0 | 0 | 12 | 53 | 6 | 0 | 7 | 0.68 | 0.51 | ||
Qu. v | 6 | 6 | 18 | 2 | 72 | 1 | 3 | 0.67 | 0.67 | ||
Gi. b | 0 | 4 | 7 | 12 | 2 | 31 | 4 | 0.52 | 0.67 | ||
Ko. b | 0 | 0 | 24 | 12 | 12 | 0 | 30 | 0.38 | 0.52 |
CNN | Confusion Matrices | Evaluation Indicators | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Tree Species | Pl. o | Pi. t | Ro. p | Ac. t | Qu. v | Gi. b | Ko. b | PA | UA | OA%/Kappa% | |
ResNet-18 | Pl. o | 84 | 5 | 5 | 1 | 11 | 7 | 7 | 0.70 | 0.61 | 54.28/ 46.19 |
Pi. t | 32 | 46 | 13 | 2 | 8 | 4 | 9 | 0.40 | 0.54 | ||
Ro. p | 1 | 18 | 66 | 0 | 6 | 4 | 25 | 0.55 | 0.58 | ||
Ac. t | 0 | 8 | 6 | 22 | 23 | 6 | 13 | 0.28 | 0.47 | ||
Qu. v | 10 | 3 | 0 | 16 | 73 | 0 | 6 | 0.68 | 0.54 | ||
Gi. b | 0 | 0 | 6 | 0 | 0 | 54 | 0 | 0.90 | 0.69 | ||
Ko. b | 11 | 5 | 17 | 6 | 13 | 3 | 23 | 0.29 | 0.28 | ||
GoogLeNet | Pl. o | 77 | 2 | 10 | 9 | 12 | 8 | 2 | 0.64 | 0.68 | 55.31/ 47.55 |
Pi. t | 23 | 35 | 14 | 0 | 14 | 8 | 20 | 0.31 | 0.58 | ||
Ro. p | 1 | 11 | 78 | 2 | 6 | 2 | 20 | 0.65 | 0.61 | ||
Ac. t | 0 | 6 | 10 | 28 | 24 | 1 | 9 | 0.36 | 0.48 | ||
Qu. v | 7 | 2 | 0 | 8 | 76 | 6 | 9 | 0.70 | 0.53 | ||
Gi. b | 0 | 0 | 0 | 5 | 0 | 52 | 3 | 0.87 | 0.63 | ||
Ko. b | 6 | 4 | 15 | 6 | 12 | 6 | 29 | 0.37 | 0.32 | ||
DenseNet-40 | Pl. o | 85 | 7 | 4 | 6 | 10 | 7 | 1 | 0.71 | 0.72 | 59.00/ 51.88 |
Pi. t | 17 | 53 | 10 | 13 | 6 | 4 | 11 | 0.46 | 0.58 | ||
Ro. p | 0 | 20 | 61 | 13 | 0 | 3 | 23 | 0.51 | 0.60 | ||
Ac. t | 0 | 11 | 9 | 35 | 22 | 0 | 1 | 0.45 | 0.40 | ||
Qu. v | 10 | 0 | 0 | 2 | 77 | 0 | 6 | 0.71 | 0.59 | ||
Gi. b | 0 | 0 | 0 | 0 | 0 | 56 | 4 | 0.93 | 0.78 | ||
Ko. b | 6 | 0 | 17 | 5 | 15 | 2 | 33 | 0.42 | 0.42 |
CNN | Confusion Matrices | Evaluation Indicators | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Tree Species | Pl. o | Pi. t | Ro. p | Ac. t | Qu. v | Gi. b | Ko. b | PA | UA | OA%/Kappa% | |
ResNet-18 | Pl. o | 99 | 12 | 6 | 0 | 0 | 0 | 3 | 0.82 | 0.68 | 73.45/ 68.67 |
Pi. t | 25 | 81 | 8 | 0 | 0 | 0 | 0 | 0.71 | 0.76 | ||
Ro. p | 11 | 4 | 90 | 0 | 0 | 0 | 15 | 0.75 | 0.75 | ||
Ac. t | 0 | 8 | 6 | 62 | 16 | 0 | 0 | 0.79 | 0.91 | ||
Qu. v | 6 | 1 | 1 | 3 | 83 | 0 | 14 | 0.77 | 0.75 | ||
Gi. b | 0 | 0 | 0 | 0 | 0 | 54 | 6 | 0.90 | 0.90 | ||
Ko. b | 4 | 9 | 15 | 3 | 12 | 6 | 29 | 0.37 | 0.43 | ||
GoogLeNet | Pl. o | 92 | 25 | 3 | 0 | 0 | 0 | 0 | 0.77 | 0.63 | 70.80/ 65.50 |
Pi. t | 23 | 86 | 1 | 0 | 0 | 0 | 0 | 0.75 | 0.61 | ||
Ro. p | 17 | 22 | 73 | 0 | 0 | 0 | 8 | 0.61 | 0.75 | ||
Ac. t | 0 | 0 | 0 | 66 | 12 | 0 | 0 | 0.85 | 0.85 | ||
Qu. v | 6 | 0 | 2 | 2 | 92 | 0 | 6 | 0.85 | 0.79 | ||
Gi. b | 0 | 0 | 0 | 0 | 0 | 51 | 9 | 0.85 | 0.89 | ||
Ko. b | 4 | 8 | 18 | 10 | 12 | 6 | 20 | 0.26 | 0.47 | ||
DenseNet-40 | Pl. o | 120 | 0 | 0 | 0 | 0 | 0 | 0 | 1.00 | 0.73 | 77.29/ 73.21 |
Pi. t | 23 | 85 | 6 | 0 | 0 | 0 | 0 | 0.75 | 0.93 | ||
Ro. p | 10 | 6 | 87 | 0 | 0 | 0 | 17 | 0.72 | 0.84 | ||
Ac. t | 0 | 0 | 9 | 62 | 16 | 0 | 0 | 0.79 | 0.84 | ||
Qu. v | 6 | 0 | 0 | 2 | 85 | 0 | 8 | 0.79 | 0.73 | ||
Gi. b | 0 | 0 | 0 | 0 | 0 | 48 | 12 | 0.80 | 0.91 | ||
Ko. b | 6 | 0 | 11 | 3 | 16 | 5 | 37 | 0.47 | 0.50 |
CNN | Confusion Matrices | Evaluation Indicators | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Tree Species | Pl. o | Pi. t | Ro. p | Ac. t | Qu. v | Gi. b | Ko. b | PA | UA | OA%/Kappa% | |
ResNet-18 | Pl. o | 110 | 6 | 0 | 0 | 4 | 0 | 0 | 0.93 | 0.88 | 84.37/ 81.65 |
Pi. t | 13 | 89 | 8 | 0 | 0 | 0 | 4 | 0.78 | 0.89 | ||
Ro. p | 0 | 7 | 90 | 0 | 5 | 0 | 18 | 0.72 | 0.91 | ||
Ac. t | 0 | 0 | 1 | 67 | 3 | 0 | 7 | 0.91 | 0.84 | ||
Qu. v | 0 | 0 | 0 | 2 | 103 | 0 | 3 | 0.94 | 0.88 | ||
Gi. b | 0 | 0 | 0 | 0 | 0 | 54 | 6 | 0.95 | 0.85 | ||
Ko. b | 0 | 2 | 4 | 2 | 0 | 7 | 63 | 0.72 | 0.63 | ||
GoogLeNet | Pl. o | 109 | 5 | 0 | 0 | 6 | 0 | 0 | 0.91 | 0.88 | 88.35/ 86.32 |
Pi. t | 15 | 88 | 11 | 0 | 0 | 0 | 0 | 0.77 | 0.89 | ||
Ro. p | 0 | 6 | 84 | 4 | 2 | 0 | 24 | 0.70 | 0.88 | ||
Ac. t | 0 | 0 | 0 | 78 | 0 | 0 | 0 | 1.00 | 0.95 | ||
Qu. v | 0 | 0 | 0 | 0 | 108 | 0 | 0 | 1.00 | 0.93 | ||
Gi. b | 0 | 0 | 0 | 0 | 0 | 60 | 0 | 1.00 | 0.91 | ||
Ko. b | 0 | 0 | 0 | 0 | 0 | 6 | 72 | 0.92 | 0.75 | ||
DenseNet-40 | Pl. o | 108 | 6 | 0 | 0 | 6 | 0 | 0 | 0.90 | 0.86 | 89.38/ 87.53 |
Pi. t | 18 | 84 | 12 | 0 | 0 | 0 | 0 | 0.74 | 0.88 | ||
Ro. p | 0 | 5 | 96 | 6 | 0 | 0 | 13 | 0.80 | 0.89 | ||
Ac. t | 0 | 0 | 0 | 78 | 0 | 0 | 0 | 1.00 | 0.93 | ||
Qu. v | 0 | 0 | 0 | 0 | 108 | 0 | 0 | 1.00 | 0.95 | ||
Gi. b | 0 | 0 | 0 | 0 | 0 | 60 | 0 | 1.00 | 0.91 | ||
Ko. b | 0 | 0 | 0 | 0 | 0 | 6 | 72 | 0.92 | 0.85 |
Model | Evaluation Indicators | Tree Species | ||||||||
---|---|---|---|---|---|---|---|---|---|---|
Pl. o | Pi. t | Ro. p | Ac. t | Qu. v | Gi. b | Ko. b | ||||
GoogLeNet ResNet-18 DenseNet-40 | H-CNN1 | Confusion Matrix | Pl. o | 98 | 5 | 11 | 0 | 6 | 0 | 0 |
Pi. t | 13 | 94 | 7 | 0 | 0 | 0 | 0 | |||
Ro. p | 0 | 0 | 108 | 12 | 0 | 0 | 0 | |||
Ac. t | 0 | 0 | 0 | 78 | 0 | 0 | 0 | |||
Qu. v | 0 | 0 | 0 | 6 | 102 | 0 | 0 | |||
Gi. b | 0 | 0 | 0 | 0 | 0 | 58 | 2 | |||
Ko. b | 0 | 0 | 0 | 0 | 0 | 0 | 78 | |||
PA | 0.82 | 0.82 | 0.90 | 1.00 | 0.94 | 0.97 | 1.00 | |||
UA | 0.88 | 0.95 | 0.86 | 0.81 | 0.94 | 1.00 | 0.97 | |||
OA%/Kappa% | 90.86/89.25 | |||||||||
DenseNet-40 ResNet-18 DenseNet-40 | H-CNN2 | Confusion Matrix | Pl. o | 100 | 11 | 0 | 0 | 6 | 0 | 3 |
Pi. t | 10 | 103 | 1 | 0 | 0 | 0 | 0 | |||
Ro. p | 0 | 6 | 102 | 12 | 0 | 0 | 0 | |||
Ac. t | 0 | 0 | 0 | 78 | 0 | 0 | 0 | |||
Qu. v | 0 | 0 | 0 | 6 | 102 | 0 | 0 | |||
Gi. b | 0 | 0 | 0 | 0 | 0 | 58 | 2 | |||
Ko. b | 0 | 0 | 0 | 0 | 0 | 0 | 78 | |||
PA | 0.83 | 0.90 | 0.85 | 1.00 | 0.94 | 0.97 | 1.00 | |||
UA | 0.91 | 0.86 | 0.99 | 0.81 | 0.94 | 1.00 | 0.94 | |||
OA%/Kappa% | 91.59/90.12 | |||||||||
DenseNet-40 DenseNet-40 DenseNet-40 | H-CNN3 | Confusion Matrix | Pl. o | 113 | 1 | 0 | 0 | 6 | 0 | 0 |
Pi. t | 13 | 101 | 0 | 0 | 0 | 0 | 0 | |||
Ro. p | 5 | 12 | 84 | 0 | 6 | 0 | 13 | |||
Ac. t | 0 | 0 | 0 | 78 | 0 | 0 | 0 | |||
Qu. v | 0 | 0 | 0 | 0 | 108 | 0 | 0 | |||
Gi. b | 0 | 0 | 0 | 0 | 0 | 60 | 0 | |||
Ko. b | 0 | 0 | 2 | 0 | 0 | 0 | 76 | |||
PA | 0.94 | 0.89 | 0.70 | 1.00 | 1.00 | 1.00 | 0.97 | |||
UA | 0.86 | 0.89 | 0.98 | 1.00 | 0.90 | 1.00 | 0.85 | |||
OA%/Kappa% | 91.45/89.94 | |||||||||
DenseNet-40 DenseNet-40 GoogLeNet | H-CNN4 | Confusion Matrix | Pl. o | 105 | 6 | 0 | 0 | 6 | 0 | 3 |
Pi. t | 10 | 104 | 0 | 0 | 0 | 0 | 0 | |||
Ro. p | 0 | 4 | 97 | 0 | 0 | 0 | 19 | |||
Ac. t | 0 | 0 | 0 | 78 | 0 | 0 | 0 | |||
Qu. v | 0 | 0 | 0 | 1 | 106 | 0 | 1 | |||
Gi. b | 0 | 0 | 0 | 0 | 0 | 60 | 0 | |||
Ko. b | 0 | 0 | 1 | 0 | 0 | 0 | 77 | |||
PA | 0.88 | 0.91 | 0.81 | 1.00 | 0.98 | 1.00 | 0.99 | |||
UA | 0.91 | 0.91 | 0.99 | 0.99 | 0.95 | 1.00 | 0.77 | |||
OA%/Kappa% | 92.48/91.67 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lei, Z.; Li, H.; Zhao, J.; Jing, L.; Tang, Y.; Wang, H. Individual Tree Species Classification Based on a Hierarchical Convolutional Neural Network and Multitemporal Google Earth Images. Remote Sens. 2022, 14, 5124. https://doi.org/10.3390/rs14205124
Lei Z, Li H, Zhao J, Jing L, Tang Y, Wang H. Individual Tree Species Classification Based on a Hierarchical Convolutional Neural Network and Multitemporal Google Earth Images. Remote Sensing. 2022; 14(20):5124. https://doi.org/10.3390/rs14205124
Chicago/Turabian StyleLei, Zhonglu, Hui Li, Jie Zhao, Linhai Jing, Yunwei Tang, and Hongkun Wang. 2022. "Individual Tree Species Classification Based on a Hierarchical Convolutional Neural Network and Multitemporal Google Earth Images" Remote Sensing 14, no. 20: 5124. https://doi.org/10.3390/rs14205124
APA StyleLei, Z., Li, H., Zhao, J., Jing, L., Tang, Y., & Wang, H. (2022). Individual Tree Species Classification Based on a Hierarchical Convolutional Neural Network and Multitemporal Google Earth Images. Remote Sensing, 14(20), 5124. https://doi.org/10.3390/rs14205124