Multi-Source Image Fusion Based Regional Classification Method for Apple Diseases and Pests
Abstract
:1. Introduction
- Firstly, the visible light and multispectral image data of an apple orchard canopy are collected by UAV, and the multi-source multi-label apple canopy image dataset of apple diseases and pests is constructed, which made up for the shortage of multi-label classification training data in large-scale orchards.
- Secondly, a VI selection method based on the saliency attention module is proposed. Aiming at the habitat information of apple plants, a feature selection method is used to weight the selected 22 vegetation indices in order to improve the accuracy of regional multi-classification.
- Finally, a multi-label classification model called AMMFNet, based on the joint prediction of RGB and multispectral images, is constructed. It can effectively utilize the complementary features of RGB images and VIs so that the model can automatically pay attention to the features related to diseases and pests and reduce the impact of redundancy in multi-source data on prediction performance.
2. Materials and Methods
2.1. Diseases and Pests Multi-Source Image Dataset of Apple Canopy
2.1.1. Image Data Collection
2.1.2. Construction of Multi-Source Image Dataset of Apple Diseases and Pests Canopy
2.2. Vegetation Indices Calculation and Selection
2.2.1. Vegetation Indices of Apple Diseases and Pests
2.2.2. Vegetation Indices Selection Algorithm
2.3. Multi-Label Classification Model for Apple Disease and Pest Areas
2.3.1. Data Input Layer
2.3.2. Data Fusion Layer
2.3.3. Feature Extraction Layer
2.3.4. Classifier
3. Results and Discussion
3.1. Experimental Environment Configuration
3.2. Evaluation Metrics
3.2.1. Comparison of Model Classification Accuracy
3.2.2. Effectiveness Analysis of Multi-Source Fusion Prediction Model
3.2.3. Ablation Experiment
3.2.4. Impact of Input Data on Model Performance
3.2.5. Impact of RF-ML Algorithm on Model Accuracy
3.2.6. Influence of Multi-Source Data Fusion Method on Model Accuracy
3.2.7. Influence of Noise on Model Performance
3.2.8. Main Contributions of This Work
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Guo, H.; Cao, Y.; Wang, C.; Rong, L.; Li, Y.; Wamg, T.; Yang, F. Recognition and application of apple defoliation disease based on transfer learning. Trans. CSAE 2024, 40, 184–192. [Google Scholar]
- Zhou, M.; Ou, Y.; Zhang, L.; Tong, G.; Wang, Y. Effect of dielectric properties on radio frequency heating uniformity of apple. Trans. CSAE 2019, 35, 273–279. [Google Scholar]
- Qiu, M.; Liu, B.; Liu, Y.; Wang, K.; Pang, J.; Zhang, X. Simulation of first flowering date for apple and risk assessment of late frost in main producing areas of northern China. Trans. CSAE 2020, 36, 154–163. [Google Scholar]
- Zhong, Y.; Zhao, M. Research on deep learning in apple leaf disease recognition. Comput. Electron. Agric. 2020, 168, 105146. [Google Scholar] [CrossRef]
- Mahmud, M.S.; He, L.; Zahid, A.; Heinemann, P.; Choi, D.; Krawczyk, G.; Zhu, H. Detection and infected area segmentation of apple fire blight using image processing and deep transfer learning for site-specific management. Comput. Electron. Agric. 2023, 209, 107862. [Google Scholar] [CrossRef]
- Liu, B.; Jia, R.; Zhu, X.; Yu, C.; Yao, Z.; Zhang, H.; He, D. A lightweight identification model for apple leaf diseases for mobile terminals. Trans. CSAE 2022, 38, 130–139. [Google Scholar]
- Ahad, M.T.; Li, Y.; Song, B.; Buhuiyan, T. Comparison of CNN-based deep learning architectures for rice diseases classification. Artif. Intell. Agric. 2023, 9, 22–35. [Google Scholar] [CrossRef]
- Liu, B.; Ding, Z.; Tian, L.; He, D.; Li, S.; Wang, H. Grape leaf disease identification using improved deep convolutional neural networks. Front. Plant Sci. 2020, 11, 1082. [Google Scholar] [CrossRef]
- Xie, X.Y.; Ma, Y.; Liu, B.; He, J.; Li, S.; Wang, H. A deep-learning-based real-time detector for grape leaf diseases using improved convolutional neural networks. Front. Plant Sci. 2020, 11, 751. [Google Scholar] [CrossRef]
- Zeng, T.; Li, S.; Song, Q.; Zhong, F.; Wei, X. Lightweight tomato real-time detection method based on improved YOLO and mobile deployment. Comput. Electron. Agric. 2023, 205, 107625. [Google Scholar] [CrossRef]
- Tang, Z.; Lu, J.; Chen, Z.; Qi, F.; Zhang, L. Improved Pest-YOLO: Real-time pest detection based on efficient channel attention mechanism and transformer encoder. Ecol. Inform. 2023, 78, 102340. [Google Scholar] [CrossRef]
- Tian, L.; Zhang, H.; Liu, B.; Qi, F.; Zhang, L. VMF-SSD: A novel V-space based multi-scale feature fusion SSD for apple leaf disease detection. IEEE/ACM Trans. Comput. Biol. Bioinform. 2022, 20, 2016–2028. [Google Scholar] [CrossRef] [PubMed]
- Liu, B.; Ren, H.; Li, J.; Duan, N.; Yuan, A.; Zhang, H. RE-RCNN: A novel representation-enhanced RCNN model for early apple leaf disease detection. ACM Trans. Sens. Netw. 2023, 1550–4867. [Google Scholar] [CrossRef]
- Cob-Parro, A.C.; Lalangui, Y.; Lazcano, R. Fostering Agricultural Transformation through AI: An Open-Source AI Architecture Exploiting the MLOps Paradigm. Agronomy 2024, 14, 259. [Google Scholar] [CrossRef]
- Singha, A.; Moon, M.S.H.; Dipta, S.R. An End-to-End Deep Learning Method for Potato Blight Disease Classification Using CNN. In Proceedings of the 2023 International Conference on Computational Intelligence, Networks and Security (ICCINS), Mylavaram, India, 22–23 December 2023; Volume 12, pp. 1–6. [Google Scholar]
- Mohite, J.; Sawant, S.; Agarrwal, R.; Pandit, A.; Pappula, S. Detection of crop water stress in maize using drone based hyperspectral imaging. In Proceedings of the IGARSS 2022—2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; pp. 5957–5960. [Google Scholar]
- Enciso, J.; Avila, C.A.; Jung, J.; Elsayed-Farag, S.; Chang, A.; Yeom, J.; Landivar, J.; Maeda, M.; Chavez, J.C. Validation of agronomic UAV and field measurements for tomato varieties. Comput. Electron. Agric. 2019, 158, 278–283. [Google Scholar] [CrossRef]
- Theau, J.; Gavelle, E.; Menard, P. Crop scouting using UAV imagery: A case study for potatoes. J. Unmanned Veh. Syst. 2020, 8, 99–118. [Google Scholar] [CrossRef]
- Brewer, K.; Clulow, A.; Sibanda, M.; Gokool, S.; Odindi, J.; Mutanga, O.; Naiken, V.; Chimonyo, V.G.P.; Mabhaudhi, T. Estimation of maize foliar temperature and stomatal conductance as indicators of water stress based on optical and thermal imagery acquired using an Unmanned Aerial Vehicle platform. Drones 2022, 6, 169. [Google Scholar] [CrossRef]
- Jiang, Y.; Liu, B.; Zhang, C.; Zhao, D.; Chen, R.Q.; Xu, B.; Long, H.L.; Yang, G.J.; Yang, H. Monitoring the maturity of multi-variety corn using multi-spectral imagery from an unmanned aerial vehicle. Trans. CSAE 2023, 39, 84–91. [Google Scholar]
- Fei, S.; Hassan, M.A.; Xiao, Y.; Su, X.; Chen, Z.; Cheng, Q.; Duan, F.; Chen, R.; Ma, Y. UAV-based multi-sensor data fusion and machine learning algorithm for yield prediction in wheat. Precis. Agric. 2023, 24, 187–212. [Google Scholar] [CrossRef]
- Yokoyama, Y.; De Wit, A.; Matsui, T.; Tanaka, T.S.T. Predicting plant-level cabbage yield by assimilating UAV-derived LAI into a crop simulation model. Precis. Agric. 2023, 1043–1048. [Google Scholar]
- Šupčík, A.; Beranová, V. Grape Yield Prediction Based on Vine Canopy Morphology Obtained by 3D Point Clouds from UAV Images; Wageningen Academic: Wageningen, The Netherlands, 2023; pp. 619–625. [Google Scholar]
- Yan, H.; Zhuo, Y.; Li, M.; Wang, Y.; Guo, H.; Wang, J.; Li, C.; Ding, F. Prediction of alfalfa yield based on machine learning and remote sensing of multi-spectral images from an unmanned aerial vehicle. Trans. CSAE 2022, 38, 64–71. [Google Scholar]
- Kent, O.W.; Chun, T.W.; Choo, T.L.; Lai, W.K. Early symptom detection of basal stem rot disease in oil palm trees using a deep learning approach on UAV images. Comput. Electron. Agric. 2023, 213, 108192. [Google Scholar] [CrossRef]
- Antolínez García, A.; Cáceres Campana, J.W. Identification of pathogens in corn using near-infrared UAV imagery and deep learning. Precis. Agric. 2023, 24, 783–806. [Google Scholar] [CrossRef]
- Das, A.K.; Mathew, J.; Zhang, Z.; Friskop, A.; Huang, Y.; Flores, P.; Han, X. Corn goss’s wilt disease assessment based on UAV imagery. In Unmanned Aerial Systems in Precision Agriculture: Technological Progresses and Applications; Springer: Berlin/Heidelberg, Germany, 2022; pp. 123–136. [Google Scholar]
- Zhao, J.; Jin, Y.; Ye, H.; Huang, W.; Dong, Y.; Fan, L.; Ma, H. Remote sensing monitoring of areca nut yellowing disease based on multi-spectral images from an unmanned aerial vehicle. Trans. CSAE 2020, 36, 54–61. [Google Scholar]
- Wang, C.; Liu, Y.; Zhang, Z.; Han, L.; Li, Y.; Zhang, H.; Wongsuk, S.; Li, Y.; Wu, X.; He, X. Spray performance evaluation of a six-rotor unmanned aerial vehicle sprayer for pesticide application using an orchard operation mode in apple orchards. Pest Manag. Sci. 2022, 78, 2449–2466. [Google Scholar] [CrossRef] [PubMed]
- Huang, J.; Luo, Y.; Quan, Q.; Wang, B.; Xue, X.; Zhang, Y. An autonomous task assignment and decision-making method for coverage path planning of multiple pesticide spraying UAVs. Comput. Electron. Agric. 2023, 212, 108128. [Google Scholar] [CrossRef]
- Zeng, W.; Deng, J.; Gao, Q. Pest control of rice leaf folder with reduced pesticide application using the P20 type plant protection UAV. Trans. CSAE 2021, 37, 53–59. [Google Scholar]
- Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of banana fusarium wilt based on UAV remote sensing. Remote Sens. 2020, 12, 938. [Google Scholar] [CrossRef]
- Chen, T.; Yang, W.; Zhang, H.; Zhu, B.; Zeng, R.; Wang, X.; Wang, S.; Wang, L.; Qi, H.; Lan, Y.; et al. Early detection of bacterial wilt in peanut plants through leaf-level hyperspectral and unmanned aerial vehicle data. Comput. Electron. Agric. 2020, 177, 105708. [Google Scholar] [CrossRef]
- Bhandari, M.; Ibrahim, A.M.; Xue, Q.; Jung, J.; Chang, A.; Rudd, J.C.; Maeda, M.; Rajan, N.; Neely, H.; Landivar, J. Assessing winter wheat foliage disease severity using aerial imagery acquired from small Unmanned Aerial Vehicle. Comput. Electron. Agric. 2020, 176, 105665. [Google Scholar] [CrossRef]
- Musci, M.A.; Persello, C.; Lingua, A.M. UAV images and deep-learning algorithms for detecting flavescence doree disease in grapevine orchards. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, XLIII-B3-2020, 1483–1489. [Google Scholar]
- Wu, G.; Fang, Y.; Jiang, Q.; Cui, M.; Li, N.; Ou, Y.; Diao, Z.; Zhang, B. Early identification of strawberry leaves disease utilizing hyperspectral imaging combing with spectral features, multiple vegetation indices and textural features. Comput. Electron. Agric. 2023, 204, 107553. [Google Scholar] [CrossRef]
- Huang, S.; Tang, L.; Hupy, J.P.; Wang, Y.; Shao, G. A commentary review on the use of normalized difference vegetation index (NDVI) in the era of popular remote sensing. J. For. Res. 2021, 32, 1–6. [Google Scholar] [CrossRef]
- Guo, Z.; Xu, H.; Ma, J.; Ning, H.; Shen, J.; Zhang, Z. Construction of three-dimensional remote sensing ecological index (TRSEI) based on stereopair images: A case study of Miaodao Archipelago in China. Ecol. Indic. 2024, 159, 111737. [Google Scholar] [CrossRef]
- Cardoso, L.A.S.; Farias, P.R.S.; Soares, J.A.C.; Caldeira, C.R.T.; de Oliveira, F.J. Use of a UAV for statistical-spectral analysis of vegetation indices in sugarcane plants in the Eastern Amazon. Int. J. Environ. Sci. Technol. 2024, 21, 6947–6964. [Google Scholar] [CrossRef]
- Zhao, X.; Qi, J.; Xu, H.; Yu, Z.; Yuan, L.; Chen, Y.; Huang, H. Evaluating the potential of airborne hyperspectral LiDAR for assessing forest insects and diseases with 3D Radiative Transfer Modeling. Remote Sens. Environ. 2023, 297, 113759. [Google Scholar] [CrossRef]
- Lanucara, S.; Praticò, S.; Pioggia, G.; Di Fazio, S.; Modica, G. Web-based spatial decision support system for precision agriculture: A tool for delineating dynamic management unit zones (MUZs). Smart Agric. Technol. 2024, 8, 100444. [Google Scholar] [CrossRef]
- Fu, X.; Bao, Y.; Tubuxin, B.; Bao, Y. Bias Correction of Sentinel-2 MSI Vegetation Indices in a Desert Steppe with Original Assembled Field Online Multi-Angle Spectrometers. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–11. [Google Scholar]
- Han, D.; Cai, H.; Zhang, L.; Wen, Y. Multi-sensor high spatial resolution leaf area index estimation by combining surface reflectance with vegetation indices for highly heterogeneous regions: A case study of the Chishui River Basin in southwest China. Ecol. Inform. 2024, 80, 102489. [Google Scholar] [CrossRef]
- Sun, H. Crop vegetation indices. In Encyclopedia of Smart Agriculture Technologies; Springer International Publishing: Cham, Switzerland, 2023; pp. 1–7. [Google Scholar]
- Marcello, J.; Eugenio, F.; Rodriguez-Esparragón, D.; Marqués, F. Assessment of forest degradation using multitemporal and multisensor very high resolution satellite imagery. In Proceedings of the International Geoscience and Remote Sensing Symposium, Pasadena, CA, USA, 16–21 July 2023; pp. 3233–3236. [Google Scholar]
- Zhao, R.; Tang, W.; An, L.; Qiao, L.; Wang, N.; Sun, H.; Li, M.; Liu, G.; Liu, Y. Solar-induced chlorophyll fluorescence extraction based on heterogeneous light distribution for improving in-situ chlorophyll content estimation. Comput. Electron. Agric. 2023, 215, 108405. [Google Scholar] [CrossRef]
- KADakci KOCA, T. A statistical approach to site-specific thresholding for burn severity maps using bi-temporal Landsat-8 images. Earth Sci. Inform. 2023, 16, 1313–1327. [Google Scholar] [CrossRef]
- Fu, Z.; Zhang, J.; Jiang, J.; Zhang, Z.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Using the time series nitrogen diagnosis curve for precise nitrogen management in wheat and rice. Field Crops Res. 2024, 307, 109259. [Google Scholar] [CrossRef]
- Gao, S.; Zhong, R.; Yan, K.; Ma, X.; Chen, X.; Pu, J.; Gao, S.; Qi, J.; Yin, G. Evaluating the saturation effect of vegetation indices in forests using 3D radiative transfer simulations and satellite observations. Remote Sens. Environ. 2023, 295, 113665. [Google Scholar]
- Huang, X.; Lin, D.; Mao, X.; Zhao, Y. Multi-source data fusion for estimating maize leaf area index over the whole growing season under different mulching and irrigation conditions. Field Crops Res. 2023, 303, 109111. [Google Scholar] [CrossRef]
- Sun, X.; Zhou, Y.; Jia, S.; Shao, H.; Liu, M.; Tao, S.; Dai, X. Impacts of mining on vegetation phenology and sensitivity assessment of spectral vegetation indices to mining activities in arid/semi-arid areas. J. Environ. Manag. 2024, 356, 120678. [Google Scholar] [CrossRef] [PubMed]
- Hu, Y.; Han, C.; Li, W.; Hu, Q.; Wu, H. Experimental evaluation of SOFC fuel adaptability and power generation performance based on MSR. Fuel Process. Technol. 2023, 250, 107919. [Google Scholar] [CrossRef]
- Jemaa, H.; Bouachir, W.; Leblom, B.; LaRocque, A.; Haddadi, A.; Bouguila, N. UAV-based computer vision system for orchard apple tree detection and health assessment. Remote Sens. 2023, 15, 3558. [Google Scholar] [CrossRef]
- Trubin, A.; Kozhoridze, G.; Zabihi, K.; Modlinger, R.; Singh, V.V.; Surový, P.; Jakuš, R. Detection of green attack and bark beetle susceptibility in Norway Spruce: Utilizing PlanetScope Multispectral Imagery for Tri-Stage spectral separability analysis. For. Ecol. Manag. 2024, 560, 121838. [Google Scholar] [CrossRef]
- Kesselring, J.; Morsdorf, F.; Kükenbrink, D.; Gastellu-Etchegorry, J.P.; Damm, A. Diversity of 3D APAR and LAI dynamics in broadleaf and coniferous forests: Implications for the interpretation of remote sensing-based products. Remote Sens. Environ. 2024, 306, 114116. [Google Scholar] [CrossRef]
- Spolaör, N.; Cherman, E.A.; Monard, M.C.; Lee, H.D. ReliefF for multi-label feature selection. In Proceedings of the Brazilian Conference on Intelligent Systems, Fortaleza, Brazil, 19–24 October 2013; pp. 6–11. [Google Scholar]
- Kononenko, I.; Šikonja, M.R. Non-myopic feature quality evaluation with (R) ReliefF. In Computational Methods of Feature Selection; Chapman and Hall/CRC: Boca Raton, FL, USA, 2007; pp. 185–208. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Woo, S.; Park, J.; Lee, J.Y.; Kweon, I.S. CBAM: Convolutional block attention module. In Proceedings of the European Conference on Computer Vision, Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- Wang, F.; Jiang, M.; Qian, C.; Yang, S.; Li, C.; Zhang, H.; Wang, X.; Tang, X. Residual attention network for image classification. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 3156–3164. [Google Scholar]
- Brock, A.; De, S.; Smith, S.L.; Simonyan, K. High-performance large-scale image recognition without normalization. In Proceedings of the International Conference on Machine Learning, Virtual, 21–24 July 2021; PMLR: London, UK, 2021; pp. 1059–1071. [Google Scholar]
- Liu, S.; Zhang, L.; Yang, X.; Su, H.; Zhu, J. Query2label: A simple transformer way to multi-label classification. arXiv 2021, arXiv:2107.10834. [Google Scholar]
Camera Category | Band Name | Wavelength/nm | Minimum | Mean | Maximum | Standard Deviation | Variance |
---|---|---|---|---|---|---|---|
RGB | Red | 660 | 5.07 | 109.74 | 255.99 | 33.32 | 1110.50 |
Green | 550 | 14.84 | 115.87 | 254.94 | 27.86 | 776.29 | |
Blue | 470 | 0.00 | 76.97 | 254.80 | 26.20 | 686.36 | |
Multispectral | 450 | 0.20 | 2.42 | 87.85 | 1.07 | 1.15 | |
560 | 0.62 | 12.18 | 359.16 | 4.85 | 23.56 | ||
650 | 0.21 | 11.01 | 457.65 | 7.49 | 56.06 | ||
730 | 1.27 | 35.67 | 397.91 | 11.09 | 122.95 | ||
840 | 1.77 | 50.89 | 458.85 | 16.35 | 267.19 |
Disease and Pest | Train Dataset | Val Dataset |
---|---|---|
Aphids | 6012 | 1503 |
Alternaria | 600 | 150 |
Mosaic | 432 | 108 |
Brown spot | 6156 | 1539 |
Gray spot | 1116 | 279 |
Index Name | Equation | Introduction |
---|---|---|
NDVI [37] | Normalized Differential Vegetation Index: a commonly used index to measure vegetation growth and coverage | |
TNDVI [38] | Transformational Normalized Difference Vegetation Index: a modified NDVI more sensitive to soil brightness for arid and semi-arid regions. | |
GNDVI [39] | Green Normalized Difference Vegetation Index: an index emphasizing the green vegetation component, which is suitable for high-biomass areas | |
RENDVI [40] | Modified Red Edge Normalized Difference Vegetation Index: using red edge bands to be more sensitive to vegetation cover and chlorophyll content | |
MSAVI2 [41] | Modified Soil-Adjusted Vegetation Index 2: a vegetation index to reduce the influence of soil background, which is suitable for bare soil and vegetation mixed areas | |
RVI [42] | Vegetation Index: the ratio of vegetation reflectance to soil reflectance, reflecting the degree of vegetation density | |
DVI [43] | Differential Vegetation Index: simple vegetation and non-vegetation reflectance differences for rapid assessment of vegetation cover | |
GDVI [44] | Global Differential Vegetation Index: considering differences in multiple bands for the assessment of complex vegetation environments | |
GRVI [44] | Green-Red Vegetation Index: using the ratio of green and red bands to reflect chlorophyll content | |
WDRVI [45] | Weighted Differential Vegetation Index: adjusting NDVI by weighting to be more sensitive to high vegetation cover areas | |
MSAVI [41] | Modified Soil-Adjusted Vegetation Index: an improved version to reduce the influence of soil background on vegetation index | |
OSAVI [46] | Optimizing Soil Adjustment Vegetation Index: further optimizing the correction of soil background by introducing soil factors | |
GOSAVI [47] | Global Optimizing Soil Adjustment Vegetation Index: an improved version of OSAVI considering the effects of multiple bands to improve accuracy | |
NDRE [48] | Normalization Differential Red Edge Index: using red edge band, sensitive to vegetation structure and health status | |
SR [49] | Structure Ratio Index: using to reflect complex trade-offs in vegetation canopy structure | |
NLI [50] | Normalization of Leaf Canopy Index: using to assess vegetation canopy health and chlorophyll content | |
RDVI [51] | Re-normalized Difference Vegetation Index: improving the accuracy of vegetation cover assessment by optimizing weights | |
MSR [52] | Multispectral Ratios: comprehensive assessment of vegetation growth using ratios of multiple bands | |
NG [53] | Normalized Greenness: a vegetation index emphasizing green band reflectance for assessing chlorophyll content | |
NR [53] | Normalized Redness: a vegetation index based on red band reflectance that reflects the health status of vegetation | |
IPVI [54] | Infrared Vegetation Index: using the infrared band to reflect the moisture content and health status of vegetation | |
MTVI2 [55] | Modified Triangular Vegetation Index 2: vegetation index sensitive to chlorophyll content, suitable for high-biomass areas |
Configuration Item | Value |
---|---|
CPU | Intel(R) Xeon(R) CPU E5-2650 v3 (Intel, Santa Clara, CA, USA) |
GPU | NVIDIA Tesla T4 16 GB (NVIDIA, Santa Clara, CA, USA) |
CUDA | 11.3 |
Operating system | Ubuntu 18.04.2 LTS (64-bit) |
Memory | 128 GB |
Hard drive | 2 TB |
Deep learning framework | PyTorch 1.11.0 |
Language | Python3.8 |
Input Image | Model | Subsetacc/% | Accuracy/% | Recall/% | Precision/% | F1/% | Forward Pass Size (MB) | Params Size (MB) |
---|---|---|---|---|---|---|---|---|
RGB | Attention-92 | 39.19 | 61.29 | 89.71 | 61.30 | 72.81 | 723.53 | 192.37 |
NFNet | 80.04 | 75.61 | 76.62 | 77.11 | 76.92 | 199.65 | 161.76 | |
Query2Label | 79.71 | 77.95 | 80.14 | 79.91 | 79.95 | 924.47 | 367.24 | |
RGB + VIs | Attention-92 | 35.49 | 59.51 | 88.80 | 59.73 | 71.36 | 723.53 | 192.64 |
NFNet | 88.18 | 82.00 | 82.69 | 83.60 | 83.11 | 199.65 | 162.03 | |
Query2Label | 86.16 | 81.49 | 82.89 | 83.22 | 83.10 | 924.47 | 367.51 | |
AMMFNet | 92.92 | 85.43 | 85.89 | 86.54 | 86.21 | 192.75 | 164.63 |
Input Image | Model | Aphid | Alternaria | Mosaic | Brown Spot | Gray Spot |
---|---|---|---|---|---|---|
RGB | Attention-92 | 68.51 | 87.31 | 99.60 | 72.02 | 86.33 |
NFNet | 94.91 | 97.92 | 100.0 | 89.60 | 96.82 | |
Query2Label | 94.95 | 97.31 | 100.0 | 90.59 | 96.45 | |
RGB + VIs | Attention-92 | 73.82 | 84.54 | 99.41 | 72.33 | 81.60 |
NFNet | 96.50 | 98.59 | 100.0 | 95.11 | 97.67 | |
Query2Label | 95.81 | 98.33 | 99.90 | 93.86 | 97.17 | |
AMMFNet | 97.54 | 99.01 | 100.0 | 97.41 | 98.47 |
Model | Input | Subsetacc/% | Accuracy/% | Recall/% | Precision/% | F1/% |
---|---|---|---|---|---|---|
ResNet18 | RGB | 83.99 | 78.26 | 78.53 | 79.92 | 79.22 |
MS | 82.02 | 76.34 | 77.69 | 77.59 | 77.64 | |
VIs | 87.58 | 80.71 | 80.46 | 81.77 | 81.11 | |
AMMFNet | 92.92 | 85.43 | 85.89 | 86.54 | 86.21 | |
ResNet34 | RGB | 82.38 | 77.64 | 78.06 | 79.83 | 78.93 |
MS | 83.10 | 77.37 | 78.04 | 79.38 | 78.70 | |
VIs | 86.20 | 80.15 | 80.95 | 81.93 | 81.44 | |
AMMFNet | 87.02 | 81.88 | 82.71 | 83.99 | 83.34 | |
ResNet50 | RGB | 83.83 | 78.56 | 79.56 | 81.04 | 80.29 |
MS | 85.80 | 79.51 | 80.10 | 81.20 | 80.72 | |
VIs | 86.40 | 80.28 | 80.97 | 82.09 | 81.53 | |
AMMFNet | 87.38 | 81.80 | 82.61 | 83.65 | 83.13 |
Evaluation Metrics | Aphid | Alternaria | Mosaic | Brown Spot | Gray Spot |
---|---|---|---|---|---|
Accuracy/% | 97.54 | 99.01 | 100.0 | 97.41 | 98.47 |
Recall/% | 79.63 | 78.75 | 66.77 | 95.64 | 86.09 |
Precision/% | 94.02 | 95.91 | 100.0 | 97.49 | 85.64 |
Model | RGB | VIs | RF-ML | Channel Attention | Data-Level Fusion | Subsetacc/% |
---|---|---|---|---|---|---|
ResNet18 | ✓ | 83.99 | ||||
✓ | 87.58 | |||||
✓ | ✓ | 86.40 | ||||
✓ | ✓ | 88.14 | ||||
✓ | ✓ | ✓ | ✓ | 90.20 | ||
✓ | ✓ | ✓ | ✓ | ✓ | 92.92 |
Model | Input Image | Subsetacc/% | Accuracy/% | Recall/% | Precision/% | F1/% |
---|---|---|---|---|---|---|
ResNet18 | RGB | 83.99 | 78.26 | 78.53 | 79.92 | 79.22 |
MS | 82.02 | 76.34 | 77.69 | 77.59 | 77.64 | |
VIs | 87.58 | 80.71 | 82.05 | 81.97 | 81.90 | |
ResNet34 | RGB | 82.38 | 77.64 | 78.06 | 79.83 | 78.93 |
MS | 83.10 | 77.37 | 78.04 | 79.38 | 78.70 | |
VIs | 86.20 | 80.15 | 80.95 | 81.93 | 81.44 | |
ResNet50 | RGB | 83.83 | 78.56 | 79.56 | 81.04 | 80.29 |
MS | 85.80 | 79.51 | 80.10 | 81.20 | 80.72 | |
VIs | 86.40 | 80.28 | 80.97 | 82.09 | 81.53 | |
ResNet101 | RGB | 86.61 | 81.14 | 81.85 | 83.01 | 82.50 |
MS | 87.32 | 80.67 | 81.43 | 82.05 | 81.74 | |
VIs | 89.26 | 82.29 | 82.83 | 83.61 | 83.22 |
Value | Subsetacc/% | Accuracy/% | Recall/% | Precision/% | F1/% |
---|---|---|---|---|---|
88.14 | 81.13 | 81.84 | 82.40 | 82.22 | |
85.64 | 79.06 | 80.08 | 80.34 | 80.21 | |
86.89 | 80.05 | 81.33 | 81.24 | 81.28 |
Method | Model | Subsetacc/% | Accuracy/% | Recall/% | Precision/% | F1/% |
---|---|---|---|---|---|---|
Data-level fusion | ResNet18 | 90.02 | 83.60 | 84.51 | 84.90 | 84.71 |
Feature-level fusion | ResNet18 | 85.64 | 79.06 | 80.08 | 80.34 | 80.21 |
Evaluation Metrics | Aphid | Alternaria | Mosaic | Brown Spot | Gray Spot |
---|---|---|---|---|---|
Accuracy/% | 94.83 | 97.41 | 99.98 | 93.94 | 96.21 |
Recall/% | 73.33 | 59.72 | 60.83 | 91.75 | 76.32 |
Precision/% | 91.15 | 93.90 | 94.68 | 87.02 | 92.62 |
Contribution Aspect | This Work | Literature References |
---|---|---|
Methodology | Developed a multi-source image fusion approach combining RGB and multispectral images. | Utilized single-source RGB images or multispectral images for classification [4,17]. |
Feature selection | Implemented global feature selection using ReliefF algorithm to enhance multi-source data fusion. | Did not include feature selection steps or use artificial feature screening [5,17]. |
Attention mechanism | Applied channel attention to re-weight fused features, improving model performance by 2.72%. | Used different attention mechanisms [5], which did not achieve comparable improvements. |
Input data combination | Evaluated the effectiveness of different input data combinations on various model architectures, demonstrating the superiority of the vegetation indices combination. | Compared single-source input methods without exploring the impact of different input data combinations on model performance [4,5,17]. |
Accuracy | Achieved a subset accuracy of 92.92%, outperforming single-source methods by 8.93% for RGB and 10.9% for multispectral images. | Reported subset accuracies of 83.99% for RGB images and 82.02% for multispectral images [5]. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, H.; Tan, B.; Sun, L.; Liu, H.; Zhang, H.; Liu, B. Multi-Source Image Fusion Based Regional Classification Method for Apple Diseases and Pests. Appl. Sci. 2024, 14, 7695. https://doi.org/10.3390/app14177695
Li H, Tan B, Sun L, Liu H, Zhang H, Liu B. Multi-Source Image Fusion Based Regional Classification Method for Apple Diseases and Pests. Applied Sciences. 2024; 14(17):7695. https://doi.org/10.3390/app14177695
Chicago/Turabian StyleLi, Hengzhao, Bowen Tan, Leiming Sun, Hanye Liu, Haixi Zhang, and Bin Liu. 2024. "Multi-Source Image Fusion Based Regional Classification Method for Apple Diseases and Pests" Applied Sciences 14, no. 17: 7695. https://doi.org/10.3390/app14177695
APA StyleLi, H., Tan, B., Sun, L., Liu, H., Zhang, H., & Liu, B. (2024). Multi-Source Image Fusion Based Regional Classification Method for Apple Diseases and Pests. Applied Sciences, 14(17), 7695. https://doi.org/10.3390/app14177695