Next Article in Journal
Dopamine and 24-Epibrassinolide Upregulate Root Resilience, Mitigating Lead Stress on Leaf Tissue and Stomatal Performance in Tomato Plants
Previous Article in Journal
Impact of Microplastic-Amended Soil on Seed Germination of Alfalfa (Medicago sativa) in a Controlled Environment
Previous Article in Special Issue
Grape Target Detection Method in Orchard Environment Based on Improved YOLOv7
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle

by
Barbara Dobosz
1,
Dariusz Gozdowski
1,
Jerzy Koronczok
2,
Jan Žukovskis
3 and
Elżbieta Wójcik-Gront
1,*
1
Department of Biometry, Institute of Agriculture, Warsaw University of Life Sciences, Nowoursynowska 159, 02-776 Warsaw, Poland
2
Agrocom Polska, Strzelecka 47, 47-120 Żędowice, Poland
3
Department of Business and Rural Development Management, Vytautas Magnus University, 53361 Kaunas, Lithuania
*
Author to whom correspondence should be addressed.
Agronomy 2025, 15(1), 238; https://doi.org/10.3390/agronomy15010238
Submission received: 23 December 2024 / Revised: 10 January 2025 / Accepted: 17 January 2025 / Published: 18 January 2025
(This article belongs to the Special Issue Remote Sensing Applications in Crop Monitoring and Modelling)

Abstract

:
Crop damage caused by wild animals, particularly wild boars (Sus scrofa), significantly impacts agricultural yields, especially in maize fields. This study evaluates two methods for assessing maize crop damage using UAV-acquired data: (1) a deep learning-based approach employing the Deepness plugin in QGIS, utilizing high-resolution RGB imagery; and (2) a method based on digital surface models (DSMs) derived from LiDAR data. Manual visual assessment, supported by ground-truthing, served as the reference for validating these methods. This study was conducted in 2023 in a maize field in Central Poland, where UAV flights captured high-resolution RGB imagery and LiDAR data. Results indicated that the DSM-based method achieved higher accuracy (94.7%) and sensitivity (69.9%) compared to the deep learning method (accuracy: 92.9%, sensitivity: 35.3%), which exhibited higher precision (92.2%) and specificity (99.7%). The DSM-based method provided a closer estimation of the total damaged area (9.45% of the field) compared to the reference (10.50%), while the deep learning method underestimated damage (4.01%). Discrepancies arose from differences in how partially damaged areas were classified; the deep learning approach excluded these zones, focusing on fully damaged areas. The findings suggest that while DSM-based methods are well-suited for quantifying extensive damage, deep learning techniques detect only completely damaged crop areas. Combining these methods could enhance the accuracy and efficiency of crop damage assessments. Future studies should explore integrated approaches across diverse crop types and damage patterns to optimize wild animal damage evaluation.

1. Introduction

Damage to crops caused by wild animals is a common cause of significant reductions in crop yields. One of the wild animal species that causes the most damage in Europe is the wild boar (Sus scrofa) [1,2,3]. One of the most frequently damaged crops by wild boars is maize, which is a source of food for wild boars from sowing to harvesting. Immediately after sowing, corn seeds are often eaten by wild boar, and thus, the plant population is reduced, which in the case of maize is the reason for a significant reduction in yields. The greatest feeding activity of wild boars in crops is most often observed in the summer, from June to September [4]. Damage to croplands is often intense, causing significant economic losses in many European countries [5,6,7]. Depending on the country, farmers may apply for compensation for damage caused by wild animals—including wild boars—from insurance companies, public administration institutions, or other institutions responsible for paying compensation for crop damage. However, this requires a fairly accurate and objective determination of the extent of damage, which is often difficult due to the pattern of the damage, its dispersion over the entire field area, and the irregular shape of the damaged areas. Therefore, photos taken from UAVs are often used to assess the damage, and then various methods of processing these photos are used to obtain a quick and accurate assessment of the extent of the damage [8,9,10,11,12,13].
Due to the low cost of acquisition, RGB photos are most often used in practice, from which orthophotos are generated that help to manually determine the areas damaged by wild animals [12,14,15]. When using multispectral cameras, it is possible to use vegetation indices, such as NDVI, as long as the damaged areas are characterized by lower values of vegetation indices. This method can be used during the growing season, when the plants are still green, because at the time of harvesting, when the entire plant canopy is dry and for the entire area of the cultivated field, the values of vegetation indices are very low. In addition to optical imaging, digital surface models (DSMs) can be used to assess the crop damage area. The preparation of such models is possible using data from RGB photos acquired by UAV or using a LiDAR sensor [12,15,16,17]. The use of LiDAR sensors to assess areas of crop damage is currently limited due to the much higher price of such sensors compared to RGB cameras.
Depending on input data, crop species, growth stage, the pattern of crop damage, and other factors, various methods for the classification of images are applied to evaluate the extent of the area of crop damage [14,15,16,18]. When one variable is used for classification—for example, the value of the vegetation index (e.g., NDVI) or plant height assessed on the basis of a DSM—classification can be made by setting a threshold value for such a variable. When many variables are used for classification, for example, individual bands of RGB or multispectral images, multivariate classification methods are used. Such methods can be very simple decision trees [19] or more complex methods such as cluster analysis [20], machine learning [21], or deep learning techniques [22,23]. The choice of classification method often depends on the crop, type of damage, and plant development stage. Due to their ease of application, automatic classification deep learning methods based on models that have been trained using data on crop damage for certain crops are becoming increasingly popular. One such model is the Corn Field Damage Segmentation deep learning model in the ONNX (Open Neural Network Exchange) format [24], which can be applied as the plugin Deepness for QGIS software [25]. This classification method allows for the use of RGB orthophotos and for very quick detection of the damaged areas caused by wild boars in fields of maize.
The aim of this study is to compare the results of maize crop damage classification using two methods, i.e., a deep learning method using the plugin Deepness for QGIS software based on RGB imagery; and classification based on plant canopy height evaluated using a DSM (digital surface model) based on LiDAR data. The results were compared with the reference crop damage area selected using a visual assessment of the UAV imagery supported by on site inspection of the field.

2. Materials and Methods

2.1. Study Area and Data Acquisition

This study was conducted in 2023 in Central Poland in a maize field located at the Experimental Station of Warsaw University of Life Sciences, Wilanów-Obory (52°04′2″ N 21°08′55″ E) (Figure 1). The climate of the location according to Köppen–Geiger classification is classified as Dfb–humid continental climate [26]. The yearly average temperature in 2023 was 10.7 °C; in the warmest months, July and August, the average monthly temperature was about 21 °C. Total yearly precipitation was about 570 mm. The predominant soil type for the studied field is Luvisols: https://soilgrids.org/ (accessed on 3 December 2024).
Data for this analysis were collected during UAV flights conducted on September 18, 2023. High-resolution RGB imagery was captured using a DJI Mavic 3 UAV flying at an altitude of 80 m, with a forward overlap of 80% and a side overlap of 80%, ensuring comprehensive coverage of the study area. The DSM was generated using data collected with the DJI L1 sensor mounted on a Matrice 350 RTK UAV flying at altitudes of 60, 70, and 80 m with a forward overlap of 50%. The DJI L1 sensor achieved up to two returns per pulse and a point density of 480 points per square meter.
The total area of the field was 18.49 ha but it has an irregular shape and no clear border, and, for the analyses, the rectangular part within the field had an area of 8.23 ha. This area was selected because it contains different patterns of crop damage, is more convenient for the calculation of the percentage of damage, and clearly defines the boundaries of the area taken into account in the analyses.

2.2. Data Analysis

The crop damage areas were selected using three methods. The first method was a manual selection of crop damage areas based on a visual assessment of high-resolution RGB orthophoto imagery supported by ground-truthing (Figure 2). The areas of crop damage were hand-drawn on the map at a high magnification. This was performed by a person (an expert) who has experience in identifying crop damage and had previously been in the field, where they directly observed the damage caused by animals. This method allowed for precise identification of damage caused by wildlife across the studied field. The total reference crop damage area was determined to be 8642 m2, which represented 10.50% of the total analyzed area. The damaged areas varied significantly in size: the smallest patches covered approximately 1 m2, whereas the largest contiguous damaged area—which is defined as an individual polygon (island polygon) separated from the other polygons with crop damage—reached 3775 m2. Such variability highlights the irregular nature of crop damage patterns caused by wild animals, which often exhibit a scattered distribution across the field.
The second method of crop damage detection utilized a digital surface model (DSM) derived from LiDAR data acquired during a UAV flight conducted at the end of September 2023. Additionally, a digital elevation model (DEM) was created from airborne LiDAR data obtained in May 2022. These datasets were processed and analyzed using DJI Terra software, which facilitated the generation of DSMs and DEMs from the LiDAR point clouds. The final crop height map was created by subtracting the DEM (which is the surface of the ground level) from the DSM (which is the surface of plants), enabling the estimation of maize plant heights and the identification of areas affected by damage (Figure 3). The analysis was performed using raster layers with a spatial resolution of 1 m (1 × 1 m pixels). This resolution was used because the distance between the rows of maize was 0.75 m, and a higher resolution, e.g., pixel size 0.5 m or smaller, could cause the areas between the maize rows to be treated as areas without plants. The derived height layer effectively delineated areas where the plant canopy had been reduced due to wildlife activity, as evidenced by the lower height values relative to surrounding undamaged crop areas.
The third method involved automated crop damage detection using deep convolutional neural networks (CNNs) and transformers trained on imagery from maize fields across Poland [27]. The model was developed based on RGB imagery, at an average spatial resolution of 3 cm per pixel, from 14 different locations, with over 360 ha of area overall. This makes it possible to use the method on images collected using inexpensive, popular, consumer-grade UAVs. The model can be applied using the incorporated Depness plugin: https://plugins.qgis.org/plugins/deepness/ (accessed on 3 December 2024). The Deepness plugin allows users to easily perform segmentation, detection, and regression on raster orthophotos with custom ONNX neural network models. The analysis was performed using the Deepness plugin integrated into QGIS 3.40 software. The input data consisted of an RGB raster with a high spatial resolution of 2.1 cm (pixel size 2.1 × 2.1 cm), which is similar to the spatial resolution of the orthophotos used for the development of the model, enabling the detailed identification of crop damage patterns. The RGB orthophoto used as input for the deep learning model was generated using DJI Terra software. The deep learning model segmented the input imagery into damaged and undamaged areas, generating a damage mask for further comparison with the reference dataset. Figure 4 presents the areas of crop damage identified by the neural network alongside the manually selected reference areas. While the automated segmentation successfully delineated damaged zones, partial crop damage—where individual plants or small groups of plants remained intact—posed challenges for accurate classification, highlighting the need for further refinement of the model.

3. Results

For each method of evaluation of crop damage, the total damaged area was calculated. The results are presented in Table 1. The largest area of crop damage was selected manually based on visual assessment (10.50% of the total studied area). The main reason for such a result is a selection of continuous areas of crop damage, where not all plants were damaged by wild animals. In such areas, individual plants or groups of a few plants are still growing and are not damaged. An example of such areas is presented in Figure 5. It is not clear whether the entire area should be treated as damaged or whether small undamaged areas should be distinguished within such areas and treated as undamaged crops. The best method in such a case will probably be a detailed mapping of crop yield but this is outside of the scope of this study. A similar area of crop damage to the reference was observed for the method based on the difference between LiDAR-derived DSM and DEM (9.45% of the total studied area). The smallest damaged area was found for the method using deep neural networks (4.01% of the total studied area); this was mainly because the areas where the crop was partially damaged were treated as not damaged.
Assuming that manually selected areas of crop damage are true damage, which can be treated as a reference damaged area, parameters of classification accuracy were calculated. Table 2 presents areas of crop damage classified correctly, i.e., in accordance with the reference crop damage; such areas are presented as true positives (TPs). In addition, false positives (FPs) are areas incorrectly identified as damaged areas; true negatives (TN) are areas correctly identified as undamaged areas; and false negatives (FN)s are areas incorrectly identified as undamaged.
Based on the results presented in Table 2, the parameters in accordance with classification were calculated as follows:
  • Accuracy = (TP + TN)/(TP + TN + FP + FN);
  • Precision (Positive Predictive Value) = TP/(TP + FP);
  • Sensitivity (True Positive Rate) = TP/(TP + FN);
  • Specificity = TN/(TN + FP).
The method of crop damage evaluation based on a DSM was characterized by higher accuracy (0.947) in comparison to the method using deep neural networks (0.929) (Table 3). Accuracy indicates the percentage of correctly classified areas to the total area of the study. In this case, higher accuracy for the method based on a DSM was due to the greater area of TP, i.e., the area classified correctly as crop damage in accordance with the reference area of crop damage. Higher precision was observed for the method of crop damage evaluation based on classification using deep neural networks (0.922) in comparison to precision for the method based on a DSM (0.776). It was mainly caused by a very small area of FP, i.e., an area classified incorrectly as crop damage for the method based on deep neural networks. Sensitivity for both methods was far from optimal, i.e., for the method based on a DSM, it was equal to 0.699, and 0.353 for the method using deep neural networks, which are quite low values. The low sensitivity for the method using deep neural networks was caused mainly by a small area classified correctly as crop damage (3047 m2), which was about 2 times lower in comparison to the method based on a DSM (6037 m2). Sensitivity is a very important parameter in the evaluation of the performance in such cases because it presents how much of the damaged area was omitted in the classification. Both methods showed a fairly large omission of damaged areas due to the presence of groups of individual plants that were not damaged. These groups of individual plants were not treated as crop-damaged areas, and especially for the method using deep neural networks, such areas were excluded as damaged crops. This is visible in Figure 6, where crop damage selected, using a DSM and DNNs, is presented together with the reference crop damage for the selected part of the study area. The specificity for both methods was high, with the method based on a DSM at 0.976 and the method using deep neural networks at 0.997; this was caused by large areas of TNs, i.e., the areas correctly identified as undamaged, for both methods of classification.

4. Discussion

The results of this study demonstrated that the DSM-based method for crop damage evaluation exhibited higher accuracy and greater sensitivity compared to the method utilizing deep neural networks; however, it showed lower precision and slightly reduced specificity. All methods performed similarly in areas where the crops were entirely damaged but discrepancies occurred in regions where undamaged plants remained. In particular, the deep neural network-based method failed to classify areas containing partially damaged crops as “damaged”. This suggests that while the deep neural network method is suitable for detecting areas of total crop loss, it is less effective when some plants remain intact. Nonetheless, this method offers significant advantages—it does not require costly LiDAR sensors, and relies on RGB imagery, which can be captured using low-cost UAV-mounted cameras. Furthermore, the data analysis process is largely automated, making crop damage detection quick, simple, and cost-effective. In contrast, the DSM-based method achieves higher accuracy but demands more expensive LiDAR sensors and an accurate digital elevation model (DEM) of the field area, which may not always be readily available. A comparable study on maize crop damage caused by wild animals was conducted in Poland by Aszkowski et al. [27], where deep convolutional neural networks were applied to analyze 13 maize fields. The reference damage areas were manually delineated through visual assessment. The study reported sensitivity (the true positive rate) ranging from 0.29 to 0.90 (mean: 0.69) and precision between 0.56 and 0.85 (mean: 0.73). The higher sensitivity observed in that study compared to our results is likely due to the uniformity of the crop damage patterns. The damaged areas in their study lacked individual or small groups of undamaged plants, making the classification task simpler. This highlights that detecting damaged areas without undamaged plants is considerably easier. The crop damage patterns observed in our study were more complex and heterogeneous as the damage occurred throughout the entire vegetation period—from sowing to full plant maturity. Wild boars consumed corn seeds immediately after sowing and later destroyed entire plants, contributing to the irregular damage patterns observed in our study. A similar study was conducted by Rutten et al. [8] in Belgium, where crop damage by wild boar was analyzed across 79 fields. The researchers applied machine learning methods with GEOBIA (Geographic Object-Based Image Analysis) on UAV RGB imagery. The reported overall accuracy ranged between 84.5% and 96.5%, depending on the reference data used, which aligns closely with the results of our study. However, key parameters such as sensitivity—crucial for assessing the effectiveness of damage detection methods—were not provided, limiting the ability to make detailed comparisons between studies. Additionally, Samiappan et al. [28] conducted a study in the USA on maize crop damage caused by wild boar. The study employed segmentation-based fractal texture analysis and support vector machines for the automatic classification of UAV RGB imagery. Reference damage areas were collected using ground-truthing with GPS receivers. Five maize fields were analyzed, and the reported overall accuracy ranged from 64.9% to 77.7%. This accuracy was notably lower compared to our study, primarily due to a high error of omission, which ranged between 28.5% and 40.8%
In our previous study on the evaluation of crop damage in maize using UAV-derived data, a DSM was successfully used for the evaluation of areas damaged by wild boar [15]. In that study, a spatial filter with the “edge” option was applied, which allowed the selection of most of the areas in which maize was damaged. The pattern of the damaged crop was different from that of this study as there were many damaged areas scattered throughout the field. However, each damaged area was relatively small, usually about a few to a dozen square meters. This allowed for the use of filtering and, thus, the determination of damaged areas. In the case of the current study, the pattern of the damage was different because some damaged areas were large, even covering thousands of square meters. In this case, the use of a spatial filter did not allow for precise selection of the damaged areas of maize. This is mainly because the spatial filter detects atypical pixels in comparison to neighboring pixels at a certain distance. Suppose the area of crop damage is very large. In that case, crop damage is “typical” for such an area, and “not typical” is undamaged crop because the method based on the application of spatial filter for the DSM is not universal and other methods, e.g., DSM based on LiDAR data, should be used for such cases.
The problem with correctly assessing the area of crop damage caused by wild animals, especially wild boar, concerns not only maize but also many other crop species [29,30,31,32,33]. Other crops that are damaged very often by wild boar are wheat, potato, rapeseed, and grassland. In the case of wild boar, large seasonal and spatial variations in crop damage are observed, and it affects the occurrence of various types of crop damage in different crops and at their different crop stages. This causes different types of crop damage and their large heterogeneity [34,35]. This is the reason why developing automatic damage detection is not easy, especially if the crop damage is not total and only part of the plants are damaged [36]. In recent years, different methods of automatic crop damage detection based on UAV-derived data, including RGB imagery and LiDAR data, were developed for different types of crop damage and crop species [37,38,39,40,41,42]. Various methods of machine learning, deep learning, convolutional neural networks, and other classification methods are applied to the classification of imagery for the automatic detection of damaged field areas of crops. Depending on the type of crop damage, crop species, and growth stage, different approaches can be more efficient in the automatic classification of crop damage. This automatic method of crop damage evaluation should not only be of high accuracy but also be low cost, i.e., cheaper in comparison to methods based on ground-truthing measurements [8,27,43].
Table 4 summarizes the advantages and disadvantages of various methods of damage classification in maize based on drone-derived data, taking into account both the authors’ own results and those of other researchers.
In summary, each method has certain drawbacks, and it is difficult to indicate the best method for detecting damage in maize. In small areas/fields, this may be estimated by drawing damaged areas manually using drone-derived orthophotos. In larger areas, the most cost- and time-effective method is the use of deep neural network models; however, these may be unreliable in the case of atypical patterns of crop damage. Damage estimation using a DSM may become widespread in the future if the cost of LiDAR sensors decreases. Another method can be considered, e.g., based on other variables such as vegetation indices like NDVI, if the crop damage is during intensive vegetation of the crop and the plants are completely damaged (not green) [44].

5. Conclusions

The purpose of this study was to evaluate damage to maize crops caused by wild animals, specifically wild boars, using UAV-based RGB imagery and LiDAR data. Manual selection based on visual assessment, classification using digital surface model (DSM) data derived from LiDAR data, and automated segmentation using a deep learning model were examined.
Based on the manual method, which served as the reference, 10.50% of the field area was damaged. The DSM-based method showed high accuracy (94.7%) and sensitivity (69.9%) in identifying damaged areas, with a total damage estimate of 9.45%. In spite of this, there was a slight decrease in precision because some undamaged areas were misclassified. The deep learning model achieved the highest precision (92.2%) and specificity (99.7%) but its sensitivity was significantly lower (35.3%), underestimating the total damaged area at 4.01%. As a result, partly damaged areas where undamaged plants were present were excluded.
DSM-based methods are more accurate in quantifying large damaged areas, even where some groups of plants within such areas were undamaged. The combination of both methods, the DSM and a deep learning model, as a spatial union of these two areas of crop damage could provide a comprehensive approach to crop damage assessment, taking advantage of each method’s strengths. Future studies should combine these approaches and evaluate their effectiveness across various crop types and damage patterns. Another future direction is the development of a more comprehensive deep learning model based not only on RGB data but also on multispectral and/or LiDAR data.
Results of other studies proved that the evaluation of crop damage using various machine learning methods can be an effective tool for the detection of maize crop damage caused by wildlife; however, different patterns of crop damage may be a factor that makes their precise determination difficult.

Author Contributions

Conceptualization, B.D. and D.G.; methodology, B.D. and D.G.; validation, B.D., J.K., J.Ž., E.W.-G. and D.G.; formal analysis, B.D., E.W.-G. and D.G.; investigation, B.D.; resources, B.D.; data curation, B.D., E.W.-G. and D.G.; writing—original draft preparation, B.D., J.K., J.Ž., E.W.-G. and D.G.; writing—review and editing, B.D., J.K., J.Ž., E.W.-G. and D.G.; visualization, B.D. and D.G.; supervision, J.K. and D.G.; project administration, B.D., J.K. and D.G.; funding acquisition, B.D. and J.Ž. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Dataset available on request from the authors.

Acknowledgments

The authors would like to express their gratitude to TPI Sp. z o.o. for providing access to the equipment necessary for data collection. Special thanks are extended to Łukasz Piecyk, Karol Rosiak, and Artur Malczewski for their technical support and assistance during the UAV flights and LiDAR data acquisition. Their contribution was invaluable to the successful completion of this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Herrero, J.; García-Serrano, A.; Couto, S.; Ortuño, V.M.; García-González, R. Diet of Wild Boar Sus scrofa L. and Crop Damage in an Intensive Agroecosystem. Eur. J. Wildl. Res. 2006, 52, 245–250. [Google Scholar] [CrossRef]
  2. Schley, L.; Roper, T.J. Diet of Wild Boar Sus scrofa in Western Europe, with Particular Reference to Consumption of Agricultural Crops. Mammal Rev. 2003, 33, 43–56. [Google Scholar] [CrossRef]
  3. Amici, A.; Serrani, F.; Rossi, C.M.; Primi, R. Increase in Crop Damage Caused by Wild Boar (Sus scrofa L.): The “Refuge Effect”. Agron. Sustain. Dev. 2012, 32, 683–692. [Google Scholar] [CrossRef]
  4. Mackin, R. Dynamics of Damage Caused by Wild Boar to Different Agricultural Crops. Acta Theriol. 1970, 15, 447–458. [Google Scholar] [CrossRef]
  5. Schley, L.; Dufrêne, M.; Krier, A.; Frantz, A.C. Patterns of Crop Damage by Wild Boar (Sus scrofa) in Luxembourg over a 10-Year Period. Eur. J. Wildl. Res. 2008, 54, 589–599. [Google Scholar] [CrossRef]
  6. Frackowiak, W.; Gorczyca, S.; Merta, D.; Wojciuch-Ploskonka, M. Factors Affecting the Level of Damage by Wild Boar in Farmland in North-eastern Poland. Pest Manag. Sci. 2013, 69, 362–366. [Google Scholar] [CrossRef] [PubMed]
  7. Laznik, Ž.; Trdan, S. Evaluation of Different Soil Parameters and Wild Boar (Sus scrofa [L.]) Grassland Damage. Ital. J. Anim. Sci. 2014, 13, 3434. [Google Scholar] [CrossRef]
  8. Rutten, A.; Casaer, J.; Vogels, M.F.A.; Addink, E.A.; Vanden Borre, J.; Leirs, H. Assessing Agricultural Damage by Wild Boar Using Drones. Wildl. Soc. Bull. 2018, 42, 568–576. [Google Scholar] [CrossRef]
  9. Fischer, J.W.; Greiner, K.; Lutman, M.W.; Webber, B.L.; Vercauteren, K.C. Use of Unmanned Aircraft Systems (UAS) and Multispectral Imagery for Quantifying Agricultural Areas Damaged by Wild Pigs. Crop Prot. 2019, 125, 104865. [Google Scholar] [CrossRef]
  10. Michez, A.; Morelle, K.; Lehaire, F.; Widar, J.; Authelet, M.; Vermeulen, C.; Lejeune, P. Use of Unmanned Aerial System to Assess Wildlife (Sus scrofa) Damage to Crops (Zea mays). J. Unmanned Veh. Syst. 2016, 4, 266–275. [Google Scholar] [CrossRef]
  11. Drimaj, J.; Skoták, V.; Kamler, J.; Plhal, R.; Adamec, Z.; Mikulka, O.; Janata, P. Comparison of Methods for Estimating Damage by Wild Ungulates on Field Crops. Agriculture 2023, 13, 1184. [Google Scholar] [CrossRef]
  12. Kuželka, K.; Surový, P. Automatic Detection and Quantification of Wild Game Crop Damage Using an Unmanned Aerial Vehicle (UAV) Equipped with an Optical Sensor Payload: A Case Study in Wheat. Eur. J. Remote Sens. 2018, 51, 241–250. [Google Scholar] [CrossRef]
  13. Friesenhahn, B.A.; Massey, L.D.; DeYoung, R.W.; Cherry, M.J.; Fischer, J.W.; Snow, N.P.; VerCauteren, K.C.; Perotto-Baldivieso, H.L. Using Drones to Detect and Quantify Wild Pig Damage and Yield Loss in Corn Fields throughout Plant Growth Stages. Wildl. Soc. Bull. 2023, 47, e1437. [Google Scholar] [CrossRef]
  14. Jełowicki, Ł.; Sosnowicz, K.; Ostrowski, W.; Osińska-Skotak, K.; Bakuła, K. Evaluation of Rapeseed Winter Crop Damage Using UAV-Based Multispectral Imagery. Remote Sens. 2020, 12, 2618. [Google Scholar] [CrossRef]
  15. Dobosz, B.; Gozdowski, D.; Koronczok, J.; Žukovskis, J.; Wójcik-Gront, E. Evaluation of Maize Crop Damage Using UAV-Based RGB and Multispectral Imagery. Agriculture 2023, 13, 1627. [Google Scholar] [CrossRef]
  16. Garcia Millan, V.E.; Rankine, C.; Sanchez-Azofeifa, G.A. Crop Loss Evaluation Using Digital Surface Models from Unmanned Aerial Vehicles Data. Remote Sens. 2020, 12, 981. [Google Scholar] [CrossRef]
  17. Debnath, S.; Paul, M.; Debnath, T. Applications of LiDAR in Agriculture and Future Research Directions. J. Imaging 2023, 9, 57. [Google Scholar] [CrossRef] [PubMed]
  18. Puig Garcia, E.; Gonzalez, F.; Hamilton, G.; Grundy, P. Assessment of Crop Insect Damage Using Unmanned Aerial Systems: A Machine Learning Approach; Modelling and Simulation Society of Australia and New Zealand Inc. (MSSANZ): Gold Coast, Australia, 2015; pp. 1420–1426. [Google Scholar]
  19. Zhang, M.; Wu, B.; Yu, M.; Zou, W.; Zheng, Y. Crop Condition Assessment with Adjusted NDVI Using the Uncropped Arable Land Ratio. Remote Sens. 2014, 6, 5774–5794. [Google Scholar] [CrossRef]
  20. Marino, S.; Alvino, A. Detection of Homogeneous Wheat Areas Using Multi-Temporal UAS Images and Ground Truth Data Analyzed by Cluster Analysis. Eur. J. Remote Sens. 2018, 51, 266–275. [Google Scholar] [CrossRef]
  21. Kwak, G.-H.; Park, N.-W. Impact of Texture Information on Crop Classification with Machine Learning and UAV Images. Appl. Sci. 2019, 9, 643. [Google Scholar] [CrossRef]
  22. Bouguettaya, A.; Zarzour, H.; Kechida, A.; Taberkit, A.M. Deep Learning Techniques to Classify Agricultural Crops Through UAV Imagery: A Review. Neural Comput. Appl. 2022, 34, 9511–9536. [Google Scholar] [CrossRef] [PubMed]
  23. Teixeira, I.; Morais, R.; Sousa, J.J.; Cunha, A. Deep Learning Models for the Classification of Crops in Aerial Imagery: A Review. Agriculture 2023, 13, 965. [Google Scholar] [CrossRef]
  24. Aszkowski, P.; Ptak, B.; Kraft, M.; Pieczyński, D.; Drapikowski, P. Deepness: Deep Neural Remote Sensing Plugin for QGIS. SoftwareX 2023, 23, 101495. [Google Scholar] [CrossRef]
  25. Aszkowski, P.; Ptak, B. Deepness Model ZOO 2022. Available online: https://qgis-plugin-deepness.readthedocs.io/en/latest/main/main_model_zoo.html (accessed on 3 December 2024).
  26. Beck, H.E.; Zimmermann, N.E.; McVicar, T.R.; Vergopolan, N.; Berg, A.; Wood, E.F. Present and Future Köppen-Geiger Climate Classification Maps at 1-Km Resolution. Sci. Data 2018, 5, 180214. [Google Scholar] [CrossRef] [PubMed]
  27. Aszkowski, P.; Kraft, M.; Drapikowski, P.; Pieczyński, D. Estimation of Corn Crop Damage Caused by Wildlife in UAV Images. Precis. Agric. 2024, 25, 2505–2530. [Google Scholar] [CrossRef]
  28. Samiappan, S.; Prince Czarnecki, J.M.; Foster, H.; Strickland, B.K.; Tegt, J.L.; Moorhead, R.J. Quantifying Damage from Wild Pigs with Small Unmanned Aerial Systems. Wildl. Soc. Bull. 2018, 42, 304–309. [Google Scholar] [CrossRef]
  29. Sarwar, M. Raiding of Agricultural Crops and Forests by Wild Boar (Sus scrofa L.) and Its Mitigation Tricks. J. Sci. Agric. 2019, 3, 01–05. [Google Scholar] [CrossRef]
  30. Kleijkers, Y.I.M. Wild Boar Damage Mapping in Agricultural Grass and Wheatlands Using Unmanned Aerial Vehicle (UAV) Data. Master’s Thesis, Lund University, Lund, Sweden, 2024. [Google Scholar]
  31. Rutten, A.; Casaer, J.; Strubbe, D.; Leirs, H. Agricultural and Landscape Factors Related to Increasing Wild Boar Agricultural Damage in a Highly Anthropogenic Landscape. Wildl. Biol. 2019, 2020, 1–11. [Google Scholar] [CrossRef]
  32. Cai, J.; Jiang, Z.; Zeng, Y.; Li, C.; Bravery, B.D. Factors Affecting Crop Damage by Wild Boar and Methods of Mitigation in a Giant Panda Reserve. Eur. J. Wildl. Res. 2008, 54, 723–728. [Google Scholar] [CrossRef]
  33. Morelle, K.; Lejeune, P. Seasonal Variations of Wild Boar Sus scrofa Distribution in Agricultural Landscapes: A Species Distribution Modelling Approach. Eur. J. Wildl. Res. 2015, 61, 45–56. [Google Scholar] [CrossRef]
  34. Bobek, B.; Furtek, J.; Bobek, J.; Merta, D.; Wojciuch-Ploskonka, M. Spatio-Temporal Characteristics of Crop Damage Caused by Wild Boar in North-Eastern Poland. Crop Prot. 2017, 93, 106–112. [Google Scholar] [CrossRef]
  35. Ficetola, G.F.; Bonardi, A.; Mairota, P.; Leronni, V.; Padoa-Schioppa, E. Predicting Wild Boar Damages to Croplands in a Mosaic of Agricultural and Natural Areas. Curr. Zool. 2014, 60, 170–179. [Google Scholar] [CrossRef]
  36. Johenneken, M.; Drak, A.; Herpers, R. Damage Analysis of Grassland from Aerial Images Applying Convolutional Neural Networks. In Proceedings of the 2020 International Conference on Software, Telecommunications and Computer Networks (SoftCOM), Split, Croatia, 17–19 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
  37. Soner, A.; Chourasiya, D.; Rathore, P.; Nikam, G. A Survey on Automatic Crops Damage Assessment Using Remote Sensing. In Proceedings of the 2020 International Conference on Innovative Computing & Communications, Delhi, India, 21–23 February 2020. [Google Scholar] [CrossRef]
  38. Mitra, A.; Singhal, A.; Mohanty, S.P.; Kougianos, E.; Ray, C. eCrop: A Novel Framework for Automatic Crop Damage Estimation in Smart Agriculture. SN Comput. Sci. 2022, 3, 319. [Google Scholar] [CrossRef]
  39. Karimzadeh, R.; Naharki, K.; Park, Y.-L. Detection of Bean Damage Caused by Epilachna varivestis (Coleoptera: Coccinellidae) Using Drones, Sensors, and Image Analysis. J. Econ. Entomol. 2024, 117, 2143–2150. [Google Scholar] [CrossRef] [PubMed]
  40. Tian, F.; Vieira, C.C.; Zhou, J.; Zhou, J.; Chen, P. Estimation of Off-Target Dicamba Damage on Soybean Using UAV Imagery and Deep Learning. Sensors 2023, 23, 3241. [Google Scholar] [CrossRef]
  41. Azizi, A.; Zhang, Z.; Rui, Z.; Li, Y.; Igathinathane, C.; Flores, P.; Mathew, J.; Pourreza, A.; Han, X.; Zhang, M. Comprehensive Wheat Lodging Detection after Initial Lodging Using UAV RGB Images. Expert Syst. Appl. 2024, 238, 121788. [Google Scholar] [CrossRef]
  42. Jurado, J.M.; Pádua, L.; Feito, F.R.; Sousa, J.J. Automatic Grapevine Trunk Detection on UAV-Based Point Cloud. Remote Sens. 2020, 12, 3043. [Google Scholar] [CrossRef]
  43. Sibanda, M.; Ndlovu, H.S.; Brewer, K.; Buthelezi, S.; Matongera, T.N.; Mutanga, O.; Odidndi, J.; Clulow, A.D.; Chimonyo, V.G.P.; Mabhaudhi, T. Remote Sensing Hail Damage on Maize Crops in Smallholder Farms Using Data Acquired by Remotely Piloted Aircraft System. Smart Agric. Technol. 2023, 6, 100325. [Google Scholar] [CrossRef]
  44. Mohammad, L.; Bandyopadhyay, J.; Sk, R.; Mondal, I.; Nguyen, T.T.; Lama, G.F.C.; Anh, D.T. Estimation of Agricultural Burned Affected Area Using NDVI and dNBR Satellite-Based Empirical Models. J. Environ. Manag. 2023, 343, 118226. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Field of maize (border of the field marked in red) with the part of the field analyzed in this study (border of the field marked in blue) (a) and location of the field on the map of Poland (b).
Figure 1. Field of maize (border of the field marked in red) with the part of the field analyzed in this study (border of the field marked in blue) (a) and location of the field on the map of Poland (b).
Agronomy 15 00238 g001
Figure 2. Part of the field analyzed in this study (border of the field marked in blue) with the reference crop-damage areas selected manually based on visual assessment (a) and photos presenting examples of crop damage by wild animals (b,c).
Figure 2. Part of the field analyzed in this study (border of the field marked in blue) with the reference crop-damage areas selected manually based on visual assessment (a) and photos presenting examples of crop damage by wild animals (b,c).
Agronomy 15 00238 g002
Figure 3. Estimated crop height as a difference between DSM (for August 2023) and DEM (for May 2022).
Figure 3. Estimated crop height as a difference between DSM (for August 2023) and DEM (for May 2022).
Agronomy 15 00238 g003
Figure 4. Estimated crop damage area using deep neural networks together with reference area of crop damage (manual selection based on visual assessment).
Figure 4. Estimated crop damage area using deep neural networks together with reference area of crop damage (manual selection based on visual assessment).
Agronomy 15 00238 g004
Figure 5. Part of the field presenting crop damage areas using manual selection (areas marked by blue line) based on visual assessment.
Figure 5. Part of the field presenting crop damage areas using manual selection (areas marked by blue line) based on visual assessment.
Agronomy 15 00238 g005
Figure 6. Part of the field presenting crop damage classified by DSM (a) and selected using DNN—deep neural networks (b) together with the reference area selected by manual selection based on visual assessment.
Figure 6. Part of the field presenting crop damage classified by DSM (a) and selected using DNN—deep neural networks (b) together with the reference area selected by manual selection based on visual assessment.
Agronomy 15 00238 g006
Table 1. The total estimated area of crop damage of maize based on three different methods.
Table 1. The total estimated area of crop damage of maize based on three different methods.
Method of EvaluationTotal Area (in m2)Percentage of the Field
Manual (based on visual assessment)864210.50%
Based on DSM and DEM (difference below 1.5 m *)77819.45%
Using deep neural networks 33034.01%
* Typical height of maize plants in late growth stages is about 2–3 m. Maize plants are much higher than most weeds, and their height is usually not above 1 m. Because of that height, 1.5 m was selected as a threshold, which allows for distinguishing between undamaged maize plants versus areas without any plants or areas where there are only weeds.
Table 2. Areas (in m2) of damaged or undamaged crops in accordance with the reference area, which was selected manually as crop damage.
Table 2. Areas (in m2) of damaged or undamaged crops in accordance with the reference area, which was selected manually as crop damage.
Method of EvaluationBased on DSMUsing Deep Neural Networks
True positive (TP)—classified correctly as crop damage60373047
False positive (FP)—classified incorrectly as crop damage1744256
True negative (TN)—correctly identified as undamaged71,92473,412
False negative (FN)—incorrectly identified as undamaged26055595
Table 3. Parameters of classification in accordance with the reference area, which was selected manually as crop damage.
Table 3. Parameters of classification in accordance with the reference area, which was selected manually as crop damage.
Method of EvaluationBased on DSMUsing Deep Neural Networks
Accuracy = (TP + TN)/(TP + TN + FP + FN)0.9470.929
Precision (Positive Predictive Value) = TP/(TP + FP)0.7760.922
Sensitivity (True Positive Rate) = TP/(TP + FN)0.6990.353
Specificity = TN/(TN + FP)0.9760.997
Table 4. Main advantages and disadvantages of various methods of damage classification in maize based on drone-derived data.
Table 4. Main advantages and disadvantages of various methods of damage classification in maize based on drone-derived data.
Method of EvaluationManual SelectionBased on DSMUsing Deep Neural Networks
Accuracyhighmoderate to highlow to high
Time-consumingvery highmoderatelow
Costs of data acquisitionlowmoderate to highlow
Costs of data analysishigh costs due to high time consumptionmoderatelow (assuming that the model is already developed)
Overall assessment of the methodeasy to apply only in small areas, time-consuming on large fieldsgood efficiency but requires expensive LiDAR sensorslow costs, easy application but in case of atypical crop damage, accuracy can be insufficient
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dobosz, B.; Gozdowski, D.; Koronczok, J.; Žukovskis, J.; Wójcik-Gront, E. Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle. Agronomy 2025, 15, 238. https://doi.org/10.3390/agronomy15010238

AMA Style

Dobosz B, Gozdowski D, Koronczok J, Žukovskis J, Wójcik-Gront E. Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle. Agronomy. 2025; 15(1):238. https://doi.org/10.3390/agronomy15010238

Chicago/Turabian Style

Dobosz, Barbara, Dariusz Gozdowski, Jerzy Koronczok, Jan Žukovskis, and Elżbieta Wójcik-Gront. 2025. "Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle" Agronomy 15, no. 1: 238. https://doi.org/10.3390/agronomy15010238

APA Style

Dobosz, B., Gozdowski, D., Koronczok, J., Žukovskis, J., & Wójcik-Gront, E. (2025). Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle. Agronomy, 15(1), 238. https://doi.org/10.3390/agronomy15010238

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop