Next Article in Journal
Optimal Color Lighting for Scanning Images of Flat Panel Display using Simplex Search
Previous Article in Journal
Two-Dimensional Orthonormal Tree-Structured Haar Transform for Fast Block Matching
Previous Article in Special Issue
Contribution of Remote Sensing on Crop Models: A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Incorporating Surface Elevation Information in UAV Multispectral Images for Mapping Weed Patches

by
Theodota Zisi
1,
Thomas K. Alexandridis
1,*,
Spyridon Kaplanis
1,
Ioannis Navrozidis
1,
Afroditi-Alexandra Tamouridou
2,
Anastasia Lagopodi
3,
Dimitrios Moshou
2 and
Vasilios Polychronos
4
1
Laboratory of Remote Sensing, Faculty of Agriculture, Aristotle University of Thessaloniki, Spectroscopy and GIS, 541 24 Thessaloniki, Greece
2
Laboratory of Agricultural Engineering, Faculty of Agriculture, Aristotle University of Thessaloniki, 541 24 Thessaloniki, Greece
3
Laboratory of Phytopathology, Faculty of Agriculture, Aristotle University of Thessaloniki, 541 24 Thessaloniki, Greece
4
Geosense S.A., Filikis Etairias 15-17, Pylaia, 555 35 Thessaloniki, Greece
*
Author to whom correspondence should be addressed.
J. Imaging 2018, 4(11), 132; https://doi.org/10.3390/jimaging4110132
Submission received: 11 September 2018 / Revised: 16 October 2018 / Accepted: 2 November 2018 / Published: 9 November 2018
(This article belongs to the Special Issue Remote and Proximal Sensing Applications in Agriculture)

Abstract

:
Accurate mapping of weed distribution within a field is a first step towards effective weed management. The aim of this work was to improve the mapping of milk thistle (Silybum marianum) weed patches through unmanned aerial vehicle (UAV) images using auxiliary layers of information, such as spatial texture and estimated vegetation height from the UAV digital surface model. UAV multispectral images acquired in the visible and near-infrared parts of the spectrum were used as the main source of data, together with texture that was estimated for the image bands using a local variance filter. The digital surface model was created from structure from motion algorithms using the UAV image stereopairs. From this layer, the terrain elevation was estimated using a focal minimum filter followed by a low-pass filter. The plant height was computed by subtracting the terrain elevation from the digital surface model. Three classification algorithms (maximum likelihood, minimum distance and an object-based image classifier) were used to identify S. marianum from other vegetation using various combinations of inputs: image bands, texture and plant height. The resulting weed distribution maps were evaluated for their accuracy using field-surveyed data. Both texture and plant height have helped improve the accuracy of classification of S. marianum weed, increasing the overall accuracy of classification from 70% to 87% in 2015, and from 82% to 95% in 2016. Thus, as texture is easier to compute than plant height from a digital surface model, it may be preferable to be used in future weed mapping applications.

1. Introduction

Understanding and limiting the damage that can be caused by weeds to a crop, either within the current or upcoming growing season, is central to the effective overall management of the crop. The most common way of treating weeds is the application of herbicides, but this method exposes both the consumers and the environment to potential risks. Silybum marianum or milk thistle is a weed akin to the common thistle. It belongs to the Asteraceae family and can be either an annual or biennial plant. It has a characteristic purple-coloured flower, while its leaves are green with white veins. It is hard to control using herbicides in fields, and its thorny leaves and high posture are a nuisance to grazing animals in pastures. It grows in dense strands and is more competitive than grass weeds, growing taller to overshadow adjacent species that compete for light [1].
In recent years, unmanned aerial vehicles (UAVs) have proven to be useful in agricultural management. Their applications vary, but usually involve remote sensing, mapping, land modelling and precision agriculture. The most common use of UAVs in weed management is to acquire multispectral images in the visible and infrared spectrum for weed identification and mapping, however, a number of other potential uses have emerged, including the application of pesticides and other agrochemicals by the UAV itself [2]. Mapping weeds using UAVs provides high spatial resolutions, usually on the order of a few centimetres. This is vital in order to detect small weeds, weeds at a young stage, or within low crop cover with high background soil signal. In addition to the advantage of spatial resolution, UAVs offer temporal flexibility and can collect data at any time that fits the user’s requirements, as compared to the satellite images. Moreover, UAVs can acquire images during overcast conditions, thus bypassing a frequent problem with satellite remote sensing, although cloud shadows and varying illumination conditions during image acquisition may affect data quality [3,4].
A challenge encountered with remote sensing of weeds is low identification accuracy. The mixing of the spectral reflectance of weeds, crop and bare soil in pixels confuses the classification algorithm—an issue that has been ignored in most applications of remote sensing in the detection of weeds [5]. Although satisfactory for site-specific weed management, rather medium to low overall accuracies (53–69%) were reported while mapping weeds (mainly Cyperus rotundus L.) in four maize fields in central Italy [6]. The lowest accuracies were attributed to the spectral resolution and spectral sensitivity of the near infrared (NIR) band. The accuracy of mapping broadleaved weeds (Chenopodium album L. and Convolvulus arvensis L.) in a sunflower field in southern Spain ranged from 19% to 100%, with lower performances attributed to the use of a UAV camera capturing images in the visible wavelengths only [7]. Tamouridou et al. [8] achieved accuracy of 87.04% and Kappa statistic of 74% using the maximum likelihood algorithm in multispectral images (green–red–near-infrared) along with the layer of texture, for the determination of S. marianum clusters in a field in which the dominant species was Avena sterilis L. Pantazi et al. [9] used the Supervised Kohonen Network, Counter-Propagation Artificial Network and XY-Fusion Network on a multispectral UAV image (green–red–near-infrared) along with texture, for the determination of S. marianum clusters in the same field. The levels of accuracy achieved were high (>95% overall accuracy), partially attributed to the near-infrared band and texture layer. Using machine learning algorithms (support vector machines) it was possible to achieve high accuracies of mapping broadleaved weeds in maize and sunflower fields even using RGB images acquired from a UAV [10]. The features offering the higher discrimination capacity were selected from a set of several statistics and measures of different nature, and the resulting map achieved an accuracy as high as 95.5%.
With the increasing availability of very-fine-resolution UAV imagery, the question of optimum spatial resolution has been examined in weed mapping [8,11]. While mapping the early stages of broadleaved weeds, spatial resolution better than 2 cm had higher performance than 4 cm [7]. However, the highest resolutions did not necessarily produce the best results, as some of the detailed information was merely noise to the classification algorithm. This is the case for large broadleaved weeds (e.g., S. marianum) or weed patches [8]. The use of lower resolutions and the subsequent speed of processing could lead to effective real-time decision support systems [11].
Object-based image analysis (OBIA) is an image processing concept that treats adjacent pixels as objects, taking into consideration parameters of object shape and homogeneity, on top of the spectral information. As weeds tend to grow into patches, OBIA has been used in weed mapping using UAV images. Multispectral UAV images in visible and near-infrared wavelengths of a maize field were used to identify a broadleaved weed (Amaranthus blitoides) and a grass weed (Sorghum halepense) through OBIA. The resulting map achieved very high weed detection accuracy (86%) and very high agreement with observed field densities (r2 = 0.89) [3]. A UAV was used to collect visible and near-infrared multispectral images over two sunflower fields infested with broadleaved weeds (Amaranthus blitoides S. Wats, Sinapis arvensis L., Convolvulus arvensis L., and Chenopodium album L.). Using OBIA, the weeds were mapped with variable accuracy, ranging from 50% for images in the visible wavelengths and up to almost 100% for visible and near-infrared images, for both flying heights tested (30–60 m above ground) [12].
In several studies, vegetation canopy height has been used to map biomass, identify vegetation type, map ecological focus area for the European Common Agricultural Policy, and other applications. Lidar (light detection and ranging) is an accurate method to measure canopy height through terrestrial and aerial laser scanners onboard manned aircraft and UAVs, however, it requires expensive equipment as compared to other methods and well-trained personnel to achieve accurate results. The use of remotely sensed image stereopairs is a lower-cost alternative, which is based on photogrammetric algorithms to estimate object height, such as structure from motion (SfM) and multiview stereo [13,14]. SfM has been popular for estimating object heights and volumes in various applications including urban mapping, archaeological surveys and quarry monitoring [15]. In vegetation mapping, SfM has been used for mapping orchards’ tree height [14], maize height and above-ground biomass [16], wheat height for fast field phenotyping [17], and barley biomass estimation [18]. Recently, plant height was incorporated into a random forest–OBIA algorithm for segmentation of objects containing weeds within rows of cotton and sunflower [19]. They report an improved weed detection accuracy, as compared with previous work on maize [3]. Plant height in conjunction with a vegetation index were also thresholded into binary maps to identify weeds in maize and sugar beet fields, and produce site-specific herbicide application maps [20]. With this body of work, there is ample precedent for the efficacy of using data on vegetation canopy height for a variety of purposes, which has not been adequately tested in weed mapping.
The aim of this work was to demonstrate the improvement of accuracy of S. marianum weed mapping through UAV images with the use of auxiliary layers of information, such as spatial texture and estimated vegetation height from a UAV digital surface model.

2. Materials and Methods

2.1. Study Area

The research study to map S. marianum has taken place in a field of 10.1 ha in the southeast suburbs of the city of Thessaloniki, Greece, with central coordinates (WGS’84) N 40°34′14′’, E 22°59′44′’ (Figure 1), where biological control of S. marianum using the fungus Microbotryum silybum had been previously applied. The climate is Mediterranean with average annual precipitation of 450 mm and average annual temperature of 20 °C.
The elevation is approximately 75 m a.m.s.l. and the topography is slightly inclined towards southwest. This field was previously cultivated with cereals, but has not been used since 1990. During these years, the intensity of S. marianum has increased. Other weeds that appear in the field include Avena sterilis L., Bromus sterilis L., Solanum elaeagnifolium Cav, Conium maculatum L., Cardaria draba L. and Rumex sp. L.
Figure 1 shows the orthomosaics of UAV images acquired in 2015 and 2016 (RGB = NIR, R, G). Pink hues appear in areas where vegetation is healthy and chlorophyll is the dominant pigment, found mainly in the central north and central south parts of the field in 2015 and central north and eastern parts in 2016. Cyan hues appear in areas where herbaceous vegetation has senesced, mainly towards the western part of the study area and other scattered locations. The cyan rectangular patch at the southern tip in 2016 is a section where tillage was applied but not cultivated further.
Normally, S. marianum grows up to 1.5–2.0 m × 0.4–1.2 m (height × width) in the study area. A photo of an S. marianum patch in the study area is displayed in Figure 2.

2.2. Data

2.2.1. UAV Data

The UAV images were acquired on two sunny days May 19, 2015 and April 22, 2016 using a Canon S110 NIR (12 Mpixels) attached to a fixed-wing UAV (Sensefly’s eBee). The camera’s sensor size was 7.44 × 5.58 mm, providing images of 3000 × 4000 pixels in 4:3 aspect ratio, and during acquisitions the focal length was 5.2 mm, shutter speed 1/2000 s, and aperture and ISO set to Auto. The image bands include green (560 nm, full-width half-maximum (FWHM): 50 nm), red (625 nm, FWHM: 90 nm) and near-infrared (850 nm, FWHM: 100 nm). Fifty-five images were taken during each flight from an altitude of 115 m above ground, with a 75% overlap and a 70% lateral overlap, each covering an area of 120 m by 160 m with a ground sampling distance of 0.04 m (Figure 3). The individual images were imported using the original geocoding information stored in the metadata of each image from the UAV GPS, and numerous tie-points were automatically generated for the matching of the image mosaic in Pix4Dmapper Pro software (http://pix4d.com/). Ground control points’ locations were collected using a dual frequency Spectra Precision SP80 GNSS receiver, using a real-time kinematic (RTK) method. The final orthomosaics had a pixel size of 0.1 m (horizontal RMSE 0.08 m) (Figure 1). The digital surface models (DMSs) of the area were produced from the point cloud using the SfM technique of Pix4Dmapper Pro software.

2.2.2. In-Situ Data

A Trimble GeoXH 2008 GPS (Sunnyvale, CA, USA) with EGNOS correction was used for in-situ recording of vegetation types with accuracy better than 0.3 m. Plant patches were preferred over individual plants to overcome the mismatch of accuracy between the orthomosaic and GPS receiver. Thus, the in-situ field surveys recorded locations of large homogeneous patches of S. marianum and patches of other vegetation types. Due to the large density and variation of other vegetation, more than one species could be found in each patch of other vegetation type, excluding S. marianum. These locations were verified visually on the orthomosaics before further use in the analyses. The field surveys took place on 19 and 29 May 2015, 12 and 22 April 2016, and on 20 May 2016, where 245 points were collected randomly along transects of the test site. Using a set of points, several polygons were created that described the location of S. marianum patches and other vegetation patches. The area covered by the training polygons of the two categories was the same to avoid bias of statistical classification algorithms (142,000 pixels each). The remaining 188 points were used for evaluation of the accuracy of the classification. This exceeded the minimum sample size for validation (N = 51), which was estimated using the binomial probability theory, as applied previously in land cover classification [21,22]:
N = Z2 p q/E2,
where p is the expected percent accuracy, q = 100 − p, E is the margin of error, and Z = 1.96 from the standard normal deviant for the 95% 2-sided confidence level. An expected accuracy of 95% and allowable error of 5% were selected for this standard field survey, resulting in a minimum sample size of 51 points.

2.3. Digital Image Processing and Geographic Analysis

2.3.1. Surface Elevation Information and Texture

Auxiliary layers of information were produced to increase the level of description of vegetation features, and eventually the accuracy of classification. These layers comprise image texture and vegetation height across the field.
The layers of texture information were created by applying a local variance filter (7 × 7) to the UAV image bands (Figure 4a–c,e–g). Higher discrimination ability of S. marianum was observed using the texture from the near-infrared band, as reported for similar data in the same area [8], thus it was retained for use in the image classification.
The DSM was employed to produce the plant height layer. In this study, a focal minimum of 40 × 40 was first applied on the DSM to locate local lower points. The choice of filter size was made after experimentation in various filter dimensions so as to cover the largest S. marianum patches of the study area. A low-pass filter of 40 × 40 was then applied to eliminate the extreme values. The resulting layer was the terrain elevation (digital elevation model—DEM) where the ground was visible among the plants, or the height of the lower plants in the locations with a consistent vegetation cover larger than 40 × 40 pixels. The DEM layer was subtracted from the original DSM to estimate the plant height [14,18] (Figure 4d,h).
The three texture layers (Figure 4a–c,d–f) appear rather similar in the north and east sides of the field. However, texture from the green band (Figure 4a,d) has less contrast among patches of vegetation, especially in the central and southern parts of the field. Several similarities are also evident between the three texture layers and the plant height layer of each year (Figure 4d,h), as higher vegetation appears to have more pronounced texture.

2.3.2. Image Classification and Accuracy Assessment

Image classification in two classes (S. marianum weed and other vegetation) was performed with three standard classifiers that have been used in weed mapping, two pixel-based and an object-based classifier:
  • The maximum likelihood (ML) algorithm is based on the assessment of the likelihood of a pixel belonging to a specific category. This method uses the data of the training areas for the assessment of the class centres, and the coexistence of the spectral classes are used to estimate probabilities. In addition to average values, the variability of reflectance values of each spectral class is considered.
  • The minimum distance (MD) algorithm uses the median vectors of each pure pixel class from training areas (centre of each spectral class) and calculates the Euclidean distance of each unknown pixel to each centre. Pixels are classified to the nearest centre unless a standard deviation or threshold is set.
  • For the object-based image analysis (OBIA), the commercial software eCognition Developer 9.0 (Trimble GeoSpatial, Munich, Germany) was used. To create the object-oriented environment, segmentation was applied using the ‘multiresolution segmentation’ algorithm on a 40 scale, which was more appropriate after testing for the size of the weed patches. The parameters for determining object homogeneity, that is, shape and compactness, were assigned the values 0.1 and 1, respectively, which produced objects in a more meaningful way [23]. Classification of objects was done using the nearest neighbour algorithm [24]. The in-situ collected samples were assigned into objects that defined the classes of interest. The mean values of each input layer were used appropriately as the features for each respective case of object-based classification. The algorithm uses the distance between the features’ range of values for the object being classified with the features’ range of values for the classes of interest, to define the membership degree to each class and eventually assign the object to a class.
In addition to the image bands, the auxiliary layers of information (texture and plant height) were used as inputs in the classification to improve its performance. A 3 × 3 majority filter was applied on the classified image to remove the scattered individual pixels and improve the homogeneity of the result. The filter size was selected following the suggestion of previous work for similar data in the same area [8].
The classification evaluation was based on a comparison between the pixel classes derived by the classification algorithms and those identified in-situ (ground truthing) by creating an error matrix for each classification, from which the user, producer, total accuracy, and the Kappa statistics were calculated [25].

3. Results

After applying the three classification algorithms on various combinations of input data layers, nine maps with the distribution of S. marianum weeds were produced for each year (Figure 5 and Figure 6). In all of the classifications, large patches of S. marianum were identified along the eastern border and in the centre of the study area. In the remaining part, other vegetation types dominate with few scattered small S. marianum patches. A location with variable results appears in the northern central and western parts of the study area, where high texture vegetation appears (Figure 4a–c,e–g), low vegetation height (Figure 4d,h), but no S. marianum patches are present (Figure 1).
Table 1 outlines the summary of error matrices produced during accuracy assessment of the resulting maps, showing the following performance metrics: overall classification accuracy, Kappa statistic, user’s accuracy (for S. marianum) and producer’s accuracy (for S. marianum). Overall classification accuracies were lower in 2015 in all combinations of input data. This could be due to later acquisition date in 2015, which together with the different meteorological conditions led to more advanced senescence of all vegetation types, thus prohibiting the distinction among them. When compared for the same input data, pixel-based classifiers generally performed better than the object-based classifier in 2015, while the object-based classifier performed equally well in 2016. Between the two pixel-based classifiers, higher accuracy levels were achieved with the MD classifier than with the ML classifier for both years. Examining the user’s and producer’s accuracies between the two pixel-based classifiers, lower user’s accuracies were consistently noted for MD in 2015 and for one case of 2016. This is due to the high levels of commission errors observed because of the high number of false positives (overestimation) of other vegetation class. This was an expected result for this rather generic class with a classifier that does not take into account the class spectral variation. Another general finding, higher accuracy was achieved by using the auxiliary layers of information (texture layer and plant height), accompanied by the highest reliability rates and user’s and producer’s accuracy.
Examining the spatial distribution of S. marianum together with the error matrices in 2015, all algorithms overestimated S. marianum distribution in the western side of the study area when using the image bands only as input (Figure 5a,d,g), producing low scores of user’s accuracy (Table 1). After inserting the texture as additional input dataset, there was a notable increase in overall accuracy for the MD algorithm (87.04%), which was due to the elimination of false positive S. marianum in the western and northern parts of the study area (Figure 5b,e,h). The overall accuracy was equally high when plant height was added as input dataset for both pixel-based classifiers, providing a similar pattern of S. marianum distribution in the study area (Figure 5c,f,i). User’s and producer’s accuracy were also increased (66–96%). Despite the clear segmentation of weed patches in the area (Figure 5g–i), OBIA did not perform as well as the other algorithms, reaching only 75.9% overall accuracy.
For 2016, ML with image bands only (Figure 6a) underestimated S. marianum in several locations, providing low overall accuracy and very low producer’s accuracy (43.75%). On the contrary, MD overestimated S. marianum in the central-eastern part of the image (Figure 6d), providing higher overall accuracy (82.09%) but lower user’s accuracy (83.33%). OBIA performed even higher (88.81% overall accuracy) with very high user’s and producer’s accuracies (88.37% and 79.16%, respectively). Having included the texture layer in the input data, ML again underestimated S. marianum in several locations (Figure 6b), which is evident from the low producer’s accuracy (43.75%), while MD identified several small patches in the northern and southern ends of the study area reaching the highest overall accuracy of the study (95.52%) (Figure 6e). OBIA performed slightly lower, reaching an overall accuracy of 92.53%. After including the plant height layer together with the three image bands (Figure 6c,f), all classifiers produced good results reaching high overall, user’s and producer’s accuracies.

4. Discussion

This study tested the use of vegetation canopy height together with multispectral UAV images in mapping mature S. marianum weeds. Relevant work has used plant height indirectly, with the incorporation of image texture for weed mapping with several classifiers [9,26] and object-based image analysis [27], as well as with the use of estimated plant height for segmentation of crops and weeds into objects but not directly in the classification algorithm [19]. However, none of the above-mentioned studies has evaluated the improvement of the multispectral image classification that was achieved by using the vegetation elevation information.
Using a combination of surface elevation information together with image bands had a notable improvement on weed mapping performance, increasing the overall accuracy of classification from 70% to 87% in 2015, and 82% to 95% in 2016. The reliability was also notably increased (k statistic increased from 0.4 to 0.7 in 2015, and 0.5 to 0.9 in 2016), thus results were unlikely to be by chance. Similar increase in biomass model accuracy was reported when using combinations of barley height and vegetation indices from UAV images, rather than vegetation indices alone [18]. The combination of multispectral UAV imagery with cotton and sunflower height was reported to have improved the accuracy of weed mapping to an R2 of 0.91, as compared to an R2 of 0.89 achieved with previous work on maize [3,19].
The highest accuracy in this study was achieved by the image bands and texture layer classified using the MD algorithm, reaching 95.5%. The positive influence of the texture layer is due to its wide contrast between the S. marianum patches with a very high texture due to the interchange between the tall plants and the gaps between them, and on the other hand the relatively low and uniform other types of vegetation that are characterised by very low texture. A positive influence was also noted by the inclusion of a texture layer together with UAV bands, when evaluating the weights of features in the hidden neurons of an artificial neural network (multilayer perceptron with automatic relevance determination) that was tested for weed mapping in the study area [28].
The incorporation of the plant height layer appeared to be as equally effective as the incorporation of texture, achieving a total accuracy as high as 94.8%, which was consistent across the two classifiers. This slightly lower performance—compared to the 95.5% of the previous effort—is due to the fact that this classification failed to recognize an additional validation point in the category ‘other vegetation’. Therefore, because these two results are quite similar, it can be concluded that the plant height layer provided equal accuracy with the frequently used texture layer. It is probably due to the fact that the information layers of the DSM and texture are analogous because they both utilise the height of the plants, the first one directly and the other indirectly by the alternations between the tall plants, their shadows and the gaps between them.
Overall, the MD algorithm produced better results than ML. In particular, in the combination of image bands and texture layer, the comparison of the two rankings shows great differences. The increased weed detection performance with the MD algorithm is probably due to the non-normal distribution of texture data used (skewness = 6.74), which affects only the ML algorithm negatively. The OBIA classifier produced results equal to the pixel-based classifiers in 2016, but achieved consistently lower accuracies in 2015, probably due to the lack of regular objects (patches) that favour this classifier [3,27].
The plant height information appears to be an effective improvement to the classification accuracy of S. marianum in this study. S. marianum grows taller compared to the crop species (Triticum spp.) as well as the predominant grass weeds (e.g., A. sterilis), from germination to maturation, making it possible to distinguish by height, from young rosette, to mature dry plant [29]. The plant height information could be useful to other combinations of weeds and crops or pasture with pronounced height difference. Early detection could be possible for broadleaved competitive weed species, which grow taller and faster to overshadow adjacent species and successfully compete for light, such as Carduus sp., Onopordum sp. and other thistles [1,29].

5. Conclusions

In this work, the mapping of S. marianum weed clusters was achieved with high accuracy using UAV images and complementary secondary layers of information related to plant height. Both texture and plant height have helped improve the accuracy of classification of S. marianum weed, enabled by the relatively high posture of S. marianum. The minimum distance classifier using texture with multispectral bands provided equally accurate results using plant height and multispectral bands. Thus, as texture is easier to compute than plant height from DSM, it may be preferable to be used in future weed mapping applications.
The conclusions derived from this work can be applicable to other combinations of weeds and crops or pasture with pronounced height difference. Future work could study the effect of different crops, phenological stages, illumination and cloud cover conditions on the use of surface elevation information.
The produced weed maps are the first step towards site-specific weed management. This is one of the common applications of precision farming, contributing to reduce the amount of herbicides used in the field, and in the long term, benefit the farmers, the consumers and the environment [30].

Author Contributions

T.K.A. conceived, designed and supervised the work; T.Z., T.K.A., S.K., I.N., A.-A.T. and V.P. collected and analysed the data, T.Z., T.K.A. and S.K. wrote the paper; A.L. and D.M. reviewed and corrected the paper.

Funding

This research received no external funding.

Acknowledgments

Authors are grateful to the American Farm School of Thessaloniki for providing access to the study area.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Anderson, W. Weed Science, 3rd ed.; Waveland Press Inc.: Long Grove, IL, USA, 1996. [Google Scholar]
  2. Puri, V.; Nayyar, A.; Raja, L. Agriculture drones: A modern breakthrough in precision agriculture. J. Stat. Manag. Syst. 2017, 20, 507–518. [Google Scholar] [CrossRef]
  3. Peña, J.M.; Torres-Sánchez, J.; de Castro, A.I.; Kelly, M.; López-Granados, F. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images. PLoS ONE 2013, 8, e77151. [Google Scholar] [CrossRef] [PubMed]
  4. Torres-Sánchez, J.; Peña, J.; de Castro, A.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef] [Green Version]
  5. Thorp, K.; Tian, L. A review on remote sensing of weeds in agriculture. Precis. Agric. 2004, 5, 477–508. [Google Scholar] [CrossRef]
  6. Castaldi, F.; Pelosi, F.; Pascucci, S.; Casa, R. Assessing the potential of images from unmanned aerial vehicles (UAV) to support herbicide patch spraying in maize. Precis. Agric. 2017, 18, 76–94. [Google Scholar] [CrossRef]
  7. Peña, J.; Torres-Sánchez, J.; Serrano-Pérez, A.; de Castro, A.; López-Granados, F. Quantifying efficacy and limits of unmanned aerial vehicle (UAV) technology for weed seedling detection as affected by sensor resolution. Sensors 2015, 15, 5609–5626. [Google Scholar] [CrossRef] [PubMed]
  8. Tamouridou, A.A.; Alexandridis, T.K.; Pantazi, X.E.; Lagopodi, A.L.; Kashefi, J.; Moshou, D. Evaluation of UAV imagery for mapping silybum marianum weed patches. Int. J. Remote Sens. 2017, 38, 2246–2259. [Google Scholar] [CrossRef]
  9. Pantazi, X.E.; Tamouridou, A.A.; Alexandridis, T.K.; Lagopodi, A.L.; Kashefi, J.; Moshou, D. Evaluation of hierarchical self-organising maps for weed mapping using UAS multispectral imagery. Comput. Electron. Agric. 2017, 130, 224–230. [Google Scholar] [CrossRef]
  10. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. Selecting patterns and features for between-and within-crop-row weed mapping using UAV-imagery. Expert Syst. Appl. 2016, 47, 85–94. [Google Scholar] [CrossRef]
  11. Gebhardt, S.; Kühbauch, W. A new algorithm for automatic rumex obtusifolius detection in digital images using colour and texture features and the influence of image resolution. Precis. Agric. 2007, 8, 1–13. [Google Scholar] [CrossRef]
  12. López-Granados, F.; Torres-Sánchez, J.; Serrano-Pérez, A.; de Castro, A.I.; Mesas-Carrascosa, F.-J.; Peña, J.-M. Early season weed mapping in sunflower using UAV technology: Variability of herbicide treatment maps against weed thresholds. Precis. Agric. 2016, 17, 183–199. [Google Scholar] [CrossRef]
  13. Küng, O.; Strecha, C.; Beyeler, A.; Zufferey, J.-C.; Floreano, D.; Fua, P.; Gervaix, F. The Accuracy of Automatic Photogrammetric Techniques on Ultra-Light UAV Imagery; UAV-g 2011-Unmanned Aerial Vehicle in Geomatics; ETH: Zurich, Switzerland, 2011. [Google Scholar]
  14. Zarco-Tejada, P.J.; Diaz-Varela, R.; Angileri, V.; Loudjani, P. Tree height quantification using very high resolution imagery acquired from an unmanned aerial vehicle (UAV) and automatic 3d photo-reconstruction methods. Eur. J. Agron. 2014, 55, 89–99. [Google Scholar] [CrossRef]
  15. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  16. Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  17. Holman, F.H.; Riche, A.B.; Michalski, A.; Castle, M.; Wooster, M.J.; Hawkesford, M.J. High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing. Remote Sens. 2016, 8, 1031. [Google Scholar] [CrossRef]
  18. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Observ. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  19. De Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An automatic random forest-OBIA algorithm for early weed mapping between and within crop rows using UAV imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef]
  20. Mink, R.; Dutta, A.; Peteinatos, G.G.; Sökefeld, M.; Engels, J.J.; Hahn, M.; Gerhards, R. Multi-temporal site-specific weed control of Cirsium arvense (L.) scop. And rumex crispus L. In maize and sugar beet using unmanned aerial vehicle based mapping. Agriculture 2018, 8, 65. [Google Scholar] [CrossRef]
  21. Cochran, W.G. Sampling Techniques, 3rd ed.; Wiley: New York, NY, USA, 1977. [Google Scholar]
  22. Fitzpatrick-Lins, K. Comparison of sampling procedures and data analysis for a land-use and land-cover map. Photogramm. Eng. Remote Sens. 1981, 47, 343–351. [Google Scholar]
  23. Li, Q.; Wang, C.; Zhang, B.; Lu, L. Object-based crop classification with Landsat-Modis enhanced time-series data. Remote Sens. 2015, 7, 16091–16107. [Google Scholar] [CrossRef]
  24. Trimble. Trimble Documentation: E-Cognition Developer 9.0 User Guide; Trimble: Munich, Germany, 2014; p. 258. [Google Scholar]
  25. Congalton, R.G. A review of assessing the accuracy of classifications of remotely sensed data. Remote Sens. Environ. 1991, 37, 35–46. [Google Scholar] [CrossRef]
  26. Alexandridis, T.; Tamouridou, A.A.; Pantazi, X.E.; Lagopodi, A.; Kashefi, J.; Ovakoglou, G.; Polychronos, V.; Moshou, D. Novelty detection classifiers in weed mapping: Silybum marianum detection on UAV multispectral images. Sensors 2017, 17, 2007. [Google Scholar] [CrossRef] [PubMed]
  27. David, L.C.G.; Ballado, A.H. Vegetation indices and textures in object-based weed detection from UAV imagery. In Proceedings of the 2016 6th IEEE International Conference on Control System, Computing and Engineering (ICCSCE), Penang, Malaysia, 25–27 November 2016; pp. 273–278. [Google Scholar]
  28. Tamouridou, A.; Alexandridis, T.; Pantazi, X.; Lagopodi, A.; Kashefi, J.; Kasampalis, D.; Kontouris, G.; Moshou, D. Application of multilayer perceptron with automatic relevance determination on weed mapping using UAV multispectral imagery. Sensors 2017, 17, 2307. [Google Scholar] [CrossRef] [PubMed]
  29. DiTomaso, J.M.; Healy, E.A. Weeds of California and Other Western States; UCANR Publications: Berkeley, CA, USA, 2007; Volume 3488. [Google Scholar]
  30. López-Granados, F. Weed detection for site-specific weed management: Mapping and real-time approaches. Weed Res. 2011, 51, 1–11. [Google Scholar] [CrossRef]
Figure 1. Location of study area, unmanned aerial vehicle (UAV) orthomosaics acquired on 19 May 2015 and 22 April 2016, and field-surveyed data.
Figure 1. Location of study area, unmanned aerial vehicle (UAV) orthomosaics acquired on 19 May 2015 and 22 April 2016, and field-surveyed data.
Jimaging 04 00132 g001
Figure 2. Photo of S. marianum patch in the study area, taken on 22 April 2016.
Figure 2. Photo of S. marianum patch in the study area, taken on 22 April 2016.
Jimaging 04 00132 g002
Figure 3. A raw image acquired by the UAV on 22 April 2016 depicting weed patches (in dark) in the centre of the study area (not in scale).
Figure 3. A raw image acquired by the UAV on 22 April 2016 depicting weed patches (in dark) in the centre of the study area (not in scale).
Jimaging 04 00132 g003
Figure 4. Surface elevation information of 2015 (ad) and 2016 (eh) imagery in forms of texture from green (a,e), red (b,f) and near-infrared bands (c,g), and plant height (d,h).
Figure 4. Surface elevation information of 2015 (ad) and 2016 (eh) imagery in forms of texture from green (a,e), red (b,f) and near-infrared bands (c,g), and plant height (d,h).
Jimaging 04 00132 g004
Figure 5. Weed distribution maps using three classifiers on various input layers acquired on 19 May 2015. G, R and NIR are the green, red and near-infrared bands of the UAV orthomosaic, respectively, OBIA is object-based image analysis.
Figure 5. Weed distribution maps using three classifiers on various input layers acquired on 19 May 2015. G, R and NIR are the green, red and near-infrared bands of the UAV orthomosaic, respectively, OBIA is object-based image analysis.
Jimaging 04 00132 g005
Figure 6. Weed distribution maps using three classifiers on various input layers acquired on 22 April 2016. G, R and NIR are the green, red and near-infrared bands of the UAV orthomosaic, respectively.
Figure 6. Weed distribution maps using three classifiers on various input layers acquired on 22 April 2016. G, R and NIR are the green, red and near-infrared bands of the UAV orthomosaic, respectively.
Jimaging 04 00132 g006
Table 1. Classification accuracy using independent validation samples (ML is maximum likelihood, MD is minimum distance, G, R and NIR are the green, red and near-infrared bands of the UAV orthomosaics, respectively).
Table 1. Classification accuracy using independent validation samples (ML is maximum likelihood, MD is minimum distance, G, R and NIR are the green, red and near-infrared bands of the UAV orthomosaics, respectively).
Input Data (Classifier)Overall Accuracy
(%)
Kappa Statistic
(−)
User’s Accuracy
(S. marianum)
(%)
Producer’s Accuracy
(S. marianum)
(%)
User’s Accuracy
(Other Vegetation)
(%)
Producer’s Accuracy
(Other Vegetation)
(%)
19/05/2015
G, R, NIR (ML)70.370.4165.52767665.5
G, R, NIR (MD)70.370.469.576470.975.8
G, R, NIR (OBIA)57.40.1752.68068.737.3
G, R, NIR, texture (ML)79.630.71758496.186.2
G, R, NIR, texture (MD)87.040.7387.58486.689.6
G, R, NIR, texture (OBIA)75.90.5366.69694.458.6
G, R, NIR, plant height (ML)87.040.7177.789592.382.7
G, R, NIR, plant height (MD)87.040.7387.58486.689.6
G, R, NIR, plant height (OBIA)75.920.5268.758886.365.5
22/04/2016
G, R, NIR (ML)79.850.510043.7576.1100
G, R, NIR (MD)82.090.5883.3362.581.693
G, R, NIR (OBIA)88.810.7588.3779.168994.1
G, R, NIR, texture (ML)79.850.510043.7576.1100
G, R, NIR, texture (MD)95.520.910087.593.4100
G, R, NIR, texture (OBIA)92.530.8393.1885.4192.296.5
G, R, NIR, plant height (ML)93.280.8589.891.6795.394.2
G, R, NIR, plant height (MD)94.780.8897.6787.593.498.8
G, R, NIR, plant height (OBIA)91.790.8197.4379.1689.498.8

Share and Cite

MDPI and ACS Style

Zisi, T.; Alexandridis, T.K.; Kaplanis, S.; Navrozidis, I.; Tamouridou, A.-A.; Lagopodi, A.; Moshou, D.; Polychronos, V. Incorporating Surface Elevation Information in UAV Multispectral Images for Mapping Weed Patches. J. Imaging 2018, 4, 132. https://doi.org/10.3390/jimaging4110132

AMA Style

Zisi T, Alexandridis TK, Kaplanis S, Navrozidis I, Tamouridou A-A, Lagopodi A, Moshou D, Polychronos V. Incorporating Surface Elevation Information in UAV Multispectral Images for Mapping Weed Patches. Journal of Imaging. 2018; 4(11):132. https://doi.org/10.3390/jimaging4110132

Chicago/Turabian Style

Zisi, Theodota, Thomas K. Alexandridis, Spyridon Kaplanis, Ioannis Navrozidis, Afroditi-Alexandra Tamouridou, Anastasia Lagopodi, Dimitrios Moshou, and Vasilios Polychronos. 2018. "Incorporating Surface Elevation Information in UAV Multispectral Images for Mapping Weed Patches" Journal of Imaging 4, no. 11: 132. https://doi.org/10.3390/jimaging4110132

APA Style

Zisi, T., Alexandridis, T. K., Kaplanis, S., Navrozidis, I., Tamouridou, A. -A., Lagopodi, A., Moshou, D., & Polychronos, V. (2018). Incorporating Surface Elevation Information in UAV Multispectral Images for Mapping Weed Patches. Journal of Imaging, 4(11), 132. https://doi.org/10.3390/jimaging4110132

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop