Next Article in Journal
Attention-Based Monocular Depth Estimation Considering Global and Local Information in Remote Sensing Images
Next Article in Special Issue
Satellite Remote Sensing Tools for Drought Assessment in Vineyards and Olive Orchards: A Systematic Review
Previous Article in Journal
Remote Sensing Extraction of Lakes on the Tibetan Plateau Based on the Google Earth Engine and Deep Learning
Previous Article in Special Issue
Multispectral and Thermal Sensors Onboard UAVs for Heterogeneity in Merlot Vineyard Detection: Contribution to Zoning Maps
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Vineyard Zoning and Vine Detection Using Machine Learning in Unmanned Aerial Vehicle Imagery

1
Faculty of Technical Sciences, University of Novi Sad, Trg Dositeja Obradovića 6, 21000 Novi Sad, Serbia
2
Faculty of Agriculture, University of Novi Sad, Trg Dositeja Obradovića 8, 21000 Novi Sad, Serbia
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(3), 584; https://doi.org/10.3390/rs16030584
Submission received: 10 January 2024 / Revised: 29 January 2024 / Accepted: 31 January 2024 / Published: 3 February 2024
(This article belongs to the Special Issue Remote Sensing in Viticulture II)

Abstract

:
Precision viticulture systems are essential for enhancing traditional intensive viticulture, achieving high-quality results, and minimizing costs. This study explores the integration of Unmanned Aerial Vehicles (UAVs) and artificial intelligence in precision viticulture, focusing on vine detection and vineyard zoning. Vine detection employs the YOLO (You Only Look Once) deep learning algorithm, achieving a remarkable 90% accuracy by analysing UAV imagery with various spectral ranges from various phenological stages. Vineyard zoning, achieved through the application of the K-means algorithm, incorporates geospatial data such as the Normalized Difference Vegetation Index (NDVI) and the assessment of nitrogen, phosphorus, and potassium content in leaf blades and petioles. This approach enables efficient resource management tailored to each zone’s specific needs. The research aims to develop a decision-support model for precision viticulture. The proposed model demonstrates a high vine detection accuracy and defines management zones with variable weighting factors assigned to each variable while preserving location information, revealing significant differences in variables. The model’s advantages lie in its rapid results and minimal data requirements, offering profound insights into the benefits of UAV application for precise vineyard management. This approach has the potential to expedite decision making, allowing for adaptive strategies based on the unique conditions of each zone.

1. Introduction

Thanks to very favourable agroecological conditions, a long history, and tradition, viticulture has always represented an important branch of agriculture in Serbia. The period of transition, economic instability, wars, the abandonment of villages, the decline of large viticulture and winemaking complexes, and market instability at the end of the 20th century have contributed to the current, very difficult situation in the viticulture and winemaking sector [1]. An agricultural farm in the Republic of Serbia that produces grapes has an average of 0.28 hectares under vines. The average vineyard area cultivated by one farm in Central Serbia is 0.23 hectares, while in Vojvodina, it is significantly larger at 0.85 hectares [1]. At the end of 2020, the Republic of Serbia published the “Program for the Development of Winemaking and Viticulture for the Period from 2021 to 2031”. With the adoption of this document, the conditions for the development of viticulture and winemaking have been created. Geographic Information Systems (GISs), sensors, and the use of artificial intelligence in this field can make a significant contribution to the further development of viticulture.
Remote sensing in agricultural production is diverse, such as in the detection of chlorophyll content in plants, the assessment of plant health and water status, soil moisture measurements, weed and pest detection, and mapping the areas for selective spraying and fertilization, among others [2]. Precision viticulture represents the integration of traditional grape production concepts with new technologies, empowering winegrowers and winemakers to make informed decisions to optimize vineyard performance [3]. The application of artificial intelligence in the viticulture sector is still in the early stages of development. Many processes in viticulture can be significantly enhanced through the utilization of artificial intelligence. In recent years, the viticulture industry has encountered challenges arising from a shortage in skilled labour and rising labour costs, which have impacted productivity, quality, and timely harvest [4]. Additionally, manual agricultural tasks are time-consuming and susceptible to the subjective decisions made by workers. These challenges have urged the development of new technologies that use data from advanced sensors, including data from Unmanned Aerial Vehicles (UAVs) and the application of artificial intelligence. These advancements aim to enhance productivity, improve quality, and boost economic competitiveness.
Over the past few years, the proliferation of modern sensors and the application of new technologies in precision viticulture have resulted in a significant increase in data generation per hectare. This has provided winegrowers with a wealth of information [5]. Data acquisition techniques commonly seen in viticulture can be categorized into three groups: satellite imagery, data acquired using aerial platforms (most frequently UAVs), and data collected from ground-based (terrestrial) platforms. Given that vineyards typically cover relatively small areas, the latter two categories play a more substantial role in assessing various parameters in precision viticulture [6]. However, it is worth noting that satellite imagery still has its utility in this domain [7]. Depending on the spatial resolution of the images, it can be applied in the management at different levels, such as at the plot, row, or individual plant levels. Due to their coarser spatial resolution, satellite images are typically seen for vineyard-level management when distinguishing rows or individual plants is impractical [8]. In contrast to satellite remote sensors, UAVs offer several advantages, notably their ability to capture images with a higher spatial resolution compared to that of satellites. This high spatial resolution enables the identification of fine details and features that are often indiscernible in satellite imagery [9]. This becomes particularly important when the pixel size is larger than the objects of interest, as is often the case in vineyards [10]. Consequently, mixed pixels emerge, where a single pixel encompasses various elements, including the above-ground sections of cultivated plants, weeds, soil, and shadows [11]. Given the narrow width of vine canopies, using images with resolutions exceeding 25 cm presents challenges related to the accurate classification of vine canopies, weeds, soil, and shadows [12].
Precision viticulture encompasses a range of methodologies, analyses, and processes tailored for vineyard management. Precision viticulture aims to address challenges such as efficient resource utilization, disease and pest management, optimal harvest timing, water management, and more. In addition to these objectives, precision viticulture studies spatial variability within vineyards and defines zones that differ in terms of soil, microclimate, and morphology [3]. These zones require specific management approaches. The primary goal of precision viticulture is to maximize agronomic potential in terms of yield and production quality while enhancing ecological sustainability, reducing costs, and minimizing the use of harmful substances, with a strong emphasis on environmental protection [13]. As such, precision viticulture aims to craft location-specific management plans for each segment of the vineyard, and even individual plants, with a keen focus on resource optimization. In doing so, unnecessary, potentially harmful treatments can be avoided, leading to cost reductions. In the field of precision viticulture, remote sensing is used for numerous purposes, including yield estimation [14,15,16,17,18]. Additionally, computer image processing and artificial intelligence techniques are applied for detecting inflorescences [19,20], vines [4,5,21,22,23,24], and even cluster berries [25], as well as for disease detection [26]. A central aim of precision viticulture is to provide grape growers and wine producers with accurate, real-time, or near-real-time data to facilitate efficient and sustainable vineyard management [27]. Some authors [2,28,29,30] propose a three-step cyclical process in order to adopt precision viticulture:
  • The collection of data on vineyards;
  • The interpretation of data;
  • The development and implementation of a targeted management plan based on the analysis.
Utilizing modern technologies in vine cultivation is imperative for unlocking the full potential of precision agriculture and viticulture, aiming to boost income and preserving the environment through site-specific resource management [31,32]. An essential aspect of precision viticulture involves the individualization of vines, facilitating plant-level management [33]. This approach empowers vineyard managers to implement treatments tailored to the unique needs of each plant or location. Over the years, issues such as diseases and mechanical damage have caused losses of plants, resulting in a decrease in the initial number of vines per hectare. As a result, farmers experience a significant reduction in potential wine production. To precisely estimate vineyard yield, calculations are based on the number of vines and the average yield per plant. However, since each individual vine may be affected by diseases to varying degrees, identifying live and missing vines in each vineyard and tallying them is necessary to assess yield based on the actual number of vines. This task is fundamental because vineyards typically lose vines every year due to aging, diseases, mechanical damage, or other factors. Given that the number of missing plants can exceed 20% [34], calculating yield by multiplying the average yield per plant by the theoretical number of plants can lead to significant overestimations. Visual counting is a time-consuming process, underscoring the need for a fully automated method capable of detecting and locating vines. Such a method holds the potential for generating base maps for various precision viticulture applications.
Various methods that can be found in the literature can estimate the position of trees, using prior knowledge regarding the number of plants in a row and the distance between plants [35,36,37,38]. One approach for identifying missing plants would be to detect areas that are not covered by the canopy along the row. However, vertical aerial photography cannot capture below the canopy, and, when a plant is missing, neighbouring plants can expand their shoots and leaves to fill the nearby adjacent space [35,39]. Regarding counting and detecting vines, most of the available literature relies on 3D point clouds for vine location [35]. Several studies have explored 3D point clouds for identifying vineyards, e.g., Comba et al. [37] proposed an unsupervised algorithm for vineyard detection and the evaluation of vine characteristics based on 3D point cloud processing. Jurado et al. [36] proposed an automatic method for grapevine trunk detection using point clouds. The proposed method focuses on the recognition of key geometric parameters to ensure the existence of each plant in the 3D model. Studies that use image analysis for mapping are rare, and the proposed methods in these studies need further improvement. de Castro et al. [38] proposed and developed an object-oriented method of image analysis for the evaluation of grapevine canopy and applied it to high-resolution digital models of the surface. Then, the individual positions of the vines were marked, assuming a constant distance between the plants, and in this way, the missing plants were estimated.
The widely adopted practice of uniform vineyard management results in lower productivity, inefficient resource use, and adverse environmental effects [40,41,42]. The delimitation of management zones can enable the management of spatial variability within the field by dividing it into homogeneous zones, which are of particular interest for the implementation of variable nutrient and water management schemes [43,44,45,46,47]. It can be assumed that based on the analysis of vegetation indices and the processing of the time series of satellite images, zones of soil degradation [48,49,50] in agriculture can be distinguished. So far, various methods have been developed to measure the spatial variability within the field for the delimitation of management zones. Most of these methods in analysis and zoning rely on individual data types. These data can be yielding data [51,52,53], soil characteristic data [54,55,56], remote sensing data [57,58,59], etc. Numerous authors [60,61,62,63] have proposed zoning through the simple categorization of values, such as vegetation indices, into a specific number of categories or classes, each containing an equal number of objects (pixels). Another approach for delimiting management zones that can be found in the literature is zoning using unsupervised classification [64,65,66,67,68,69]. The K-means algorithm clearly stands out as the most frequently applied machine learning method for defining management zones in agriculture, and thus finds application in viticulture as well. In order to be sure that the obtained results (zones) depend only on the condition of the vines, it is necessary to remove the influence of the inter-row and the vegetation in them. Many authors [62,70,71] suggest reading the values from the raster (e.g., vegetation indices) using points in the vine rows and creating a new raster via interpolation based on those values. In this way, the new raster will represent the spatial distribution of only the state of the vines, and such data can be used to define management zones. It was noted that in the literature, there is no rule on how such locations are chosen, but that they are chosen randomly. Also, although there is a study in the literature [72] that deals with defining standardized methods for delineating zones and providing recommendations on which data to use in zone delimitation, the proposed methodology in this paper is valuable because it provides the locations of grapevines with high accuracy without the need for processing point clouds. These locations are ultimately used for vineyard zoning.
Therefore, considering all the mentioned shortcomings found in the available literature, the authors of this study propose a method that utilizes only UAV images in the RGB + NIR spectrum to detect and locate vines and count and identify living and wilted vines (those vines that are dead, but the stem still remains in the rows). Based on these data, along with the results of plant sample analyses within the vineyard (the nitrogen (N), phosphorus (P), and potassium (K) content in leaf blades and petioles), the method employs machine learning techniques to define management zones. These zones will distinctly separate different parts of the vineyard with various characteristics. The developed model will enable winemakers and grape growers to have a reliable, fast, and automated procedure for counting, detecting, and localizing living and wilted vines, as well as for zoning the vineyard, a heterogeneous structure, into a specific number of homogeneous zones managed in a unique way to maximize agronomic potential.

2. Materials and Methods

2.1. Study Areas and Datasets

The study area was located in Sremski Karlovci, in the province of Vojvodina, the northern part of Serbia. The study area included the valleys of Fruška Gora Mountain, characterized by meadows and pastures, while its slopes are covered with orchards and vineyards. Some parts of the mountain rise to heights exceeding 300 m above sea level (ASL), being covered with dense deciduous forest. The viticulture in the Fruška Gora Mountain has a long tradition. The earliest records are 1700 years old, but it is reasonable to believe that vineyards existed in that region before Roman Emperor Probus. The vineyard zone spreads from the forest zone on the mountain crest and from the zone of field crops at the mountain feet.
The experimental was conducted in a vineyard block with an area of 2 ha (45°11′16″N, 19°55′44″E) with an altitude of 120 m ASL (Figure 1). This vineyard block is a part of the Experimental field for viticulture, University of Novi Sad, Faculty of Agriculture. The vineyard was planted in 1996 (cv. Riesling Italico, clone SK 54, grafted on Kober 5BB), by planting two grafts in one planting place with a planting distance of 2.8 × 1.6 m. Initially, the vineyard block consisted of 5880 vines (2940 locations with two grafts each). However, in the test area with 1.2 ha (marked in red in Figure 1), where the controls were conducted, there were a total of 2442 locations with two grafts each. The actual presence of vines was determined manually by walking through the vineyard and recording the status of each vine. Three situations were observed: live vines, missing vines, and dead vines but with its trunk still present (wilted vines).
For this area, a series of studies were conducted in 2020 and 2022 during different phases of the vine growth cycle. The research utilized a DJI Phantom P4 v2.0 drone (DJI Sky City, Shenzhen, China) equipped with a multispectral camera, MicaSense RedEdge-M (MicaSense, Inc., Seattle, WA, USA). This sensor captures images in five bands: Blue (475 nm ± 20 nm), Green (560 nm ± 20 nm), Red (668 nm ± 10 nm), Red Edge (717 nm ± 10 nm), and Near Infrared (840 nm ± 40 nm) [73]. To perform the vine counting and locate wilted vines according to the proposed model, it was necessary to capture the vineyard at two different time points: before the start of the vegetative cycle (first half of April (2 April 2020 and 13 April 2022)) and after the flowering stage (end of flowering and onset of veraison (21 August 2020 and 11 August 2022)). The images taken before vegetative growth were later used to identify the grapevines (using shadows), a process that is difficult during and after the flowering stage due to leaf density. In both periods, the imaging covered the vineyard and its surroundings, and precisely these data from the surrounding vineyards were used to train a neural network, enabling the automatic recognition of vines within the analysed vineyard.
In order to precisely position the obtained photogrammetric survey products (orthophoto, Digital Surface Model (DSM), Digital Terrain Model (DTM), reflection maps for individual bands), 5 markers were placed before each flight (in the corners of the area and in the middle of the surveyed area), and the positions of the markers were measured with a Trimble R2 (Trimble Germany GmbH, Raunheim, Germany) GNSS (Global Navigation Satellite System) receiver. The Real-Time Kinematic positioning (RTK) measurement method was used. Markers were used as Ground Control Points (GCPs), and in this way all, obtained maps from different time periods could be overlapped with sufficient accuracy.
The flight plan was formed to obtain crosshatch trajectories with a longitudinal overlap of 75% and a transverse overlap of 70% at a height of 60 m above ground. During each flight, two sets of images were obtained: RGB from the camera built into the UAV and multispectral from the MicaSense RedEdge-M camera, which was additionally mounted on the UAV together with a GNSS receiver. In using the data from this receiver, information about the position of the camera during the recording (EXIF info) wass attached to the multispectral images. Transverse overlap of images wass achieved through defined paths in the flight plan. In order to obtain a correct reflectance map for each band of the camera, before each flight, the calibration panel supplied with the multispectral camera was pre-recorded.
Photogrammetric image processing was conducted for each imaging campaign separately, and as a result of processing RGB images from the DJI camera, multilayered rasters with an average ground sampling distance (GSD) of approximately 1.5 cm were obtained. For multispectral images captured with the MicaSense RedEdge-M, five different orthophotos were generated for each multispectral band (Blue, Green, Red, Red Edge (RE), Near Infrared (NIR)) with a GSD of around 3.5 cm.
In addition to UAV data used for vineyard zoning with the aim of better defining management zones, an analysis of plant samples within the vineyard was conducted to incorporate these data into the zoning process. The vineyard block was initially divided into 20 subplots. On two occasions (end of flowering and at veraison), one average sample of leaves (30 leaves) was taken from each subplot. In the laboratory, the leaf blades and petioles were separated and submitted for a chemical analysis. The nitrogen (N) content in the samples was determined using the AOAC Official Method 972.43 [74], and phosphorus (P) and potassium (K) were determined via ICP-OES [74].

2.2. Methodology

To conduct a rapid analysis of the vineyard and obtain quality results for timely management, it is necessary to create a model that incorporates all essential elements in the vineyard, from data collection to the final definition of management zones.
For the purpose of this research, field data on the analysed vineyard were collected, including UAV images from multiple time periods and samples for the analysis of N, P, and K content in leaf blades and petioles. Figure 2 illustrates the system encompassing all the steps of processing the previously collected data, highlighting several key steps:
  • The detection and localization of vines;
  • The exclusion of inter-rows from the analysis;
  • The zoning of vineyards into a certain number of homogeneous zones.

2.2.1. Detection of Vines—You Only Look Once (YOLO) Algorithm

Object detection is a method related to computer image processing, which involves detecting instances of semantic objects of a certain class (such as people, buildings, or cars) in digital images and videos. Object detection is applicable in many areas, so it also finds applications in precision agriculture.
Deep learning-based object detection algorithms can be divided into two types: two-stage networks and one-stage networks. The first group includes algorithms that perform detection in two stages (1. the detection of potential regions containing objects; 2. The classification of images based on those regions), such as RCNNs (Region-Based Convolutional Neural Networks) and Fast RCNNs. A drawback of these algorithms can be the long image processing time, potentially hindering real-time detection [75]. The second group of detection algorithms includes YOLO, SSD (Single-Shot Detector), and others.
One of the primary contributions of this study is manifested in its initial step, namely, the detection and localization of live vines. The outcomes of this step also serve as input data for the subsequent stages of the proposed algorithm, making it evident that the accurate identification of live grapevines will lead to appropriately defined management zones. Figure 3 depicts the segment that deals with the detection and localization of vines from the model presented earlier (Figure 2).
In 2016, Redmon and Farhadi [76] proposed the YOLO model, which is a one-stage network. YOLO is a neural network-based algorithm used for object detection. It distinguishes itself from other object detection algorithms in that it observes the image only once. The neural network’s design is implemented through a feedforward network, where information flows from the input layer to the hidden layers, ultimately reaching the output layer. This means that object detection for the entire image is performed in a single pass of the algorithm, simultaneously predicting the probability of an object belonging to a certain class and the bounding boxes that specify its location in the image (Figure 4) [77].
The YOLO algorithm family consists of multiple models, with YOLOv5 being easy to train with good reliability and stability [78]. The Web of Science shows that publications based on YOLOv5 have an absolute advantage and have been widely used in the past years [79]. The selection of the YOLOv5 model for our research is based on its proven simplicity and speed, as well as its widespread use in previous studies [5,27,78,79], confirming the relevance of this model in the research community. Additionally, YOLOv5 was chosen for its popularity and availability across a wide range of applications in both industry and academic circles. Therefore, YOLOv5 remains highly competitive and was utilized in this study. YOLOv5 is a popular deep learning framework that includes five network models of different sizes, YOLOv5n, YOLOv5s, YOLOv5m, YOLOv5l, and YOLOv5x [80], representing different depths and widths of the network [79].
The detection of vines inside a vineyard using images obtained by a UAV is a challenge, because in the later stages of vineyard development, the vegetation is so dense that it is not possible to see the tree, and in the early stages when there is no vegetation, the tree merges with the soil, and it is difficult to identify it on RGB images. This problem can be overcome by detecting the shadow cast by the vine tree in the first phase (Figure 5), and then in the next phase localizing the vines.
In the experimental field, within the vineyard that was used for training, the marking of vines was carried out. The next figure (Figure 6) shows a part of the training dataset, for which bounding boxes were manually drawn.
The training dataset, created in this way and used to train the neural network, enables the transfer and the detection of vines in another vineyard without the need of remarking or creating new training sets. So, there is no requirement to retrain the network. When applying the previously trained model to a new dataset, results are obtained rapidly, with marked bounding boxes around each vine. The number of bounding boxes corresponds to the number of vines.
The first result obtained after applying the YOLO algorithm is the number of identified vine trees. However, several events, such as diseases or mechanical damage, can lead to the drying of vine trees, thus reducing the number of living vines that will develop later. Accordingly, the next step is the localization of living vines and the determination of the number of dead vines. To accomplish this, it is necessary to collect new images of the analysed vineyard in the next phenological period, when the green shoots begin to develop. Then, the Normalized Difference Vegetation Index (NDVI) is calculated to differentiate between living and dead vines. In this step, it is necessary to define the threshold value of the NDVI, which is used to represent living vines (the threshold value in this work was estimated manually). An additional challenge in this step is inter-row vegetation that can introduce errors into the detection results.
The following figures show the algorithm for the identification and localization of living vines. Initially, the vine identification process involves adjusting the boundary frames (blue boxes) containing the identified shadows (Figure 7a) to include the entire width of the vine row (Figure 7b). After that, using the NDVI, the parts of the image that belong to the vine rows are extracted (Figure 7c). If those polygons (red lines inside red rectangle) are within the adjusted bounding box (red boxes), it indicates the presence of a live vine. Conversely, if the boundary is empty, it indicates a dead vine. The last step is the localization of live vines, i.e., the determination of the coordinates of all living vines. The position of the vine (yellow dots) is obtained by calculating the centre point for the part of the row that is extracted using the NDVI and is located within the translated boundary frame (Figure 7d).
To ensure the high accuracy of the presented model for identifying living vines, it is crucial that the UAV images are captured during a period when vegetation is not developed to the extent that it obscures the space above the dead vine, but is sufficiently advanced to identify the canopy of each individual vine using the NDVI (Figure 7d). Additionally, when planning the flight, consideration should be given to the time of day when the capture will take place to identify vine shadows in the image. If the images are captured around noon, the shadows will be almost invisible, causing errors in the algorithm’s operation, and some vines will not be identified.

2.2.2. Management Zones—K-Means Algorithm

The K-means clustering algorithm is considered one of the most powerful and popular data analysis algorithms in the research community. The K-means algorithm is an algorithm for clustering n objects in k clusters, based on attributes, where k < n. The specified classification method starts from a predetermined number of clusters defined by the user himself. The K-means algorithm is based on the principle of iterative displacement. This means that after establishing an initial solution, subsequent moves are made (i.e., assigning observations to clusters), and the algorithm stops when no further improvement is possible.
The K-means clustering algorithm is optimal for use in problems of precise agriculture in vineyards using UAV images [81,82,83,84]. Being an unsupervised learning method, it does not require labelled data for training. Also, the K-means algorithm is a computationally efficient method that can handle large datasets. With the increasing availability of high-resolution UAV imagery, the use of efficient algorithms is crucial for timely analysis. Finally, K-means is a robust algorithm that can handle noise and outliers in the data (in the case that you have noise, the K-means algorithm will cluster the noise in one of the defined classes).
To zone the vineyard based on the condition of the grapevines, it is necessary to exclude inter-rows (the space between rows in the field where there may be non-vineyard plants or bare soil) before applying the K-means algorithm in further analyses. This intermediate step needs to be performed on UAV imaging data (i.e., on the raster representing the NDVI, the parts related to inter-rows need to be removed) to obtain data related only to the vines, as is the case with other data involved in the zoning process (results of chemical analyses of vine leaf and petiole). The removal of inter-rows is conducted using the locations of previously detected live vines, from which NDVI values were collected from the image captured after the flowering stage. Based on these values, i.e., using points defining the location of live vines, the possibility of readings from rows without vegetation caused by withered vines is eliminated. Through a further interpolation of these values, the spatial distribution of the NDVIs of grapevines within the analysed vineyard is obtained, without the influence of inter-row vegetation or soil. Although logical, this choice of points has not been applied in the literature so far, where it is common to randomly select data collection locations in vineyards, regardless of whether they are between rows or within rows.
Based on the newly created raster representing the NDVI without the influence of inter-row vegetation and rasters obtained by interpolating the leaf and stem content of nitrogen, phosphorus, and potassium sample analysis data, management zones are defined using the K-means clustering algorithm (Figure 8).
Standard clustering algorithms, including the previously mentioned K-means, use only attribute values when defining clusters. Therefore, the resulting clusters are not guaranteed to be spatially contiguous, and these algorithms were not designed to ensure this. If spatial information is desired for grouping, the data concerning object locations must be appropriately adapted for algorithmic processing.
When defining the management zones, the location information (coordinates) was also incorporated along with the previously calculated attributes (NDVI and the N, P, and K content in leaf blade and petiole). This approach treats coordinates separately from regular attributes and tries to make a compromise between attribute similarity and location similarity. If we consider the general case, in which there are p attributes next to the X, Y coordinates, with a weight of w1 ≤ 1 for the coordinates, the weight for the regular attributes will be w2 = 1 − w1. Specifically, weight w1 is assigned to the coordinates (as w1/2 for X and Y), while weight w2 is assigned to regular attributes as w2/p. As the weight for the X, Y coordinates increases, the fit to the attribute dimension will become worse. The large weight for the X, Y coordinates essentially forces a continuous solution, like that which would follow if only the coordinates were considered. Although this enforces adjacency, it does not provide a significant result in terms of attribute similarity.

2.2.3. Metrics

In addition to visual assessment, numerical and quantitative evaluations play an important role in assessing the accuracy of the obtained results. The performance of the model was evaluated using a confusion matrix, and the metrics presented in this chapter were used to determine the performance of the vine detection model. Accuracy is often used in evaluating the performance of classification models in machine learning and is one of the basic indicators of the quality of classification models. It is a metric that shows the percentage of correctly classified instances in relation to the total number of instances [85]:
A c c u r a c y = N u m b e r   o f   c o r r e c t   p r e d i c t i o n s T o t a l   n u m b e r   o f   p r e d i c t i o n s
Recall is another metric used to assess model accuracy and is calculated as the ratio of the number of correctly classified instances in a certain class to the total number of instances that actually belong to that class. This parameter indicates whether false negatives have a large impact on model performance, where a higher number indicates a better performance. Recall can be expressed by the following formula [85]:
R e c a l l = T r u e   P o s i t i v e T r u e   P o s i t i v e + F a l s e   P o s i t i v e × 100 %
Precision is a measure used to evaluate the performance of classification models. It represents the percentage of correctly classified positive instances out of all instances that the model classified as positive. Precision is calculated only on positively classified examples using the following formula [86]:
P r e c i s i o n = T r u e   P o s i t i v e T r u e   P o s i t i v e + F a l s e   N e g a t i v e × 100 %
The F1 score is the harmonic mean between precision and recall, representing a balance between these two measures. The harmonic mean is useful because it assigns more weight to the smaller value, meaning that the F1 score will be high only if both the precision and recall are high, indicating a model with a good balance between detecting true positive results and avoiding false positive results [86].
F 1   s c o r e = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l × 100 %

3. Results

As already mentioned, the first step in the proposed algorithm is the detection of the grapevine tree shadow using the YOLO algorithm. In this step, training was initially performed on a dataset created for surrounding vineyards. During training, a pre-trained YOLOv5s (chosen for its speed in obtaining results) model was used as the starting point, with a training image size of 640 × 640, a batch size of 32, and 60 epochs.
After applying the YOLO algorithm and all the steps of the proposed algorithm for the detection and localization of vines on UAV images, it can be concluded that a high degree of detection of individual trees was obtained. An example of the results for a part of the analysed vineyard is provided in the next figure (Figure 9). The results show a high level of accuracy in prediction, with no instances of misidentification (pillars are distinguished from vines).
To fully test the vineyard detection algorithm, it was conducted with three separate combinations. The first one involved training on the 2020 image (green part of Figure 1) and performing vineyard recognition on the 2020 image (red part of Figure 1). The second combination consisted of training on the 2020 image and applying vineyard detection to the 2022 image. The third combination used the 2022 image for training purposes and subsequently detected vines within the same 2022 image. This type of algorithm testing was chosen because the analysed area, i.e., the analysed vineyard, is of small size, and also because chemical analysis results for other available vineyards were not accessible for zoning after the detection of vines in this study. The results were obtained through the detection of living and dead vines using the algorithm described previously, in combination with field analysis (vine counting) in the vineyard.
The confusion matrix for all three combinations of training and detection for both years is provided in Table 1, and the metrics used to determine the performance of the vine detection model (accuracy, precision, recall, and F1 score), as explained in Section 2.2.3, are provided in Table 2.
The data obtained from the applied vine detection were compared with reference data to visually represent the number of true positive (TP), true negative (TN), false positive (FP), and false negative (FN) objects. Figure 10 illustrates these results as follows: TP represents correctly identified living vines; FP denotes instances where the algorithm mistakenly recognized non-existent vines in the field; FN represents vines that exist in the field but were not recognized by the model; and TN indicates vines that do not exist in the field and were correctly identified as wilted chocks.
Using the locations of living vines, the possibility of reading from a row where there is no vegetation, which can be caused by wilted vines, is eliminated. In this way, the spatial distribution of the NDVI of the vine is obtained, without the influence of inter-row vegetation. Since, from the previously shown results, it can be concluded that the best results were obtained using the model trained on data from the year 2020 and tested on images also from the year 2020, in further work, for removing inter-rows and interpolating NDVI values, points obtained in this way were used. Furthermore, by interpolating the NDVI values, a new raster representing the vegetation status within the entire vineyard was obtained (Figure 11).
After removing the inter-rows from the raster with NDVI values and obtaining the vegetation status in the analysed vineyard, and after all other data (N, P, and K content in leaf blades and petioles) have been interpolated, and corresponding rasters with a continuous spatial distribution of the data within the vineyard have been obtained, the next step of the proposed model was implemented, which is the definition of management zones. When defining the management zones, in addition to attributes (NDVI and N, P, and K content in leaf blades and petioles), we used information about the location (coordinates). In this study, weights of a value of 0.15 were assigned to the coordinates (0.15/2 for each coordinate) and the remaining 0.85 (0.85/7 for each of the 7 attributes—NDVI and N, P, and K content in leaf blades and petioles) was assigned to the other attributes. This approach aims to create management zones in vineyards that preserve both spatial and spectral properties. The results of the grouping (zoning) approach with variable weights for coordinates and attributes are illustrated in Figure 12.
By analysing the statistical values of the variables within each zone (Zone 1 is the red area and Zone 2 is the green area in Figure 12), the differences in the mean values for all the variables participating in the zoning process can be clearly observed (Table 3).
Figure 13 shows box plot diagrams of each variable, which graphically present differences between the mean values of the zones, of which its numerical values are shown in the previous table. As known from statistics, the appearance of a box plot diagram provides information about the mean, range, dispersion, and presence of extreme values in a dataset. From Figure 13, the data have a certain variability within the zones, and these two zones are different from each other in all variables, which is particularly expressed in the example for phosphorus, where it can be seen that the values of these variables are significantly different for the created management zones.
Based on the previously presented box plot diagrams and the final appearance of the management zones, it can be concluded that this approach to grouping, i.e., zoning, in which different weights are assigned to attributes and coordinates, provides good results in terms of the spatial connectivity of zones, and also in terms of spectral properties.

4. Discussion

The incorporation of emerging technologies into viticulture enables automation and faster decision making [87,88,89,90,91]. The assessment of missing vines proves valuable in determining the number of gaps along grapevine rows that could be filled with new plants. Detecting missing vines in the vineyard is a key challenge that can be addressed with new technologies. While detecting and locating individual plants are crucial for plant or group-level (zone) management, there currently exists no fully automated method utilizing UAV imagery for this purpose. Various methods found in the literature can estimate tree positions using UAV images, but they rely on prior knowledge of the number of plants in a row and the spacing between plants [35,38] or use laser scanning data, i.e., point clouds [92,93]. The existing literature [35,36,37,94,95] suggests that vine detection involves robots, laser scanners, and position extraction from point clouds, which can be time consuming and require significant financial investment. Therefore, one of the tasks of this study was to develop an automated method capable of detecting and locating grapevine trees, offering the potential to generate basic maps (maps of missing plants detected throughout the vineyard, valuable for farmers when replacing missing plants) for various applications in precision viticulture, including vineyard zoning.
The results of point cloud processing and tree identification from the 3D model in a study [36] revealed that out of 221 estimated plants, 197 were correctly detected and classified (89%). Concerning missing plants, 41 (64%) were accurately classified, resulting in an overall accuracy of about 84%. Aguiar et al. [96] proposed tree detection using high-performance deep learning suitable for real-time applications in robotics. Also, Pinto de Aguiar et al. [97], addressed the feature extraction problem in the vineyard context using deep learning to detect grapevine trees with the YOLO algorithm. Several solutions for grapevine tree detection have been developed using range sensors [98] or camera systems [99], but such systems, despite their high performance, are still not widely applied in vineyards, mainly due to the associated high costs of robots and autonomous vehicles. Hence, a motivating factor for this study was to find methods that, without substantial equipment and resource investments, can swiftly deliver quality results and successfully and precisely detect individual vines.
Aerial imaging using UAVs has emerged as an expeditious method for object capturing, with increasing significance in contemporary remote sensing due to its cost-effectiveness and high-resolution capabilities. Nevertheless, a substantial limitation in vertical aerial imaging lies in the challenge of accurately discerning the underlying conditions beneath the canopy. In absence of a plant, neighbouring plants can extend their shoots and leaves to fill the adjacent unoccupied space. Consequently, an essential consideration relies in the careful selection of the imaging period [35]. Moreno et al. [92], proposed 3D models to be created for the vineyard during the winter season when only branches without vegetation are present, and based on this, the position of each tree is determined. This approach to vineyard imaging before the start of the vegetative cycle was also applied in our study, and based on these photographs, tree identification was performed without the need for creating a 3D model. The advantage of the proposed vine detection algorithm in this study is that it provides fast results without the need for extensive data such as point clouds, and data acquisition is fast and straightforward. For this step of the proposed algorithm (vine detection), only UAV images of the vineyard with R, G, B, and NIR spectra from two periods (before the start of the vegetative cycle and after the flowering stage) were used.
In the context of image object detection within the scope of this investigation, the fundamental principle characteristic in deep learning methodologies relies in the use of a pre-trained model for the classification of novel images. This involves the knowledge or proficiency acquired by a previously trained model when applied to new images—an approach particularly noticeable when neural networks, used on expansive datasets, are to be used in databases characterized by significantly reduced data volumes. The adoption of pre-trained models mitigates the extended training periods typically associated with neural networks [100]. It is noteworthy that the application of transfer learning is particularly uncommon within the agricultural domain [96]. An empirical validation advanced within this study establishes its viability and efficacy within this specific sector. The results of vine detection performed in this way show that the results from the model trained on 2020 data and tested on 2022 data is almost identical to the results of the model trained and tested on 2022 data. This confirms the previous claim that there is no need to retrain the model with new data. Using transfer learning and a pre-trained model can yield quality results, reducing the time required for results and enabling the application of the YOLO algorithm in near real time, facilitating timely and accurate decision making.
The fact that high accuracy in vine detection is quickly achieved by analysing UAV images distinguishes this methodology from studies that use point clouds [36], 3D models [92], or oblique imagery [101] for the same purpose. The results obtained in this study using the proposed detection algorithm show an accuracy of approximately 90%, with similar metrics observed for each of the three combinations of images analysed. The critical consideration when applying the proposed algorithm is the proper selection of the vineyard imaging period, necessitating capture in two periods: before the start of the vegetative cycle and after the flowering stage. Additionally, when planning the flight, attention should be given to the time of day when the imaging will take place to identify shadows of grapevine trees in images from pre-vegetative cycle periods. For imaging in the second phenological period, attention should be paid to the development of grapevines, choosing a period when the vegetation is sufficiently developed to identify it in the image yet not obscuring the space above the canopy of withered grapevines.
The second objective addressed in this study is the delineation of management zones within the analysed vineyard. Various methods have been developed for measuring spatial variability within fields for delineating management zones, but they have predominantly focused on arable crops and much less on orchards, particularly vineyards [102]. Most of these methods rely on individual types of data in the analysis and zoning process. These data can include yield data [51,52,53], soil characteristic data [54,55,56], remote sensing data [57,58,59], etc. This research proposes a method for zoning based on a larger dataset composed of diverse geospatial data.
Our research aimed to zone a vineyard into two classes based on the grapevine’s condition (NDVI) and the results of chemical analyses of leaf blade and petiole samples (nitrogen, phosphorus, and potassium content). The initial challenge was addressing the impact of inter-row vegetation on determining the grapevine’s condition using the NDVI. To ensure that the obtained results (zones) depend only on the grapevine’s condition, it is imperative to eliminate the influence of inter-rows and the vegetation within them. Although some studies suggest the application of the K-means algorithm to separate grapevine rows and then determine the grapevine’s condition for the entire vineyard based on the results [103], such an approach is not applicable in vineyards in southeastern Europe due to the challenging conditions that grape growers face. Primarily, grape growers confront a labour and resource shortage (that would be invested in robotics and autonomous machinery), resulting in the presence of weeds and grass in inter-rows. This significantly hampers the accurate definition of the grapevine’s condition within the vineyard since, in further analysis after this type of classification, vegetation related to the grapevine is used, along with weeds and grass, to define the grapevine’s condition. Many authors [70,71] propose reading values from the raster (e.g., vegetation indices) using points in grapevine rows and creating new rasters by interpolating based on these readings. In this way, the new raster will represent the spatial arrangement of only the grapevine’s condition, and such data can be used for defining management zones. It has been noticed in the literature that there is no rule based on which the locations of points used for the interpolation are determined; rather, they are chosen randomly. In our study, the grapevine’s condition without the influence of inter-rows was determined by using previously obtained locations of live vines used to read NDVI values and interpolate them. The raster obtained by interpolating values from these locations provides a more evenly distributed spatial arrangement of vineyard conditions compared to the usual method of random sampling from rows. This approach ensures that interpolation values are taken only from locations with live vines. Such data form the basis for proper vineyard zoning, and these management zones are one of the foundations of precision viticulture.
As there is still no universally accepted method for delineating management zones, cluster analysis can be the basis for unresolved issues in this field [65]. While many studies have addressed determining zones in vineyards [60,61,62], there are limited studies using artificial intelligence for delineating management zones within vineyards. Most proposed methods rely on statistical analysis. Delineating management zones can enable the management of spatial variability within fields by dividing vineyards into homogeneous zones, of particular interest for implementing variable nutrient and water management schemes [43,44,45,46,47]. It can be assumed that the analysis of vegetation indices can identify soil degradation zones [48,49,50] in agriculture. Many authors [60,61,62,63] propose zoning through the simple division of raster values, e.g., vegetation indices, into a certain number of classes, with an equal number of objects (pixels) in each class. Another approach in delineating management zones found in the literature is zoning through unsupervised classification [64,65,66,67,68,69]. The K-means algorithm stands out as the most commonly applied machine learning method for defining management zones in agriculture, and it is also used in viticulture. An analysis of the literature revealed a lack of standardized way to define management zones, and there are no recommendations for which data to use when delineating zones.
As mentioned earlier, this research proposes a method for zoning based on a larger comprehensive dataset composed of a newly created raster representing the NDVI without the influence of inter-row vegetation and a raster obtained by interpolating sample data (N, P, and K content in leaf blades and petioles). It is essential to note that the proposed model is flexible regarding the amount of input data for zoning, and it can be expanded with additional data or with the exclusion of some data in case of unavailability. To achieve a clearer separation of management zones, we took more available data (NDVI, and N, P, and K content in leaf blades and petioles) for our analysis, based on which, zoning was performed. Management zones were defined using the K-means clustering algorithm, and the principle of defining management zones based on weighted attributes assigned in this study yielded good results in terms of the spatial connectivity of zones, the elimination of small and scattered zones, and achieving differences in spectral properties indicating diversity between zones.
The vineyard block in this research exhibited variability in terms of parameters used for the analysis (NDVI and N, P, and K content in leaves and petioles). Using the proposed model, the vineyard block was divided into two zones, differing from each other and with certain variability within. Generally, the values of parameters in both zones, in other words, the whole vineyard block, were lower than those referred in the literature [104,105,106]. The lower NDVI values primarily resulted from a lower N status in the plant tissue [104]. An unfavourable nutrition status was especially expressed in K, and more in zone 2. In terms of the P status, the defined zones differed significantly. In this step, obtained results can be used to define sampling methods (soil, yield, pruning residues). For making decisions regarding vineyard management (fertilizing for example), additional attributes need to be included in the presented model (topography, soil characteristics, yield parameters).
In future research, it is essential to expand the proposed zoning algorithm by incorporating additional geospatial data and examining how and to what extent these additional data influence the separation of management zones. Additionally, exploring alternative methods for defining homogeneous zones should be considered to establish a standardized zoning model. Besides UAV acquired images, the potential use of Sentinel-2 or PlanetScope images for zoning should be investigated, as they are believed to accurately represent the vine status within the vineyard despite the presence of mixed pixels. Furthermore, the utilization of very high spatial resolution satellite images should be considered, given the increasing availability of such data, with the assumption that these results could closely align with UAV image results. Additionally, the proposed vine detection and zoning algorithm should be tested on other vineyards with different row orientations and slope directions, confirming the feasibility of transfer learning in this field. Based on processed data, analysis, and obtained results, drawing appropriate conclusions regarding potential expansions and further algorithm development possibilities is crucial.

5. Conclusions

Zoning in the context of agriculture, including viticulture, is a key concept enabling the efficient management of agricultural resources. The idea behind zoning is to divide agricultural parcels or vineyards into smaller, homogeneous zones that share similar characteristics and conditions. This approach brings numerous benefits in more effective resource management and a tailored approach to the needs of each zone. As an alternative to traditional methods of generating zoning maps, information obtained from remote sensors allows the quick and easy definition of management zones. In the field of remote sensing, unmanned aerial vehicles (UAVs) represent a significant advancement due to their characteristics, including cost-effectiveness, high resolution, and flexibility, making them widely used in various areas, including precision agriculture.
This study proposes a model to address zoning issues with a focus on precision viticulture. The research aims to create a decision support model for defining management zones in viticulture, assisting winemakers and grape producers in adjusting the use of resources such as water, fertilizers, and pesticides according to the specific needs of each zone. The information obtained from this process is valuable for both producers and the entire agricultural industry.
The results of this research provide a deep insight into the advantages of precision viticulture achieved through the development of an innovative decision support model. This study emphasizes key steps in the zoning process, highlighting the importance of automatic vine detection and the precise definition of homogeneous zones, forming the basis for the application of modern agronomic measures. The implementation of new technologies, especially through the use of UAV imagery and innovative analysis algorithms, enables not only efficient automation but also fast decision making in viticulture. The research results confirm that the proposed model for vine detection, based on transfer learning and the analysis of RGB + NIR images, provides high accuracy in vine detection, reducing the need for extensive data. The proposed algorithm is simple, efficient, and achieves high precision in detecting individual vines without the need for additional information such as LiDAR data or digital terrain models. With such data, it is possible to precisely eliminate the influence of inter-row vegetation and depict the true vegetation status, as well as the spread of individual phenomena within the vineyard. Ultimately, this leads to well-defined management zones that serve as the foundation for the implementation of precision viticulture.
The proposed model can significantly speed up the decision-making process, regardless of the number of objects to be identified or the size of the analysed vineyard. The division of the vineyard block into homogeneous zones, guided by parameters such as the NDVI and the content of N, P, and K in leaf blades and petioles, indicates variability in conditions within the vineyard. This information not only aids in understanding the state of the grapevines but also provides a basis for precise vineyard management, from sampling to fertilization strategies. Further improvements to the proposed model will explore the use of other geospatial data and consider alternative algorithms that could potentially increase the algorithm’s speed or enhance its accuracy.

Author Contributions

Conceptualization, M.G. (Milan Gavrilović) and D.J.; methodology, M.G. (Milan Gavrilović) and D.J.; software, M.G. (Milan Gavrilović); validation, D.J., P.B. (Predrag Božović), and P.B. (Pavel Benka); formal analysis, M.G. (Milan Gavrilović), D.J., P.B. (Predrag Božović), and P.B. (Pavel Benka); investigation, M.G. (Milan Gavrilović) and D.J.; resources, P.B. (Predrag Božović) and P.B. (Pavel Benka); data curation, P.B. (Predrag Božović) and P.B. (Pavel Benka); writing—original draft preparation, M.G. (Milan Gavrilović) and D.J.; writing—review and editing, M.G. (Milan Gavrilović), D.J., P.B. (Predrag Božović), P.B. (Pavel Benka), and M.G. (Miro Govedarica); visualization, M.G. (Milan Gavrilović); supervision, M.G. (Miro Govedarica); project administration, M.G. (Milan Gavrilović), D.J. and M.G. (Miro Govedarica). All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are available in article.

Acknowledgments

This research (paper) was supported by the Ministry of Science, Technological Development and Innovation through project no. 451-03-47/2023-01/200156 “Innovative scientific and artistic research from the FTS (activity) domain” and trough project “Optimisation of grape production by comprehensive vineyard mapping (VinMap)”, PoC 5154, funded by the Innovation Fund, Republic of Serbia.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Korać, N.; Cindrić, P.; Medić, M.; Ivanišević, D. Voćarstvo i Vinogradarstvo (Deo Vinogradarstvo); Univerzitet u Novom Sadu, Poljoprivredni Fakultet: Novi Sad, Serbia, 2016; ISBN 978-86-7520-383-4. [Google Scholar]
  2. Tardaguila, J.; Stoll, M.; Gutiérrez, S.; Proffitt, T.; Diago, M.P. Smart Applications and Digital Technologies in Viticulture: A Review. Smart Agric. Technol. 2021, 1, 100005. [Google Scholar] [CrossRef]
  3. Lyu, H.; Grafton, M.; Ramilan, T.; Irwin, M.; Wei, H.-E.; Sandoval, E. Using Remote and Proximal Sensing Data and Vine Vigor Parameters for Non-Destructive and Rapid Prediction of Grape Quality. Remote Sens. 2023, 15, 5412. [Google Scholar] [CrossRef]
  4. Pérez-Zavala, R.; Torres-Torriti, M.; Cheein, F.A.; Troni, G. A Pattern Recognition Strategy for Visual Grape Bunch Detection in Vineyards. Comput. Electron. Agric. 2018, 151, 136–149. [Google Scholar] [CrossRef]
  5. Sozzi, M.; Cantalamessa, S.; Cogato, A.; Kayad, A.; Marinello, F. Automatic Bunch Detection in White Grape Varieties Using YOLOv3, YOLOv4, and YOLOv5 Deep Learning Algorithms. Agronomy 2022, 12, 319. [Google Scholar] [CrossRef]
  6. Squeri, C.; Poni, S.; Di Gennaro, S.F.; Matese, A.; Gatti, M. Comparison and Ground Truthing of Different Remote and Proximal Sensing Platforms to Characterize Variability in a Hedgerow-Trained Vineyard. Remote Sens. 2021, 13, 2056. [Google Scholar] [CrossRef]
  7. Cogato, A.; Meggio, F.; Collins, C.; Marinello, F. Medium-Resolution Multispectral Data from Sentinel-2 to Assess the Damage and the Recovery Time of Late Frost on Vineyards. Remote Sens. 2020, 12, 1896. [Google Scholar] [CrossRef]
  8. Giovos, R.; Tassopoulos, D.; Kalivas, D.; Lougkos, N.; Priovolou, A. Remote Sensing Vegetation Indices in Viticulture: A Critical Review. Agriculture 2021, 11, 457. [Google Scholar] [CrossRef]
  9. Atencia Payares, L.K.; Tarquis, A.M.; Hermoso Peralo, R.; Cano, J.; Cámara, J.; Nowack, J.; Gómez del Campo, M. Multispectral and Thermal Sensors Onboard UAVs for Heterogeneity in Merlot Vineyard Detection: Contribution to Zoning Maps. Remote Sens. 2023, 15, 4024. [Google Scholar] [CrossRef]
  10. de Castro, A.I.; Peña, J.M.; Torres-Sánchez, J.; Jiménez-Brenes, F.M.; Valencia-Gredilla, F.; Recasens, J.; López-Granados, F. Mapping Cynodon Dactylon Infesting Cover Crops with an Automatic Decision Tree-OBIA Procedure and UAV Imagery for Precision Viticulture. Remote Sens. 2020, 12, 56. [Google Scholar] [CrossRef]
  11. Meyers, J.M.; Dokoozlian, N.; Ryan, C.; Bioni, C.; Vanden Heuvel, J.E. A New, Satellite NDVI-Based Sampling Protocol for Grape Maturation Monitoring. Remote Sens. 2020, 12, 1159. [Google Scholar] [CrossRef]
  12. Poblete-Echeverría, C.; Olmedo, G.; Ingram, B.; Bardeen, M. Detection and Segmentation of Vine Canopy in Ultra-High Spatial Resolution RGB Imagery Obtained from Unmanned Aerial Vehicle (UAV): A Case Study in a Commercial Vineyard. Remote Sens. 2017, 9, 268. [Google Scholar] [CrossRef]
  13. Escolà, A.; Peña, J.M.; López-Granados, F.; Rosell-Polo, J.R.; de Castro, A.I.; Gregorio, E.; Jiménez-Brenes, F.M.; Sanz, R.; Sebé, F.; Llorens, J.; et al. Mobile Terrestrial Laser Scanner vs. UAV Photogrammetry to Estimate Woody Crop Canopy Parameters—Part 1: Methodology and Comparison in Vineyards. Comput. Electron. Agric. 2023, 212, 108109. [Google Scholar] [CrossRef]
  14. Aquino, A.; Millan, B.; Gaston, D.; Diago, M.-P.; Tardaguila, J. vitisFlower®: Development and Testing of a Novel Android-Smartphone Application for Assessing the Number of Grapevine Flowers per Inflorescence Using Artificial Vision Techniques. Sensors 2015, 15, 21204–21218. [Google Scholar] [CrossRef]
  15. Dunn, G.M.; Martin, S.R. Yield Prediction from Digital Image Analysis: A Technique with Potential for Vineyard Assessments Prior to Harvest. Aust. J. Grape Wine Res. 2008, 10, 196–198. [Google Scholar] [CrossRef]
  16. Liu, S.; Whitty, M. Automatic Grape Bunch Detection in Vineyards with an SVM Classifier. J. Appl. Log. 2015, 13, 643–653. [Google Scholar] [CrossRef]
  17. Liakos, K.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [PubMed]
  18. Maxwell, A.E.; Warner, T.A.; Fang, F. Implementation of Machine-Learning Classification in Remote Sensing: An Applied Review. Int. J. Remote Sens. 2018, 39, 2784–2817. [Google Scholar] [CrossRef]
  19. Palacios, F.; Bueno, G.; Salido, J.; Diago, M.P.; Hernández, I.; Tardaguila, J. Automated Grapevine Flower Detection and Quantification Method Based on Computer Vision and Deep Learning from On-the-Go Imaging Using a Mobile Sensing Platform under Field Conditions. Comput. Electron. Agric. 2020, 178, 105796. [Google Scholar] [CrossRef]
  20. Rahim, U.F.; Utsumi, T.; Mineno, H. Deep Learning-Based Accurate Grapevine Inflorescence and Flower Quantification in Unstructured Vineyard Images Acquired Using a Mobile Sensing Platform. Comput. Electron. Agric. 2022, 198, 107088. [Google Scholar] [CrossRef]
  21. Aguiar, A.S.; Magalhães, S.A.; dos Santos, F.N.; Castro, L.; Pinho, T.; Valente, J.; Martins, R.; Boaventura-Cunha, J. Grape Bunch Detection at Different Growth Stages Using Deep Learning Quantized Models. Agronomy 2021, 11, 1890. [Google Scholar] [CrossRef]
  22. Jaramillo, J.; Vanden Heuvel, J.; Petersen, K.H. Low-Cost, Computer Vision-Based, Prebloom Cluster Count Prediction in Vineyards. Front. Agron. 2021, 3, 648080. [Google Scholar] [CrossRef]
  23. Font, D.; Pallejà, T.; Tresanchez, M.; Teixidó, M.; Martinez, D.; Moreno, J.; Palacín, J. Counting Red Grapes in Vineyards by Detecting Specular Spherical Reflection Peaks in RGB Images Obtained at Night with Artificial Illumination. Comput. Electron. Agric. 2014, 108, 105–111. [Google Scholar] [CrossRef]
  24. Torres-Sánchez, J.; Mesas-Carrascosa, F.J.; Santesteban, L.-G.; Jiménez-Brenes, F.M.; Oneka, O.; Villa-Llop, A.; Loidi, M.; López-Granados, F. Grape Cluster Detection Using UAV Photogrammetric Point Clouds as a Low-Cost Tool for Yield Forecasting in Vineyards. Sensors 2021, 21, 3083. [Google Scholar] [CrossRef] [PubMed]
  25. Nuske, S.; Wilshusen, K.; Achar, S.; Yoder, L.; Narasimhan, S.; Singh, S. Automated Visual Yield Estimation in Vineyards. J. Field Robot. 2014, 31, 837–860. [Google Scholar] [CrossRef]
  26. Imran, H.A.; Zeggada, A.; Ianniello, I.; Melgani, F.; Polverari, A.; Baroni, A.; Danzi, D.; Goller, R. Low-Cost Handheld Spectrometry for Detecting Flavescence Dorée in Vineyards. Appl. Sci. 2023, 13, 2388. [Google Scholar] [CrossRef]
  27. Lu, S.; Liu, X.; He, Z.; Zhang, X.; Liu, W.; Karkee, M. Swin-Transformer-YOLOv5 for Real-Time Wine Grape Bunch Detection. Remote Sens. 2022, 14, 5853. [Google Scholar] [CrossRef]
  28. Bramley, R.G.V. 12—Precision Viticulture: Managing Vineyard Variability for Improved Quality Outcomes. In Managing Wine Quality, 2nd ed.; Reynolds, A.G., Ed.; Woodhead Publishing Series in Food Science, Technology and Nutrition; Woodhead Publishing: Cambridge, UK, 2022; pp. 541–586. ISBN 978-0-08-102067-8. [Google Scholar]
  29. Bhat, S.A.; Huang, N.-F. Big Data and AI Revolution in Precision Agriculture: Survey and Challenges. IEEE Access 2021, 9, 110209–110222. [Google Scholar] [CrossRef]
  30. Morais, R.; Fernandes, M.A.; Matos, S.G.; Serôdio, C.; Ferreira, P.J.S.G.; Reis, M.J.C.S. A ZigBee Multi-Powered Wireless Acquisition Device for Remote Sensing Applications in Precision Viticulture. Comput. Electron. Agric. 2008, 62, 94–106. [Google Scholar] [CrossRef]
  31. Cambouris, A.N.; Zebarth, B.J.; Ziadi, N.; Perron, I. Precision Agriculture in Potato Production. Potato Res. 2014, 57, 249–262. [Google Scholar] [CrossRef]
  32. Ahmad, A.; Ordoñez, J.; Cartujo, P.; Martos, V. Remotely Piloted Aircraft (RPA) in Agriculture: A Pursuit of Sustainability. Agronomy 2021, 11, 7. [Google Scholar] [CrossRef]
  33. Ariza-Sentís, M.; Baja, H.; Vélez, S.; Valente, J. Object Detection and Tracking on UAV RGB Videos for Early Extraction of Grape Phenotypic Traits. Comput. Electron. Agric. 2023, 211, 108051. [Google Scholar] [CrossRef]
  34. Matese, A.; Di Gennaro, S.F. Beyond the Traditional NDVI Index as a Key Factor to Mainstream the Use of UAV in Precision Viticulture. Sci. Rep. 2021, 11, 2721. [Google Scholar] [CrossRef]
  35. Di Gennaro, S.F.; Matese, A. Evaluation of Novel Precision Viticulture Tool for Canopy Biomass Estimation and Missing Plant Detection Based on 2.5D and 3D Approaches Using RGB Images Acquired by UAV Platform. Plant Methods 2020, 16, 91. [Google Scholar] [CrossRef]
  36. Jurado, J.M.; Pádua, L.; Feito, F.R.; Sousa, J.J. Automatic Grapevine Trunk Detection on UAV-Based Point Cloud. Remote Sens. 2020, 12, 3043. [Google Scholar] [CrossRef]
  37. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Gay, P. Unsupervised Detection of Vineyards by 3D Point-Cloud UAV Photogrammetry for Precision Agriculture. Comput. Electron. Agric. 2018, 155, 84–95. [Google Scholar] [CrossRef]
  38. de Castro, A.; Jiménez-Brenes, F.; Torres-Sánchez, J.; Peña, J.; Borra-Serrano, I.; López-Granados, F. 3-D Characterization of Vineyards Using a Novel UAV Imagery-Based OBIA Procedure for Precision Viticulture Applications. Remote Sens. 2018, 10, 584. [Google Scholar] [CrossRef]
  39. Moreno, H.; Andújar, D. Proximal Sensing for Geometric Characterization of Vines: A Review of the Latest Advances. Comput. Electron. Agric. 2023, 210, 107901. [Google Scholar] [CrossRef]
  40. Muschietti-Piana, M.d.P.; Cipriotti, P.A.; Urricariet, S.; Peralta, N.R.; Niborski, M. Using Site-Specific Nitrogen Management in Rainfed Corn to Reduce the Risk of Nitrate Leaching. Agric. Water Manag. 2018, 199, 61–70. [Google Scholar] [CrossRef]
  41. Haghverdi, A.; Leib, B.G.; Washington-Allen, R.A.; Ayers, P.D.; Buschermohle, M.J. Perspectives on Delineating Management Zones for Variable Rate Irrigation. Comput. Electron. Agric. 2015, 117, 154–167. [Google Scholar] [CrossRef]
  42. Oshunsanya, S.O.; Oluwasemire, K.O.; Taiwo, O.J. Use of GIS to Delineate Site-Specific Management Zone for Precision Agriculture. Commun. Soil Sci. Plant Anal. 2017, 48, 565–575. [Google Scholar] [CrossRef]
  43. Schenatto, K.; de Souza, E.G.; Bazzi, C.L.; Gavioli, A.; Betzek, N.M.; Beneduzzi, H.M. Normalization of Data for Delineating Management Zones. Comput. Electron. Agric. 2017, 143, 238–248. [Google Scholar] [CrossRef]
  44. Servadio, P.; Bergonzoli, S.; Verotti, M. Delineation of Management Zones Based on Soil Mechanical-Chemical Properties to Apply Variable Rates of Inputs throughout a Field (VRA). Eng. Agric. Environ. Food 2017, 10, 20–30. [Google Scholar] [CrossRef]
  45. Jiang, G.; Grafton, M.; Pearson, D.; Bretherton, M.; Holmes, A. Predicting Spatiotemporal Yield Variability to Aid Arable Precision Agriculture in New Zealand: A Case Study of Maize-Grain Crop Production in the Waikato Region. N. Z. J. Crop Hortic. Sci. 2021, 49, 41–62. [Google Scholar] [CrossRef]
  46. Memiaghe, J.N.; Cambouris, A.N.; Ziadi, N.; Karam, A. Soil Phosphorus Distribution under Two Contrasting Grassland Fields in Eastern Canada; ASA-CSSA-SSSA: Madison, WI, USA, 2019. [Google Scholar]
  47. Priori, S.; Martini, E.; Andrenelli, M.C.; Magini, S.; Agnelli, A.E.; Bucelli, P.; Biagi, M.; Pellegrini, S.; Costantini, E.A.C. Improving Wine Quality through Harvest Zoning and Combined Use of Remote and Soil Proximal Sensing. Soil Sci. Soc. Am. J. 2013, 77, 1338–1348. [Google Scholar] [CrossRef]
  48. Kamble, B.; Kilic, A.; Hubbard, K. Estimating Crop Coefficients Using Remote Sensing-Based Vegetation Index. Remote Sens. 2013, 5, 1588–1602. [Google Scholar] [CrossRef]
  49. Park, J.; Baik, J.; Choi, M. Satellite-Based Crop Coefficient and Evapotranspiration Using Surface Soil Moisture and Vegetation Indices in Northeast Asia. CATENA 2017, 156, 305–314. [Google Scholar] [CrossRef]
  50. Pôças, I.; Paço, T.A.; Paredes, P.; Cunha, M.; Pereira, L.S. Estimation of Actual Crop Coefficients Using Remotely Sensed Vegetation Indices and Soil Water Balance Modelled Data. Remote Sens. 2015, 7, 2373–2400. [Google Scholar] [CrossRef]
  51. Ali, A.; Martelli, R.; Scudiero, E.; Lupia, F.; Falsone, G.; Rondelli, V.; Barbanti, L. Soil and Climate Factors Drive Spatio-Temporal Variability of Arable Crop Yields under Uniform Management in Northern Italy. Arch. Agron. Soil Sci. 2023, 69, 75–89. [Google Scholar] [CrossRef]
  52. Borgogno-Mondino, E.; Lessio, A.; Tarricone, L.; Novello, V.; de Palma, L. A Comparison between Multispectral Aerial and Satellite Imagery in Precision Viticulture. Precis. Agric. 2018, 19, 195–217. [Google Scholar] [CrossRef]
  53. Anastasiou, E.; Balafoutis, A.; Darra, N.; Psiroukis, V.; Biniari, A.; Xanthopoulos, G.; Fountas, S. Satellite and Proximal Sensing to Estimate the Yield and Quality of Table Grapes. Agriculture 2018, 8, 94. [Google Scholar] [CrossRef]
  54. Corwin, D.L.; Lesch, S.M.; Shouse, P.J.; Soppe, R.; Ayars, J.E. Identifying Soil Properties That Influence Cotton Yield Using Soil Sampling Directed by Apparent Soil Electrical Conductivity. Agron. J. 2003, 95, 352–364. [Google Scholar] [CrossRef]
  55. Johnson, C.K.; Doran, J.W.; Duke, H.R.; Wienhold, B.J.; Eskridge, K.M.; Shanahan, J.F. Field-Scale Electrical Conductivity Mapping for Delineating Soil Condition. Soil Sci. Soc. Am. J. 2001, 65, 1829–1837. [Google Scholar] [CrossRef]
  56. Tagarakis, A.; Liakos, V.; Fountas, S.; Koundouras, S.; Gemtos, T.A. Management Zones Delineation Using Fuzzy Clustering Techniques in Grapevines. Precis. Agric. 2013, 14, 18–39. [Google Scholar] [CrossRef]
  57. Nahry, A.H.E.; Ali, R.R.; Baroudy, A.A.E. An Approach for Precision Farming under Pivot Irrigation System Using Remote Sensing and GIS Techniques. Agric. Water Manag. 2011, 98, 517–531. [Google Scholar] [CrossRef]
  58. Hansen, P.M.; Schjoerring, J.K. Reflectance Measurement of Canopy Biomass and Nitrogen Status in Wheat Crops Using Normalized Difference Vegetation Indices and Partial Least Squares Regression. Remote Sens. Environ. 2003, 86, 542–553. [Google Scholar] [CrossRef]
  59. Li, W.; Huang, J.; Yang, L.; Chen, Y.; Fang, Y.; Jin, H.; Sun, H.; Huang, R. A Practical Remote Sensing Monitoring Framework for Late Frost Damage in Wine Grapes Using Multi-Source Satellite Data. Remote Sens. 2021, 13, 3231. [Google Scholar] [CrossRef]
  60. Matese, A.; Di Gennaro, S.F.; Santesteban, L.G. Methods to Compare the Spatial Variability of UAV-Based Spectral and Geometric Information with Ground Autocorrelated Data. A Case of Study for Precision Viticulture. Comput. Electron. Agric. 2019, 162, 931–940. [Google Scholar] [CrossRef]
  61. Pádua, L.; Marques, P.; Adão, T.; Guimarães, N.; Sousa, A.; Peres, E.; Sousa, J.J. Vineyard Variability Analysis through UAV-Based Vigour Maps to Assess Climate Change Impacts. Agronomy 2019, 9, 581. [Google Scholar] [CrossRef]
  62. Matese, A.; Di Gennaro, S.F.; Berton, A. Assessment of a Canopy Height Model (CHM) in a Vineyard Using UAV-Based Multispectral Imaging. Int. J. Remote Sens. 2017, 38, 2150–2160. [Google Scholar] [CrossRef]
  63. Campos, J.; García-Ruíz, F.; Gil, E. Assessment of Vineyard Canopy Characteristics from Vigour Maps Obtained Using UAV and Satellite Imagery. Sensors 2021, 21, 2363. [Google Scholar] [CrossRef]
  64. Lajili, A.; Cambouris, A.N.; Chokmani, K.; Duchemin, M.; Perron, I.; Zebarth, B.J.; Biswas, A.; Adamchuk, V.I. Analysis of Four Delineation Methods to Identify Potential Management Zones in a Commercial Potato Field in Eastern Canada. Agronomy 2021, 11, 432. [Google Scholar] [CrossRef]
  65. Ali, A.; Rondelli, V.; Martelli, R.; Falsone, G.; Lupia, F.; Barbanti, L. Management Zones Delineation through Clustering Techniques Based on Soils Traits, NDVI Data, and Multiple Year Crop Yields. Agriculture 2022, 12, 231. [Google Scholar] [CrossRef]
  66. Bramley, R.G.V.; Hamilton, R.P. Understanding Variability in Winegrape Production Systems. Aust. J. Grape Wine Res. 2004, 10, 32–45. [Google Scholar] [CrossRef]
  67. Mazzia, V.; Comba, L.; Khaliq, A.; Chiaberge, M.; Gay, P. UAV and Machine Learning Based Refinement of a Satellite-Driven Vegetation Index for Precision Agriculture. Sensors 2020, 20, 2530. [Google Scholar] [CrossRef] [PubMed]
  68. Bonilla, I.; Martínez De Toda, F.; Martínez-Casasnovas, J.A. Vineyard Zonal Management for Grape Quality Assessment by Combining Airborne Remote Sensed Imagery and Soil Sensors; Neale, C.M.U., Maltese, A., Eds.; SPIE: Amsterdam, The Netherlands, 2014; p. 92390S. [Google Scholar]
  69. Martinez-Casasnovas, J.A.; Agelet-Fernandez, J.; Arno, J.; Ramos, M.C. Analysis of Vineyard Differential Management Zones and Relation to Vine Development, Grape Maturity and Quality. Span. J. Agric. Res. 2012, 10, 326–337. [Google Scholar] [CrossRef]
  70. Hall, A.; Lamb, D.W.; Holzapfel, B.P.; Louis, J.P. Within-Season Temporal Variation in Correlations between Vineyard Canopy and Winegrape Composition and Yield. Precis. Agric. 2011, 12, 103–117. [Google Scholar] [CrossRef]
  71. Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Peres, E.; Morais, R.; Sousa, J.J. Multi-Temporal Vineyard Monitoring through UAV-Based RGB Imagery. Remote Sens. 2018, 10, 1907. [Google Scholar] [CrossRef]
  72. Vélez, S.; Ariza-Sentís, M.; Valente, J. Mapping the Spatial Variability of Botrytis Bunch Rot Risk in Vineyards Using UAV Multispectral Imagery. Eur. J. Agron. 2023, 142, 126691. [Google Scholar] [CrossRef]
  73. RedEdge-M User Manual (PDF)—Legacy. Available online: https://support.micasense.com/hc/en-us/articles/115003537673-RedEdge-M-User-Manual-PDF-Legacy (accessed on 23 November 2023).
  74. AOAC International. AOAC Official Method 972.43, Microchemical Determination of Carbon, Hydrogen and Nitrogen, Automated Method. In Official Methods of Analysis of AOAC International; AOAC International: Gaithersburg, MD, USA, 2006; Volume 18, pp. 5–6. [Google Scholar]
  75. Li, Q.; Luo, Z.; He, X.; Chen, H. LA_YOLOx: Effective Model to Detect the Surface Defects of Insulative Baffles. Electronics 2023, 12, 2035. [Google Scholar] [CrossRef]
  76. Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018. [Google Scholar] [CrossRef]
  77. Noh, C.-M.; Jang, J.-G.; Kim, S.-S.; Lee, S.-S.; Shin, S.-C.; Lee, J.-C. A Study on the Optimization of the Coil Defect Detection Model Based on Deep Learning. Appl. Sci. 2023, 13, 5200. [Google Scholar] [CrossRef]
  78. Liu, Z.; Gao, X.; Wan, Y.; Wang, J.; Lyu, H. An Improved YOLOv5 Method for Small Object Detection in UAV Capture Scenes. IEEE Access 2023, 11, 14365–14374. [Google Scholar] [CrossRef]
  79. Sun, Z.; Li, P.; Meng, Q.; Sun, Y.; Bi, Y. An Improved YOLOv5 Method to Detect Tailings Ponds from High-Resolution Remote Sensing Images. Remote Sens. 2023, 15, 1796. [Google Scholar] [CrossRef]
  80. Zhang, L.; Zhao, C.; Feng, Y.; Li, D. Pests Identification of IP102 by YOLOv5 Embedded with the Novel Lightweight Module. Agronomy 2023, 13, 1583. [Google Scholar] [CrossRef]
  81. Gavioli, A.; De Souza, E.G.; Bazzi, C.L.; Schenatto, K.; Betzek, N.M. Identification of Management Zones in Precision Agriculture: An Evaluation of Alternative Cluster Analysis Methods. Biosyst. Eng. 2019, 181, 86–102. [Google Scholar] [CrossRef]
  82. Arango, R.B.; Campos, A.M.; Combarro, E.F.; Canas, E.R.; Díaz, I. Identification of Agricultural Management Zones Through Clustering Algorithms with Thermal and Multispectral Satellite Imagery. Int. J. Unc. Fuzz. Knowl. Based Syst. 2017, 25, 121–140. [Google Scholar] [CrossRef]
  83. Arno, J.; Martinez-Casasnovas, J.A.; Ribes-Dasi, M.; Rosell, J.R. Clustering of Grape Yield Maps to Delineate Site-Specific Management Zones. Span. J. Agric. Res. 2011, 9, 721. [Google Scholar] [CrossRef]
  84. Agati, G.; Soudani, K.; Tuccio, L.; Fierini, E.; Ben Ghozlen, N.; Fadaili, E.M.; Romani, A.; Cerovic, Z.G. Management Zone Delineation for Winegrape Selective Harvesting Based on Fluorescence-Sensor Mapping of Grape Skin Anthocyanins. J. Agric. Food Chem. 2018, 66, 5778–5789. [Google Scholar] [CrossRef]
  85. Jin, Y.; Xu, W.; Zhang, C.; Luo, X.; Jia, H. Boundary-Aware Refined Network for Automatic Building Extraction in Very High-Resolution Urban Aerial Images. Remote Sens. 2021, 13, 692. [Google Scholar] [CrossRef]
  86. Uzun Saylan, B.C.; Baydar, O.; Yeşilova, E.; Kurt Bayrakdar, S.; Bilgir, E.; Bayrakdar, İ.Ş.; Çelik, Ö.; Orhan, K. Assessing the Effectiveness of Artificial Intelligence Models for Detecting Alveolar Bone Loss in Periodontal Disease: A Panoramic Radiograph Study. Diagnostics 2023, 13, 1800. [Google Scholar] [CrossRef]
  87. Arnó, J.; Escolà, A.; Rosell-Polo, J.R. Setting the Optimal Length to Be Scanned in Rows of Vines by Using Mobile Terrestrial Laser Scanners. Precis. Agric. 2017, 18, 145–151. [Google Scholar] [CrossRef]
  88. Matese, A.; Di Gennaro, S.F. Practical Applications of a Multisensor UAV Platform Based on Multispectral, Thermal and RGB High Resolution Images in Precision Viticulture. Agriculture 2018, 8, 116. [Google Scholar] [CrossRef]
  89. Terrón, J.M.; Blanco, J.; Moral, F.J.; Mancha, L.A.; Uriarte, D.; Marques da Silva, J.R. Evaluation of Vineyard Growth under Four Irrigation Regimes Using Vegetation and Soil On-the-Go Sensors. SOIL 2015, 1, 459–473. [Google Scholar] [CrossRef]
  90. Llorens, J.; Gil, E.; Llop, J.; Escolà, A. Ultrasonic and LIDAR Sensors for Electronic Canopy Characterization in Vineyards: Advances to Improve Pesticide Application Methods. Sensors 2011, 11, 2177–2194. [Google Scholar] [CrossRef] [PubMed]
  91. Arnó, J.; Escolà, A.; Vallès, J.M.; Llorens, J.; Sanz, R.; Masip, J.; Palacín, J.; Rosell-Polo, J.R. Leaf Area Index Estimation in Vineyards Using a Ground-Based LiDAR Scanner. Precis. Agric. 2013, 14, 290–306. [Google Scholar] [CrossRef]
  92. Moreno, H.; Valero, C.; Bengochea-Guevara, J.M.; Ribeiro, Á.; Garrido-Izard, M.; Andújar, D. On-Ground Vineyard Reconstruction Using a LiDAR-Based Automated System. Sensors 2020, 20, 1102. [Google Scholar] [CrossRef] [PubMed]
  93. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Tortia, C.; Mania, E.; Guidoni, S.; Gay, P. Leaf Area Index Evaluation in Vineyards Using 3D Point Clouds from UAV Imagery. Precis. Agric. 2020, 21, 881–896. [Google Scholar] [CrossRef]
  94. Vidoni, R.; Gallo, R.; Ristorto, G.; Carabin, G.; Mazzetto, F.; Scalera, L.; Gasparetto, A. ByeLab: An Agricultural Mobile Robot Prototype for Proximal Sensing and Precision Farming; American Society of Mechanical Engineers Digital Collection; American Society of Mechanical Engineers: New York, NY, USA, 2018. [Google Scholar]
  95. Sanz, R.; Rosell, J.R.; Llorens, J.; Gil, E.; Planas, S. Relationship between Tree Row LIDAR-Volume and Leaf Area Density for Fruit Orchards and Vineyards Obtained with a LIDAR 3D Dynamic Measurement System. Agric. For. Meteorol. 2013, 171–172, 153–162. [Google Scholar] [CrossRef]
  96. Aguiar, A.S.; Santos, F.N.D.; De Sousa, A.J.M.; Oliveira, P.M.; Santos, L.C. Visual Trunk Detection Using Transfer Learning and a Deep Learning-Based Coprocessor. IEEE Access 2020, 8, 77308–77320. [Google Scholar] [CrossRef]
  97. Pinto de Aguiar, A.S.; Neves dos Santos, F.B.; Feliz dos Santos, L.C.; de Jesus Filipe, V.M.; Miranda de Sousa, A.J. Vineyard Trunk Detection Using Deep Learning—An Experimental Device Benchmark. Comput. Electron. Agric. 2020, 175, 105535. [Google Scholar] [CrossRef]
  98. Neves Dos Santos, F.; Sobreira, H.M.P.; Campos, D.F.B.; Morais, R.; Moreira, A.P.G.M.; Contente, O.M.S. Towards a Reliable Monitoring Robot for Mountain Vineyards. In Proceedings of the 2015 IEEE International Conference on Autonomous Robot Systems and Competitions, Vila Real, Portugal, 8–10 April 2015; pp. 37–43. [Google Scholar]
  99. Mendes, J.M.; dos Santos, F.N.; Ferraz, N.A.; do Couto, P.M.; dos Santos, R.M. Localization Based on Natural Features Detector for Steep Slope Vineyards. J. Intell. Robot. Syst. 2019, 93, 433–446. [Google Scholar] [CrossRef]
  100. Pereira, C.S.; Morais, R.; Reis, M.J.C.S. Deep Learning Techniques for Grape Plant Species Identification in Natural Images. Sensors 2019, 19, 4850. [Google Scholar] [CrossRef] [PubMed]
  101. Di Gennaro, S.F.; Vannini, G.L.; Berton, A.; Dainelli, R.; Toscano, P.; Matese, A. Missing Plant Detection in Vineyards Using UAV Angled RGB Imagery Acquired in Dormant Period. Drones 2023, 7, 349. [Google Scholar] [CrossRef]
  102. Uribeetxebarria, A.; Daniele, E.; Escolà, A.; Arnó, J.; Martínez-Casasnovas, J.A. Spatial Variability in Orchards after Land Transformation: Consequences for Precision Agriculture Practices. Sci. Total Environ. 2018, 635, 343–352. [Google Scholar] [CrossRef] [PubMed]
  103. Valente, J.; Kooistra, L.; Mücher, S. Fast Classification of Large Germinated Fields Via High-Resolution UAV Imagery. IEEE Robot. Autom. Lett. 2019, 4, 3216–3223. [Google Scholar] [CrossRef]
  104. Moghimi, A.; Pourreza, A.; Zuniga-Ramirez, G.; Williams, L.E.; Fidelibus, M.W. A Novel Machine Learning Approach to Estimate Grapevine Leaf Nitrogen Concentration Using Aerial Multispectral Imagery. Remote Sens. 2020, 12, 3515. [Google Scholar] [CrossRef]
  105. Paprić, Đ.; Korać, N.; Kuljančić, I.; Medić, M. Foliar Analysis of Riesling Italien Clones on Different Grapevine Rootstocks. Letop. Naučnih Rad. Poljopr. Fak. 2009, 33, 43–49. [Google Scholar]
  106. Schreiner, R.P.; Scagel, C.F. Leaf Blade versus Petiole Nutrient Tests as Predictors of Nitrogen, Phosphorus, and Potassium Status of ‘Pinot Noir’ Grapevines. HortScience 2017, 52, 174–184. [Google Scholar] [CrossRef]
Figure 1. Area of interest, vineyards for testing and training.
Figure 1. Area of interest, vineyards for testing and training.
Remotesensing 16 00584 g001
Figure 2. Graphical representation of proposed model for the zoning of vineyards.
Figure 2. Graphical representation of proposed model for the zoning of vineyards.
Remotesensing 16 00584 g002
Figure 3. Details of the proposed model for the detection of living vines.
Figure 3. Details of the proposed model for the detection of living vines.
Remotesensing 16 00584 g003
Figure 4. Defining bounding boxes and the determination of the final bounding box.
Figure 4. Defining bounding boxes and the determination of the final bounding box.
Remotesensing 16 00584 g004
Figure 5. The appearance of the vine tree before the development of vegetation (a) and its shadow, and the identified shadow cast by the tree (b).
Figure 5. The appearance of the vine tree before the development of vegetation (a) and its shadow, and the identified shadow cast by the tree (b).
Remotesensing 16 00584 g005
Figure 6. Example of training dataset with drawn bounding boxes for vine shadows and the coordinates of each bounding box.
Figure 6. Example of training dataset with drawn bounding boxes for vine shadows and the coordinates of each bounding box.
Remotesensing 16 00584 g006
Figure 7. Determining the positions of live vines (bounding boxes containing identified shadows (a), translated bounding boxes (b), extracted parts of the vine row (c), localized vine (d)).
Figure 7. Determining the positions of live vines (bounding boxes containing identified shadows (a), translated bounding boxes (b), extracted parts of the vine row (c), localized vine (d)).
Remotesensing 16 00584 g007aRemotesensing 16 00584 g007b
Figure 8. K-means clustering-based pixel classification.
Figure 8. K-means clustering-based pixel classification.
Remotesensing 16 00584 g008
Figure 9. Part of the results after applying the YOLO algorithm (a) and after completing all steps of the proposed algorithm for vine detection (b) in the analysed vineyard.
Figure 9. Part of the results after applying the YOLO algorithm (a) and after completing all steps of the proposed algorithm for vine detection (b) in the analysed vineyard.
Remotesensing 16 00584 g009
Figure 10. Results of the detection of vines for all three combinations: 2020–2020 (a), 2020–2022 (b), and 2022–2022 (c).
Figure 10. Results of the detection of vines for all three combinations: 2020–2020 (a), 2020–2022 (b), and 2022–2022 (c).
Remotesensing 16 00584 g010
Figure 11. NDVI before (a) and after (b) inter-row removal.
Figure 11. NDVI before (a) and after (b) inter-row removal.
Remotesensing 16 00584 g011
Figure 12. The result of clustering using the K-means method (a) and final (filtered) management zones (b).
Figure 12. The result of clustering using the K-means method (a) and final (filtered) management zones (b).
Remotesensing 16 00584 g012
Figure 13. Box plot diagrams for the obtained clusters. Values of N, P, and K content are in % of dry wt.
Figure 13. Box plot diagrams for the obtained clusters. Values of N, P, and K content are in % of dry wt.
Remotesensing 16 00584 g013
Table 1. Confusion matrix for all three combinations for training and detection for both years.
Table 1. Confusion matrix for all three combinations for training and detection for both years.
Detected 2020–2020 Detected 2020–2022 Detected 2022–2022
Counted 2020–2020 LiveDeadTotalCounted 2020–2022 LiveDeadTotalCounted 2022–2022 LiveDeadTotal
Live2100382138Live18572962153Live18612922153
Dead48263311Dead84279363Dead43282325
Total21483012449 Total19415752516 Total19045742478
Table 2. Accuracy, precision, recall, and F1 score of the vine detection.
Table 2. Accuracy, precision, recall, and F1 score of the vine detection.
2020–20202020–20222022–2022
Accuracy0.96Accuracy0.85Accuracy0.86
Precision0.98Precision0.96Precision0.98
Recall0.98Recall0.86Recall0.86
F1 score0.98F1 score0.91F1 score0.92
Table 3. Mean values of all variables involved in zoning. Values of N, P, and K content are in % of dry wt.
Table 3. Mean values of all variables involved in zoning. Values of N, P, and K content are in % of dry wt.
Mean ofZone 1Zone 2
NDVI0.646590.70601
N Petiole0.646130.61632
N Leaf blade2.172182.11042
P Petiole0.168190.32651
P Leaf blade0.160390.18214
K Petiole0.582600.47835
K Leaf blade0.399850.38451
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gavrilović, M.; Jovanović, D.; Božović, P.; Benka, P.; Govedarica, M. Vineyard Zoning and Vine Detection Using Machine Learning in Unmanned Aerial Vehicle Imagery. Remote Sens. 2024, 16, 584. https://doi.org/10.3390/rs16030584

AMA Style

Gavrilović M, Jovanović D, Božović P, Benka P, Govedarica M. Vineyard Zoning and Vine Detection Using Machine Learning in Unmanned Aerial Vehicle Imagery. Remote Sensing. 2024; 16(3):584. https://doi.org/10.3390/rs16030584

Chicago/Turabian Style

Gavrilović, Milan, Dušan Jovanović, Predrag Božović, Pavel Benka, and Miro Govedarica. 2024. "Vineyard Zoning and Vine Detection Using Machine Learning in Unmanned Aerial Vehicle Imagery" Remote Sensing 16, no. 3: 584. https://doi.org/10.3390/rs16030584

APA Style

Gavrilović, M., Jovanović, D., Božović, P., Benka, P., & Govedarica, M. (2024). Vineyard Zoning and Vine Detection Using Machine Learning in Unmanned Aerial Vehicle Imagery. Remote Sensing, 16(3), 584. https://doi.org/10.3390/rs16030584

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop