Next Article in Journal
Detection of Surface Water and Floods with Multispectral Satellites
Next Article in Special Issue
Combination of Continuous Wavelet Transform and Successive Projection Algorithm for the Estimation of Winter Wheat Plant Nitrogen Concentration
Previous Article in Journal
Effects of Mining on Urban Environmental Change: A Case Study of Panzhihua
Previous Article in Special Issue
Nitrogen Balance Index Prediction of Winter Wheat by Canopy Hyperspectral Transformation and Machine Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Technique Using Planar Area and Ground Shadows Calculated from UAV RGB Imagery to Estimate Pistachio Tree (Pistacia vera L.) Canopy Volume

1
Information Technology Group, Wageningen University & Research, 6708 PB Wageningen, The Netherlands
2
Instituto Tecnológico Agrario de Castilla y León (ITACyL), Unidad de Cultivos Leñosos y Hortícolas, 47071 Valladolid, Spain
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(23), 6006; https://doi.org/10.3390/rs14236006
Submission received: 11 November 2022 / Revised: 24 November 2022 / Accepted: 25 November 2022 / Published: 27 November 2022
(This article belongs to the Special Issue Proximal and Remote Sensing for Precision Crop Management)

Abstract

:
Interest in pistachios has increased in recent years due to their healthy nutritional profile and high profitability. In pistachio trees, as in other woody crops, the volume of the canopy is a key factor that affects the pistachio crop load, water requirements, and quality. However, canopy/crown monitoring is time-consuming and labor-intensive, as it is traditionally carried out by measuring tree dimensions in the field. Therefore, methods for rapid tree canopy characterization are needed for providing accurate information that can be used for management decisions. The present study focuses on developing a new, fast, and low-cost technique, based on two main steps, for estimating the canopy volume in pistachio trees. The first step is based on adequately planning the UAV (unmanned aerial vehicle) flight according to light conditions and segmenting the RGB (Red, Green, Blue) imagery using machine learning methods. The second step is based on measuring vegetation planar area and ground shadows using two methodological approaches: a pixel-based classification approach and an OBIA (object-based image analysis) approach. The results show statistically significant linear relationships (p < 0.05) between the ground-truth data and the estimated volume of pistachio tree crowns, with R2 > 0.8 (pixel-based classification) and R2 > 0.9 (OBIA). The proposed methodologies show potential benefits for accurately monitoring the vegetation of the trees. Moreover, the method is compatible with other remote sensing techniques, usually performed at solar noon, so UAV operators can plan a flexible working day. Further research is needed to verify whether these results can be extrapolated to other woody crops.

1. Introduction

Over the past few years, interest in pistachio nuts has increased due to the high economic revenues they provide to farmers and their numerous beneficial effects. Pistachios improve the quality of the diet and supply a number of bioactive compounds with a healthy nutritional profile, including fiber, healthy fats, phytosterols, and antioxidant compounds, which help to reduce the likelihood of heart disease [1]. Pistachio trees are dioecious plants. Therefore, both male and female trees are needed to produce pistachio nuts. Consequently, male trees must be planted between the female trees, generally in a ratio of 1 to 25 in commercial orchards [2]. The flowers have no petals, are unisexual and are clustered in bunches, similar to other woody crops, such as grapevines [3].
Tree-crown size influences the quality of pistachio nuts since canopy volume affects the crop load, impacting the fruit set and ovary growth, leading to embryo development disorders, and leaving the nut empty [4]. Moreover, as in other crops, it is essential to estimate the canopy volume since it determines the amount of solar radiation that the canopy intercepts, which is directly related to transpiration [5]. Guidance documents for irrigation management in agriculture, such as FAO 56 [6], use evapotranspiration, a concept that combines evaporation and transpiration, two phenomena that occur simultaneously. Unfortunately, there is no simple way to distinguish between these two processes. On the one hand, the tree’s water requirements are determined by the transpiration of the canopy leaves, which is the vaporization of liquid water contained in plant tissues into the atmosphere, predominantly through the stomata. On the other hand, soil water evaporation is mainly determined by the fraction of solar radiation reaching the soil surface. This fraction decreases over the growing season as canopy size increases (and consequently the shadow it generates), significantly reducing evaporation rates [7] because this process mainly occurs on the wet soil surface, which is not shaded by the canopy [2]. Moreover, FAO 56 employs factors such as the crop coefficient Kc, the ratio between the actual crop evapotranspiration and the reference evapotranspiration, an essential input for evaluating crop evapotranspiration. Kc is related to leaf area [8,9] and changes during the growing season according to leaf area, the solar radiation intercepted by the canopy, and crop phenology [10]. These components are strictly related to the plant’s genetics. Therefore, knowledge of the genetic variability on canopy features, such as crown volume, is necessary to conduct breeding programs that aim to create new cultivars suitable for changing environmental conditions or different planting systems, such as planting in high-density hedgerows [11]. For all these reasons, it is important to characterize the size of pistachio canopies.
However, canopy monitoring is time-consuming and labor-intensive as it is traditionally carried out by measuring tree dimensions in the field, both crown height and canopy diameter. First, traditional methods need measurements of the dimensions of the crown, such as the radius. Then, the volume is calculated by assuming the tree crown is a classic 3D solid, such as a cylinder, cone, sphere, ellipsoid or paraboloid, which can be a precise estimation [12]. Still, as farm sizes increase, it becomes more challenging to perform this task effectively without the support of new technologies [13]. A quick and accurate way to estimate canopy diameter might involve remote sensing, which helps to analyze different traits and the within-field spatial variability of the crops and allows the development of site-specific strategies for the entire orchard, group of plants or even the single plant [14]. Remote sensing has proven its value in precision agriculture, a concept that seeks the best possible agricultural production, in terms of quality and quantity, by implementing site-specific management in farms through the use of a wide range of technologies and techniques focused on the assessment and management of spatial variability within the plot. Handling this variability is a critical factor in agriculture, since the needs of plants in some parts of the field may be very different from those in other areas. In this way, the inputs, such as the amount of water or fertilizer, can be adapted to the needs of the plants [15]. The application of precision agriculture is possible thanks to the combination of a variety of advanced technologies, such as the use of global navigation satellite systems (GNSS), the development of software to manage and analyze farms in the context of geographic information systems (GIS), and, of course, the development of technologies related to the acquisition of data remotely [16]. In this way, remote sensing allows determining plant parameters non-destructively, with multiple applications such as crop yield prediction [17], disease detection [18] or vigor estimation [19]. It has already proven its worth in pistachio since it is possible to determine the spatial variability of pistachio trees using satellite remote sensing, as well as their vigor and phenology [20].
Remote sensing consists of a wide range of techniques involving various platforms and sensors. Unmanned aerial vehicles (UAVs) represent one of these platforms. UAVs are a game-changer in precision agriculture, with several unique features that enable them to provide unrivaled remote sensing data, such as high-resolution imagery [21]. They can load a variety of sensors, such as thermal, multispectral or hyperspectral cameras, which have successfully demonstrated their value in agricultural applications [22]. Moreover, recent advances in sensor miniaturization, standardization, and cost reduction have democratized the use of UAVs and expanded their possibilities, giving a wide variety of options to choose from, ranging from high-tech hyperspectral and thermal cameras to more simple RGB (Red, Green, Blue) cameras. These include different technologies, platforms, sensors, and methodologies for extracting traits and features of the crop [23]. The use of aerial vehicles combined with image and data processing based on photogrammetric methods to generate orthomosaics allows the production of maps with high spatial resolution, providing high added value, simplifying crop management, to a certain extent, and supporting the farm manager’s decision-making process [24].
There are several other methods based on remote sensing for estimating the vegetative development of the plant, such as analyzing the 3D point clouds generated from images by photogrammetric processes [25,26] or assessing the tree crown volume from airborne sensor LiDAR (Light Detection and Ranging) data using computational geometry methods [27]. In addition, other methods measure the distribution of biomass in woody plant canopies, such as the average leaf inclination angle (ALIA), canopy gap fraction (% sky in all viewing directions), or the canopy closure (% of ground shaded by foliage) methods [28]. These methods can be helpful but often require time-consuming calculations, expensive equipment, or powerful computers for data processing. Therefore, using each combination of sensor and platform entails a series of essential adjustments and different practices for performing the mission planning properly. Thus, radiometric and geometric calibrations and adjustments of all kinds must be considered depending on the sensor and platform used, leading to different exposure to error, extra costs, and prolonged time, depending on the components used, not only during the development of the flight mission, but also in the computational needs for image pre- and post-processing [29]. It is, therefore, essential to consider the best practices in mission planning and to choose wisely the sensor and platform fitted for the necessities of the work.
In addition, a common approach in remote sensing is using multispectral images to calculate vegetation indices, which have proven useful because of their relationship with critical crop parameters, such as health status or leaf area [30,31,32,33]. These images allow the calculation of vegetation indices such as NDVI (normalized difference vegetation index), which can be used to calculate the planar or planimetric canopy area, that is, the number of vegetation pixels across a single row [34]. Although in this approach shadows are often handled as undesirable information and are usually removed from image sets [35,36], previous work has demonstrated that it is possible to estimate the LAI (leaf area index) of the vineyard from shadows projected on the ground [37]. However, in this previous work, vineyards were managed with a different training system and, moreover, pistachio orchards have a lower density (fewer trees per hectare) and higher tree height than vines. Therefore, it is interesting to understand whether the projected shadows could also be used to estimate the canopy volume of pistachio trees quickly and accurately for informing farm managers.
As mentioned above, since field methods are labor-intensive, obtaining canopy geometry data using remote sensing and image segmentation techniques can significantly speed up the process and allow for more effective tree monitoring. Therefore, a remote sensing-based technique with two methodological approaches for obtaining crown volumes is proposed in this paper. Furthermore, even when the crowns are irregular, the crown volumes can be shaped as a sphere or an ellipsoid since such classic 3D solids are commonly employed [12]. Moreover, since the leaf area of the plant is correlated with intercepted light [38], a rapid and low-cost technique with two methodological approaches for the estimation of the canopy volume (or crown volume) of pistachio trees is developed based on the analysis of the ground shadows projected by the plants. To improve the method, these shadows are combined with the planar area of the vegetation. To our knowledge, this is the first study that presents and evaluates a field protocol for assessing the crown volume of pistachio trees using UAV RGB imagery to measure their projected shadows on the ground. In other woody crops, the canopy attributes are assessed through the LAI [37]. However, in pistachio trees, the canopy is measured by calculating crown volume [39], and therefore the main objective of the proposed technique is to estimate the canopy volume of the plant. Crown volume is the apparent geometric volume that includes all the branches and leaves, even the holes among them [12,40]. Then, the canopy volume is calculated following the same procedures as for the ground-truth data, assuming that the tree crown is a sphere or an ellipsoid.

2. Materials and Methods

2.1. Experimental Site

The pistachio orchard is a 2.03 ha plot located in ‘La Seca’, Valladolid, in the region of Castilla y León, Spain (X: 341450.3, Y: 4589731.8; ETRS89/UTM zone 30N; Figure 1). It was planted in 2016 with a NE–SW orientation and a 7 × 6 m triangular planting pattern.
The pistachio trees (Pistacia vera L. cv. Kerman grafted on UCB rootstock) were irrigated periodically during the vegetative cycle, using two different irrigation treatments in order to generate canopy volume variability aiming to obtain a more representative model calibration. They were irrigated from February to October 2021 using a computer-controlled drip irrigation system, adjusting the duration of each irrigation episode to vary the amount of water applied. The quantity of irrigation water applied was 278 and 418 mm for each treatment during 2021, respectively. All the trees were irrigated daily, using subsurface drip irrigation with twelve pressure-compensated drippers per tree (2 L·h−1).
The phenology dynamics were similar to previous years, with the beginning of flowering around the third week of April, full flowering in the last week of April, and the end of flowering the first week of May. Furthermore, the harvest took place at the beginning of October. Regulatory pruning was carried out to manage vegetation growth while leaving the entire quantity of flower buds. In addition, the soil was kept free of any weeds that would have affected image processing.
Climate data were collected from the closest weather station (VA103, Rueda, Valladolid, Figure 2), which belongs to the ITACyL and SIAR Network (Agroclimatic information system for irrigation). The data are provided on the ‘inforiego’ web service (https://www.inforiego.org/, accessed on 9 July 2022).

2.2. Methodology

The steps of the methodology were: (1) georeferencing the ground control points (GCPs), (2) performing the flight mission, capturing UAV RGB images, (3) photogrammetric process, including orthomosaic building, (4) orthomosaic reclassification, segmenting the orthomosaic using random forest into three classes: vegetation, soil and shadows, (5) analysis in two different ways: raster analysis approach (pixel-based classification) and vector analysis approach, assessing geometries in an object-based image analysis approach (OBIA). Figure 3 shows the last three steps.

2.3. Ground-Truth Data

The ground-truth data were taken on 29 July 2021, just before the flight mission. Two ROIs (Regions of Interest) were employed (Figure 1). The red one defines the UAV flight area, and the blue ROI indicates the area where the ground-truth measures were taken.
In order to ensure enough canopy volume variability, twenty pistachio trees were selected, ten from each irrigation treatment. Measurements were taken following commonly used field methodologies [41,42] and in accordance with the instructions of the technicians of the agricultural holding. In this respect, each plant was accurately measured to define its dimensions (Figure 4a), i.e., the distance from the canopy to the ground (h) and the height (a), length (b), and width (c) of the canopy. Subsequently, the canopy volume was calculated using the sphere formula:
V = 4 3 π r 3
and the ellipsoid formula:
V = 4 3 π ( a / 2 ) ( b / 2 ) ( c / 2 )
Because the field technicians determined that these geometries fitted better the canopy shape of pistachio trees. The radius r for the sphere formula is the average of a/2, b/2 and c/2 (Figure 4b).

2.4. UAV Imagery Acquisition

The aerial survey was performed on the same day as the ground-truth measurements, 29 July 2021, over the pistachio field using a UAV DJI Phantom Advance quadcopter (Figure 5) equipped with a DJI FC6310 RGB camera with a 1-inch 20-megapixel sensor, aperture from F2.8 to F11, a focal length of 8.8 mm (35 mm equivalent: 24 mm), a horizontal field of view of 84°, and mechanical shutter for rolling shutter distortion reduction. For this research, the camera was configured using ISO 400 and a 7.1 aperture, and 248 nadir images (image size 5475 px × 3078 px) in JPEG format were captured with a side and frontal overlap ratio of 80%. This UAV model was chosen because DJI is a world’s leading drone manufacturer [43] and, according to the manufacturer’s Agricultural Drone Industry Insights Report for 2021 [44], the Phantom series is one of the company’s best-selling agricultural drones.
The flight mapping survey (Figure 6a) was designed using DJI Pilot App (v1.9.0), setting up the UAV horizontal speed of 4 m/s and a flying height of 55 m above ground level (AGL), resulting in a theoretical average ground sample distance (GSD) of 1.53 cm/px. In addition, a set of five GCPs were located in the field and georeferenced using a real-time kinematic (RTK) GPS Triumph-2 JAVAD to enhance the geometric accuracy of the image mosaicking process, optimize camera positions, and improve the orientation of the data. The dataset is freely available in a public repository [45]. The survey was planned to obtain the shadows of the trees projected on the ground (Figure 6b).
To this end, the flight was scheduled using the NOAA Solar Calculator (https://www.esrl.noaa.gov/gmd/grad/solcalc/, accessed on 9 July 2022), which implements the equations described by Jean Meeus [46] and allows to estimate the positions of the sun for a certain date in any place on Earth, in this case, the location of the pistachio orchard (Figure 7). Table 1 shows the calculated solar azimuth, elevation, and shadow ratio for the flight date, 29 July 2021, at 11:23, solar noon, sunset, and sunrise (local time).
In this study, the desired sun elevation was β = 45°, aiming to obtain the actual size of the plant (1:1 shadow ratio) from the projected shadows. Therefore, the flight for the planned day for the survey (29 July 2021) was scheduled between 11:15 and 11:30 a.m., and the azimuth angle was α = 106° (Table 1). The solar noon in the location of the pistachio orchard was at 14:26 (local time). Azimuth is measured clockwise from true north to the point on the horizon directly below the object. The elevation is measured vertically from that point on the horizon up to the object.

2.5. Orthomosaic Processing

The first step was to build the orthomosaic, taking into account the locations of the GCPs, georeferenced in ETRS89/UTM 30N. They were indicated in the images to improve orthophoto accuracy. Then, the orthomosaic was generated from the RGB imagery using Agisoft Metashape Professional software, v1.7.6 (Agisoft LLC, St. Petersburg, Russia), following the software provider guidelines.
Then, the image was segmented, aiming to extract the shadows and vegetation planar area (Figure 8) using random forest as described by Vélez et al. [37].
Random forest is a supervised machine learning method, implemented in this work for segmentation purposes, that uses decision trees for classification and prediction. The algorithm combines the predictions to give a final output using the bootstrap technique [47]. In this study, the number of trees was set to 500. Since it is a supervised algorithm, fifteen polygons (five per class) were located within the orthomosaic over areas with vegetation, soil, and shadows, segmenting the orthomosaic into three classes: soil, ground shadows, and vegetation. The classification was very accurate, with an OOB (out of bag) estimate of the error rate of 0.46%. After segmentation, a sieve filter was applied (threshold of 20) to remove noise from the raster image. This process discards raster polygons below a given threshold size (in pixels) and substitutes them using the pixel value of the largest neighbor polygon.
Both methodologies employ a grid based on the previously defined angles (planting pattern and solar azimuth), generating parallelepiped polygons (Figure 9), and assigning a correlative number to each polygon (id).
Therefore, in this specific work, the angles were 25° due to the orientation of the planting pattern and 106° due to the aforementioned solar azimuth (Table 1), resulting in a single-polygon area of 43.7 m2. The grid was used on the reclassified image to identify each tree, assigning a plant id from the polygon id.
Once the orthomosaic is segmented, it is analyzed in two different ways: using a raster analysis approach (pixel-based classification) and a vector analysis approach, examining the geometries in an OBIA approach. In the raster analysis approach, the number of pixels of each class within the polygon corresponding to each plant is counted directly. In the OBIA approach, the pixels corresponding to each category are transformed into objects, enveloping them within several geometries.

2.5.1. Pixel-Based Methodological Approach

In the raster analysis approach, or pixel-based classification, the grid was used to calculate the statistics of each plant, that is, the number of pixels belonging to the vegetation class and shadow class within each polygon. Since the area of the polygon is known, the percentage of the pixels of a class in relation to the total is the planar vegetation area or the area belonging to the shadows, respectively. Subsequently, an average area was calculated by averaging the shadow area (pixels classified as shadows on the ground) and the planar area of the vegetation (pixels classified as vegetation) per plant. The planar area, calculated from nadir images, captures dimensions in the horizontal plane (the plane parallel to the Earth’s surface), while the shadow projected on the ground captures dimensions in a different plane, in this case at 45° to the vertical (because it is the sun’s elevation angle, Figure 6b). From now on, the vegetation is contained in the ‘vegetation plane’ and the shadows in the ‘shadow plane’.
Finally, in the raster analysis approach, the averaged area was considered a circle and the equivalent radius was estimated in order to calculate the crown volume as a sphere. These results were validated against the values of the sphere volumes computed from ground-truth data.

2.5.2. OBIA Methodological Approach

In the OBIA approach, a raster vectorization process was performed on the segmented orthomosaic, transforming the raster image into polygons (Figure 10).
The same procedure was performed twice to exploit the dimensions captured in vegetation and shadow planes. Therefore, each plant had two linked polygons (with the same plant id), one from the vegetation plane and one from the shadow plane. Hence, as a first step, the pixels classified as shadows were converted into polygons.
Subsequently, the polygons were filtered using a 0.2-m2 threshold, aiming to reduce the noise, and several types of geometries were used to envelop the crown area. In this way, the minimum bounding geometry algorithm included in the QGIS algorithm provider (QGIS version 3.22.X, QGIS developer team 2022) was used to create geometries enveloping the areas. The same approach was followed for the vegetation plane (Figure 11), using a 0.05 m2 threshold for filtering the polygons.
In both OBIA workflows, the four geometries provided by the QGIS minimum bounding geometry algorithm were employed: circles, bounding boxes (envelopes), oriented rectangles, and convex hulls. Finally, the spatial statistics were calculated using the polygon shape indices algorithm from SAGA [48], and the crown volume was computed as an ellipsoid or a sphere depending on the geometry type:
  • Circle: The crown volume was calculated as a sphere using an equivalent radius obtained by averaging the radius of the polygons of the vegetation plane and the shadow plane. These results were validated against the values of the sphere volumes computed from ground-truth data.
  • Bounding boxes (envelopes) and oriented rectangles: The crown volume was calculated as an ellipsoid. The equivalent three radii were the length of each plane’s polygon and the polygons’ averaged width. These results were validated against the values of the ellipsoid volumes computed from ground-truth data.
  • Convex hulls: As in the raster analysis approach, an averaged area was calculated by averaging the area of the two polygons (from the vegetation and the shadow planes). Then, the averaged area was considered a circle and the equivalent radius was estimated to calculate the crown volume as a sphere. These results were validated against the values of the sphere volumes computed from ground-truth data.
Finally, root mean square error (RMSE) was calculated to assess the differences between predicted and observed values. RMSE is one of the most commonly used measures to evaluate the error in the environmental sciences [49]:
R M S E = i = 1 n ( y ^ i y i ) 2 n
where n is the total sample size, y is the actual value (ground truth), and y ^ is the estimated value.
Image, statistical, and data analyses were conducted using QGIS (version 3.22.X, QGIS developer team 2022) and R software (version 4.2.X, R Foundation for Statistical Computing, R Core Team 2019, Vienna, Austria, https://www.R-project.org/, accessed on 9 July 2022), using packages randomForest, raster, rgdal, sf, rgeos, and caret, obtained from the Comprehensive R Archive Network (CRAN).

3. Results

The crown volume of each tree was calculated using ground-truth data (Table 2) and assuming that the tree crown is a sphere (V.s.) or an ellipsoid (V.e.), using the distance from the canopy to the ground (h), the height (a), the length (b), and the width (c).
In the current research, the canopy volume calculated using the sphere approximation (V.s.) was strongly correlated to the volume calculated assuming the canopy is an ellipsoid (V.e.), showing an R2 very close to 1 (Figure 12). However, the slope of the linear equation is 1.08. Therefore, the larger the canopy volume of the pistachio tree, the higher the difference in canopy values when using different geometries. In this case, the canopy volumes obtained using the sphere approximation become proportionally larger than those estimated using the ellipsoid approach.
The linear regression between the estimated crown volume using the raster analysis approach and the ground-truth data was calculated, assuming the crown as a sphere (Figure 13a) or an ellipsoid (Figure 13b). Both approaches showed a clear relationship with ground-truth data (R2 > 0.8; p < 0.05), although they also slightly underestimated them. However, when assuming the pistachio tree crown as a sphere, the RMSE value is slightly higher (RMSE = 2.82 m3) than when assuming an ellipsoid (RMSE = 2.73 m3).
Regarding the OBIA analysis, linear regressions were calculated between the estimated and the ground-truth data values (Figure 14).
First, the estimated values were calculated by enclosing the polygons using several geometries: bounding box (Figure 14a), oriented rectangle (Figure 14b), circle (Figure 14c), and convex hull (Figure 14d). Then, the estimated volumes were compared to the measured ones, assuming the tree crown was an ellipsoid (Figure 14a,b) and a sphere (Figure 14c,d).
The results showed a clear relationship (R2 > 0.9), as in the raster analysis approach. However, in the OBIA approach, the methodology overestimates the values of the pistachio tree canopy in all cases, with higher volume resulting in higher overestimation (slope > 1). The convex hull geometry (Figure 14d) showed less error than the others with RMSE = 5.06 m3, whereas the circle geometry showed the highest error (RMSE = 12.22 m3).

4. Discussion

Crown volume is related to light interception and, therefore, to the quantity and quality of the produced fruit [50], so it is reasonable to consider that crown volume could be estimated by directly measuring the shadows (which are the result of the light intercepted by the plant), and the planar area of the vegetation (which intercepts that light). Vélez et al. [37] demonstrated that it is possible to precisely plan a flight over vineyards by calculating the azimuth and elevation of the sun, aiming to obtain an optimal projection of plant shadows on the ground for obtaining information about a dimension of the canopy other than the horizontal plane (the plane parallel to the Earth surface). This information is often difficult to get in aerial orthophotography applications. However, shadows projected on the ground capture dimensions in a different plane, depending on the sun’s elevation angle.
The most appropriate time to capture the images following the present methodology is defined by the day of the year, the height of the trees, and the amount of available ground on which the shade will be projected, which will depend on the planting position of the trees. This article shows how to calculate flight time using a solar calculator based on astronomical algorithms. The proposed technique was successful, looking for a β = 45° sun elevation angle to calculate the plant’s actual dimensions from the projected shadows.
Overall, the results showed a significant positive relationship between the estimated and measured canopy volumes. Therefore, both methodologies have proved to be effective to a limited extent, and under certain conditions. As a result, the estimated crown volume using the raster analysis approach and the ground-truth data showed a strong relationship (R2 > 0.8). Moreover, the estimated crown volume using the methodology based on the OBIA procedure also showed a clear relationship with the ground-truth data (R2 > 0.9). The estimated crown volumes using the raster analysis approach were slightly underestimated compared to the ground-truth data. On the contrary, the estimated crown volumes using the OBIA procedure were largely overestimated compared to the ground-truth data. This result may be because the OBIA geometries are generated by enveloping the polygons corresponding to each class, that is, shadows and vegetation, enclosing the existing gaps in the tree canopy vegetation. Furthermore, the perimeter of the projected tree crown does not fit exactly with the polygon that encloses it, whether it is bounding boxes, oriented rectangles, enclosing circles, or convex hulls, having a smoothing effect on the irregular shape of the surface of the crown and leading to indentations between the generated polygon and the actual projected surface. Other authors have faced similar issues in temperate old-growth forests, proposing index-based techniques to estimate the relative amount of indentations to counterbalance their effect [51]. In this work, the data were fitted using linear regression, producing a good adjustment and suggesting that the number of gaps and indentations was proportional to the crown size, with increasing crown volumes resulting in a bigger overestimation.
Bearing in mind that all values estimated using the OBIA procedure were overestimated, it is observed that the estimated values calculated using the convex hull procedure are the least overestimated. The estimated values using the bounding box and oriented rectangle geometries are between the best and the worst results. On the contrary, the circle overestimated the canopy size more than other geometries. This finding makes sense since the objective of using geometries is to envelop all pixels belonging to the canopy of the tree, and the circle occupies the most extensive area because the definition of its geometry only depends on one parameter, the radius. In contrast, other geometries employed in the OBIA approach have more possibilities for adjustment. On the contrary, in the pixel-based analysis, each pixel is counted to obtain the total number of pixels belonging to a class, which is closer to the measured values, as the slope of the regression equation is closer to 1 than in the OBIA procedure.
An analysis of the error generated by the different procedures shows that the methodological approach using raster analysis has higher predictive accuracy because the error is lower [49] since the RMSE is at least half the RMSE of the OBIA approach. A more in-depth analysis of the OBIA approach shows that the RMSE values of the circle were the highest, confirming that the use of this type of geometry is the worst option. On the other hand, the RMSE values of the convex hull were lower than the others, indicating that the convex hull is the most suitable geometry to wrap the shape of the crown.
This result was expected because circles and bounding boxes are simple geometric shape approximations, but a convex hull is a computational geometry approach, and no assumptions are made about the shape of the crown, as opposed to the simple shape approximation [12]. Thus, the convex hull of a set of points is the smallest convex set that contains them [52], and therefore the area will be lower than a simple geometric shape. Several other authors have also used convex hulls in their research. Yan et al. [53] found it difficult to deal with the gaps and holes within the crown. As in the present work, they obtained overestimated computed values compared to the actual values. Wang et al. [54] employed convex hulls to reconstruct occluded apples, obtaining good results in extracting their real shape and showing that convex hulls are a good choice for shape representation and analysis. Finally, Zhu et al. [55] used convex hulls to calculate the tree crown volume, increasing their number and decreasing their size with each further iteration, finding a higher accuracy by summing up all the volumes of all these convex hulls. Therefore, this could be a way to improve the methodology presented in this paper. Furthermore, the convex hull algorithm is likely to be improved and fine-tuned to provide an improved approach for calculating crown volumes [56].
The results of the present work are relevant since it is key to estimate the crown volume accurately and quickly, as it is related to the leaf area and, therefore, to the photosynthetic capacity of the plant, which defines the plant’s ability to produce photoassimilates and determines the maximum crop load, limiting the source-sink ratio of the plant since these compounds move from leaves to nuts [57]. As a general rule, in all woody crops, the crop load is crucial as it affects the correct development of the crop [58]. Particularly, in pistachio, it is even more critical as the crop load also affects the percentage of nuts with split shells and the percentage of blank nuts. With increasing crop load, the rate of nuts with open shells decreases, as well as the ratio of empty nuts [4]. Moreover, the dimensions of the canopy and the crown volume are valuable phenotyping traits to conduct breeding programs that aim to create new cultivars adapted to climate change and new cropping systems [11].
A wide variety of remote sensing-based technologies are available for canopy monitoring. These techniques are often associated with, or restricted to, digital photogrammetry reconstructions and LiDAR data, as they are largely available on the market, and have proven their worth in a range of applications [59]. Thus, UAV RGB images have been widely used in woody crops for 3D point cloud generation [25,60,61], showing potential in agriculture for assessing diverse parameters. Gómez-Gálvez et al. [42] generated 3D point clouds for accurate and high-throughput measurements of canopy traits, using UAV RGB images captured over olive trees, showing variances across cultivars and between trees of different ages within the same cultivar. The results of the present work agree with theirs, suggesting that it is possible to rapidly characterize large numbers of trees and assist in the identification and definition of tree crowns as phenotypic traits through the use of UAVs in combination with diverse image processing techniques. Other authors have employed more complex sensors than simple RGB cameras, such as multispectral cameras for 3D point cloud generation [62] or LiDAR [27,51,53]. Certainly, more complex sensors can provide additional data to improve the technique presented in this work, e.g., by using slices of the geometries computed using LiDAR instead of RGB imagery to extract the dimensions of the canopy. However, using more complex technology does not always guarantee better results. In this sense, Ghanbari Parmehr and Amati found that the point clouds obtained from scanning canopies using both photogrammetric and LiDAR methods are nearly identical and provide comparable tree characteristics [63]. Therefore, all remote-sensing-based methods can be valuable and useful depending on the context. Furthermore, since precision agriculture seeks the implementation of site-specific management [15] using a wide range of combined technologies [16], certain remote sensing techniques may be more compatible than others for integration. In this sense, a specific remote sensing technique that generates accurate results, such as the generation and evaluation of 3D point clouds [25,42] may be suitable when enough processing time is available but not when information closer to real-time is required, e.g., to generate information from UAVs flying just ahead of a tractor or other agricultural machinery. It is therefore important to dispose of a wide range of different techniques to be able to choose from. In other fields, architectures for UAV image processing are already being developed that allow for near real-time results with enough accuracy [64] as well as for images to be classified according to their contents [65].
In this sense, this work presents a novel technique that shows potential benefits in terms of compatibility with other remote sensing techniques, increasing the range of tools available for technicians to obtain useful agronomic information on pistachio tree orchards. The rationale behind the method is straightforward and robust, as the approach measures the dimensions directly from the images, that is, from the physical object (the canopy), taking advantage of the shadows directly generated from the object by adequately planning the flight mission. In this way, the flight was performed between 11:15 and 11:30, whereas the solar noon was at 14:26 local time at the location of the pistachio orchard for that specific day. Therefore, the proposed approach requires different times of the day than other methods, which must perform the flight at solar midday (solar noon) [29], making this approach compatible with other remote sensing techniques and enabling integration into a flexible working day.
Finally, it should be noted that, like other methodologies based on remote sensing, the proposed approach has its limitations:
  • First, spheres and ellipsoids were used in the present work as tree crowns because the studied pistachio canopies were similar to these geometries. However, this assumption can lead to errors because the canopy does not have a perfect geometric shape. The present technique could be adjusted to other shapes (for instance, using the formula for the cone or the paraboloid [12]). In any case, this article aims to present a novel technique to estimate pistachio tree (Pistacia vera L.) canopy volume by analyzing ground shadows using UAV RGB imagery, so it is open to further modification and improvement in future research.
  • The shadows must be projected correctly on the ground. That is, adequate direct lighting is required, without clouds obstructing the sun’s rays on the Earth’s surface and generating diffuse illumination.
  • Another constraint could be the amount of available ground on which the shade will be projected. It should not be an issue in woody crop plantations because they are planned to ensure good solar illumination, avoiding the shading of some trees over others (a typical planting pattern is 7 × 6 m or 7 × 7 m). However, some new plantations are more intensive, with low spacings (up to 4 × 1.25 m). This implies planting in hedgerow systems, which can be approached in a similar way to that reported by Vélez et al. [37].
  • Shadows vary throughout the day depending on the sun’s position. Therefore, it is essential to plan the mission accordingly to obtain the best information on the projection of shadows on the ground. Yet, this is not a limitation of this methodology alone, as all methods based on remote sensing with solar illumination must consider natural lighting conditions to obtain good results.
  • The values for filtering the noise (Figure 10 and Figure 11) were chosen after several tests using different values for a better visual fit, aiming to remove the weeds from the image and isolate the pistachio tree crowns. This aspect could be improved in future versions of the approach since this work seeks not to develop an algorithm to identify weeds in pistachio orchards, but to provide a technique to size the tree canopy.
In future research, it would be interesting to assess the applicability of this methodology to other woody crops planted individually, such as olive or almond trees. Moreover, it would be interesting to explore whether this methodology can be applied to analyze the density of leaves within that volume, as the gaps within the tree canopy can be observed when analyzing the shadows after the segmentation process (Figure 8). Finally, using image sensors other than RGB, such as multispectral sensors, could provide useful information. They usually include the NIR (near-infrared) band, which is highly relevant in agriculture for physicochemical and morphophysiological analysis, as it can acquire interesting spectral information of the main photosynthetic pigments and characterizes the internal structure of the leaves through light scattering [66]. Moreover, the NIR channel is commonly used to isolate vegetation pixels from the background [67,68]. Therefore, multispectral imagery could increase the accuracy of the methodology in capturing crop information and segmenting the image more precisely.

5. Conclusions

This study developed a new, rapid, and low-cost technique, with two methodological approaches (one based on raster analysis and the other on OBIA), for estimating the canopy volume of pistachio trees. To this end, the RGB orthomosaic was segmented using machine learning methods to measure and combine the planar area (vegetation plane) and the ground shadows area (shadow plane). All approaches showed statistically significant linear relationships. However, the methodological approach based on raster analysis (pixel-based classification) was better fitted to ground-truth data and had lower error and higher predictive accuracy than the OBIA approach. Moreover, the circle geometry was the worst option to envelop the tree crown whereas the convex hull geometry was the best option.
The proposed methodology gives UAV operators more versatility to plan a flexible working day, demonstrating that it is possible to take images of the projected shadows to obtain the actual size of the plant (1:1 shadow ratio) for any given day, with a sun elevation angle of β = 45°, using astronomical algorithms to adequately plan the UAV flight according to the lighting conditions.

Author Contributions

Conceptualization: S.V.; data curation: S.V. and R.V.; formal analysis: S.V.; investigation: S.V.; methodology: S.V., S.Á. and R.V.; project administration: S.Á.; resources: S.V., H.M., D.R.-R., R.V. and S.Á.; software: S.V. and R.V.; supervision: S.V.; visualization: S.V. and R.V.; writing—original draft: S.V.; writing—review and editing: S.Á. and R.V.; funding acquisition: S.Á. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the project CDTI (IDI-20200822) and FEADER funds.

Data Availability Statement

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bulló, M.; Juanola-Falgarona, M.; Hernández-Alonso, P.; Salas-Salvadó, J. Nutrition Attributes and Health Effects of Pistachio Nuts. Br. J. Nutr. 2015, 113, S79–S93. [Google Scholar] [CrossRef] [Green Version]
  2. Steduto, P.; Hsiao, T.C.; Fereres, E.; Raes, D. Crop Yield Response to Water, 1st ed.; Food and Agriculture Organization of the United Nations: Rome, Italy, 2012; ISBN 978-92-5-107274-5. [Google Scholar]
  3. Mandalari, G.; Barreca, D.; Gervasi, T.; Roussell, M.A.; Klein, B.; Feeney, M.J.; Carughi, A. Pistachio Nuts (Pistacia vera L.): Production, Nutrients, Bioactives and Novel Health Effects. Plants 2021, 11, 18. [Google Scholar] [CrossRef]
  4. Ferguson, L.; Polito, V.; Kallsen, C. The Pistachio Tree; Botany and Physiology and Factors That Affect Yield. In Pistachio Production Manual; University of California: Davis, CA, USA, 2005; pp. 31–39. [Google Scholar]
  5. Charles-Edwards, D.A.; Thornley, J.H.M. Light Interception by an Isolated Plant A Simple Model. Ann. Bot. New Ser. 1973, 37, 919–928. [Google Scholar] [CrossRef]
  6. Allen, R.G.; Pereira, L.S.; Raes, D.; Smith, M. Crop Evapotranspiration: Guidelines for Computing Crop Water Requirements; Food and Agriculture Organization of the United Nations: Rome, Italy, 1998; ISBN 92-5-104219-5. [Google Scholar]
  7. Todd, R.W.; Klocke, N.L.; Hergert, G.W.; Parkhurst, A.M. Evaporation from Soil Influenced by Crop Shading, Crop Residue, and Wetting Regime. Trans. ASAE 1991, 34, 461–466. [Google Scholar] [CrossRef]
  8. Bandyopadhyay, P.K.; Mallick, S. Actual Evapotranspiration and Crop Coefficients of Wheat (Triticum Aestivum) under Varying Moisture Levels of Humid Tropical Canal Command Area. Agric. Water Manag. 2003, 59, 33–47. [Google Scholar] [CrossRef]
  9. Jia, Q.; Wang, Y.-P. Relationships between Leaf Area Index and Evapotranspiration and Crop Coefficient of Hilly Apple Orchard in the Loess Plateau. Water 2021, 13, 1957. [Google Scholar] [CrossRef]
  10. Netzer, Y.; Yao, C.; Shenker, M.; Bravdo, B.-A.; Schwartz, A. Water Use and the Development of Seasonal Crop Coefficients for Superior Seedless Grapevines Trained to an Open-Gable Trellis System. Irrig. Sci. 2009, 27, 109–120. [Google Scholar] [CrossRef]
  11. León, L.; Díaz-Rueda, P.; Belaj, A.; De la Rosa, R.; Carrascosa, C.; Colmenero-Flores, J.M. Evaluation of Early Vigor Traits in Wild Olive Germplasm. Sci. Hortic. 2020, 264, 109157. [Google Scholar] [CrossRef] [Green Version]
  12. Zhu, Z.; Kleinn, C.; Nölke, N. Assessing Tree Crown Volume—A Review. For. Int. J. For. Res. 2021, 94, 18–35. [Google Scholar] [CrossRef]
  13. Balafoutis, A.T.; Beck, B.; Fountas, S.; Tsiropoulos, Z.; Vangeyte, J.; Gómez-Barbero, M.; Pedersen, S.M. Smart Farming Technologies–Description, Taxonomy and Economic Impact. In Precision Agriculture: Technology and Economic Perspectives; Progress in Precision Agriculture; Springer: Berlin, Germany, 2017; p. 58. ISBN 978-3-319-68715-5. [Google Scholar]
  14. Pierce, F.J.; Clay, D. GIS Applications in Agriculture; CRC Press: Boca Raton, FL, USA, 2007; ISBN 978-0-8493-7526-2. [Google Scholar]
  15. Gebbers, R.; Adamchuk, V.I. Precision Agriculture and Food Security. Science 2010, 327, 828–831. [Google Scholar] [CrossRef]
  16. Santesteban, L.G. Precision Viticulture and Advanced Analytics. A Short Review. Food Chem. 2019, 279, 58–62. [Google Scholar] [CrossRef]
  17. Muruganantham, P.; Wibowo, S.; Grandhi, S.; Samrat, N.H.; Islam, N. A Systematic Literature Review on Crop Yield Prediction with Deep Learning and Remote Sensing. Remote Sens. 2022, 14, 1990. [Google Scholar] [CrossRef]
  18. Albetis, J.; Duthoit, S.; Guttler, F.; Jacquin, A.; Goulard, M.; Poilvé, H.; Féret, J.-B.; Dedieu, G. Detection of Flavescence Dorée Grapevine Disease Using Unmanned Aerial Vehicle (UAV) Multispectral Imagery. Remote Sens. 2017, 9, 308. [Google Scholar] [CrossRef] [Green Version]
  19. Pádua, L.; Marques, P.; Adão, T.; Guimarães, N.; Sousa, A.; Peres, E.; Sousa, J.J. Vineyard Variability Analysis through UAV-Based Vigour Maps to Assess Climate Change Impacts. Agronomy 2019, 9, 581. [Google Scholar] [CrossRef] [Green Version]
  20. Barajas, E.; Álvarez, S.; Fernández, E.; Vélez, S.; Rubio, J.A.; Martín, H. Sentinel-2 Satellite Imagery for Agronomic and Quality Variability Assessment of Pistachio (Pistacia vera L.). Sustainability 2020, 12, 8437. [Google Scholar] [CrossRef]
  21. Maes, W.H.; Steppe, K. Perspectives for Remote Sensing with Unmanned Aerial Vehicles in Precision Agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
  22. Kim, J.; Kim, S.; Ju, C.; Son, H.I. Unmanned Aerial Vehicles in Agriculture: A Review of Perspective of Platform, Control, and Applications. IEEE Access 2019, 7, 105100–105115. [Google Scholar] [CrossRef]
  23. Xie, C.; Yang, C. A Review on Plant High-Throughput Phenotyping Traits Using UAV-Based Sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
  24. Matese, A.; Di Gennaro, S. Practical Applications of a Multisensor UAV Platform Based on Multispectral, Thermal and RGB High Resolution Images in Precision Viticulture. Agriculture 2018, 8, 116. [Google Scholar] [CrossRef] [Green Version]
  25. García-Fernández, M.; Sanz-Ablanedo, E.; Pereira-Obaya, D.; Rodríguez-Pérez, J.R. Vineyard Pruning Weight Prediction Using 3D Point Clouds Generated from UAV Imagery and Structure from Motion Photogrammetry. Agronomy 2021, 11, 2489. [Google Scholar] [CrossRef]
  26. Pagliai, A.; Ammoniaci, M.; Sarri, D.; Lisci, R.; Perria, R.; Vieri, M.; D’Arcangelo, M.E.M.; Storchi, P.; Kartsiotis, S.-P. Comparison of Aerial and Ground 3D Point Clouds for Canopy Size Assessment in Precision Viticulture. Remote Sens. 2022, 14, 1145. [Google Scholar] [CrossRef]
  27. Korhonen, L.; Vauhkonen, J.; Virolainen, A.; Hovi, A.; Korpela, I. Estimation of Tree Crown Volume from Airborne Lidar Data Using Computational Geometry. Int. J. Remote Sens. 2013, 34, 7236–7248. [Google Scholar] [CrossRef]
  28. Seidel, D.; Fleck, S.; Leuschner, C.; Hammett, T. Review of Ground-Based Methods to Measure the Distribution of Biomass in Forest Canopies. Ann. For. Sci. 2011, 68, 225–244. [Google Scholar] [CrossRef] [Green Version]
  29. Eltner, A.; Hoffmeister, D.; Kaiser, A.; Karrasch, P.; Klingbeil, L.; Stöcker, C.; Rovere, A. UAVs for the Environmental Sciences: Methods and Applications; WBG Academic: Darmstadt, Germany, 2022; ISBN 978-3-534-40588-6. [Google Scholar]
  30. Haboudane, D. Hyperspectral Vegetation Indices and Novel Algorithms for Predicting Green LAI of Crop Canopies: Modeling and Validation in the Context of Precision Agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  31. Towers, P.C.; Strever, A.; Poblete-Echeverría, C. Comparison of Vegetation Indices for Leaf Area Index Estimation in Vertical Shoot Positioned Vine Canopies with and without Grenbiule Hail-Protection Netting. Remote Sens. 2019, 11, 1073. [Google Scholar] [CrossRef] [Green Version]
  32. Vélez, S.; Barajas, E.; Rubio, J.A.; Vacas, R.; Poblete-Echeverría, C. Effect of Missing Vines on Total Leaf Area Determined by NDVI Calculated from Sentinel Satellite Data: Progressive Vine Removal Experiments. Appl. Sci. 2020, 10, 3612. [Google Scholar] [CrossRef]
  33. Giovos, R.; Tassopoulos, D.; Kalivas, D.; Lougkos, N.; Priovolou, A. Remote Sensing Vegetation Indices in Viticulture: A Critical Review. Agriculture 2021, 11, 457. [Google Scholar] [CrossRef]
  34. Hall, A.; Louis, J.P.; Lamb, D.W. Low-Resolution Remotely Sensed Images of Winegrape Vineyards Map Spatial Variability in Planimetric Canopy Area Instead of Leaf Area Index. Aust. J. Grape Wine Res. 2008, 14, 9–17. [Google Scholar] [CrossRef]
  35. Aboutalebi, M.; Torres-Rua, A.F.; Kustas, W.P.; Nieto, H.; Coopmans, C.; McKee, M. Assessment of Different Methods for Shadow Detection in High-Resolution Optical Imagery and Evaluation of Shadow Impact on Calculation of NDVI, and Evapotranspiration. Irrig. Sci. 2019, 37, 407–429. [Google Scholar] [CrossRef]
  36. Jiang, H.; Wang, S.; Cao, X.; Yang, C.; Zhang, Z.; Wang, X. A Shadow- Eliminated Vegetation Index (SEVI) for Removal of Self and Cast Shadow Effects on Vegetation in Rugged Terrains. Int. J. Digit. Earth 2019, 12, 1013–1029. [Google Scholar] [CrossRef]
  37. Vélez, S.; Poblete-Echeverría, C.; Rubio, J.A.; Vacas, R.; Barajas, E. Estimation of Leaf Area Index in Vineyards by Analysing Projected Shadows Using UAV Imagery. OENO One 2021, 55, 159–180. [Google Scholar] [CrossRef]
  38. Baeza, P.; Sánchez-De-Miguel, P.; Lissarrague, J.R. Radiation Balance in Vineyards. In Methodologies and Results in Grapevine Research; Delrot, S., Medrano, H., Or, E., Bavaresco, L., Grando, S., Eds.; Springer: Dordrecht, The Netherlands, 2010; pp. 21–29. ISBN 978-90-481-9282-3. [Google Scholar]
  39. Ferguson, L.; Haviland, D.R. Pistachio Production Manual; University of California: Davis, CA, USA, 2016; ISBN 1-60107-877-3. [Google Scholar]
  40. Hamilton, G.J. The Dependence of Volume Increment of Individual Trees on Dominance, Crown Dimensions, and Competition. Forestry 1969, 42, 133–144. [Google Scholar] [CrossRef]
  41. Caruso, G.; Zarco-Tejada, P.J.; González-Dugo, V.; Moriondo, M.; Tozzini, L.; Palai, G.; Rallo, G.; Hornero, A.; Primicerio, J.; Gucci, R. High-Resolution Imagery Acquired from an Unmanned Platform to Estimate Biophysical and Geometrical Parameters of Olive Trees under Different Irrigation Regimes. PLoS ONE 2019, 14, e0210804. [Google Scholar] [CrossRef] [Green Version]
  42. Gómez-Gálvez, F.J.; Pérez-Mohedano, D.; de la Rosa-Navarro, R.; Belaj, A. High-Throughput Analysis of the Canopy Traits in the Worldwide Olive Germplasm Bank of Córdoba Using Very High-Resolution Imagery Acquired from Unmanned Aerial Vehicle (UAV). Sci. Hortic. 2021, 278, 109851. [Google Scholar] [CrossRef]
  43. Sesar Joint Undertaking. European Drones Outlook Study: Unlocking the Value for Europe; Publications Office: London, UK, 2017. [Google Scholar]
  44. DJI Sciences and Technologies Ltd. Agricultural Drone Industry Insights Report (2021). Available online: https://www.dji.com/newsroom/news/agricultural-drone-industry-insights-report-2021 (accessed on 7 October 2022).
  45. Vélez, S.; Vacas, R.; Martín, H.; Ruano-Rosa, D.; Álvarez, S. High-Resolution UAV RGB Imagery Dataset for Precision Agriculture and 3D Photogrammetric Reconstruction Captured over a Pistachio Orchard (Pistacia vera L.) in Spain. Data 2022, 7, 157. [Google Scholar] [CrossRef]
  46. Meeus, J. Astronomical Algorithms, 2nd ed.; Willmann-Bell: Richmond, VA, USA, 1998; ISBN 978-0-943396-61-3. [Google Scholar]
  47. Kuhn, M. Building Predictive Models in R Using the Caret Package. J. Stat. Softw. 2008, 28, 5. [Google Scholar] [CrossRef] [Green Version]
  48. Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4. Geosci. Model Dev. 2015, 8, 1991–2007. [Google Scholar] [CrossRef] [Green Version]
  49. Li, J. Assessing the Accuracy of Predictive Models for Numerical Data: Not r nor R2, Why Not? Then What? PLoS ONE 2017, 12, e0183250. [Google Scholar] [CrossRef] [Green Version]
  50. Wagenmakers, P.S.; Callesen, O. Light Distribution in Apple Orchard Systems in Relation to Production and Fruit Quality. J. Hortic. Sci. 1995, 70, 935–948. [Google Scholar] [CrossRef]
  51. Fleck, S.; Mölder, I.; Jacob, M.; Gebauer, T.; Jungkunst, H.F.; Leuschner, C. Comparison of Conventional Eight-Point Crown Projections with LIDAR-Based Virtual Crown Projections in a Temperate Old-Growth Forest. Ann. For. Sci. 2011, 68, 1173–1185. [Google Scholar] [CrossRef]
  52. Barber, C.B.; Dobkin, D.P.; Huhdanpaa, H. The Quickhull Algorithm for Convex Hulls. ACM Trans. Math. Softw. 1996, 22, 469–483. [Google Scholar] [CrossRef] [Green Version]
  53. Yan, Z.; Liu, R.; Cheng, L.; Zhou, X.; Ruan, X.; Xiao, Y. A Concave Hull Methodology for Calculating the Crown Volume of Individual Trees Based on Vehicle-Borne LiDAR Data. Remote Sens. 2019, 11, 623. [Google Scholar] [CrossRef] [Green Version]
  54. Wang, D.; Song, H.; Tie, Z.; Zhang, W.; He, D. Recognition and Localization of Occluded Apples Using K-Means Clustering Algorithm and Convex Hull Theory: A Comparison. Multimed. Tools Appl. 2016, 75, 3177–3198. [Google Scholar] [CrossRef]
  55. Zhu, Z.; Kleinn, C.; Nölke, N. Towards Tree Green Crown Volume: A Methodological Approach Using Terrestrial Laser Scanning. Remote Sens. 2020, 12, 1841. [Google Scholar] [CrossRef]
  56. Lin, W.; Meng, Y.; Qiu, Z.; Zhang, S.; Wu, J. Measurement and Calculation of Crown Projection Area and Crown Volume of Individual Trees Based on 3D Laser-Scanned Point-Cloud Data. Int. J. Remote Sens. 2017, 38, 1083–1100. [Google Scholar] [CrossRef]
  57. Raven, P.H.; Evert, R.F.; Eichhorn, S.E. Biology of Plants, 8th ed.; W.H. Freeman and Company Publishers: New York, NY, USA, 2013; ISBN 978-1-4292-1961-7. [Google Scholar]
  58. Keller, M. The Science of Grapevines: Anatomy and Physiology, 2nd ed.; Elsevier: Amsterdam, The Netherlands, 2015; ISBN 978-0-12-419987-3. [Google Scholar]
  59. Faridhouseini, A.; Mianabadi, A.; Bannayan, M.; Alizadeh, A. Lidar Remote Sensing for Forestry and Terrestrial Applications. Int. J. Appl. Environ. Sci. 2011, 74, 99–114. [Google Scholar]
  60. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Tortia, C.; Mania, E.; Guidoni, S.; Gay, P. Leaf Area Index Evaluation in Vineyards Using 3D Point Clouds from UAV Imagery. Precis. Agric. 2020, 21, 881–896. [Google Scholar] [CrossRef] [Green Version]
  61. Jurado, J.M.; Pádua, L.; Feito, F.R.; Sousa, J.J. Automatic Grapevine Trunk Detection on UAV-Based Point Cloud. Remote Sens. 2020, 12, 3043. [Google Scholar] [CrossRef]
  62. Jurado, J.M.; Ortega, L.; Cubillas, J.J.; Feito, F.R. Multispectral Mapping on 3D Models and Multi-Temporal Monitoring for Individual Characterization of Olive Trees. Remote Sens. 2020, 12, 1106. [Google Scholar] [CrossRef] [Green Version]
  63. Ghanbari Parmehr, E.; Amati, M. Individual Tree Canopy Parameters Estimation Using UAV-Based Photogrammetric and LiDAR Point Clouds in an Urban Park. Remote Sens. 2021, 13, 2062. [Google Scholar] [CrossRef]
  64. Xie, X.; Yang, W.; Cao, G.; Yang, J.; Zhao, Z.; Chen, S.; Liao, Q.; Shi, G. Real-Time Vehicle Detection from UAV Imagery. In Proceedings of the 2018 IEEE Fourth International Conference on Multimedia Big Data (BigMM), Xi’an, China, 13–16 September 2018; pp. 1–5. [Google Scholar]
  65. Sheppard, C.; Rahnemoonfar, M. Real-Time Scene Understanding for UAV Imagery Based on Deep Convolutional Neural Networks. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 2243–2246. [Google Scholar]
  66. Mishra, P.; Sadeh, R.; Bino, E.; Polder, G.; Boer, M.P.; Rutledge, D.N.; Herrmann, I. Complementary Chemometrics and Deep Learning for Semantic Segmentation of Tall and Wide Visible and Near-Infrared Spectral Images of Plants. Comput. Electron. Agric. 2021, 186, 106226. [Google Scholar] [CrossRef]
  67. Chen, C.J.; Zhang, Z. GRID: A Python Package for Field Plot Phenotyping Using Aerial Images. Remote Sens. 2020, 12, 1697. [Google Scholar] [CrossRef]
  68. Mardanisamani, S.; Eramian, M. Segmentation of Vegetation and Microplots in Aerial Agriculture Images: A Survey. Plant Phenome J. 2022, 5, 42. [Google Scholar] [CrossRef]
Figure 1. Location of the pistachio orchard in the ‘Castilla y León’ region, Spain. Coordinates: X: 341450.3, Y: 4589731.8; ETRS89/UTM zone 30N. Red polygon: UAV flight area. Blue polygon: area where the ground-truth measures were taken.
Figure 1. Location of the pistachio orchard in the ‘Castilla y León’ region, Spain. Coordinates: X: 341450.3, Y: 4589731.8; ETRS89/UTM zone 30N. Red polygon: UAV flight area. Blue polygon: area where the ground-truth measures were taken.
Remotesensing 14 06006 g001
Figure 2. Weather variables over the period from 1 January to 29 July 2021 at the experimental site.
Figure 2. Weather variables over the period from 1 January to 29 July 2021 at the experimental site.
Remotesensing 14 06006 g002
Figure 3. Workflow. Both methodologies employ a grid based on the planting pattern and the solar azimuth to identify each tree, assigning a plant id from the polygon id. The orthomosaic is generated using RGB images, and segmented using random forest algorithm into three classes: vegetation, soil and shadows. The product is analyzed in two ways: (a) using a raster approach (pixel-based classification), analyzing the number of pixels of each class within each polygon and (b) an OBIA (Object-Based Image Analysis) approach, enveloping the pixels within several geometries.
Figure 3. Workflow. Both methodologies employ a grid based on the planting pattern and the solar azimuth to identify each tree, assigning a plant id from the polygon id. The orthomosaic is generated using RGB images, and segmented using random forest algorithm into three classes: vegetation, soil and shadows. The product is analyzed in two ways: (a) using a raster approach (pixel-based classification), analyzing the number of pixels of each class within each polygon and (b) an OBIA (Object-Based Image Analysis) approach, enveloping the pixels within several geometries.
Remotesensing 14 06006 g003
Figure 4. (a) Ground-truth measurements. The canopy’s height, length and width are a, b and c, respectively. h is the distance from the canopy to the ground. (b) Pistachio crown shape, considered as an ellipsoid.
Figure 4. (a) Ground-truth measurements. The canopy’s height, length and width are a, b and c, respectively. h is the distance from the canopy to the ground. (b) Pistachio crown shape, considered as an ellipsoid.
Remotesensing 14 06006 g004
Figure 5. (a) DJI Phantom 4 Unmanned Aerial Vehicle (UAV), (b) UAV flight over the pistachio trees.
Figure 5. (a) DJI Phantom 4 Unmanned Aerial Vehicle (UAV), (b) UAV flight over the pistachio trees.
Remotesensing 14 06006 g005
Figure 6. (a) Planned flight mission. Yellow line: unmanned aerial vehicle (UAV) path. (b) Azimuth and relationship between sun elevation and shadows, adapted from Vélez et al. [37].
Figure 6. (a) Planned flight mission. Yellow line: unmanned aerial vehicle (UAV) path. (b) Azimuth and relationship between sun elevation and shadows, adapted from Vélez et al. [37].
Remotesensing 14 06006 g006
Figure 7. Position of the sun throughout the day at the experimental site. Red polygon: UAV flight area.
Figure 7. Position of the sun throughout the day at the experimental site. Red polygon: UAV flight area.
Remotesensing 14 06006 g007
Figure 8. Random forest classification results. White color: vegetation; black color: shadows; grey color: soil. Blue polygon: area where the ground-truth measures were taken.
Figure 8. Random forest classification results. White color: vegetation; black color: shadows; grey color: soil. Blue polygon: area where the ground-truth measures were taken.
Remotesensing 14 06006 g008
Figure 9. Grid generated using the planting pattern and solar azimuth angles (red polygons) to mask the reclassified image (background) to identify each tree and to calculate the statistics of each plant.
Figure 9. Grid generated using the planting pattern and solar azimuth angles (red polygons) to mask the reclassified image (background) to identify each tree and to calculate the statistics of each plant.
Remotesensing 14 06006 g009
Figure 10. OBIA approach in the shadow plane. Several types of geometries were used to envelop the area detected in the images (bounding boxes, oriented rectangles, enclosing circles and convex hulls).
Figure 10. OBIA approach in the shadow plane. Several types of geometries were used to envelop the area detected in the images (bounding boxes, oriented rectangles, enclosing circles and convex hulls).
Remotesensing 14 06006 g010
Figure 11. OBIA approach in the vegetation plane. Several types of geometries were used to envelop the area detected in the images (bounding boxes, oriented rectangles, enclosing circles and convex hulls).
Figure 11. OBIA approach in the vegetation plane. Several types of geometries were used to envelop the area detected in the images (bounding boxes, oriented rectangles, enclosing circles and convex hulls).
Remotesensing 14 06006 g011
Figure 12. Scatter plot and linear fitting between volume estimations using ground-truth data, assuming the tree crown as a sphere or an ellipsoid. The dashed line is the fitted linear function passing through the origin and the solid line is the 1:1 line.
Figure 12. Scatter plot and linear fitting between volume estimations using ground-truth data, assuming the tree crown as a sphere or an ellipsoid. The dashed line is the fitted linear function passing through the origin and the solid line is the 1:1 line.
Remotesensing 14 06006 g012
Figure 13. Scatter plot and linear fitting of the estimated volume vs. observed values (ground-truth), assuming the crown as (a) a sphere or (b) an ellipsoid. Estimated values using raster and random forest classification. Linear regression models with p < 0.05. The dashed line is the fitted linear function passing through the origin and the solid is the 1:1 line.
Figure 13. Scatter plot and linear fitting of the estimated volume vs. observed values (ground-truth), assuming the crown as (a) a sphere or (b) an ellipsoid. Estimated values using raster and random forest classification. Linear regression models with p < 0.05. The dashed line is the fitted linear function passing through the origin and the solid is the 1:1 line.
Remotesensing 14 06006 g013
Figure 14. Scatter plot and linear fitting of the estimated volume vs. observed values (ground-truth), assuming the crown is an ellipsoid (a,b) and a sphere (c,d). Estimated values: (a) bounding box, (b) oriented rectangle, (c) circle, and (d) convex hull. Linear regression models with p < 0.05. The dashed line is the fitted linear function passing through the origin and the solid is the 1:1 line.
Figure 14. Scatter plot and linear fitting of the estimated volume vs. observed values (ground-truth), assuming the crown is an ellipsoid (a,b) and a sphere (c,d). Estimated values: (a) bounding box, (b) oriented rectangle, (c) circle, and (d) convex hull. Linear regression models with p < 0.05. The dashed line is the fitted linear function passing through the origin and the solid is the 1:1 line.
Remotesensing 14 06006 g014
Table 1. Calculated solar azimuth, elevation and shadow ratio using NOAA Solar Calculator for the flight date, 29 July 2021, at 11:23, solar noon, sunset and sunrise (local time).
Table 1. Calculated solar azimuth, elevation and shadow ratio using NOAA Solar Calculator for the flight date, 29 July 2021, at 11:23, solar noon, sunset and sunrise (local time).
ParameterFlight Time
11:23
Solar Noon
Remotesensing 14 06006 i001
Sunset
Remotesensing 14 06006 i002
Sunrise
Remotesensing 14 06006 i003
Remotesensing 14 06006 i004Azimuth (α)106.24°180295.9463.94
Remotesensing 14 06006 i005Elevation (β)45.0°67.18−0.46−0.34
Remotesensing 14 06006 i006Shadow ratio1:11:0.42Not visibleNot visible
Table 2. Ground-truth values. h: distance from the canopy to the ground; a: height; b: length; c: width; Av.d.: Average diameter (m); V.s.: Volume of the sphere (m³); V.e.: Volume of the ellipsoid (m³).
Table 2. Ground-truth values. h: distance from the canopy to the ground; a: height; b: length; c: width; Av.d.: Average diameter (m); V.s.: Volume of the sphere (m³); V.e.: Volume of the ellipsoid (m³).
Treeh (m)a (m)b (m)c (m)Av.d. (m)V.s. (m³)V.e. (m³)
11.021.862.812.862.518.287.83
21.11.832.922.772.518.257.75
30.981.642.862.652.387.096.51
41.011.361.922.061.782.952.82
51.021.792.982.762.518.287.71
60.91.562.682.452.235.815.36
70.991.962.82.992.589.038.59
80.971.382.272.362.004.213.87
90.981.912.783.322.679.979.23
100.991.261.861.971.702.562.42
110.951.211.982.121.772.902.66
120.811.081.61.881.521.841.70
130.971.261.742.071.692.532.38
140.811.342.112.341.933.763.46
151.021.432.282.181.963.963.72
160.921.52.071.961.843.283.19
170.881.822.513.462.609.178.28
180.891.522.151.951.873.443.34
190.91.943.422.922.7611.0110.14
200.91.883.323.272.8211.7810.69
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Vélez, S.; Vacas, R.; Martín, H.; Ruano-Rosa, D.; Álvarez, S. A Novel Technique Using Planar Area and Ground Shadows Calculated from UAV RGB Imagery to Estimate Pistachio Tree (Pistacia vera L.) Canopy Volume. Remote Sens. 2022, 14, 6006. https://doi.org/10.3390/rs14236006

AMA Style

Vélez S, Vacas R, Martín H, Ruano-Rosa D, Álvarez S. A Novel Technique Using Planar Area and Ground Shadows Calculated from UAV RGB Imagery to Estimate Pistachio Tree (Pistacia vera L.) Canopy Volume. Remote Sensing. 2022; 14(23):6006. https://doi.org/10.3390/rs14236006

Chicago/Turabian Style

Vélez, Sergio, Rubén Vacas, Hugo Martín, David Ruano-Rosa, and Sara Álvarez. 2022. "A Novel Technique Using Planar Area and Ground Shadows Calculated from UAV RGB Imagery to Estimate Pistachio Tree (Pistacia vera L.) Canopy Volume" Remote Sensing 14, no. 23: 6006. https://doi.org/10.3390/rs14236006

APA Style

Vélez, S., Vacas, R., Martín, H., Ruano-Rosa, D., & Álvarez, S. (2022). A Novel Technique Using Planar Area and Ground Shadows Calculated from UAV RGB Imagery to Estimate Pistachio Tree (Pistacia vera L.) Canopy Volume. Remote Sensing, 14(23), 6006. https://doi.org/10.3390/rs14236006

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop