Next Article in Journal
Polarization Aberrations in High-Numerical-Aperture Lens Systems and Their Effects on Vectorial-Information Sensing
Next Article in Special Issue
Influence of UAS Flight Altitude and Speed on Aboveground Biomass Prediction
Previous Article in Journal
Approaches for Joint Retrieval of Wind Speed and Significant Wave Height and Further Improvement for Tiangong-2 Interferometric Imaging Radar Altimeter
Previous Article in Special Issue
Practicality and Robustness of Tree Species Identification Using UAV RGB Image and Deep Learning in Temperate Forest in Japan
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Automated Inventory of Broadleaf Tree Plantations with UAS Imagery

1
Department of Forestry and Natural Resources, Purdue University, 715 West State Street, West Lafayette, IN 47906, USA
2
School of Aviation Technology, Purdue University, 715 West State Street, West Lafayette, IN 47906, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(8), 1931; https://doi.org/10.3390/rs14081931
Submission received: 7 March 2022 / Revised: 6 April 2022 / Accepted: 14 April 2022 / Published: 16 April 2022
(This article belongs to the Special Issue UAV Applications for Forest Management: Wood Volume, Biomass, Mapping)

Abstract

:
With the increased availability of unmanned aerial systems (UAS) imagery, digitalized forest inventory has gained prominence in recent years. This paper presents a methodology for automated measurement of tree height and crown area in two broadleaf tree plantations of different species and ages using two different UAS platforms. Using structure from motion (SfM), we generated canopy height models (CHMs) for each broadleaf plantation in Indiana, USA. From the CHMs, we calculated individual tree parameters automatically through an open-source web tool developed using the Shiny R package and assessed the accuracy against field measurements. Our analysis shows higher tree measurement accuracy with the datasets derived from multi-rotor platform (M600) than with the fixed wing platform (Bramor). The results show that our automated method could identify individual trees (F-score > 90%) and tree biometrics (root mean square error < 1.2 m for height and <1 m2 for the crown area) with reasonably good accuracy. Moreover, our automated tool can efficiently calculate tree-level biometric estimations for 4600 trees within 30 min based on a CHM from UAS-SfM derived images. This automated UAS imagery approach for tree-level forest measurements will be beneficial to landowners and forest managers by streamlining their broadleaf forest measurement and monitoring effort.

1. Introduction

Forest inventory provides critical information for sustainable forest management [1]. Rapidly changing forest conditions require remote sensing methods to routinely monitor and inventory forests [2,3]. Satellite and aerial remote sensing have played a major role in forest mapping and inventory for the past decades, but their ability to map individual trees is restricted by factors such as low spatial resolution, cloud cover, and acquisition time [4,5,6,7]. Recent advancements in unmanned aerial systems (UASs) and digital aerial photogrammetry (DAP) have made it possible for routine implementation of digitalized tree measurements and forest inventories [8,9,10,11]. A UAS platform can be coupled with RGB, multispectral, or LiDAR sensors, offering flexibility with data collection [12,13,14,15,16]. This mode of data acquisition is cost-effective and offers high spatial and temporal imagery for precision monitoring and inventorying of forests [16,17].
UAS platforms are broadly categorized as fixed-wing or multi-rotor. Plot area and availability of open space for take-off and landing are some of the main considerations to employ a particular UAS platform [10,12,16]. Multi-rotor platforms are affordable and face fewer challenges in forested areas than fixed-wing ones [17,18,19]. The take-off and landing of a multi-rotor platform do not require broad open space [20,21]. However, a fixed-wing platform can image larger areas in a relatively short period of time, thereby reducing survey time [22]. Both types of platforms are widely used in forestry (Appendix A).
Accurate tree-level information, such as tree height, crown area, and crown closure, is essential to derive datasets useful for disease mapping [23], invasive species mapping [24,25], forest fire monitoring [26,27], estimating forest growth and health [28], structure [29], biomass [30], carbon stock [31], and forestland productivity [32,33,34]. A variety of algorithms have been proposed in the past two decades for individual tree delineation (ITD). Generally, most of the previous work employ the local maxima filtering on a canopy height model (CHM) using a fixed window to detect individual tree points [13,16,35,36,37]. The local maxima filtering approach assumes that pixels of higher values correspond to treetops, meaning that treetops can be delineated by identifying the brightest pixel of a tree crown from the CHM [37,38]. To delineate the crown area, previous studies employed a variety of region growing [35,39], valley flowing [40], template matching [41], and watershed segmentation algorithms [42]. These approaches usually worked best for coniferous trees with their distinct apex but had lower accuracies with broadleaf forests [43]. Broadleaf trees display asymmetrical, fuzzy, and branching crowns making traditional fixed-window-based algorithms perform poorly [43]. The accuracy of ITD algorithms depends largely upon the appropriate selection of the window size. Larger window sizes might accommodate multiple tree crowns while smaller ones may not include any tree apex [44,45]. Without prior in situ knowledge, selecting the right window might be a tedious and challenging task [35]. This limitation can be overcome by using a variable window filter [46] coupled with the marker controlled watershed segmentation method (MCWS) as the size of the moving window varies according to the brightest pixel, assuming that taller trees have larger crowns in a broadleaf forest. This technique also considers minimum tree height as a filter to omit codominant branches and underlying bushes, thereby eliminating oversegmentation [47].
Many studies have employed ITD algorithms for segmenting trees successfully. Birdal et al. [13] employed a 3 × 3 local maxima framework to obtain tree heights from an urban coniferous setting and achieved a 94% correlation with ground measurement. A similar study by Bonnet et al. [17] employed local maxima on different photogrammetric products such as individual rectified, orthorectified, and orthomosaic images, and observed 84% correlation with ground measurements (Appendix A). This study supported local maxima detection as a robust and versatile method while ignoring the accuracy of a fixed window to delineate complex tree canopy in a mixed forested area. Another study, by Carr and Slyder [14], explored temperate deciduous forest during the leaf-off season by manually identifying and segmenting trees using a region growing approach, and achieved a segmentation accuracy of 90.9%. Although most of these works employed local maxima filtering with fixed window size, few studies have explored a variable window filter to accommodate the varying crown size and spacing of trees [47,48,49]. These studies on individual tree measurements concentrated predominantly on semiautomated techniques using local maxima for mixed coniferous stands in boreal forests [15,22,38,50] (Appendix A).
Despite considerable research with UAS applications in forestry, UAS technology has not been consistently successful for broadleaf trees, primarily because of varying fuzzy crown structures. Here, we investigate the applicability of a UAS-based automated tree-level information extraction framework using a variable window filter with MCWS in a fairly closed canopy of oak and walnut plantations of differing ages. The main objectives are the following:
(1)
To test the accuracy of ITD with UAS-derived CHM.
(2)
To test the accuracy of biometrics measurements (tree height and crown area) in two broadleaf tree plantations of different species and ages using datasets from two different UAS platforms. In addition, to facilitate the application of automated tree-level measurements using UAS imagery, we developed a web-based application for deriving tree height and crown area for broadleaf tree plantations (https://feilab.shinyapps.io/Crown/ accessed on 12 December 2021).

2. Materials and Methods

2.1. Study Area and UAS Image Acquisition

The research was conducted in two broadleaf tree plantations at Martell Forest in West Lafayette, Indiana, USA (Figure 1a and Appendix B). One was a planted forest consisting of red oak (Q. rubra) and bur oak (Q. macrocarpa) trees of varying ages (10–12 years). With a total area of 7 ha (17 ac), the stand was planted on three plots identified as 119, 115, and 112. Plots 119 and 115 have red oaks bordered by bur oaks planted in 50 rows × 22 columns with a spacing of 4.8 m × 2.4 m. Plot 112 has alternating trees of red oak and bur oak in 50 rows × 50 columns. The main reason behind using this study area was to experiment on broadleaf trees with different canopy overlaps, while ground measurements were still feasible. Another forest is a plantation of black walnut (Juglans nigra) planted in the 1960s (Figure 1b). We randomly selected 224 trees (red dots in Figure 1a) using the random sampling technique in R [51] for ground measurements. For each tree we measured tree height using a hypsometer (Haglöf Vertex IV, www.haglof.se, accessed on 12 December 2021), and crown diameter using a diameter tape (Table 1). Tree locations and measurements were stored as a point feature class in ArcGIS 10.6 [52].
Image acquisition for these plantations was carried out by using two UAS platforms: a fixed-wing platform C-Astral Bramor PPX and a multi-rotor DJI M600. Both systems were equipped with a post-processing kinematic (PPK) GPS that corrects for geolocation error post-collection [53]. The flights were conducted at 2.00 p.m. (EST) to ensure constant sun angle with wind speed less than 10 knots. Cloud cover was zero over the study area during the flight. We used multiple platforms to test the reliability and consistency of the algorithm in extracting tree-level information from different datasets. A detailed summary of the platform and image characteristics, processing time, and output accuracies is presented in Table 2.

2.2. UAS Data Processing

All the raw images were post-processed by correcting the PPK rover log file with an established fixed location using the Continuously Operating Reference Station (CORS) network through EZSurv software (Site ID: INWL) (https://effigis.com/en/solutions/onpoz/ezsurv/ accessed on 10 October 2021). The resulting corrected imagery was then processed by using Pix4D software. The workflow includes three major stages: initial processing, generating point cloud, and building a digital surface model (DSM) and orthomosaic (Figure 2). The initial processing stage comprises camera location optimization and internal orientation checks through the PPK data. During this stage, the algorithm identifies features on the image as tie points and performs dense stereo matching for image alignment. After initial processing, a dense 3D point cloud was generated through image registration. These two stages have minimal to no manual input specification. Noise filtering and surface smoothing were enabled for filtering DSM, and raster interpolation was performed using the triangulation method to generate point clouds [54]. From the extracted point cloud, a digital elevation model (DEM) was derived by separating the ground points and tree points using Pix4D interpolation (Appendix B).

2.3. Individual Crown Detection and Height Measurement

To delineate accurate tree locations, we subtracted DEM from DSM and generated a CHM, and resampled the CHM to 5 cm with the nearest neighbor resampling method in R 3.6.2. The term canopy refers to the upper layer of a forest formed by tree crowns [13]. The CHM used in this study is a measure of the aboveground height of trees. We also employed a 3 × 3 Gaussian filter to the generated CHM to filter negative values for a consistent workflow. A Gaussian filter is a low-pass filter used to remove extremities in edges and boundaries, thereby reducing high-frequency components [17].
Dominant treetops on the smoothened CHM were detected using a variable window filter algorithm. A circular search window was used to accommodate the generally circular shape of the crown for this plantation area. The moving window marked local maxima for each tree apex. The local maxima were selected based on a minimum height and the relationship between a maxima height and its distance to the nearest brighter pixel. The maxima which did not fulfill the following criteria were discarded [37]. This condition ensures the minimization of false positives of treetops.
  d m d min + d prop × h m   h m   h min  
where dmin corresponds to the radiometric value of a pixel, hmin is the minimum canopy height, hm is the maximum height within the search window, and dm is the distance to the nearest pixel.
For this study, the selection parameters above for the window size were set to 0.05, 0.6 (dmin, dprop) to accommodate a smaller GSD. The minimum canopy height, hmin, was set to 3 m, meaning that any pixel below this value would not be considered as a crown [37,55]. This value avoided the increment in false positives resulting from understory vegetation pixels. The output from this filtering procedure is a spatially referenced point file indicating the location of individual treetops and their corresponding height. With the identified treetop location, we further employed MCWS to delineate individual crowns within the study area. The MCWS method assumes a tree crown as an inverted watershed and uses markers within a window for segmentation. Neighboring pixels around each local maxima are given priority based on the gradient magnitude of the pixel. Pixels are labeled as a neighboring marker if they exhibit the lowest gradient magnitude [44,56]. Non-labeled pixels, including the new ones, are revaluated to be associated with a marker, based on their magnitude. More pixels are associated with higher value markers, assuming that taller trees have a wider crown spread. The crown diameter was then calculated from the crown area and compared with ground measurements observed through the average crown spread method ((longest spread + longest cross spread)/2). A subsequent spatially referenced point file for treetops and a polygon file for the crown area were created for further analysis with the ground measurements. All image analyses were performed in R (ForestTools) [57] and the output files were assigned with WGS84 UTM zone 16, allowing for use in ArcGIS 10.6 and other forms of GIS software.
We evaluated the proposed workflow using three measures: correlation of determination (R2), root mean square error (RMSE), and tree detection accuracy with ground points to validate the algorithm deriving tree location and height-crown measures (Figure 3). Tree detection accuracy was determined by manually interpreting with reference tree points observed from the ground data. The total error for the derived tree height and crown diameter was expressed with RMSE. A tree segment is considered a correct match if it contains a ground-observed tree position. Trees crossing the plot boundary were removed. The segmentation accuracy of individual crowns was assessed using F-score, tree detection rate or recall (rc), and correctness of detected trees or precision (pr). These measures were calculated using the true positive (TP), false negative (FN), and false positive (FP) detection rates, indicating perfect segmentation, undersegmentation, and oversegmentation, respectively [38,58]. The following equations were used to calculate these metrics [58,59] (Appendix C).
rc = TP ( TP + FN )
pr = TP ( TP + FP )
F - score = 2 × ( rc × pr ) ( rc + pr )

2.4. Web-Based Automation

To allow users to take advantage of our tree-level information extraction algorithm, we used Shiny R to develop a web application that will automate individual tree detection and segmentation processes. Users can input height models such as DSM and DEM through this interactive user interface developed through Shiny R [60]. The overall workflow described in the “Individual Crown Detection and Height Measurement” section is implemented in the backend of the Shiny server. User-provided inputs will be processed, and the crown segmented shapefile will be supplied for result downloading (Appendix D).

3. Results

Our automated analysis identified a total of 4449 trees (95.3%) out of the 4668 ground trees in the oak plantation with the Bramor data and 4608 trees (98.7%) with the M600 data. This automated approach computed individual tree parameters for 4600 trees approximately in 10 min, whereas ground measurements for 240 trees took three days for three people. The minimum and maximum heights observed from the UAS-derived algorithm were 3.88 m and 12.98 m, respectively, with a mean of 8.66 m, and crown diameter ranging from 1.10 m to 6.67 m, with a mean of 3.85 m using the Bramor dataset. The red oak plantation was also surveyed using the DJI M600 platform from which the derived tree height ranged from 3.4 m to 10.7 m and the crown diameter ranged from 1.1 m to 6.6 m. By repeating the same procedure in a relatively old, closed-canopy black walnut plantation, we were able to estimate 204 treetops (95.8%) out of the 213 trees. The UAS-derived tree height ranged between 9.36 m to 35.81 m with a mean of 18.5 m, and the crown diameter was between 3.61 m to 8.85 m with a mean diameter of 5.97 m.
Using the fixed window filter, smaller window sizes had higher commission error, whereas larger window sizes had higher omission error (Figure 4a–d). Tree tops detected using the variable window filter produced lower commission and omission error for both datasets (Bramor and M600) in both plantations (young red oak and mature black walnut). Among the fixed window filters, the 5 × 5 filter was the best in detecting most trees (80% tree located), while the 3 × 3 filter had high commission rate due to detecting all possible maxima. The increase in window size 5 × 5 to 7 × 7 resulted in poorer ITD results for the younger oak plantation. The variable window filter had an ITD of 83% for the oak plantation (Figure 4e,f). Tree detection rates were also higher when using the variable window filter (83–95%). Detailed information on TP, FP, FN, recall, precision, omission, and commission error are presented in Table 3.
Tree detection accuracy using the variable window filter coupled with MCWS approach was reasonably high for oak plantations (F-score = 0.91 with the Bramor; F-score = 0.93 with the M600), and the precision for individual tree detection was better with the M600 dataset (pr = 0.88 with the Bramor; pr = 0.9 with M600), indicating the stability of a multi-rotor platform (Table 3). For the walnut plantation, tree detection accuracy was 0.95 while recall and precision were 0.95 and 0.96, respectively. The omission and commission errors were relatively lower for both plantations with both datasets. The omission error for the treetops detected for the red oak was 3% with M600 data, and 5% with the Bramor data. Likewise, the commission error for the oak plantation of the datasets from M600 was observed to be 10%, while the Bramor had a commission error of 12%. For the walnut plantation, both the commission (4%) and omission error (5%) were considerably lower.
In general, UAS-based tree height and crown diameter estimations were highly correlated with ground-based measurements (Figure 5a,b). Correlation between UAS-based and ground measure was higher for tree height and crown diameter (0.93 and 0.78, respectively) using the Bramor dataset, while the adjusted coefficient of determination for the tree measurements using the M600 was relatively low (R2 = 0.67 for height; R2 = 0.78 for crown). The RMSE observed for tree height was 0.727 m and crown diameter was 0.434 m, which were lower than the RMSE of tree height and crown diameter measured with M600 data (RMSE = 1.4 m for height; RMSE = 1.2 m for crown). Similarly, the slope and intercept values for the crown diameter were less than 1 for both platforms (Crn = 0.79 with Bramor; Crn = 0.84 with M600), indicating that either the prescribed automated procedure underestimated crown diameter measure or the field measurements overestimated the crown diameter.

4. Discussion

In this study, we present an automated framework using MCWS to delineate individual trees and estimate tree height and crown area. The procedure was tested with UAS imagery collected over oak and walnut plantations of varying tree sizes and ages. The results were validated using TP, FP, FN, R2, and RMSE. We also provide a shiny app that can be utilized by non-UAS professionals to conduct automated broadleaf forest inventory.
The estimation of the proposed framework to detect tree positions and calculate height and crown area was well correlated with ground measures for both types of plantations (Figure 5, Table 3). Our results are comparable to the best results from previous studies (Appendix A) for utilizing UAS for forest inventory. Fankhauser et al. (2018) [15] achieved an R2 of 0.82 and an RMSE of 2.92 m for tree height measurement in a boreal forest. Carr and Slyder, (2018) [14] employed a manual approach to segment individual trees and measure tree height, for which they obtained an R2 of 0.82 and RMSE of 1.06 m for a deciduous forest during the leaf-off season. Our proposed methodology achieved a higher correlation for tree height and crown diameter compared to other studies (R2 = 0.93 for height; R2 = 0.79 for crown) during the leaf-on season (Figure 5). The RMSE was less than a meter for both tree height and crown diameter (RMSE = 0.727 m for height; RMSE = 0.434 m for crown) due to quality height models derived from high-precision images captured using PPK-corrected UAS data, use of a variable window filter, and MCWS for accurate segmentation. Using the proposed approach, this study detected 90% of the treetops accurately, accounting for varying ages and crown sizes. Given the relatively high accuracy achieved in this study, we believe our approach can be applied to other broadleaf tree plantations.
Individual tree detection rates show fewer false positives for the walnut plantation compared to the oak plantations (Table 3). This is due to periodic thinning and distinct crown structure exhibited by the 50-year-old walnut plantation. The height of the tallest tree observed for the walnut plantation was 35 m while it was 12 m for the red oak. This significant difference in height helps distinguish mature trees from underlying vegetations and thereby decrease the commission error. When comparing the two UAS platform datasets (Bramor and M600), the CHM generated from the M600 had higher tree detection accuracy (98%) and lower commission (10%) and omission error (3%), even though the sensor had a relatively low resolution. The stability of a multi-rotor platform during flight minimizes wind effect and thus can capture good-quality imagery, making it an ideal platform for forest monitoring (Appendix A). In contrast, a fixed-wing platform is nimble and fast. However, it is prone to be affected by wind, resulting in wind-swayed treetops [20,21,54], which is probably responsible for the lower accuracy observed in ITD compared to the multi-rotor-platform-based inventory. It is important to note that the combination of a variable window filter with MCWS results in higher individual tree detection and measurement accuracy than the fixed window filter for both the datasets in both the plantations (Table 3). It is also important to note that when using fixed window filters, identifying the optimal window size has to be achieved through trial-and-error method. The variable window filter used in this study for detecting local maxima can alter future tree measurement practices with its varying window size and hmin to accommodate for the fuzzy crown structure of broadleaf trees. User-defined hmin filters out the underlying bushes and codominant branches, thereby controlling oversegmentation [35,37].
The web tool developed to automate the workflow is a straightforward three-click open-source application for users to obtain accurate tree positions, effectively reducing biophysical ground measurement. The minimum tree height input (hmin) in this application acts as a filter to eliminate any underlying vegetation and background noise from the input image. This minimum height filtration, along with the variable window, objectively detects tree points and can be replicated for various broadleaf tree species without prior in situ knowledge. Accurate determination of such biophysical parameters will also prove useful for varied forestry applications such as tree age determination [17], biomass calculation [61], and timber quantification [62]. Tree parameters for 4600 trees were calculated from the CHM within 10 min through the web application, thus indicating the efficiency of this automated technique in measuring thousands of trees within a short amount of time. This method provides a supportive basis for accurate remote measurement of trees in the future.
Although our study provides acceptable accurate estimates of tree height and crown diameter, few studies have shown higher accuracies on conifer forests [13,50]. Decreasing accuracies in broadleaf forests using this methodology could be due to uncertainty in the field measurements of tree height and crown diameter. Indirect field measurements using a vertex hypsometer tend to result in a larger error range due to higher offsets [22,63]. In addition, crown measurements are labor-intensive and can result in human error, as overlapping crowns are difficult to measure [47]. Reliable field measurement can be obtained by using LiDAR instruments [64] or directly measuring felled trees [22]. In addition, UAS-based treetop position estimates may not perfectly align with the ground-observed tree position, which usually is the trunk location rather than crown top points, and treetop positions are harder to estimate using field-based methods through overlapping branches and leaves. It will be important to set thresholds for ITD results in dense canopy forests [48,65].
This study mainly focuses on broadleaf tree plantations of varying size, age, and species. While this study explores automating forest inventory procedures with the datasets of consumer-grade UAS systems, optimal overlap accuracy has not been tested for cost-efficient forest management. It is notable to mention that UAS data processing costs can be minimized by using open-source software such as Open Drone Map [66] and VisualSfM [67]. The Shiny application is also constrained in terms of data size and extent matching. At this point, the application will perform optimally for datasets smaller than 100 MB provided the extent matches between the DSM and the DEM. To produce an accurate DEM for closed-canopy forest systems, it is preferable to use a high-resolution LiDAR DEM or obtain UAS datasets during the leaf-off season [14,68]. In addition, environmental factors such as wind speed, temperature variation, and cloud cover (including fog, mist, and snow) must be considered as variables that can diminish the quality of a photogrammetrically derived DEM [54].
Future forest maintenance requires information in a computerized format for continuous and repeatable workflow execution, and UAS-derived data offer a promising future in that sense. Although the integrative methodology adopted here is an initiative to employ UAS for studying broadleaf tree plantations, more complexity parameters need to be considered for extending to structurally complex, naturally regenerated, mixed forests. Based on the findings of this study, future research should be directed towards decreasing the uncertainties from reference data and implementing an automated workflow for tree detection and measurement in complex forest types.

5. Conclusions

This study demonstrates the ability of UASs in capturing imagery for deriving accurate height models for individual tree parameter estimation in broadleaf tree plantations of differing ages and crown sizes. Biophysical parameters derived from imagery captured by multiple platforms were evaluated against ground measurements. From our study, a UAS-SfM-based height model coupled with a variable window filter and MCWS technique can estimate individual trees with high accuracy (F-score > 0.90) for two broadleaf tree plantations varying in size and age. The proposed method estimated tree height and crown measures with sub-meter accuracy (RMSE~0.73 m). Our open-source web application can be utilized by forest managers and landowners with ease. This study mainly focused on ITD for broadleaf tree plantations and has not been tested for natural forest systems. Although the results of this study show promise in extending the application of UAS-image-based inventory to older broadleaf tree plantations, further research should be directed towards the transferability of this approach to other forest plantations.

Author Contributions

A.C. and G.S. conceived and designed the experiments. A.C., Z.M. and J.H. collected and processed UAS data. A.C. analyzed the data with guidance from G.S., S.F. and A.C. wrote the paper and G.S., S.F. and J.H. edited and finalized the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the USDA National Institute of Food and Agriculture McIntire Stennis Project (IND011523MS).

Data Availability Statement

Restrictions apply to the availability of these data. Data was obtained from Dr. Joseph Hupy and are available [from the authors ([email protected])] with the permission of Dr. Joseph Hupy.

Acknowledgments

We would like to thank the Department of Forestry and Natural Resources at Purdue University for providing immense support for this project. We are very grateful to James Warren of the USDA forest service for his accurate tree identification and help with ground measurements. We also thank Zhaofei Wen, postdoctoral scholar, and Rachel Brummet, an undergraduate student for collecting data essential for analysis.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Summary of selected works to illustrate forest measurement practices and the accuracy using UAS data.
Table A1. Summary of selected works to illustrate forest measurement practices and the accuracy using UAS data.
SourceApplicationUAS
Model
UAS PlatformSoftware Used Data Used and Steps Automation LevelAssessment of Tree Height Measurement
[39]Tree crown delineation; mixed forestsenseFly eBeeFixed-wingAgisoftUAV-SfM-Region growingAutomatedSegment accuracy-0.85–0.88
[3]Tree delineation and measurement- conifer standsPhantom 4 ProMulti-rotorAgisoftUAV-SfMSemi-automatedRMSE:
0.62 m
[11]Forest structure; Subtropical dry forestPhantom 4 ProMulti-rotorAgisoftUAV-SfMSemi-automatedr: 0.94
RMSE:
2.15 m
[13] Tree height; coniferous trees SenseFly eBee Fixed-wing Pix4D-GCP DSM point clouds and LMF Semi-Automated R2: 0.94
RMSE: 28 cm
[17] Tree detection; coniferous stands Gatewing X100 Fixed-wing Micmac DSM point clouds and LMF Semi-Automated R2: 0.83
* RMSE:
1.39 m
[14]Tree segmentation; deciduous forest * DJI P3 Multi-rotor Pix4D LiDAR Point cloud Manual R2: 0.82
RMSE: 0.106 m
[28] Tree height growth; temperate mixed forest * DJI P3 Pro Multi-rotor Agisoft Orthoimage Manual -
[38] Tree detection; mixed conifer forest * DJI P3 Quadcopter Multi-rotor Agisoft Point cloud to generate CHM Semi-automated Overall tree detection accuracy–0.85
[50] Tree height: Scots pine OctoXLOctocopter Multi-rotor Pix4D Point clouds and orthomosaic–LMFSemi-automated R2: 0.971
RMSE: 0.34 m
[15]Tree height; pine trees 3D Robotics SoloMulti-rotor Agisoft LiDAR point clouds and UAS imagery- LMFSemi-automated R2: 0.82
RMSE: 2.92 m
[22] Tree height; Douglas firGyrocopterMulti-rotor SURE Aerial LiDAR and UAV point clouds Semi-automated RMSE: 1.09 m
* RMSE was converted from cm to meters. LMF refers to the local maxima filtering algorithm.

Appendix B

Figure A1. Location map of the two plantation areas. Top: black walnut; bottom: red oak; located in Indiana, USA.
Figure A1. Location map of the two plantation areas. Top: black walnut; bottom: red oak; located in Indiana, USA.
Remotesensing 14 01931 g0a1
Table A2. Pix4D processing workflow settings for the datasets.
Table A2. Pix4D processing workflow settings for the datasets.
Initial ProcessingUser Settings
Input image coordinate systemWGS84 EGM Geoid
Output image coordinate systemWGS84/ UTM zone 16N (EGM 96 Geoid)
Key point image scaleFull, Image scale = 0.5
Matching image pairsAerial grid or corridor
Key point extraction: Targeted number of key pointsAutomatic
Calibration methodStandard
Internal parameters optimizationAll
External parameters optimizationAll
Point Cloud Optimization
Image scale1/2 image size; multiscale
Point densityOptimal
Minimum number of matches3
3D Textured mesh resolutionMedium resolution (default)
DSM, Orthomosaic and Index
DSM and orthomosaic resolution1 × GSD
Noise filteringYes
Surface smoothingYes; type: sharp
Raster DSM generation methodTriangulation
OrthomosaicGenerate, merge tiles and Geotiff without transparency

Appendix C

Figure A2. Illustration for calculating the TP, FP, and FN from the segmented data.
Figure A2. Illustration for calculating the TP, FP, and FN from the segmented data.
Remotesensing 14 01931 g0a2

Appendix D

User manual and instructions to use the app are available online: https://feilab.shinyapps.io/Crown/ (accessed on 12 December 2021).
Sample DSM and DEM for the website have been collected from the following:
Hudak, Andrew T.; Liebermann, Robert J.; Moreira, Eder P.; Gessler, Paul E. 2013. Digital surface, terrain, and canopy height models for Priest River Experimental Forest in 2002. 1st Edition. Fort Collins, CO: U.S. Department of Agriculture, Forest Service, Rocky Mountain Research Station. https://doi.org/10.2737/RDS-2013-0001.
Remotesensing 14 01931 g0a3aRemotesensing 14 01931 g0a3bRemotesensing 14 01931 g0a3c

References

  1. Němec, P. Comparison of Modern Forest Inventory Method with the Common Method for Management of Tropical Rainforest in the Peruvian Amazon. J. Trop. For. Sci. 2015, 27, 80–91. [Google Scholar]
  2. Gougeon, F.A.; Leckie, D.G. The Individual Tree Crown Approach Applied to Ikonos Images of a Coniferous Plantation Area. Photogramm. Eng. Remote Sens. 2006, 72, 1287–1297. [Google Scholar] [CrossRef]
  3. Creasy, M.B.; Tinkham, W.T.; Hoffman, C.M.; Vogeler, J.C. Potential for Individual Tree Monitoring in Ponderosa Pine Dominated Forests Using Unmanned Aerial System Structure from Motion Point Clouds. Can. J. For. Res. 2021, 51, 1093–1105. [Google Scholar] [CrossRef]
  4. Stoffels, J.; Mader, S.; Hill, J.; Werner, W.; Ontrup, G. Satellite-Based Stand-Wise Forest Cover Type Mapping Using a Spatially Adaptive Classification Approach. Eur. J. For. Res. 2012, 131, 1071–1089. [Google Scholar] [CrossRef]
  5. Marchetti, F.; Arbelo, M.; Moreno-Ruíz, J.A.; Hernández-Leal, P.A.; Alonso-Benito, A. Multitemporal WorldView Satellites Imagery for Mapping Chestnut Trees. In Proceedings of the SPIE—The International Society for Optical Engineering, Warsaw, Poland, 11–14 September 2017; Volume 10421. [Google Scholar] [CrossRef]
  6. Scheer, L.; Sltko, R. Assessment of Some Forest Characteristics Employing IKONOS Satellite Data. J. For. Sci. 2007, 53, 345–351. [Google Scholar] [CrossRef] [Green Version]
  7. Filewod, B.; Kant, S. Identifying Economically Relevant Forest Types from Global Satellite Data. For. Policy Econ. 2021, 127, 102452. [Google Scholar] [CrossRef]
  8. North, E.A.; D’Amato, A.W.; Russell, M.B. Performance Metrics for Street and Park Trees in Urban Forests. J. For. 2018, 116, 547–554. [Google Scholar] [CrossRef] [Green Version]
  9. Goodbody, T.R.H.; Coops, N.C.; Marshall, P.L.; Tompalski, P.; Crawford, P. Unmanned Aerial Systems for Precision Forest Inventory Purposes: A Review and Case Study. For. Chron. 2017, 93, 71–81. [Google Scholar] [CrossRef] [Green Version]
  10. Frayer, W.E.; Furnival, G.M. Forest Survey Sampling Designs: A History. J. For. 1999, 97, 4–10. [Google Scholar] [CrossRef]
  11. Gobbi, B.; van Rompaey, A.; Loto, D.; Gasparri, I.; Vanacker, V. Comparing Forest Structural Attributes Derived from UAV-Based Point Clouds with Conventional Forest Inventories in the Dry Chaco. Remote Sens. 2020, 12, 4005. [Google Scholar] [CrossRef]
  12. Tang, L.; Shao, G. Drone Remote Sensing for Forestry Research and Practices. J. For. Res. 2015, 26, 791–797. [Google Scholar] [CrossRef]
  13. Birdal, A.C.; Avdan, U.; Türk, T. Estimating Tree Heights with Images from an Unmanned Aerial Vehicle. Geomat. Nat. Hazards Risk 2017, 8, 1144–1156. [Google Scholar] [CrossRef] [Green Version]
  14. Carr, J.C.; Slyder, J.B. Individual Tree Segmentation from a Leaf-off Photogrammetric Point Cloud. Int. J. Remote Sens. 2018, 39, 5195–5210. [Google Scholar] [CrossRef]
  15. Fankhauser, K.E.; Strigul, N.S.; Gatziolis, D. Augmentation of Traditional Forest Inventory and Airborne Laser Scanning with Unmanned Aerial Systems and Photogrammetry for Forest Monitoring. Remote Sens. 2018, 10, 1562. [Google Scholar] [CrossRef] [Green Version]
  16. Caruso, G.; Zarco-Tejada, P.J.; González-Dugo, V.; Moriondo, M.; Tozzini, L.; Palai, G.; Rallo, G.; Hornero, A.; Primicerio, J.; Gucci, R. High-Resolution Imagery Acquired from an Unmanned Platform to Estimate Biophysical and Geometrical Parameters of Olive Trees under Different Irrigation Regimes. PLoS ONE 2019, 14, e0210804. [Google Scholar] [CrossRef] [Green Version]
  17. Bonnet, S.; Lisein, J.; Lejeune, P. Comparison of UAS Photogrammetric Products for Tree Detection and Characterization of Coniferous Stands. Int. J. Remote Sens. 2017, 38, 5310–5337. [Google Scholar] [CrossRef]
  18. Cromwell, C.; Giampaolo, J.; Hupy, J.; Miller, Z.; Chandrasekaran, A. A Systematic Review of Best Practices for UAS Data Collection in Forestry-Related Applications. Forests 2021, 12, 957. [Google Scholar] [CrossRef]
  19. Kuželka, K.; Surový, P. Mapping Forest Structure Using UAS inside Flight Capabilities. Sensors 2018, 18, 2245. [Google Scholar] [CrossRef] [Green Version]
  20. Guimarães, N.; Pádua, L.; Marques, P.; Silva, N.; Peres, E.; Sousa, J.J. Forestry Remote Sensing from Unmanned Aerial Vehicles: A Review Focusing on the Data, Processing and Potentialities. Remote Sens. 2020, 12, 1046. [Google Scholar] [CrossRef] [Green Version]
  21. Torresan, C.; Berton, A.; Carotenuto, F.; di Gennaro, S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.; Zaldei, A.; Wallace, L. Forestry Applications of UAVs in Europe: A Review. Int. J. Remote Sens. 2017, 38, 2427–2447. [Google Scholar] [CrossRef]
  22. Ganz, S.; Käber, Y.; Adler, P. Measuring Tree Height with Remote Sensing—A Comparison of Photogrammetric and LiDAR Data with Different Field Measurements. Forests 2019, 10, 694. [Google Scholar] [CrossRef] [Green Version]
  23. Pasher, J.; King, D.J. Mapping Dead Wood Distribution in a Temperate Hardwood Forest Using High Resolution Airborne Imagery. For. Ecol. Manag. 2009, 258, 1536–1548. [Google Scholar] [CrossRef]
  24. Dorigo, W.; Lucieer, A.; Podobnikar, T.; Carni, A. Mapping Invasive Fallopia Japonica by Combined Spectral, Spatial, and Temporal Analysis of Digital Orthophotos. Int. J. Appl. Earth Obs. Geoinf. 2012, 19, 185–195. [Google Scholar] [CrossRef]
  25. Dash, J.P.; Watt, M.S.; Paul, T.S.H.; Morgenroth, J.; Pearse, G.D. Early Detection of Invasive Exotic Trees Using UAV and Manned Aircraft Multispectral and LiDAR Data. Remote Sens. 2019, 11, 1812. [Google Scholar] [CrossRef] [Green Version]
  26. Shin, J.; Seo, W.; Kim, T.; Park, J.; Woo, C. Using UAV Multispectral Images for Classification of Forest Burn Severity—A Case Study of the 2019 Gangneung Forest Fire. Forests 2019, 10, 1025. [Google Scholar] [CrossRef] [Green Version]
  27. Meng, R.; Wu, J.; Zhao, F.; Cook, B.D.; Hanavan, R.P.; Serbin, S.P. Measuring Short-Term Post-Fire Forest Recovery across a Burn Severity Gradient in a Mixed Pine-Oak Forest Using Multi-Sensor Remote Sensing Techniques. Remote Sens. Environ. 2018, 210, 282–296. [Google Scholar] [CrossRef]
  28. Dempewolf, J.; Nagol, J.; Hein, S.; Thiel, C.; Zimmermann, R. Measurement of Within-Season Tree Height Growth in a Mixed Forest Stand Using UAV Imagery. Forests 2017, 8, 231. [Google Scholar] [CrossRef] [Green Version]
  29. Panagiotidis, D.; Abdollahnejad, A.; Surový, P.; Chiteculo, V. Determining Tree Height and Crown Diameter from High-Resolution UAV Imagery. Int. J. Remote Sens. 2017, 38, 2392–2410. [Google Scholar] [CrossRef]
  30. Kachamba, J.D.; Ørka, O.H.; Gobakken, T.; Eid, T.; Mwase, W. Biomass Estimation Using 3D Data from Unmanned Aerial Vehicle Imagery in a Tropical Woodland. Remote Sens. 2016, 8, 968. [Google Scholar] [CrossRef] [Green Version]
  31. Jones, A.R.; Raja Segaran, R.; Clarke, K.D.; Waycott, M.; Goh, W.S.H.; Gillanders, B.M. Estimating Mangrove Tree Biomass and Carbon Content: A Comparison of Forest Inventory Techniques and Drone Imagery. Front. Mar. Sci. 2020, 6, 784. [Google Scholar] [CrossRef] [Green Version]
  32. Guerra-Hernández, J.; González-Ferreiro, E.; Monleón, V.J.; Faias, S.P.; Tomé, M.; Díaz-Varela, R.A. Use of Multi-Temporal UAV-Derived Imagery for Estimating Individual Tree Growth in Pinus Pinea Stands. Forests 2017, 8, 300. [Google Scholar] [CrossRef]
  33. Iizuka, K.; Yonehara, T.; Itoh, M.; Kosugi, Y. Estimating Tree Height and Diameter at Breast Height (DBH) from Digital Surface Models and Orthophotos Obtained with an Unmanned Aerial System for a Japanese Cypress (Chamaecyparis Obtusa) Forest. Remote Sens. 2018, 10, 13. [Google Scholar] [CrossRef] [Green Version]
  34. Piermattei, L.; Karel, W.; Wang, D.; Wieser, M.; Mokroš, M.; Surový, P.; Koreň, M.; Tomaštík, J.; Pfeifer, N.; Hollaus, M. Terrestrial Structure from Motion Photogrammetry for Deriving Forest Inventory Data. Remote Sens. 2019, 11, 950. [Google Scholar] [CrossRef] [Green Version]
  35. Novotný, J.; Hanuš, J.; Lukeš, P.; Kaplan, V. Individual Tree Crowns Delineation Using Local Maxima Approach and Seeded Region Growing Technique. In Proceedings of the Symposium GIS Ostrava, Ostrava, Czech Republic, 24–26 January 2011; pp. 23–26. [Google Scholar]
  36. Korpela, I.; Anttila, P.; Pitkänen, J. The Performance of a Local Maxima Method for Detecting Individual Tree Tops in Aerial Photographs. Int. J. Remote Sens. 2006, 27, 1159–1175. [Google Scholar] [CrossRef]
  37. Monnet, J.-M.; Mermin, E.; Chanussot, J.; Berger, F. Tree Top Detection Using Local Maxima Filtering: A Parameter Sensitivity Analysis. In Proceedings of the 10th International Conference on LiDAR Applications for Assessing Forest Ecosystems (Silvilaser 2010), Freivurg, Germany, 14–17 September 2010. [Google Scholar]
  38. Mohan, M.; Silva, A.C.; Klauberg, C.; Jat, P.; Catts, G.; Cardil, A.; Hudak, T.A.; Dia, M. Individual Tree Detection from Unmanned Aerial Vehicle (UAV) Derived Canopy Height Model in an Open Canopy Mixed Conifer Forest. Forests 2017, 8, 340. [Google Scholar] [CrossRef] [Green Version]
  39. Gu, J.; Congalton, R.G. Individual Tree Crown Delineation from UAS Imagery Based on Region Growing by Over-Segments With a Competitive Mechanism. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–11. [Google Scholar] [CrossRef]
  40. Gougeon, F.A. A Crown-Following Approach to the Automatic Delineation of Individual Tree Crowns in High Spatial Resolution Aerial Images. Can. J. Remote Sens. 1995, 21, 274–284. [Google Scholar] [CrossRef]
  41. Pirotti, F. Assessing a Template Matching Approach for Tree Height and Position Extraction from Lidar-Derived Canopy Height Models of Pinus Pinaster Stands. Forests 2010, 1, 194–208. [Google Scholar] [CrossRef]
  42. Wang, L.; Gong, P.; Biging, G.S. Individual Tree-Crown Delineation and Treetop Detection in High-Spatial-Resolution Aerial Imagery. Photogramm. Eng. Remote Sens. 2004, 70, 351–357. [Google Scholar] [CrossRef] [Green Version]
  43. Jaskierniak, D.; Lucieer, A.; Kuczera, G.; Turner, D.; Lane, P.N.J.; Benyon, R.G.; Haydon, S. Individual Tree Detection and Crown Delineation from Unmanned Aircraft System (UAS) LiDAR in Structurally Complex Mixed Species Eucalypt Forests. ISPRS J. Photogramm. Remote Sens. 2021, 171, 171–187. [Google Scholar] [CrossRef]
  44. Parvati, K.; Prakasa Rao, B.S.; Mariya Das, M. Image Segmentation Using Gray-Scale Morphology and Marker-Controlled Watershed Transformation. Discret. Dyn. Nat. Soc. 2008, 2008, 384346. [Google Scholar] [CrossRef] [Green Version]
  45. Li, B.; Pan, M.; Wu, Z. An Improved Segmentation of High Spatial Resolution Remote Sensing Image Using Marker-Based Watershed Algorithm. In Proceedings of the 2012 20th International Conference on Geoinformatics, Hong Kong, China, 15–17 June 2012; pp. 1–5. [Google Scholar] [CrossRef]
  46. Yin, D.; Wang, L. Individual Mangrove Tree Measurement Using UAV-Based LiDAR Data: Possibilities and Challenges. Remote Sens. Environ. 2019, 223, 34–49. [Google Scholar] [CrossRef]
  47. Nuijten, R.J.G.; Coops, N.C.; Goodbody, T.R.H.; Pelletier, G. Examining the Multi-Seasonal Consistency of Individual Tree Segmentation on Deciduous Stands Using Digital Aerial Photogrammetry (DAP) and Unmanned Aerial Systems (UAS). Remote Sens. 2019, 11, 739. [Google Scholar] [CrossRef] [Green Version]
  48. Zhen, Z.; Quackenbush, L.J.; Zhang, L. Impact of Tree-Oriented Growth Order in Marker-Controlled Region Growing for Individual Tree Crown Delineation Using Airborne Laser Scanner (ALS) Data. Remote Sens. 2014, 6, 555–579. [Google Scholar] [CrossRef] [Green Version]
  49. Amiri, N. Assessment of Marker-Controlled Watershed Segmentation Algorithm for Individual Tree Top Detection and Crown Delineation. Ph.D. Thesis, University of Twente Faculty of Geo-Information and Earth Observation (ITC), Enschede, The Netherlands, 2014. [Google Scholar]
  50. Krause, S.; Sanders, G.M.T.; Mund, J.-P.; Greve, K. UAV-Based Photogrammetric Tree Height Measurement for Intensive Forest Monitoring. Remote Sens. 2019, 11, 758. [Google Scholar] [CrossRef] [Green Version]
  51. R core Team. R: A Language and Environment for Statistical Computing. 2014. Available online: https://www.r-project.org/ (accessed on 15 December 2021).
  52. ESRI. ArcGIS Pro Desktop. Redlands, CA: Environmental Systems Research Institute. 2011. Available online: https://www.esri.com/en-us/arcgis/products/arcgis-pro/ (accessed on 15 December 2021).
  53. Miller, Z.M.; Hupy, J.; Chandrasekaran, A.; Shao, G.; Fei, S. Application of Postprocessing Kinematic Methods with UAS Remote Sensing in Forest Ecosystems. J. For. 2021, 119, 454–466. [Google Scholar] [CrossRef]
  54. Iglhaut, J.; Cabo, C.; Puliti, S.; Piermattei, L.; O′Connor, J.; Rosette, J. Structure from Motion Photogrammetry in Forestry: A Review. Curr. For. Rep. 2019, 5, 155–168. [Google Scholar] [CrossRef] [Green Version]
  55. Pirotti, F.; Kobal, M.; Roussel, J. A Comparison of Tree Segmentation Methods Using very High Density Airborne Laser Scanner Data. ISPRS—Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 285–290. [Google Scholar] [CrossRef] [Green Version]
  56. Wang, D.; Vallotton, P. Improved Marker-Controlled Watershed Segmentation with Local Boundary Priors. In Proceedings of the 2010 25th International Conference of Image and Vision Computing New Zealand, Queenstown, New Zealand, 8–9 November 2010; pp. 1–6. [Google Scholar] [CrossRef]
  57. Plowright, A.; Plowright, M.A. Package ‘ForestTools’. CRAN. 2018. Available online: https://github.com (accessed on 20 July 2020).
  58. Goutte, C.; Gaussier, E. A Probabilistic Interpretation of Precision, Recall and F-Score, with Implication for Evaluation. In European Conference on Information Retrieval; Springer: Berlin/Heidelberg, Germany, 2005; Volume 3408. [Google Scholar] [CrossRef]
  59. Sokolova, M.; Japkowicz, N.; Szpakowicz, S. Beyond Accuracy, F-Score and ROC: A Family of Discriminant Measures for Performance Evaluation BT-AI 2006: Advances in Artificial Intelligence; Sattar, A., Kang, B., Eds.; Springer: Berlin/Heidelberg, Germany, 2006; pp. 1015–1021. [Google Scholar]
  60. Chang, W.; Cheng, J.; Allaire, J.; Xie, Y.; McPherson, J. Shiny: Web Application Framework for R. R Package Version 2017, 1, 2017. [Google Scholar]
  61. Mao, P.; Qin, L.; Hao, M.; Zhao, W.; Luo, J.; Qiu, X.; Xu, L.; Xiong, Y.; Ran, Y.; Yan, C.; et al. An Improved Approach to Estimate Above-Ground Volume and Biomass of Desert Shrub Communities Based on UAV RGB Images. Ecol. Indic. 2021, 125, 107494. [Google Scholar] [CrossRef]
  62. Moe, T.K.; Owari, T.; Furuya, N.; Hiroshima, T. Comparing Individual Tree Height Information Derived from Field Surveys, LiDAR and UAV-DAP for High-Value Timber Species in Northern Japan. Forests 2020, 11, 223. [Google Scholar] [CrossRef] [Green Version]
  63. Luoma, V.; Saarinen, N.; Wulder, M.A.; White, J.C.; Vastaranta, M.; Holopainen, M.; Hyyppä, J. Assessing Precision in Conventional Field Measurements of Individual Tree Attributes. Forests 2017, 8, 38. [Google Scholar] [CrossRef] [Green Version]
  64. Sullivan, F.B.; Ducey, M.J.; Orwig, D.A.; Cook, B.; Palace, M.W. Comparison of Lidar- and Allometry-Derived Canopy Height Models in an Eastern Deciduous Forest. For. Ecol. Manag. 2017, 406, 83–94. [Google Scholar] [CrossRef]
  65. Strîmbu, V.F.; Strîmbu, B.M. A Graph-Based Segmentation Algorithm for Tree Crown Extraction Using Airborne LiDAR Data. ISPRS J. Photogramm. Remote Sens. 2015, 104, 30–43. [Google Scholar] [CrossRef] [Green Version]
  66. OpenDroneMap. WebODM: Drone Mapping Software (Version 1.1.0). Available online: https://www.opendronemap.org/webodm/ (accessed on 1 August 2021).
  67. Wu, C. VisualSFM: A Visual Structure from Motion System. Available online: http://ccwu.me/vsfm/ (accessed on 15 December 2021).
  68. Berra, E.F. Individual Tree Crown Detection and Delineation across a Woodland Using Leaf-on and Leaf-off Imagery from a UAV Consumer-Grade Camera. J. Appl. Remote Sens. 2020, 14, 34501. [Google Scholar] [CrossRef]
Figure 1. Orthomosaic and point cloud illustration of the two plantations in this study, (a) red oak plantation and (b) black walnut plantation, with their tree height statistics, at Martell Forest, Indiana.
Figure 1. Orthomosaic and point cloud illustration of the two plantations in this study, (a) red oak plantation and (b) black walnut plantation, with their tree height statistics, at Martell Forest, Indiana.
Remotesensing 14 01931 g001
Figure 2. Workflow of tree-level information extraction using UAS-based imagery from broadleaf tree plantations.
Figure 2. Workflow of tree-level information extraction using UAS-based imagery from broadleaf tree plantations.
Remotesensing 14 01931 g002
Figure 3. Illustration of issues involved in treetop detection and crown segmentation of individual trees performed using the automated technique proposed in this study. Panels (a,d) represent perfect segmentation. Panels (b,e) represent undersegmentation. Panels (c,f) represent oversegmentation with orthophoto and CHM.
Figure 3. Illustration of issues involved in treetop detection and crown segmentation of individual trees performed using the automated technique proposed in this study. Panels (a,d) represent perfect segmentation. Panels (b,e) represent undersegmentation. Panels (c,f) represent oversegmentation with orthophoto and CHM.
Remotesensing 14 01931 g003
Figure 4. Comparison of different accuracy parameters for the two plantations using different UAS platforms. (a) Recall, (b) Precision, (c) Omission error, (d) Commission error, (e) F-score and (f) True positive rates are presented by using manually detected treetops as a reference for the accuracy assessment.
Figure 4. Comparison of different accuracy parameters for the two plantations using different UAS platforms. (a) Recall, (b) Precision, (c) Omission error, (d) Commission error, (e) F-score and (f) True positive rates are presented by using manually detected treetops as a reference for the accuracy assessment.
Remotesensing 14 01931 g004
Figure 5. Results of regression analysis between ground-measured and algorithm-derived estimates for the red oak plantation. (a) Crown diameter measured using different UAS datasets (M600 and Bramor, respectively). (b) Tree height measured using different UAS datasets (DJI Mavic 600 and Bramor, respectively). *** indicates p ≤ 0.001 (statistical significance).
Figure 5. Results of regression analysis between ground-measured and algorithm-derived estimates for the red oak plantation. (a) Crown diameter measured using different UAS datasets (M600 and Bramor, respectively). (b) Tree height measured using different UAS datasets (DJI Mavic 600 and Bramor, respectively). *** indicates p ≤ 0.001 (statistical significance).
Remotesensing 14 01931 g005
Table 1. Plantation layout characteristics for the study area.
Table 1. Plantation layout characteristics for the study area.
Field CharacteristicsDataMethod of Ground Measurement
Top left
Coordinates
40°26′36″ N,
−87°1′51″ W
42°25′55″ N, −87°2′27″ W-
SpeciesRed oakBlack walnutVisual recognition on the ground
Tree count4668213Manual count from orthomosaic and on ground
Height (m)3.3–15.6 *-Vortex IV hypsometer on ground
Crown size (m)1.7–6.7 *-Measuring tape on the ground
* Based on 224 trees that were randomly sampled and measured.
Table 2. Summary of parameter specifications for two platforms and sensors used for this study.
Table 2. Summary of parameter specifications for two platforms and sensors used for this study.
SpecificationsC-Astral Bramor PPXDJI M600
Flight propertiesPlatformFixed-wing Remotesensing 14 01931 i001Hex-rotor
Remotesensing 14 01931 i002
Altitude (m)122120
Maximum payload (kg)46
* Area covered in (square m)1,331,905.24207,290
Flight time (min)2524
Camera and sensor propertiesSensorSony RXI RII
Remotesensing 14 01931 i003
Sony A6000
Remotesensing 14 01931 i004
Resolution (MP)42.424.2
Shutter speed (s)1/16001/1600
Focal length (mm)3521
ApertureF 4.5F 3.5
Image characteristicsPhoto overlap (%)8080
Images captured1124343
Images calibrated1113341
TimeFlight00:25:0000:24:00
Processing16:02:5803:48:59
OutputCoordinate systemWGS 84/UTM zone 16NWGS 84/UTM zone 16N
Average point density (per m2)508255
Number of 3D points282,244,97351,100,953
Ground sampling distance (cm)1.672.14
DSM accuracy (m)RMSEx0.05030.002
RMSEy0.01490.0017
RMSEz0.1330.0126
* The area covered by the Bramor platform was cropped to the plot extent for further analysis.
Table 3. Comparison of different accuracy parameters for the two plantations using different UAS platforms. Recall, precision, and detection rates are presented by using manually detected treetops as a reference for the accuracy assessment.
Table 3. Comparison of different accuracy parameters for the two plantations using different UAS platforms. Recall, precision, and detection rates are presented by using manually detected treetops as a reference for the accuracy assessment.
Window TypesRecall (rc)Precision (pr)TPFPFN1 Omission Error2 Commission ErrorF-Score3 Trees Detected
Bramor-oak3 × 30.8600.683341615895560.1400.3170.7610.851
5 × 50.9400.87539545642530.0600.1250.9060.968
7 × 70.5980.884246732416580.4020.1160.7130.598
Variable0.9500.88339134212190.0500.1170.9160.953
M600-oak3 × 30.8950.724362713824250.1050.2760.8010.868
5 × 50.9470.90840124072250.0530.0920.9270.947
7 × 70.7940.81229836907760.2060.1880.8030.787
Variable0.9500.9014057549530.0300.1000.9340.987
M600-walnut3 × 30.7510.60714594480.2490.3930.6711.122
5 × 50.9280.92812810100.0720.0720.9280.648
7 × 70.8800.86317628240.1200.1370.8710.958
Variable0.9500.9602037100.0500.0400.9500.958
1 Omission error is calculated from recall 100 × (1-rc). 2 Commission error is calculated from precision 100 × (1-pr). 3 Trees detected was calculated for the correctly detected trees from the proposed method against manually detected trees from the orthomosaic.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chandrasekaran, A.; Shao, G.; Fei, S.; Miller, Z.; Hupy, J. Automated Inventory of Broadleaf Tree Plantations with UAS Imagery. Remote Sens. 2022, 14, 1931. https://doi.org/10.3390/rs14081931

AMA Style

Chandrasekaran A, Shao G, Fei S, Miller Z, Hupy J. Automated Inventory of Broadleaf Tree Plantations with UAS Imagery. Remote Sensing. 2022; 14(8):1931. https://doi.org/10.3390/rs14081931

Chicago/Turabian Style

Chandrasekaran, Aishwarya, Guofan Shao, Songlin Fei, Zachary Miller, and Joseph Hupy. 2022. "Automated Inventory of Broadleaf Tree Plantations with UAS Imagery" Remote Sensing 14, no. 8: 1931. https://doi.org/10.3390/rs14081931

APA Style

Chandrasekaran, A., Shao, G., Fei, S., Miller, Z., & Hupy, J. (2022). Automated Inventory of Broadleaf Tree Plantations with UAS Imagery. Remote Sensing, 14(8), 1931. https://doi.org/10.3390/rs14081931

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop