Next Article in Journal
Soil Enzyme Activities and Microbial Carbon Pump Promote Carbon Storage by Influencing Bacterial Communities Under Nitrogen-Rich Conditions in Tea Plantation
Previous Article in Journal
Picking-Point Localization Algorithm for Citrus Fruits Based on Improved YOLOv8 Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Plant Height Estimation in Corn Fields Based on Column Space Segmentation Algorithm

1
College of Information Technology and Engineering, Guangzhou College of Commerce, Guangzhou 511363, China
2
College of Electronic Engineering, South China Agricultural University, Guangzhou 510642, China
*
Author to whom correspondence should be addressed.
Agriculture 2025, 15(3), 236; https://doi.org/10.3390/agriculture15030236
Submission received: 3 December 2024 / Revised: 27 December 2024 / Accepted: 21 January 2025 / Published: 22 January 2025

Abstract

:
Plant genomics have progressed significantly due to advances in information technology, but phenotypic measurement technology has not kept pace, hindering plant breeding. As maize is one of China’s three main grain crops, accurately measuring plant height is crucial for assessing crop growth and productivity. This study addresses the challenges of plant segmentation and inaccurate plant height extraction in maize populations under field conditions. A three-dimensional dense point cloud was reconstructed using the structure from motion–multi-view stereo (SFM-MVS) method, based on multi-view image sequences captured by an unmanned aerial vehicle (UAV). To improve plant segmentation, we propose a column space approximate segmentation algorithm, which combines the column space method with the enclosing box technique. The proposed method achieved a segmentation accuracy exceeding 90% in dense canopy conditions, significantly outperforming traditional algorithms, such as region growing (80%) and Euclidean clustering (75%). Furthermore, the extracted plant heights demonstrated a high correlation with manual measurements, with R2 values ranging from 0.8884 to 0.9989 and RMSE values as low as 0.0148 m. However, the scalability of the method for larger agricultural operations may face challenges due to computational demands when processing large-scale datasets and potential performance variability under different environmental conditions. Addressing these issues through algorithm optimization, parallel processing, and the integration of additional data sources such as multispectral or LiDAR data could enhance its scalability and robustness. The results demonstrate that the method can accurately reflect the heights of maize plants, providing a reliable solution for large-scale, field-based maize phenotyping. The method has potential applications in high-throughput monitoring of crop phenotypes and precision agriculture.

1. Introduction

Plant height, one of the most significant phenotypic indicators, represents the development and vigor of crops, which is crucial for variety selection and productivity prediction [1,2,3]. Crop phenotypic data forms the foundation of crop selection and field management [4]. Conventional techniques for measuring plant height rely on manual labor or two-dimensional photographs, but they have issues with accuracy and efficiency. Recently, UAV-based remote-sensing technologies have demonstrated an ability to improve the efficiency and accuracy of plant phenotyping, particularly for large-scale applications [5]. However, challenges such as variability in environmental conditions, dense canopy structures, and overlapping plants remain major hurdles for accurate plant height measurement in the field.
With the advancement of three-dimensional reconstruction technology, point cloud data may now be used to determine crop plant height [6]. Studies such as those by José et al. [7] and Bendig et al. [8] have successfully applied UAV-based 3D reconstruction methods to estimate plant height in wheat and maize, respectively, using structure from motion–multi-view stereo (SFM-MVS) techniques. These methods achieved high accuracy while maintaining cost-efficiency compared to active sensors like LiDAR. In particular, maize is China’s primary grain crop, and measuring the plant height precisely is essential to increasing productivity and satisfying demand [9,10]. The two primary categories of 3D reconstruction approaches are active and passive. Of these, passive reconstruction techniques, such as structure from motion (SFM), which are based on multi-view geometry, offer the benefits of low cost, high accuracy, and ease of use [11]. When paired with the multi-view geometry technique, a UAV fitted with a high-resolution camera to capture image sequences can effectively and non-destructively acquire 3D point cloud data on crops [12,13,14]. However, further work is required to address the limitations of SFM-based methods in handling complex environments, such as dense crop canopies and variable lighting conditions, which can affect segmentation accuracy and height estimation [15].
Although point cloud-based crop height measurements have gradually gained popularity in agricultural research, the majority of these measurements are currently limited to single-plant measurements at small scales in the field or indoors, and fewer measurements are made on field groups with more shade and closer spacing between research plants [16,17]. Recent studies have demonstrated that integrating multispectral or LiDAR data with SFM techniques can improve segmentation accuracy in complex field conditions [18,19]. Additionally, algorithmic improvements, such as region-growing segmentation and machine-learning-based approaches, have been explored to enhance point cloud processing in dense crop populations [20].
The scalability of UAV-based point cloud methods, including the proposed approach, is a critical factor for their application in larger agricultural fields and diverse crop types. For larger fields, the main challenges include processing increasingly large datasets and ensuring consistent accuracy under varying environmental conditions (e.g., uneven terrain and heterogeneous lighting). These challenges may be addressed by integrating parallel processing frameworks, optimizing data acquisition flight paths, and leveraging cloud-based data-processing platforms. Furthermore, the proposed column space approximate segmentation algorithm can be adapted to other crop types by adjusting parameters such as point cloud density thresholds or incorporating additional input data, such as multispectral or LiDAR data, to handle differences in canopy structure and plant morphology. Future studies should focus on validating the method across various crops, such as wheat, rice, or soybean, to ensure broader applicability and scalability.
Studies conducted both domestically and internationally have demonstrated that point cloud-based plant height measuring is more accurate and efficient than conventional techniques [21]. For instance, researchers have obtained impressive results in measuring the plant height of various crops by reconstructing point cloud data using LIDAR or multi-view geometry. However, the extensive use of LIDAR equipment is restricted by its expensive cost [22]. On the other hand, the multi-view geometry approach achieves low-cost, high-quality point cloud data capture, which is particularly appropriate for the field crop population study situation [23]. However, because of plant shade and close spacing, field crop populations have more difficulty extracting plant height and segmenting point cloud data [24]. Consequently, research on point cloud-based techniques for measuring plant height in field populations may yield fresh concepts for digital agriculture.
In this study, we gather field image sequences of maize populations and create point cloud data by combining UAV remote sensing with multi-view geometric 3D reconstruction technology [25,26]. Based on this, a plant height estimate model is developed, a new technique for measuring plant height in field populations is suggested, and the pre-processing and segmentation algorithms of the point cloud data are examined [27]. This study not only addresses the limitations of conventional plant height measurement techniques but also contributes to advancing UAV-based phenotyping methods by proposing a novel column space approximate segmentation algorithm [28]. This algorithm is designed to overcome the challenges associated with dense canopy structures and overlapping plants, improving the accuracy of plant height estimation at the field scale.
In addition to promoting the use of point cloud technology in agriculture and offering technical assistance for the informatization and unmanned measurement of crop phenotypes, this study intends to close the gap in the field of plant height measurement of point cloud data of crop populations at the field scale. It will also be extremely important for the collection of maize growth parameters and variable management.

2. Materials and Methods

2.1. Study Area and Experimental Design

Maize plants cultivated at the teaching and research base of South China Agricultural University were used as the experimental materials for this study. A site map of the maize test fields, showing the data collected from two different test fields, is provided in Figure 1. Field 1 covers an area of approximately 0.3 hectares, and Field 2 covers approximately 0.2 hectares. The aerial images shown in Figure 1 were captured using a UAV system on day 30 after maize emergence. These data provide a comprehensive overview of the experimental setup. This study involved five sets of trials conducted in three distinct maize plant environments across two fields. The fields are referred to as Fields 1 and 2 and the maize plant groups as a, b, c, d, and e. Four composite trial groups were designed for this study: trial 1 included plants b and c; trial 2 included plants d and e; trial 3 included plants a, b, and c; and trial 4 included all five sets (a, b, c, d, and e). In trials 1 and 2, the experimental conditions were the same maize plants in the same field. In trials 3 and 4, the conditions involved different maize plants in different fields. The composite trial groups are summarized in Table 1.

2.2. Field Maize Crop Population Collection System

By using UAVs with monocular cameras to capture 2D image sequences and create 3D models, the field corn acquisition system leverages 3D reconstruction technology with multi-view geometry, which is extensively utilized in the domains of localization, navigation, and virtual reality. Structure from motion–multi-view stereo (SFM-MVS), feature extraction, and matching are the main technologies used in the sparse and dense reconstruction stages of 3D reconstruction.
The study’s field crop population-gathering system included a desktop computer running Windows 10 (Intel Core i5-10500, 32G RAM, Gigabyte 3070 graphics card); a DJI Phantom 4 Pro, including the drone, remote controller, and 20-megapixel CMOS camera; a tape measure; and additional tools. The DJI Elf 4 Pro drone was selected based on the following considerations. The DJI Elf 4 Pro drone was chosen for this study due to its high-resolution 20 MP camera, flight stability with GPS-assisted navigation, and cost-effectiveness compared to more expensive UAVs with LiDAR or multispectral sensors. It offers sufficient accuracy for 3D reconstruction and plant height measurement while being easy to operate, making it ideal for large-scale field data collection. Unlike higher-end UAVs, the DJI Elf 4 Pro provides a good balance of performance, ease of use, and affordability, making it well-suited for field-scale phenotyping studies. Crop field point cloud data were generated using the Pix4Dmapper 4.5.6 software for 3D reconstruction, processed using the C++ Point Cloud Library (PCL 1.8.0), and visualized using MeshLab 1.3.2 and CloudCompare 2.9.0. The Pix4Dmapper software 4.5.6 was configured with specific parameters to ensure high-quality 3D reconstruction:
(1)
Image alignment settings: The alignment settings were configured to “High” to ensure accurate feature detection and matching between overlapping UAV images. The “Keypoint Image Scale” was set to 1 (full image resolution) for improved alignment precision;
(2)
Reconstruction resolution: the “Image Scale” for dense point cloud reconstruction was set to “Full Scale” to utilize the original image resolution and enhance the level of detail in the resulting point cloud;
(3)
Point cloud densification: the “Densification Quality” parameter was set to “High” to increase the density of the point cloud, capturing fine details of maize canopy structures;
(4)
Noise filtering: noise filtering was enabled with the “Aggressive” setting to minimize irrelevant or erroneous points caused by environmental factors, such as shadows and lighting variations;
(5)
Ground control points (GCPs): GCPs were used to georeference the data accurately and distributed evenly across the study area. The positional accuracy of the GCPs was ±1 cm;
(6)
Output coordinate system: the coordinate system was set to WGS84/UTM Zone 50 N to align the data with standard geospatial reference systems for further analysis.
These parameter configurations were carefully selected to balance processing efficiency and reconstruction accuracy, ensuring the generation of high-quality point cloud data for field-scale maize population analysis.

2.3. Point Cloud Data

A point cloud is a vast collection of target surface information points that is typically acquired using multi-view geometry or laser technology. Point cloud data may accurately depict the actual condition of the ground surface, including the reflecting characteristics of ground objects and the ground state. Based on the 3D reconstruction technique of multi-view geometry, we were able to generate corn population point cloud data at the field scale for this study. The point cloud data includes 3D coordinate information (X, Y, Z), color information (RGB), and intensity information (Intensity).

2.4. Image Acquisition and Plant Height Measurement in Maize Populations

In this investigation, a DJI Phantom 4 Pro, including the drone, remote controller, and 20-megapixel CMOS camera was used to acquire a multi-view image sequence of a maize population in a field setting at a resolution of 6000 × 4000. The UAV employed a tic-tac-toe flight route planning flight to gather the slanted data in four directions and make sure that the overlap area between two adjacent photographs was 70–80% in order to more thoroughly capture the 3D information of the corn (Figure 2). Each red dot in a denotes a photo taken from the UAV aircraft route, while b indicates the angle and location of the image capture. The selection of 70–80% image overlap was based on its proven effectiveness in improving 3D reconstruction accuracy, particularly in dense canopy conditions:
(1)
Enhanced feature matching: A high overlap percentage ensures that a sufficient number of shared feature points are detected across adjacent images. This improves the alignment accuracy during the structure from motion (SfM) process, resulting in a more reliable reconstruction of the maize canopy;
(2)
Improved spatial consistency: Overlap levels of 70–80% reduce gaps or inconsistencies in the point cloud data, particularly in areas with complex structures like overlapping leaves or varying canopy heights. This ensures a more continuous and detailed representation of the field;
(3)
Minimized reconstruction errors: In cases of lower overlap, key areas may be under-represented, leading to incomplete reconstructions or errors in dense regions. By maintaining 70–80% overlap, the software (Pix4Dmapper) is better able to handle variations in lighting, perspective, and texture across the field, reducing potential noise in the resulting 3D model;
(4)
Balancing processing efficiency: While higher overlap percentages (e.g., above 80%) can theoretically improve accuracy, they significantly increase data storage and processing time. The 70–80% overlap strikes a balance between achieving high accuracy and maintaining reasonable computational requirements.
This overlap configuration is widely adopted in UAV-based 3D reconstruction studies and has been validated in similar agricultural applications. The resulting point cloud generated in this study achieved sufficient density and precision to accurately measure plant height and analyze maize populations.

2.5. Point Cloud Segmentation for Corn Populations

2.5.1. Region Growing Algorithm

In order to create a full region, the region growth algorithm first identifies the seed points and then progressively combines the points in the neighborhood with comparable characteristics. Normals and curvatures are calculated and sorted in ascending order of curvature. The neighboring points are compared, and the lowest curvature point is chosen as the initial seed. The normal angle threshold is set, and the eligible neighboring points are added. The curvature threshold is set, and the growth continues until all the points have been processed. The enveloping box algorithm uses simple geometries (such as AABB, enveloping sphere, and OBB) to approximate complex objects. The minimal hexahedron used to represent AABB has a simple structure, but it is inapplicable to complex shapes, particularly when it must be recalculated during rotation.

2.5.2. Algorithm for Approximate Segmentation of Column Space

In order to extract the corn single-plant point cloud and determine its height, we provide in this work a column space approximate segmentation approach that combines the column space algorithm with AABB enclosing box segmentation. To remove the leaf point cloud and keep the stalk point cloud, the angle between the point cloud’s normal vector and the horizontal direction is first determined. The height of a single plant is then estimated by segmenting the stalk point cloud using an AABB bracket box. The specific process is shown in Figure 3.
The steps are as follows. Determine the angle between the point cloud’s normal vector and the horizontal direction. Set the angle threshold to extract the stalk point cloud. Set the AABB bracket box’s width and length to obtain the height data, and use the bracket box’s length to calculate the height of a single plant. Both incorrect and excessive segmentation are successfully decreased by this technique.
In three-dimensional space, the normal vectors of crop leaves and stalks differ, and segmentation can be achieved by adjusting the normal vectors’ angle. Normal vectors must be computed from 3D coordinates, while the corn population point cloud data are reconstructed from image sequences that contain color information and 3D coordinates. In order to approximate the surface normal vectors and achieve the segmentation base data support, normal estimation fits the point cloud’s local surface to a plane and derives the normals using the least squares approach. The plane equation is presumed to be
cos α · x + cos β · y + cos γ · z + p = 1
a x + b y + c z = d d 0 ,   a 2 + b 2 + c 2 = 1
where cos α , cos β , and cos γ are the cosine of the direction of the normal vector at the point (x, y, z) in the plane, and |p| is the distance from the origin to the plane. Finding the equation of the plane is transformed into finding the four parameters a, b, c, and d.

2.6. Point Cloud Preprocessing and Segmentation Analysis for Corn Populations

The point cloud of the maize population was preprocessed and segmented in this investigation using a point cloud library (PCL) configuration [21]. Point cloud collecting, filtering, segmentation, retrieval, feature extraction, and visualization are frequently implemented using PCL, a cross-platform open-source C++ point cloud template library running on the Robot operating system.

2.6.1. Corn Population Point Cloud Preprocessing Results

In this study, we performed preprocessing of the maize population point cloud, which included downsampling, statistical filtering, kd tree construction, and visualization. Figure 4 illustrates the various stages of preprocessing applied to the maize population point cloud derived from a single test set. The figure series shows the transformation of the point cloud data through different stages, as described below. Figure 4a presents a partial view of the large-scale field point cloud generated using CloudCompare 2.9.0. software. It represents the initial, raw point cloud data before any filtering or processing, providing an overview of the entire dataset. Figure 4b depicts the maize target point cloud extracted from the field. It represents the refined version of the point cloud, focusing specifically on the maize population by filtering out irrelevant background information. Figure 4c shows the point cloud after the ground points have been removed, leaving only the points associated with the maize population. The point cloud contains 6,122,494 points, representing the dataset after ground extraction and indicating the initial clean-up of the data. Figure 4d displays the point cloud after statistical filtering has been applied. Statistical filtering reduces noise and removes irrelevant points, resulting in a more accurate dataset. The processed point cloud contains 5,361,509 points. Figure 4e presents the final point cloud after downsampling, which reduces the point count to 29,939. The downsampling process makes the dataset more manageable for further analysis while retaining the key features of the maize population.
Each figure represents a distinct stage in the preprocessing pipeline, clearly demonstrating the evolution of the point cloud as it undergoes various transformations. These steps significantly improve the quality and usability of the maize population point cloud for subsequent analysis.

2.6.2. Point Cloud Preprocessing and Segmentation Methods

In this study, the point cloud data of maize seedlings were preprocessed and segmented using two primary methods: the color-based region-growing algorithm and the column space approximate segmentation algorithm. These steps were critical for separating individual plants and preparing the data for subsequent analysis.
The point cloud data were first processed using the color-based region-growing algorithm to segment the maize seedlings into individual plants. Key parameters, such as distance threshold, color threshold, and the minimum and maximum number of clustered points, were established. The algorithm performed well in simpler scenarios, achieving clear single-plant segmentation. However, under conditions of dense planting or occlusion, the segmentation results were less effective, requiring additional adjustments to the parameters.
The direction angle of each point in the point cloud was calculated by analyzing the angle between the point’s normal vector and the horizontal plane. This analysis helped to isolate stalk points from leaf points. Points with a direction angle between 1 and 2 radians were retained as part of the stalk, while others were excluded. This threshold-based approach effectively separated stalk point clouds, enabling more accurate plant height extraction.
To address the limitations of the region-growing algorithm in dense field conditions, a column space approximate segmentation algorithm was developed. This method combined the column space segmentation approach with the axis-aligned bounding box (AABB) method to approximate the segmentation of individual plants. By adjusting the edge length of the bounding boxes, the algorithm ensured effective segmentation of the point cloud into single plants. This method is particularly suited for estimating plant height at the field scale, despite the challenges of overlapping plants and shading in dense canopies.

2.7. Methods for Measuring Plant Height and Validation

2.7.1. Measurement of Plant Height in Corn Populations

In this study, the approximate single-plant segmentation of the maize population point cloud was achieved, as described in previous sections. The calculation of maize single-plant height from the segmented point cloud only required batch extraction using the AABB (axis-aligned bounding box) enclosing box algorithm. The batch extraction height algorithm was implemented in VS2017 using C++ to realize the code functionality. The process flowchart for the height extraction is shown in Figure 5.

2.7.2. Validation of Measured Data

Due to the large number of maize plants in the field, it was impractical to measure the heights of each plant manually. To validate the accuracy of the extracted point cloud height data, this study compared the extracted data of individual plants with the manually measured height data. The validation process involved the following steps:
(1)
Sort and analyze curve consistency: Both datasets, the extracted point cloud height data and the manually measured height data, were sorted from low to high. The sorted data were then plotted on a curve graph to observe the consistency of their patterns;
(2)
Polar deviation definition and calculation: Polar deviation was defined as a metric to quantify the differences between the corresponding height values in the point cloud dataset and the manually measured dataset. It provides a normalized measure of the differences, allowing for comparison across datasets with varying ranges and distributions. If the curves were consistent, the polar deviation was calculated to assess the degree of similarity between the two datasets. If the curves were inconsistent, the raised parts of the curve were segmented for separate polar deviation calculations;
(3)
Further analysis based on polar deviation: If the polar deviation was close (indicating a high degree of similarity), the average height of both datasets was calculated in the same unit. If the polar deviation was significantly different, the deviation was analyzed in multiples of the unit;
(4)
Correlation analysis: For cases where the point cloud dataset contained more data points than the manually measured dataset, average heights were calculated in multiplier units. Correlation points between the two datasets, which referred to non-null values within the same unit interval, were extracted. After merging the null rows, the calculations were repeated. This process was continued until no further correlation points remained.
The analysis of five test datasets demonstrated that the extreme differences between the point cloud height data and the manually measured data were minimal. Furthermore, the average heights calculated in the same units exhibited a high degree of consistency. These results validated the feasibility and accuracy of the maize population height extraction method using polar deviation as a metric to assess and quantify the consistency between the datasets. This method, based on multi-view geometric 3D reconstruction at the field scale, provides effective methodological support for high-throughput crop phenotype monitoring.

3. Results and Discussion

3.1. Plant Height Estimation in Maize Populations

3.1.1. Analysis of Regional Growth Segmentation in Point Cloud Populations of Maize Seedlings

The segmentation of the maize population was achieved using the color-based region growth algorithm. Key parameters such as distance threshold, color threshold, and the minimum and maximum number of clustered points were established. Figure 6 provides a top-down view of the segmentation results for experiment f and experiment g. In trial f, the segmentation successfully isolated the single-plant point clouds, achieving clear boundaries. In contrast, trial g exhibited poor segmentation due to plant occlusion. Under certain circumstances, the segmentation effect can be improved by adjusting the threshold values. In Figure 6, different colors represent different plants or plant clusters during the segmentation process. In Experiment f, where the segmentation was successful, the point clouds of individual plants were clearly separated, with each plant being assigned a distinct color. In Experiment g, due to plant occlusion, the segmentation was suboptimal, leading to multiple plants or clusters being incorrectly grouped together under the same color, resulting in unclear boundaries.
Further segmentation features for individual maize plants were observed, as shown in Figure 7. By adjusting key parameters, Figure 7a illustrates a scenario where single-plant clusters correspond to isolated monocultures with optimal segmentation. In Figure 7b, multiple clusters needed to be combined to represent individual plants. In cases where single-plant clusters were not achieved, the point clouds were manually merged using CloudCompare software to enhance the segmentation quality. In Figure 7, distinct colors are used to represent different clusters of maize plants, where each color corresponds to an individual plant or a group of plants identified by the segmentation algorithm. In Figure 7a, each individual maize plant is assigned a unique color, indicating successful segmentation where the plants are isolated, with clear boundaries between them. In Figure 7b, multiple plant clusters have been incorrectly merged into the same color due to occlusion or similarity in point cloud data, resulting in blurred boundaries and suboptimal segmentation outcomes.

3.1.2. Effect of AABB Enclosing Box Side Length on Height Estimation

In order to study the effect of AABB enclosing box side length on the point cloud segmentation of maize monocots, this study conducted experiments with side lengths set to one, two, five, and the maximum value, respectively, and counted the extracted height data (Table 2). The results show that, as the side length decreases, the total number of extracted bounding boxes increases, and the maximum, minimum, and average height values decrease. Conversely, as the side length increases, the number of extracted bounding boxes decreases, and the height values increase. If the side length is too small, the original single-plant point cloud is over-split, resulting in low extracted heights. If the side length is too large, the multi-plant point clouds merge, leading to higher extracted heights.
Figure 8 shows the histograms of maize height extracted using bounding boxes with edge lengths of one, two, and five. The histograms indicate that the extracted heights with different side lengths exhibit similar curve patterns, but the distribution and frequency of the values differ. When the side length is two, the range of extracted heights more closely matches the number of single plants in the field, as it can approximately encompass the single-plant point cloud. This results in extracted heights that better simulate the actual height of maize plants. Therefore, subsequent studies uniformly set the AABB enclosing box edge length to two to ensure the reasonableness and accuracy of the results.

3.1.3. Normal Vector Analysis and Column Space Approximate Segmentation Algorithm

The column space approximate segmentation algorithm was employed to segment maize stalks from the point cloud and improve height estimation accuracy. By analyzing the normal vectors, points with direction angles between one and two radians were identified as stalk points, effectively excluding leaf points and irrelevant data.
Figure 9 presents a histogram of the direction angle distribution, where the horizontal coordinate represents the angle in radians and shows a near-normal distribution. Based on these thresholds, Figure 10 illustrates the segmented point cloud with the stalk points preserved. Figure 10a provides an overall view, while Figure 10b,c show detailed close-ups.
Using the column space approximation approach combined with AABB segmentation, single-plant point clouds were extracted for subsequent height measurements. Figure 11 demonstrates this segmentation process, with Figure 11c showing the single-plant regions after segmentation. This segmentation algorithm was particularly effective in dense canopies with overlapping plants, achieving high precision in isolating individual maize stalks.

3.1.4. Distribution of Extracted Heights

In this study, the population point cloud was segmented into AABB enclosing boxes containing single-plant point clouds, and their heights were extracted. For statistical analysis and comparative validation, the extracted height data were multiplied by 100 to convert the measurements to centimeters. The experiment consisted of five groups, namely a, b, c, d, and e, with corresponding numbers of the extracted height data 765, 775, 805, 875, and 889, respectively.
A box plot was used to describe the distribution of the data for each test. The box plots show the overall data distribution through statistics, such as median, quartiles, and upper and lower bounds, and identify outliers (if present). As shown in Figure 12, the extracted height data for trials a, b, and c are distributed between 0 and 150 cm. Trials b and c have data concentrated between 0 and 125 cm, trial d between 0 and 150 cm, and trial e between 0 and 175 cm. The absence of outliers indicates the high quality of the extracted point cloud. Outliers, when present, may be caused by missing boundary point clouds (if smaller than the lower bound) or by a very small number of exceptionally tall plants (if larger than the upper bound). In the box plot, Triangles: Represent outliers, i.e., data points outside the range of the box, either above the upper bound or below the lower bound. These can be extreme values due to data errors or exceptional cases; Lines: The “lines” in the box plot represent the upper and lower quartiles (the boundaries of the box) and the median (usually a horizontal line).
The statistics in Table 3 show that the average heights of trials a, b, and c were about 55 cm, trial d about 78 cm, and trial e about 105 cm. These results reflect the differences in the maize growth periods. Trials a, b, and c had similar growth stages. Trial d was later, and trial e was the latest, which aligns with the actual growth situation. A histogram (Figure 13) was plotted using the five sets of data, revealing distinct differences in maize height distributions across the groups. Trial a was concentrated at 30 cm and 75 cm, with trial b at 37 cm and 70 cm, trial c at 40 cm and 70 cm, trial d at 75 cm and 112 cm, and trial e at 75 to 150 cm.

3.2. Real-Time Data Collection and Analysis

According to Table 4 and Figure 14, the manually measured maize plant heights across the five experimental groups showed distinct distribution patterns. Trials a, b, and c had average maize plant heights concentrated around 129–160 cm (trial b) and 120–160 cm (trial c), respectively, indicating similar growth stages. Trial d had an average plant height of about 167 cm, with a distribution range of 140–200 cm, suggesting a longer growing period compared to trials a, b, and c. Trial e had an average height of approximately 204 cm, with a distribution range of 200–220 cm, indicating a significantly longer growing period than the other trials.
Combined with the histograms of point cloud-extracted height data from each experiment, the results show that the distribution curve of point cloud-extracted maize heights closely matches the manually measured heights. This further validates that the point cloud extraction method, based on multi-view geometric 3D reconstruction, can accurately reflect maize plant heights with high measurement accuracy and field applicability.

3.3. Analysis of Results

In this study, the feasibility and accuracy of extracting maize plant height at the field scale from point cloud data were verified through a correlation analysis with the manually measured height data.

3.3.1. Analysis of the Results of the Single-Trial Group

From the data analysis of 215 correlation points across five experimental groups (a, b, c, d, e), the data comparison results are shown in Figure 15. A linear regression analysis indicated that the extracted point cloud height data were highly correlated with the manually measured height data, with an R2 ranging from 0.9880 to 0.9989, averaging 0.9943. The root mean square error (RMSE) ranged from 0.0148 m to 0.029 m, with an average of 0.0222 m (as shown in Table 5). The fitting line was nearly parallel to the 1:1 line, indicating a high degree of compatibility and superior accuracy of the extraction method compared to other plant height measurement methods. The k value of the fitted line parameters was close to one, indicating model stability. However, the estimated point cloud heights were consistently lower than the manually measured heights, which can be attributed to several factors. The limitations of the point cloud extraction method, including occlusion, uneven plant surfaces, and resolution issues, may prevent an accurate capture of the true plant height. Additionally, environmental factors such as lighting, plant positioning, sensor calibration, and variability in the plant surface structure (e.g., leaves, branches, and stalk curvature) can further affect the accuracy of height extraction. Manual measurement errors, such as difficulties in locating the plant top, also contribute to these discrepancies. The differences in b values reflected error sources in the various trial groups, including measurement environment, corn surface variability, and calculation errors. Environmental factors such as lighting conditions (e.g., overexposure or shadowing) and wind speed may have introduced variability during UAV data collection, leading to slight deviations in reconstruction quality. For example, high wind speeds could cause UAV instability, affecting the alignment of multi-view images and the accuracy of the point cloud reconstruction. To minimize the impact of these factors, several strategies were implemented in this study:
(1)
Environmental control during data collection: UAV flights were conducted during the early morning or late afternoon to ensure stable lighting conditions and avoid overexposure caused by midday sunlight. Flights were avoided during strong wind conditions to ensure UAV stability. Additionally, the flights were timed to avoid periods of high humidity, such as early mornings with dew or afternoons with elevated soil evaporation;
(2)
Flight path optimization: A tic-tac-toe flight pattern was used to ensure a 70–80% image overlap. This overlap minimized inconsistencies caused by variations in wind speed or sunlight conditions across the images;
(3)
Post-processing and noise correction: During point cloud processing, noise filtering techniques, such as statistical outlier removal and aggressive noise filtering in Pix4Dmapper, were applied to correct for irregularities caused by plant movement or lighting variations;
(4)
Validation against ground-truth data: UAV-derived height measurements were validated against manually measured ground-truth data to quantify and adjust for any deviations introduced by environmental factors.
Although these measures effectively reduced the impact of environmental factors, some sources of error remained, such as image misalignment caused by high wind speeds or reduced point cloud resolution due to uneven lighting conditions. These issues may result in slight decreases in accuracy under specific conditions. However, the overall results demonstrate that this method is highly applicable and robust for field-scale height estimation in maize populations.
Overall, the results confirm the feasibility and accuracy of extracting point cloud heights for individual maize plants at the field scale. However, certain environmental challenges, such as varying light intensities or weather conditions, may still require adjustments to preprocessing or data acquisition techniques.

3.3.2. Analysis of the Results of the Composite Test Group

The five experimental groups in this study were conducted in two fields and three different maize plant environments, referred to as fields 1 and 2 and maize plants a, b, and c. The composite test groups are specified as shown in Table 6.
The composite test group involved two fields and three different maize plant environments, consisting of four sets of trials: trials 1 and 2, conducted with the same maize plant in the same field; trial 3, conducted with a different maize plant in the same field; and trial 4, conducted with a different plant in a different field, as shown in Figure 16 and Table 7.
For tests 1 and 2, the R2 under the same maize plant condition in the same field were all above 0.99, and the RMSE was below 0.0274 m, indicating that the data were extremely correlated and that the error was smaller than that of a single test group. For test 3, the R2 was 0.91, the RMSE was 0.121 under different maize plants in the same field, and the correlation was weaker than that of tests 1 and 2 but still maintained high accuracy. The results showed that the model had high feasibility. For experiment 4, the lowest R2 was 0.8884 and the largest RMSE was 0.1692 under different plant conditions in different fields. But, there was still a strong correlation, and the error was at the decimeter level, indicating that the model has high feasibility, but with a large error.
The main reasons for the correlation differences include the effects of environmental and temporal differences (e.g., light, wind speed, etc.) on the point cloud reconstruction, and the error superposition effect under different test conditions. Although the correlation and precision of the composite test group were lower than those of the single test group, the correlation and precision were close to those of the single test group under the condition of the same maize plant in the same field, which indicated that the method of this study was highly reliable for field application under specific conditions. For instance, during the field tests conducted under strong sunlight, overexposure may have caused a loss of fine details in the images, reducing point cloud resolution. Similarly, dense canopy structures in later maize growth stages, such as at maturity, may introduce shading effects that hinder the accurate segmentation of individual plants. Conversely, in early growth stages with sparser canopies, noise introduced by weeds or soil may impact segmentation accuracy. These factors demonstrate that the method’s accuracy can vary depending on the environmental conditions and maize growth stages.
Although the correlation and precision of the composite test group were lower than those of the single test group, the correlation and precision were close to those of the single test group under the condition of the same maize plant in the same field. This indicated that the method of this study was highly reliable for field application under specific conditions. Future research could explore integrating multispectral data to improve plant segmentation accuracy in dense canopy conditions or incorporating LiDAR data to reduce errors in height estimation caused by shading and occlusion. Further optimization of preprocessing algorithms, such as advanced image alignment and filtering techniques, may also enhance point cloud quality under challenging environmental conditions. In addition, we compared the proposed column space approximation segmentation algorithm with commonly used segmentation techniques, such as the region-growing algorithm and Euclidean clustering, to further validate its effectiveness.
The region-growing algorithm, which segments point clouds based on curvature and normal vector consistency, performs well for sparse datasets but tends to merge multiple plants into a single cluster in densely planted or shaded environments. Similarly, Euclidean clustering is effective in simple geometric arrangements but struggles to separate overlapping plants in field populations. By contrast, the proposed column space approximation algorithm leverages normal vector orientation and AABB bounding box constraints to achieve more accurate segmentation, even in challenging field conditions. For example, in shaded areas with overlapping canopies, the region-growing algorithm often fails to isolate individual maize plants, where it produces merged clusters. In comparison, the proposed algorithm successfully segmented distinct point clouds for each plant. Quantitatively, the proposed method achieved higher segmentation accuracy under dense canopy conditions, with the accuracy exceeding 90%, compared to approximately 80% for the region-growing algorithm and 75% for Euclidean clustering. Furthermore, the computational efficiency of the proposed algorithm was significantly higher, reducing processing time while maintaining segmentation accuracy.
Despite these advantages, the proposed algorithm may encounter challenges in cases of extreme plant morphology irregularity, or substantial environmental noise. Future research could integrate multispectral or LiDAR data to enhance segmentation robustness, particularly in complex environments. Moreover, refining preprocessing algorithms, such as advanced filtering techniques, could further improve point cloud quality and reduce errors caused by shading or occlusion.
In summary, the maize single-plant height extracted based on point cloud data has strong feasibility and accuracy at the field scale. However, the study highlights the need for further improvements to address the challenges posed by varying environmental conditions (e.g., lighting and wind) and maize growth stages (e.g., dense canopies and shading). Incorporating additional data sources like LiDAR or multispectral imagery and optimizing the segmentation process could further improve the method’s robustness and applicability in diverse field scenarios.

4. Conclusions

In this study, we conducted a systematic investigation of plant height measurement for maize populations at the field scale using the UAV-based structure from motion–multi-view stereo technique, and reached the following main conclusions:
(1)
Data acquisition and 3D reconstruction: We designed an image acquisition method suitable for maize populations in the field, utilized a drone to capture multi-view image sequences, and successfully reconstructed a dense 3D point cloud using the SFM-MVS method. This established a solid foundation for point cloud segmentation and plant height measurement;
(2)
Point cloud processing and segmentation: Through the processes of cropping, filtering, de-landing, and down-sampling for point cloud pre-processing, we propose a column space approximation segmentation algorithm. This effectively addresses the segmentation challenges posed by plant shading and the branching complexity of maize populations in the field, enabling high-precision segmentation of individual plants;
(3)
Plant height measurement and modeling: We proposed a plant height extraction method based on a bounding box and verified the high degree of consistency between the plant height extracted from the point cloud and the manually measured data through linear regression analysis. In multi-group tests, the average fitting coefficient (R2) for single tests reached 0.9943, with a root mean square error (RMSE) of less than 0.029 m. For the composite tests, the average R2 was 0.9591, and the average RMSE was 0.1014 m. These results indicate that the method exhibits excellent measurement accuracy and applicability.
In summary, the point cloud processing and segmentation method based on multi-view geometric 3D reconstruction technology provides a feasible approach for the efficient and accurate measurement of phenotypic parameters of crop populations at the field scale. This study offers technical support for developing maize plant height estimation models and advancing digital agriculture and contributes to the promotion of unmanned crop phenotype monitoring systems in agriculture.
Moreover, the findings of this study provide important insights for the design of future unmanned crop monitoring systems. As to the hardware requirements, the results underscore the importance of using UAVs equipped with high-resolution cameras and stable flight systems to ensure accurate data collection under varying environmental conditions. Future systems may also benefit from incorporating advanced sensors such as LiDAR or multispectral cameras for enhanced data quality in dense canopy conditions or complex terrains. As to the operational protocols, the findings emphasize the need for optimized flight path planning (e.g., ensuring adequate image overlap and appropriate flight altitudes) and robust data-processing algorithms to handle environmental variability. These protocols can be adapted to suit different geographical regions with distinct environmental challenges.
By addressing hardware and operational requirements, this study lays the groundwork for designing more reliable and adaptable unmanned crop monitoring systems, thereby advancing precision agriculture globally.

Author Contributions

Conceptualization, H.Z. and N.L.; methodology, J.X.; software, L.C.; validation, H.Z., N.L., and J.X.; formal analysis, L.C.; investigation, N.L.; resources, S.C.; data curation, L.C.; writing—original draft preparation, H.Z.; writing—review and editing, H.Z.; visualization, N.L.; supervision, S.C.; project administration, N.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Guangdong Provincial Youth Innovation Talent Program, grant number 2021KQNCX121, Guangzhou College of Commerce Research Project, grant number 2021XJYB05.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

The authors would like to acknowledge the support of this study provided by the Guangdong Provincial Youth Innovation Talent Program and the Guangzhou College of Commerce Research Project.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichhorn, K.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  2. Araus, J.L.; Kefauver, S.C.; Vergara-Díaz, O.; Gracia-Romero, A.; Rezzouk, F.Z.; Segarra, J.; Buchaillot, M.L.; Chang-Espino, M.; Vatter, T.; Sanchez-Bragado, R.; et al. Crop phenotyping in a context of global change: What to measure and how to do it. J. Integr. Plant Biol. 2022, 64, 592–618. [Google Scholar] [CrossRef] [PubMed]
  3. Ninomiya, S. High-throughput field crop phenotyping: Current status and challenges. Breed. Sci. 2022, 72, 3–18. [Google Scholar] [CrossRef] [PubMed]
  4. Mulla, D.J. Twenty-five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  5. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned Aerial Vehicle Remote Sensing for Field-Based Crop Phenotyping: Current Status and Perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
  6. Li, D.; Wang, Z.; Zhang, J. Estimation of crop height using UAV-based RGB and LiDAR data fusion. Remote Sens. 2020, 12, 865. [Google Scholar]
  7. Pea, J.M.; Torres-Sánchez, J.; Castro, A.I.D.; Kelly, M.; López-Granados, F. Weed Mapping in Early-Season Maize Fields Using Object-Based Analysis of Unmanned Aerial Vehicle (UAV) Images. PLoS ONE 2013, 8, e77151. [Google Scholar] [CrossRef]
  8. Bendig, J.; Yu, K.; Aasen, H.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  9. Maes, W.H.; Steppe, K. The science of plant phenotyping: From roots to shoots. Front. Plant Sci. 2019, 10, 240. [Google Scholar]
  10. Presti, D.L.; Di Tocco, J.; Massaroni, C.; Cimini, S.; De Gara, L.; Singh, S.; Raucci, A.; Manganiello, G.; Woo, S.L.; Cinti, S. Current understanding, challenges and perspective on portable systems applied to plant monitoring and precision agriculture. Biosens. Bioelectron. 2023, 222, 115005. [Google Scholar] [CrossRef]
  11. Mishra, A.; Soni, M.; Gupta, R. A review of Structure from Motion (SFM) algorithms for 3D reconstruction. Int. J. Comput. Appl. 2013, 71, 13–18. [Google Scholar]
  12. Holman, F.H.; Riche, A.B.; Michalski, A.; Castle, M.; Wooster, M.J.; Hawkesford, M.J. High Throughput Field Phenotyping of Wheat Plant Height and Growth Rate in Field Plot Trials Using UAV-Based Remote Sensing. Remote Sens. 2016, 8, 1031. [Google Scholar] [CrossRef]
  13. Hassler, S.C.; Baysal-Gurel, F. Unmanned Aircraft System (UAS) technology and applications in agriculture. Agronomy 2021, 9, 618. [Google Scholar] [CrossRef]
  14. Nielsen, K.M.; Duddu, H.S.; Bett, K.E.; Shirtliffe, S.J. UAV image-based crop growth analysis of 3D-reconstructed crop canopies. Plants 2022, 11, 2691. [Google Scholar] [CrossRef]
  15. Tunca, E.; Kksal, E.S.; Taner, S.E.; Akay, H. Crop height estimation of sorghum from high resolution multispectral images using the structure from motion (SfM) algorithm. Int. J. Environ. Sci. Technol. (IJEST) 2024, 21, 1981–1992. [Google Scholar] [CrossRef]
  16. Watanabe, K.; Guo, W.; Ninomiya, S. Modeling and estimation of plant height using photogrammetry and machine learning. Front. Plant Sci. 2017, 8, 2002. [Google Scholar]
  17. Haug, S.; Michaels, A.; Schmitt, M. Crop height and density estimation using UAV-based RGB imagery. ISPRS J. Photogramm. 2014, 96, 149–162. [Google Scholar]
  18. Chroni, A.; Vasilakos, C.; Christaki, M.; Soulakellis, N. Fusing Multispectral and LiDAR Data for CNN-Based Semantic Segmentation in Semi-Arid Mediterranean Environments: Land Cover Classification and Analysis. Remote Sens. 2024, 16, 2729. [Google Scholar] [CrossRef]
  19. Xiao, K.; Qian, J.; Li, T.; Peng, Y. Multispectral LiDAR Point Cloud Segmentation for Land Cover Leveraging Semantic Fusion in Deep Learning Network. Remote Sens. 2023, 15, 243. [Google Scholar] [CrossRef]
  20. Li, D.; Cao, Y.; Tang, X.S.; Yan, S.; Cai, X. Leaf Segmentation on Dense Plant Point Clouds with Facet Region Growing. Sensors 2018, 18, 3625. [Google Scholar] [CrossRef]
  21. Huang, Y.; Lan, Y.; Wu, W.; Thomson, S.J. Spatial variability analysis of crop height based on LiDAR and multispectral UAV imagery. Precis. Agric. 2021, 22, 999–1015. [Google Scholar]
  22. Zhou, Z.; Guo, X.; Wang, J. A review of LIDAR-based crop height measurement. Comput. Electron. Agric. 2021, 181, 105965. [Google Scholar]
  23. Schirrmann, M.; Giebel, A.; Pflanz, M.; Löpmeier, F.J.; Dammer, K.H.; Bareth, G. Monitoring agronomic parameters of winter wheat crops with low-cost UAV imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef]
  24. Roth, L.; Streit, U. Analysis of plant growth using segmentation algorithms for high-throughput phenotyping. Plant Methods 2019, 15, 5. [Google Scholar]
  25. Sun, Q.; Wang, H.; Wang, S. UAV-based photogrammetric modeling of plant height and its potential for yield prediction. Agronomy 2021, 11, 758. [Google Scholar]
  26. Zermas, D.; Morellas, V.; Mulla, D.; Papanikolopoulos, N. 3D model processing for high throughput phenotype extraction: The case of corn. Comput. Electron. Agric. 2020, 172, 105465. [Google Scholar] [CrossRef]
  27. Singh, R.; Mosavi, A. A review on algorithms for segmentation of point clouds from 3D scans. Comput. Electron. Agric. 2020, 174, 105465. [Google Scholar]
  28. Rusu, R.B.; Cousins, S. 3D is here: Point Cloud Library (PCL). In Proceedings of the IEEE International Conference on Robotics & Automation, Shanghai, China, 9–13 May 2011; pp. 1–4. [Google Scholar]
Figure 1. Aerial views of the experimental maize test fields. (a) Test Plot 1: covers an area of approximately 0.3 hectares, and (b) Test Plot 2: covers an area of approximately 0.2 hectares. Images were captured on day 30 after maize emergence using UAV-based multi-view imaging.
Figure 1. Aerial views of the experimental maize test fields. (a) Test Plot 1: covers an area of approximately 0.3 hectares, and (b) Test Plot 2: covers an area of approximately 0.2 hectares. Images were captured on day 30 after maize emergence using UAV-based multi-view imaging.
Agriculture 15 00236 g001
Figure 2. UAV planning diagram.
Figure 2. UAV planning diagram.
Agriculture 15 00236 g002
Figure 3. The flowchart of corn single-plant height estimation based on column space segmentation.
Figure 3. The flowchart of corn single-plant height estimation based on column space segmentation.
Agriculture 15 00236 g003
Figure 4. Point cloud preprocessing.
Figure 4. Point cloud preprocessing.
Agriculture 15 00236 g004aAgriculture 15 00236 g004b
Figure 5. Flowchart of the extraction height.
Figure 5. Flowchart of the extraction height.
Agriculture 15 00236 g005
Figure 6. Algorithm segmentation effect.
Figure 6. Algorithm segmentation effect.
Agriculture 15 00236 g006
Figure 7. Clustering of different experimental monocultures.
Figure 7. Clustering of different experimental monocultures.
Agriculture 15 00236 g007
Figure 8. Histogram of estimated maize plant height using the point cloud extraction method for different sizes of wraparound boxes.
Figure 8. Histogram of estimated maize plant height using the point cloud extraction method for different sizes of wraparound boxes.
Agriculture 15 00236 g008
Figure 9. Histogram of direction angle.
Figure 9. Histogram of direction angle.
Agriculture 15 00236 g009
Figure 10. Normal vector point cloud map.
Figure 10. Normal vector point cloud map.
Agriculture 15 00236 g010
Figure 11. Example diagram of the column space algorithm.
Figure 11. Example diagram of the column space algorithm.
Agriculture 15 00236 g011
Figure 12. Box plot of estimated height data for each test.
Figure 12. Box plot of estimated height data for each test.
Agriculture 15 00236 g012
Figure 13. Histogram of estimated maize plant height data for five sets of tests using the point cloud extraction method. The heights for each test (ae) were derived from the processed point cloud data generated by the multi-view reconstruction method.
Figure 13. Histogram of estimated maize plant height data for five sets of tests using the point cloud extraction method. The heights for each test (ae) were derived from the processed point cloud data generated by the multi-view reconstruction method.
Agriculture 15 00236 g013aAgriculture 15 00236 g013b
Figure 14. Histogram of manually measured maize plant height data for five sets of tests. The heights for each test (ae) were obtained through manual measurement, serving as a ground-truth comparison to the point cloud estimation.
Figure 14. Histogram of manually measured maize plant height data for five sets of tests. The heights for each test (ae) were obtained through manual measurement, serving as a ground-truth comparison to the point cloud estimation.
Agriculture 15 00236 g014
Figure 15. Comparative validation plot of plant height for the five experimental groups.
Figure 15. Comparative validation plot of plant height for the five experimental groups.
Agriculture 15 00236 g015
Figure 16. Validation plot of plant height comparison for composite test groups.
Figure 16. Validation plot of plant height comparison for composite test groups.
Agriculture 15 00236 g016
Table 1. Composite group.
Table 1. Composite group.
Composite Test GroupIncludes Maize Plants NumberRemarks
1b, cSame corn plant in same field
2d, eSame corn plant in same field
3a, b, cDifferent maize plants in the same field
4a, b, c, d, eDifferent plants in different fields
Table 2. Summary statistics of estimated corn plant heights using the point cloud extraction method.
Table 2. Summary statistics of estimated corn plant heights using the point cloud extraction method.
Bounding BoxTotalMaximum ValuesMinimum ValueAverage Value
133191.66570.03670.6505
28891.74290.08641.0506
51652.14830.17701.3676
greatest12.47062.47062.4706
Table 3. Summary statistics of estimated height data for each test.
Table 3. Summary statistics of estimated height data for each test.
Test No.TotalMinimum ValueMaximum ValuesAverageRange
a7652.5787139.785055.8984137.2063
b7752.3823118.755755.1277116.3734
c8054.0375124.868055.8835120.8305
d8725.4728143.944078.7779138.4712
e8898.6357174.2878105.1655165.6521
Table 4. Statistics of measured heights for the five groups of tests.
Table 4. Statistics of measured heights for the five groups of tests.
Test No.TotalMinimum ValueMaximum ValuesAverageRange
a8680.0232.2150.3471152.2
b6475.6164.0129.071488.4
c6064.7178.1134.2102113.4
d60109.7204.5167.762794.8
e60156.6232.1204.4375.5
Table 5. Linear statistics for the five groups of tests.
Table 5. Linear statistics for the five groups of tests.
Test No.Associated PointsEquation for Estimating Plant HeightR2RMSE
a591.023x + 73.710.99890.0163
b420.9183x + 56.870.99090.0283
c411.031x + 49.570.99840.0148
d381.017x + 89.230.99520.0226
e390.9845 x + 93.560.98800.0290
Note. x corresponds to the manually measured plant height.
Table 6. Distribution of composite test groups.
Table 6. Distribution of composite test groups.
Test No.Field NumberCorn Plant Number
a1a
b1b
c1b
d2c
e2c
Table 7. Linear statistics for the composite test group situation.
Table 7. Linear statistics for the composite test group situation.
Test No.Associated PointsEquation for Estimating Plant HeightR2RMSE
a591.023x + 73.710.99890.0163
bc830.9842x + 52.630.993470.0274
de771.017x + 89.570.99560.0265
abc1420.9825x + 62.740.918890.1210
abcde2151.118x + 61.790.88840.1692
Note. x corresponds to the manually measured plant height.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, H.; Liu, N.; Xia, J.; Chen, L.; Chen, S. Plant Height Estimation in Corn Fields Based on Column Space Segmentation Algorithm. Agriculture 2025, 15, 236. https://doi.org/10.3390/agriculture15030236

AMA Style

Zhang H, Liu N, Xia J, Chen L, Chen S. Plant Height Estimation in Corn Fields Based on Column Space Segmentation Algorithm. Agriculture. 2025; 15(3):236. https://doi.org/10.3390/agriculture15030236

Chicago/Turabian Style

Zhang, Huazhe, Nian Liu, Juan Xia, Lejun Chen, and Shengde Chen. 2025. "Plant Height Estimation in Corn Fields Based on Column Space Segmentation Algorithm" Agriculture 15, no. 3: 236. https://doi.org/10.3390/agriculture15030236

APA Style

Zhang, H., Liu, N., Xia, J., Chen, L., & Chen, S. (2025). Plant Height Estimation in Corn Fields Based on Column Space Segmentation Algorithm. Agriculture, 15(3), 236. https://doi.org/10.3390/agriculture15030236

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop