Next Article in Journal
Chili and Sweet Pepper Seed Oil Used as a Natural Antioxidant to Improve the Thermo-Oxidative Stability of Sunflower Oil
Next Article in Special Issue
Diversity Characterization of Soybean Germplasm Seeds Using Image Analysis
Previous Article in Journal
Multicolor Fluorescence Imaging for the Early Detection of Salt Stress in Arabidopsis
Previous Article in Special Issue
DiaMOS Plant: A Dataset for Diagnosis and Monitoring Plant Disease
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mapping Gaps in Sugarcane by UAV RGB Imagery: The Lower and Earlier the Flight, the More Accurate

by
Marcelo Rodrigues Barbosa Júnior
*,
Danilo Tedesco
,
Rafael de Graaf Corrêa
,
Bruno Rafael de Almeida Moreira
,
Rouverson Pereira da Silva
and
Cristiano Zerbato
Department of Engineering and Mathematical Sciences, School of Veterinarian and Agricultural Sciences, São Paulo State University (Unesp), Jaboticabal, São Paulo 14884-900, Brazil
*
Author to whom correspondence should be addressed.
Agronomy 2021, 11(12), 2578; https://doi.org/10.3390/agronomy11122578
Submission received: 9 November 2021 / Revised: 14 December 2021 / Accepted: 16 December 2021 / Published: 18 December 2021
(This article belongs to the Special Issue Imaging Technology for Detecting Crops and Agricultural Products)

Abstract

:
Imagery data prove useful for mapping gaps in sugarcane. However, if the quality of data is poor or the moment of flying an aerial platform is not compatible to phenology, prediction becomes rather inaccurate. Therefore, we analyzed how the combination of pixel size (3.5, 6.0 and 8.2 cm) and height of plant (0.5, 0.9, 1.0, 1.2 and 1.7 m) could impact the mapping of gaps on unmanned aerial vehicle (UAV) RGB imagery. Both factors significantly influenced mapping. The larger the pixel or plant, the less accurate the prediction. Error was more likely to occur for regions on the field where actively growing vegetation overlapped at gaps of 0.5 m. Hence, even 3.5 cm pixel did not capture them. Overall, pixels of 3.5 cm and plants of 0.5 m outstripped other combinations, making it the most accurate (absolute error ~0.015 m) solution for remote mapping on the field. Our insights are timely and provide forward knowledge that is particularly relevant to progress in the field’s prominence of flying a UAV to map gaps. They will enable producers to make decisions on replanting and fertilizing site-specific high-resolution imagery data.

Graphical Abstract

1. Introduction

Mapping gaps in sugarcane is instrumental for its sustainable production on an industrial scale. It provides support to producers for making decisions on replanting and fertilizing [1,2,3]. A method by Stolf [4] to determine gaps consists of measuring spaces between stalks larger than 0.5 m every 0.5 m along a 100-m sampling line. The pioneer approach is easy to setup and conduct. However, it is labor intensive, time consuming and often requires an expert (subjectiveness) for accuracy and reliability. Another limitation refers to scale. Extensive areas often make its application unfeasible [5]. Thereby, development of an alternative is necessary and challenging. A cost-effective option to map gaps in sugarcane would be using remote sensing on UAVs.
UAVs are the next generation of data-acquisition platforms for agriculture. They are able to capture information about an object with increased richness of details than satellites [6,7,8,9,10,11]. Moreover, they are not dependent on weather (e.g., cloudiness) as orbital sensors. Numerous agricultural applications exist for UAVs. The regular literature focuses on counting population [12,13,14,15], predicting yield [16,17,18,19] and monitoring evaporative or transpiratory stream [20,21,22,23] in croplands. The operation is automatic and can save time, labor and costs. UAVs are versatile. They can level the cost-effectiveness of commercially producing sugarcane [1,2,3]. Plainly, UAVs prove useful for sugarcane production.
By reading into the contemporary literature on mapping gaps in sugarcane, we can find a piece of information on UAV and non-UAV solutions. A non-UAV approach by Molin and Veiga [5] refers to operating a level-ground photoelectric sensor. The device is integrative and fits easily in the front wheel of a tractor. It provides an infra-red light-emitting diode (IR LED) system to detect and measure stalks, sending real-time information to an onboard data logger. A global navigation satellite system (GNSS) transmits signals to the device, making it possible to georeference data. The method enables the user to perform sampling at a full scale; therefore, this is an advance relative to Stolf [4]. A clustering method by Luna and Lobo [1] also is applicable in mapping gaps in sugarcane-producing areas on UAV imagery data. The alternative can differentiate the regions of images with plants and soil. The outcome of clustering enables discriminant analysis for adequately (RMSE~5.05) classifying the pattern merely as a result of developing relationships or predictive equations for continuous independent variables.
The collaboration by Souza et al. [2] to the academic progress in the field’s prominence in remotely mapping gaps in sugarcane production is the elaboration of an object-detecting analytical approach. The method relies on processing UAV imagery data with visible (R and G) and infrared (NIR) regions of the spectrum. It produces an accuracy of about 0.35 m. An onboard ultrasonic sensor can map gaps in sugarcane as accurately as a UAV. However, it is not as competent as an onboard photoelectric sensor [3]. The RMSE for the prediction of gap’s length upon data from photoelectric sensor is about 0.20 m. The technology can outperform the ultrasonic sensor and vegetation index in mapping gaps. However, it requires an agricultural machine to operate, potentially making the project costly. If guidance is manual, the tractor can damage ratooning by trampling. Autosteering with ‘just-in-time’ telemetric correction by real time kinematic (RTK) or real time eXtended (RTX) device could compensate for the driver’s errors. However, it further contributes to costs. A UAV could monitor the field without touching the object and, most notably, without inducing intensive traffic, which might force plants to not grow or grow irregularly. To the best of our knowledge, however, no in-depth investigation exists on whether it is possible for the size of pixel or plant to interfere with the accuracy of predicting gaps in sugarcane from RGB imagery data from a UAV, which emphasize the necessity for further systematic studies and confirmatory experiments to clarity.
Therefore, in light of progressing research in remotely mapping gaps in sugarcane, we analyzed how the pixel size and height of plant could impact on predicting them in UAV RBG imagery. A secondary objective was to quantitatively analyze whether the length of the gap could influence imaging accuracy.

2. Materials and Methods

2.1. Site

We conducted our study in an open field of 10 ha at 21°14′29″ S and 48°17′18″ W, São Paulo, Brazil. The regional climate is tropical, with rainy summer and dry winter [24]. The soil is an Oxisol with predominantly clayey texture and 0–8% relief [25]. The cultivar of sugarcane was CTC-4. The inter-row spacing for its planting was 1.4 m in order to resemble conditions at an industrial scale.

2.2. Aerial Platform and Sensor

The aerial platform for remote sensing was a UAV (DJI Inspire 1, Shenzhen, China). The UAV is powered by a battery with an autonomy of 20 min. It can operate at up to 4500 m of altitude and 5 km horizontal distance from the user. Moreover, it is stable against winds of up to 36 km h−1. This UAV has a GNSS receiver, which guarantees a vertical and horizontal precision of 0.5 and 2.5 m, respectively. The flights were performed automatically in midday (±1 h) by using an application (DroneDeploy, San Francisco, CA, USA) on a tablet.
An RGB sensor (DJI Zenmuse X3 camera—FC350, Shenzhen, China) was coupled to the UAV. The camera was stabilized by a 3-axis gimbal and has 12 megapixels of spatial resolution, a focal length of 20 mm (equivalent to a format of 35 mm) and a 1/2.3″ CMOS sensor. The images were captured using the shutter speed of 1/8000 to avoid errors due to vibrations. The exposure and focus settings were automatically configured in the flight plan application. The images were stored on a memory card inserted in the camera. The UAV images were captured with a front overlap of 75%, side overlap of 70% and flight speed of 7.0 m/s. The flights altitudes used were 80, 150 and 200 m to extract information from object at corresponding pixels of 3.5, 6.0 and 8.2 cm.

2.3. On-Field Sampling and Biometrics

We performed the first flight as well as the measurement of height on 28 September 2019 in order to capture spectral and morphological data from the crop after sprouting leveled off at the 30th day of cultivation. To complete acquisition of data and to broaden and strengthen our approach, we also programmatically surveyed the field at 45th, 60th and 75th days after harvest to refer to midstage tillering and, finally, at the 90th of cultivation to study changes on the canopy at the latest stage of elongation. Actively growing plants at the boundary of the target-region were biometrically assessed for height, expressed as centimeter, to establish an eventual effect of vegetation on the ability for the UAV. Such plants structurally might cover the gaps, thus potentially making it difficult for the aerial platform to detect them.
Gaps were analyzed at lengths of 0.5, 1.0, 1.5, 2.0 and 2.5 m with five replicates to reduce systematic errors. They were artificially projected on the field by manually removing seedlings out of the active lines. They were then marked up with visible targets for identification by the aerial platform (Figure 1). We based our range of gaps on records with respect to commercial fields and, most importantly, on the literature. Gaps of 0.5–2.5 m are frequent in sugarcane-producing areas in Brazil [4]. Studies focusing on the determination of gaps in sugarcane using UAV RGB imagery resolution and phenological condition are often not available from the regular literature. Thus, further in-depth investigations are necessary, especially for the impact of pixel size and height of plant on the mapping. The size of the pixel is proportional to altitude. Since distance of the sensor from the ground changes the pixel’s properties, takeoff point for the series of flights was standardized to prevent bias. Figure 2 illustrates the impact of pixel size and phenology on the resolution of gaps of 2.5 m.

2.4. Processing of Image for Identification of Gaps

The process of processing original images in the environment of photogrammetry software DroneDeploy (San Francisco, CA, USA) started with alignment. The role of alignment is to translate homologous information from multiple overlapping images into an intuitive and readable point cloud. The second step consisted of interpolating point cloud into a useful model for digital elevation model to design an orthomosaic map for the identification of gaps by artificial intelligence from the commercial software Inforow (Piracicaba, Brazil). The algorithm was able to process single-band orthomosaic maps and can iteratively identify and distinguish plants from the soil merely as a result of grouping and segmentation of low-intensity pixels by chromatic similarity and topological proximity, without bias and computational unfeasibility. After identification, segments with information on dimension and geographical position of gaps were converted into vectorial files then organized by the date of on-field sampling for further validation.

2.5. Validation of Protocol

A linear regression analysis was performed to cross validate our hypothesis to the impact of pixel size and height of plant on the mapping of gaps. The length of real object was plotted on the x-axis as the independent variable against the length of analogous digital pattern on the y-axis as the dependent variable. The metrics for analyzing precision and accuracy of the model included the coefficient of determination (R2; Equation (1)) and mean absolute error (MAE; Equation (2)), respectively:
R 2 = ( y ^ i y ¯ ) 2 ( y i y ¯ ) 2
MAE = 1 n i = 1 n | y i y ^ i |
where y i and y ^ i stand as the observation and prediction for the length of gap, respectively; and y ¯ is the mean.

3. Results

3.1. Crop’s Growth and Development

Sugarcanes were grown and developed healthily. A boxplot diagram (Figure 3) can adequately describe temporal variability in the height of the plant. The height significantly increased, from 0.5 ± 0.05 m at the 30th to 0.9 ± 0.05 m at the 45th day of cultivation, which is typical of sugarcane in the transition from sprouting to tillering. No substantive change in height emerged after tillering leveled off and the plant started elongating itself; thus, the magnitude of this morphophysiological feature was 1.00 ± 0.10 m 60 days after harvest. As for the latest stage of elongation, the height considerably increased from 1.20 ± 0.15 m at the 75th to 1.70 ± 0.15 m at the 90th day of cultivation, and some unacceptable yet controllable discrepancies existed for the biometry of stalk, an attribute of heterogeneity and complexity of the field at the end of the experimentation. The central points of the biometrical ranges were only nominal values of height, and we intuitively defined them as ‘a posteriori’ for the analysis of potential interference of actively growing vegetation with the quality of imaging. They assisted us with deducing the most reliable combination possible between pixel’s size or equivalently flight’s altitude and crop’s phenology to identify and predict gaps by strategically flying the UAV on the field, as traditional approaches do allow this.

3.2. Prediction of Gaps

Functional relationships existed between length of gap, height of plant and pixel size. A first-order polynomial model can adequately predict them (Figure 4). The smaller the pixel or plant, the more accurate the prediction for the length of gap, as it becomes more visible from the data-acquisition platform. Gaps as large as 2.5 cm make it easier to capture them from a UAV, regardless of the resolution. By contrast, smaller gaps added more complexities to remote sensing. Hence, the linear model did not adequately describe them for spatial representation upon imagery data from plants larger than 0.5 m. Therefore, the size of pixel to maximize prediction was 3.5 cm. The combination of pixels of 3.5 cm with height of 0.5 m produced the largest value of r2 of 0.95. For plants as large as 0.9–1.2 m, a magnitude of r2 also was appreciable at 0.8–0.9 m. It considerably decreased to 0.2 by processing data on plants of 1.7 m. Plainly, denser vegetation overlapped gaps. Another accurate resolution could be 6.0 cm for plants that are sprouting. Later stages of tillering and elongation deteriorated the quality of image, making it impossible for predicting gaps of 0.5 m at lower resolution pixels. Overall, an inaccuracy in modeling gaps of 0.5–1.0 m with pixels larger than 3.5 from imagery data on the field between 60th and 90th day supported our hypothesis. We can draw an optimal combination of flight altitude and phenological stage for remotely detecting gaps. Thus, the user who wishes to map gaps in sugarcane must plan to fly a UAV on the field no longer than 45 days from planting. If not, remote sensing cannot be feasible and there will be systematic error (Table 1).

4. Discussion

In this study, we analyzed whether it could be possible for a UAV RGB platform to predict gaps with phenology. Our approach proved useful. The optical sensor adequately detected the gaps. Both pixel size and height of plant determined its performance. Pixels larger than 3.5 cm could not capture gaps of 0.5–1.0 m for plants at tillering and elongation. A denser vegetation can limit the accessibility of a UAV to gaps smaller than 1.0 m, even if spatial resolution is the highest possible at 3.5 cm. Clearly, pixels larger than 3.5 cm cannot provide spatial resolutions sufficient for distinguishing actively growing vegetation from the soil. Hence, predicting linear regression for gaps can be rather inaccurate. We could find errors of up to 100% if there is existing overlap. As sugarcane grows, tillers and leaves emerge upwards, covering the target-object partly or completely. Thus, sprouting is more likely for accurate remote detection than compared to tillering and elongation. Plainly, later phenological stages add more errors to the orthomosaic map. Therefore, our guidance for prospective users (e.g., producers, consultants and service-delivering companies) is to fly at an altitude to extract 3.5-cm pixel-wise data on the field at 30–45th day. If not, poorer resolution and denser vegetation will make it challenging for capturing useful images for computational processing. Another reliable spatial resolution could be 6.0 cm for plants at transitory sprouting–tillering stages. Its accuracy was 6% less than the accuracy for the pixel of 3.5 cm.
An advantage of our study, relative to existing literature [1,2,3], is the possibility of realistically mapping gaps without the need of an onboard photoelectric sensors and in-site calibration. An onboard photoelectric sensor by Maldaner et al. [3] can perform accurate mapping. However, it requires an agricultural machine and experts for calibration. Support for operating a UAV is not necessary, improving the autonomy and cost-effectiveness of the project. Another benefit refers to planning the flights relative to compatible phenological stages. We can adequately capture the pattern at high resolution without negative impacts of weather (e.g., cloudiness) or canopy (e.g., architecture) on the quality of imaging. Thirdly, flying a UAV can prevent ratooning from damages by machinery trampling, if the guidance is manual and the driver has no expertise and hands-on skills sufficient to align the ground-level operation with the trajectories on the field.
The strategy of considering the gap’s length and height of plant to include in detection and prediction allows accurate mapping of the pattern by its phenology-dependent dimension. Dynamics of growth and development and how it impacts on imaging is instrumental for analyzing the crop at specific stages, not only one portion but also for the entire cycle and without distinction. These are apparent limitations of earlier studies by Luna and Lobo [1], Souza et al. [2] and Maldaner et al. [3]. A series of flights to monitor the crop at sprouting, tillering and elongation could require an additional investment. However, as an optimal combination of pixel size and height of plant exists for mapping, a single flight on the field at sprouting is sufficient to acquire useful imagery data. Thereby, the operational cost is not likely to limit the utility of the technology. A direction for future in-depth studies is to economically analyze the mapping of gaps by a UAV for clarity.
Studies on mapping gaps in sugarcane using UAV RGB imagery resolution and phenological condition are often not available from the literature. However, we can read positive relationships between image resolution and predictive performance into experiments on remotely estimating density and in cover breeding [26,27,28]. Higher-resolution optical sensor or lower altitude can improve data for algorithmic processing [2]. However, in mapping sugarcane only, setting the flight to pixels no larger than 3.5 cm is not sufficient for controlling imprecisions. Phenology influences the visibility of the target by covering it at stages later than sprouting. Even if size of pixel optimally is 10% of the size of the object [26], it is not capable of capturing patterns. Denser vegetation at tillering and elongation covers the objective area. Hence, gaps of 0.5–1.0 m are not likely for remote detection (Figure 5). Other factors limiting sensors to have access to gaps potentially include weed spread in regions of importance [29] and trampling by machineries [30]. However, they could not be sources of errors in our study. The field was free of weeds. And we could not find damage by the traffic of machineries that might have forced the plant to not grow or grow irregularly. Therefore, the sugarcane itself is architecturally complex relative to imaging. However, our approach proves useful for predicting gaps. It is timely and can assist users with making decisions on replanting or fertilizing.

5. Conclusions

Our study clearly demonstrated the impact of pixel size and height of plant on the mapping of gaps using UAV RGB imagery data. High-resolution pixels of 3.5 cm and plants no larger than 0.5 m are the optimal combinations for predicting first-order models for the length of gaps. Therefore, the user must fly a UAV on field after 30–45 days of production. If not, phenological stages later than sprouting, namely tillering and elongation, will cover the objective area and make it challenging for the aerial platform to obtain access to gaps. Our insights are timely and provide forward knowledge of particular relevance to progress in the field’s prominence of flying UAVs to map gaps. They will enable producers in making decisions on replanting and fertilizing using site-specific high-resolution imagery data.

Author Contributions

Conceptualization, M.R.B.J. and C.Z.; methodology, M.R.B.J. and C.Z.; validation, M.R.B.J., D.T. and C.Z.; formal analysis, M.R.B.J.; investigation, M.R.B.J. and R.d.G.C.; resources, C.Z.; data curation, M.R.B.J. and D.T.; writing—original draft preparation, M.R.B.J. and D.T.; writing—review and editing, M.R.B.J., D.T., R.d.G.C., B.R.d.A.M., R.P.d.S. and C.Z.; visualization, M.R.B.J., D.T., R.d.G.C., B.R.d.A.M., R.P.d.S. and C.Z.; supervision, C.Z.; project administration, C.Z. All authors have read and agreed to the published version of the manuscript.

Funding

We acknowledge support for the publication of this work by the or Publishing Fund of Graduate Program in Agronomy (Plant Production).

Acknowledgments

We would like to acknowledge the Coordination for the Improvement of Higher Education Personnel (CAPES) for the financial support (code 001) to the first author and the Laboratory of Machinery and Agricultural Mechanization (LAMMA) of the Department of Engineering and Mathematical Sciences for the infrastructural support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Luna, I.; Lobo, A. Mapping Crop Planting Quality in Sugarcane from UAV Imagery: A Pilot Study in Nicaragua. Remote Sens. 2016, 8, 500. [Google Scholar] [CrossRef] [Green Version]
  2. Souza, C.H.W.; Lamparelli, R.A.C.; Rocha, J.V.; Magalhães, P.S.G. Mapping skips in sugarcane fields using object-based analysis of unmanned aerial vehicle (UAV) images. Comput. Electron. Agric. 2017, 143, 49–56. [Google Scholar] [CrossRef]
  3. Maldaner, L.F.; Molin, J.P.; Martello, M.; Tavares, T.R.; Dias, F.L.F. Identification and measurement of gaps within sugarcane rows for site-specific management: Comparing different sensor-based approaches. Biosyst. Eng. 2021, 209, 64–73. [Google Scholar] [CrossRef]
  4. Stolf, R. Methodology for gap evaluation on sugarcane lines. STAB 1986, 4, 12–20. [Google Scholar]
  5. Molin, J.P.; Veiga, J.P.S. Spatial variability of sugarcane row gaps: Measurement and mapping. Ciência Agrotecnol. 2016, 40, 347–355. [Google Scholar] [CrossRef] [Green Version]
  6. Amaral, L.R.; Zerbato, C.; Freitas, R.G.; Barbosa Júnior, M.R.; Simões, I.O.P.S. UAV applications in Agriculture 4.0. Rev. Cienc. Agron. 2020, 51, 1–15. [Google Scholar] [CrossRef]
  7. Xie, C.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
  8. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  9. Yao, H.; Qin, R.; Chen, X. Unmanned Aerial Vehicle for Remote Sensing Applications—A Review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef] [Green Version]
  10. Librán-Embid, F.; Klaus, F.; Tscharntke, T.; Grass, I. Unmanned aerial vehicles for biodiversity-friendly agricultural landscapes—A systematic review. Sci. Total Environ. 2020, 732, 139204. [Google Scholar] [CrossRef]
  11. Neupane, K.; Baysal-Gurel, F. Automatic Identification and Monitoring of Plant Diseases Using Unmanned Aerial Vehicles: A Review. Remote Sens. 2021, 13, 3841. [Google Scholar] [CrossRef]
  12. Osco, L.P.; de Arruda, M.d.S.; Gonçalves, D.N.; Dias, A.; Batistoti, J.; de Souza, M.; Gomes, F.D.G.; Ramos, A.P.M.; de Castro Jorge, L.A.; Liesenberg, V.; et al. A CNN approach to simultaneously count plants and detect plantation-rows from UAV imagery. ISPRS J. Photogramm. Remote Sens. 2021, 174, 1–17. [Google Scholar] [CrossRef]
  13. Bahuguna, S.; Anchal, S.; Guleria, D.; Devi, M.; Kumar, D.; Kumar, R.; Murthy, P.V.S.; Kumar, A. Unmanned Aerial Vehicle-Based Multispectral Remote Sensing for Commercially Important Aromatic Crops in India for Its Efficient Monitoring and Management. J. Indian Soc. Remote Sens. 2021, 1–11. [Google Scholar] [CrossRef]
  14. Valente, J.; Sari, B.; Kooistra, L.; Kramer, H.; Mücher, S. Automated crop plant counting from very high-resolution aerial imagery. Precis. Agric. 2020, 21, 1366–1384. [Google Scholar] [CrossRef]
  15. Guo, W.; Zheng, B.; Potgieter, A.B.; Diot, J.; Watanabe, K.; Noshita, K.; Jordan, D.R.; Wang, X.; Watson, J.; Ninomiya, S.; et al. Aerial imagery analysis—Quantifying appearance and number of sorghum heads for applications in breeding and agronomy. Front. Plant Sci. 2018, 871, 1544. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Feng, L.; Zhang, Z.; Ma, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Alfalfa Yield Prediction Using UAV-Based Hyperspectral Imagery and Ensemble Learning. Remote Sens. 2020, 12, 2028. [Google Scholar] [CrossRef]
  17. Cholula, U.; da Silva, J.A.; Marconi, T.; Thomasson, J.A.; Solorzano, J.; Enciso, J. Forecasting yield and lignocellulosic composition of energy cane using unmanned aerial systems. Agronomy 2020, 10, 718. [Google Scholar] [CrossRef]
  18. Ramos, A.P.M.; Osco, L.P.; Furuya, D.E.G.; Gonçalves, W.N.; Santana, D.C.; Teodoro, L.P.R.; da Silva Junior, C.A.; Capristo-Silva, G.F.; Li, J.; Baio, F.H.R.; et al. A random forest ranking approach to predict yield in maize with uav-based vegetation spectral indices. Comput. Electron. Agric. 2020, 178, 105791. [Google Scholar] [CrossRef]
  19. Ge, H.; Ma, F.; Li, Z.; Du, C. Grain Yield Estimation in Rice Breeding Using Phenological Data and Vegetation Indices Derived from UAV Images. Agronomy 2021, 11, 2439. [Google Scholar] [CrossRef]
  20. Poblete, T.; Ortega-Farías, S.; Ryu, D. Automatic Coregistration Algorithm to Remove Canopy Shaded Pixels in UAV-Borne Thermal Images to Improve the Estimation of Crop Water Stress Index of a Drip-Irrigated Cabernet Sauvignon Vineyard. Sensors 2018, 18, 397. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. de Jesús Marcial-Pablo, M.; Ontiveros-Capurata, R.E.; Jiménez-Jiménez, S.I.; Ojeda-Bustamante, W. Maize Crop Coefficient Estimation Based on Spectral Vegetation Indices and Vegetation Cover Fraction Derived from UAV-Based Multispectral Images. Agronomy 2021, 11, 668. [Google Scholar] [CrossRef]
  22. Mokhtari, A.; Ahmadi, A.; Daccache, A.; Drechsler, K. Actual Evapotranspiration from UAV Images: A Multi-Sensor Data Fusion Approach. Remote Sens. 2021, 13, 2315. [Google Scholar] [CrossRef]
  23. Santos, R.A.; Mantovani, E.C.; Filgueiras, R.; Fernandes-Filho, E.I.; Silva, A.C.B.; Venancio, L.P. Actual Evapotranspiration and Biomass of Maize from a Red–Green-Near-Infrared (RGNIR) Sensor on Board an Unmanned Aerial Vehicle (UAV). Water 2020, 12, 2359. [Google Scholar] [CrossRef]
  24. Alvares, C.A.; Stape, J.L.; Sentelhas, P.C.; Gonçalves, J.L.M.; Sparovek, G. Köppen’s climate classification map for Brazil. Meteorol. Z. 2013, 22, 711–728. [Google Scholar] [CrossRef]
  25. Rossi, M. Mapa Pedológico do Estado de São Paulo: Revisado e Ampliado; Institulo Florestal: São Paulo, Brazil, 2017; Volume 1, ISBN 9788564808164.
  26. Hu, P.; Guo, W.; Chapman, S.C.; Guo, Y.; Zheng, B. Pixel size of aerial imagery constrains the applications of unmanned aerial vehicle in crop breeding. ISPRS J. Photogramm. Remote Sens. 2019, 154, 1–9. [Google Scholar] [CrossRef]
  27. Yan, G.; Li, L.; Coy, A.; Mu, X.; Chen, S.; Xie, D.; Zhang, W.; Shen, Q.; Zhou, H. Improving the estimation of fractional vegetation cover from UAV RGB imagery by colour unmixing. ISPRS J. Photogramm. Remote Sens. 2019, 158, 23–34. [Google Scholar] [CrossRef]
  28. Zhang, D.; Mansaray, L.R.; Jin, H.; Sun, H.; Kuang, Z.; Huang, J. A universal estimation model of fractional vegetation cover for different crops based on time series digital photographs. Comput. Electron. Agric. 2018, 151, 93–103. [Google Scholar] [CrossRef]
  29. Ranđelović, P.; Đorđević, V.; Milić, S.; Balešević-Tubić, S.; Petrović, K.; Miladinović, J.; Đukić, V. Prediction of Soybean Plant Density Using a Machine Learning Model and Vegetation Indices Extracted from RGB Images Taken with a UAV. Agronomy 2020, 10, 1108. [Google Scholar] [CrossRef]
  30. Xu, J.X.; Ma, J.; Tang, Y.N.; Wu, W.X.; Shao, J.H.; Wu, W.B.; Wei, S.Y.; Liu, Y.F.; Wang, Y.C.; Guo, H.Q. Estimation of sugarcane yield using a machine learning approach based on uav-lidar data. Remote Sens. 2020, 12, 2823. [Google Scholar] [CrossRef]
Figure 1. Illustrative information. (a) Disposition of the gap in the experimental plot. (b) Location of the gap identification target. (c) Gap end for measuring plant height. (d) Ruler used to measure plant height.
Figure 1. Illustrative information. (a) Disposition of the gap in the experimental plot. (b) Location of the gap identification target. (c) Gap end for measuring plant height. (d) Ruler used to measure plant height.
Agronomy 11 02578 g001
Figure 2. Examples of a 2.5-m long gap. Columns represent the images captured at different heights, and lines represent the pixel sizes used to capture the images. All images refer to the same gap under different evaluation conditions.
Figure 2. Examples of a 2.5-m long gap. Columns represent the images captured at different heights, and lines represent the pixel sizes used to capture the images. All images refer to the same gap under different evaluation conditions.
Agronomy 11 02578 g002
Figure 3. Sugarcane growth pattern. Each box represents plant growth at each sampling time. On average, plant height reached 0.5, 0.9, 1.0, 1.2 and 1.7 m for each sampling data, respectively.
Figure 3. Sugarcane growth pattern. Each box represents plant growth at each sampling time. On average, plant height reached 0.5, 0.9, 1.0, 1.2 and 1.7 m for each sampling data, respectively.
Agronomy 11 02578 g003
Figure 4. Scatterplots comparing field gap lengths and gap estimated by UAV images. The comparison corresponds to the plant’s height (rows: 0.5, 0.9, 1.0, 1.2 and 1.7 m), pixel size (columns: 3.5, 6.0 and 8.2 cm) and gap lengths (x-axis: field gaps and y-axis: gaps estimated by UAV images; 0.5, 1.0, 1.5, 2.0 and 2.5 m). The points represent the mean of data (±standard deviation).
Figure 4. Scatterplots comparing field gap lengths and gap estimated by UAV images. The comparison corresponds to the plant’s height (rows: 0.5, 0.9, 1.0, 1.2 and 1.7 m), pixel size (columns: 3.5, 6.0 and 8.2 cm) and gap lengths (x-axis: field gaps and y-axis: gaps estimated by UAV images; 0.5, 1.0, 1.5, 2.0 and 2.5 m). The points represent the mean of data (±standard deviation).
Agronomy 11 02578 g004
Figure 5. Example of gap overlap caused by crop leaves. A 0.50-m ruler is in front of the gap.
Figure 5. Example of gap overlap caused by crop leaves. A 0.50-m ruler is in front of the gap.
Agronomy 11 02578 g005
Table 1. Mean absolute error (MAE) relative to estimated gaps by UAV image and field gaps. The error is also presented for all gaps together in the “Total” column, resulting in 7.5 m of gaps.
Table 1. Mean absolute error (MAE) relative to estimated gaps by UAV image and field gaps. The error is also presented for all gaps together in the “Total” column, resulting in 7.5 m of gaps.
Pixel Size (cm)Plant Height (m)Gap Length (m)
0.51.01.52.02.5Total
3.50.50.50.380.240.260.021.44
0.90.51.000.480.430.462.87
1.00.51.000.530.480.563.07
1.20.51.000.720.890.693.80
1.70.51.001.011.100.714.32
6.00.50.50.410.500.330.131.89
0.90.51.000.790.890.603.78
1.00.51.000.991.090.874.45
1.20.51.001.011.170.954.64
1.70.51.001.221.191.125.03
8.20.50.50.550.550.550.042.49
0.90.51.001.221.591.275.58
1.00.51.001.361.681.826.36
1.20.51.001.411.681.946.54
1.70.51.001.411.802.066.77
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Barbosa Júnior, M.R.; Tedesco, D.; Corrêa, R.d.G.; Moreira, B.R.d.A.; Silva, R.P.d.; Zerbato, C. Mapping Gaps in Sugarcane by UAV RGB Imagery: The Lower and Earlier the Flight, the More Accurate. Agronomy 2021, 11, 2578. https://doi.org/10.3390/agronomy11122578

AMA Style

Barbosa Júnior MR, Tedesco D, Corrêa RdG, Moreira BRdA, Silva RPd, Zerbato C. Mapping Gaps in Sugarcane by UAV RGB Imagery: The Lower and Earlier the Flight, the More Accurate. Agronomy. 2021; 11(12):2578. https://doi.org/10.3390/agronomy11122578

Chicago/Turabian Style

Barbosa Júnior, Marcelo Rodrigues, Danilo Tedesco, Rafael de Graaf Corrêa, Bruno Rafael de Almeida Moreira, Rouverson Pereira da Silva, and Cristiano Zerbato. 2021. "Mapping Gaps in Sugarcane by UAV RGB Imagery: The Lower and Earlier the Flight, the More Accurate" Agronomy 11, no. 12: 2578. https://doi.org/10.3390/agronomy11122578

APA Style

Barbosa Júnior, M. R., Tedesco, D., Corrêa, R. d. G., Moreira, B. R. d. A., Silva, R. P. d., & Zerbato, C. (2021). Mapping Gaps in Sugarcane by UAV RGB Imagery: The Lower and Earlier the Flight, the More Accurate. Agronomy, 11(12), 2578. https://doi.org/10.3390/agronomy11122578

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop