Crop Production Parameter Estimation through Remote Sensing Data

A special issue of Agronomy (ISSN 2073-4395). This special issue belongs to the section "Precision and Digital Agriculture".

Deadline for manuscript submissions: closed (30 September 2024) | Viewed by 6557

Special Issue Editors

Department of Agricultural and Biological Engineering, Mississippi State University, Starkville, MS 39762, USA
Interests: smart/digital agriculture; artificial intelligence in agriculture; crop prediction models; UAV/UGV swarm
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
USDA-ARS Crop Production Systems Research Unit/Department of Plant and Soil Sciences, Mississippi State University, Starkville, MS 39762, USA
Interests: geospatial analysis; agricultural remote sensing; machine learning

Special Issue Information

Dear Colleague,

Highly accurate and reliable estimation of crop production parameters, such as biomass and yield, is critical for improved crop production process management and strategic planning. Remote sensing has been studied and developed for estimating plant biomass and crop yield. However, it is still being investigated for increasing the accuracy and reliability of the estimations. This Special Issue aims to provide a comprehensive view of the development and application of crop production parameter estimation using remote sensing from satellite, airborne, manned, and unmanned aerial vehicles to ground-based systems. In recent years, machine/deep learning has been developed and applied to increase the accuracy and reliability of crop production parameter estimation using remotely sensed data. This Special Issue wishes to explore the achievements in but does not limit itself to, the following scopes of crop production parameter estimation for biomass, yield, or any other related parameters using remote sensing: (1) at the national or regional scale for crop production planning; (2) at farm or field scale for precision agriculture operations; (3) assimilation of remote sensing data into crop models and (4) developing specialized machine/deep learning schemes and algorithms.

Dr. Yanbo Huang
Dr. Xin Zhang
Dr. Chandan Kumar
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Agronomy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • crop production parameter estimation
  • plant biomass
  • crop yield
  • remote sensing
  • machine learning

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Related Special Issue

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 6949 KiB  
Article
Study on the Method of Vineyard Information Extraction Based on Spectral and Texture Features of GF-6 Satellite Imagery
by Xuemei Han, Huichun Ye, Yue Zhang, Chaojia Nie and Fu Wen
Agronomy 2024, 14(11), 2542; https://doi.org/10.3390/agronomy14112542 - 28 Oct 2024
Viewed by 530
Abstract
Accurately identifying the distribution of vineyard cultivation is of great significance for the development of the grape industry and the optimization of planting structures. Traditional remote sensing techniques for vineyard identification primarily depend on machine learning algorithms based on spectral features. However, the [...] Read more.
Accurately identifying the distribution of vineyard cultivation is of great significance for the development of the grape industry and the optimization of planting structures. Traditional remote sensing techniques for vineyard identification primarily depend on machine learning algorithms based on spectral features. However, the spectral reflectance similarities between grapevines and other orchard vegetation lead to persistent misclassification and omission errors across various machine learning algorithms. As a perennial vine plant, grapes are cultivated using trellis systems, displaying regular row spacing and distinctive strip-like texture patterns in high-resolution satellite imagery. This study selected the main oasis area of Turpan City in Xinjiang, China, as the research area. First, this study extracted both spectral and texture features based on GF-6 satellite imagery, subsequently employing the Boruta algorithm to discern the relative significance of these remote sensing features. Then, this study constructed vineyard information extraction models by integrating spectral and texture features, using machine learning algorithms including Naive Bayes (NB), Support Vector Machines (SVMs), and Random Forests (RFs). The efficacy of various machine learning algorithms and remote sensing features in extracting vineyard information was subsequently evaluated and compared. The results indicate that three spectral features and five texture features under a 7 × 7 window have significant sensitivity to vineyard recognition. These spectral features include the Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), and Normalized Difference Water Index (NDWI), while texture features include contrast statistics in the near-infrared band (B4_CO) and the variance statistic, contrast statistic, heterogeneity statistic, and correlation statistic derived from NDVI images (NDVI_VA, NDVI_CO, NDVI_DI, and NDVI_COR). The RF algorithm significantly outperforms both the NB and SVM models in extracting vineyard information, boasting an impressive accuracy of 93.89% and a Kappa coefficient of 0.89. This marks a 12.25% increase in accuracy and a 0.11 increment in the Kappa coefficient over the NB model, as well as an 8.02% enhancement in accuracy and a 0.06 rise in the Kappa coefficient compared to the SVM model. Moreover, the RF model, which amalgamates spectral and texture features, exhibits a notable 13.59% increase in accuracy versus the spectral-only model and a 14.92% improvement over the texture-only model. This underscores the efficacy of the RF model in harnessing the spectral and textural attributes of GF-6 imagery for the precise extraction of vineyard data, offering valuable theoretical and methodological insights for future vineyard identification and information retrieval efforts. Full article
(This article belongs to the Special Issue Crop Production Parameter Estimation through Remote Sensing Data)
Show Figures

Figure 1

18 pages, 9046 KiB  
Article
Application of UAV Multispectral Imaging to Monitor Soybean Growth with Yield Prediction through Machine Learning
by Sadia Alam Shammi, Yanbo Huang, Gary Feng, Haile Tewolde, Xin Zhang, Johnie Jenkins and Mark Shankle
Agronomy 2024, 14(4), 672; https://doi.org/10.3390/agronomy14040672 - 26 Mar 2024
Cited by 3 | Viewed by 1801
Abstract
The application of remote sensing, which is non-destructive and cost-efficient, has been widely used in crop monitoring and management. This study used a built-in multispectral imager on a small unmanned aerial vehicle (UAV) to capture multispectral images in five different spectral bands (blue, [...] Read more.
The application of remote sensing, which is non-destructive and cost-efficient, has been widely used in crop monitoring and management. This study used a built-in multispectral imager on a small unmanned aerial vehicle (UAV) to capture multispectral images in five different spectral bands (blue, green, red, red edge, and near-infrared), instead of satellite-captured data, to monitor soybean growth in a field. The field experiment was conducted in a soybean field at the Mississippi State University Experiment Station near Pontotoc, MS, USA. The experiment consisted of five cover crops (Cereal Rye, Vetch, Wheat, Mustard plus Cereal Rye, and native vegetation) planted in the winter and three fertilizer treatments (Fertilizer, Poultry Liter, and None) applied before planting the soybean. During the soybean growing season in 2022, eight UAV imaging flyovers were conducted, spread across the growth season. UAV image-derived vegetation indices (VIs) coupled with machine learning (ML) models were computed for characterizing soybean growth at different stages across the season. The aim of this study focuses on monitoring soybean growth to predict yield, using 14 VIs including CC (Canopy Cover), NDVI (Normalized Difference Vegetation Index), GNDVI (Green Normalized Difference Vegetation Index), EVI2 (Enhanced Vegetation Index 2), and others. Different machine learning algorithms including Linear Regression (LR), Support Vector Machine (SVM), and Random Forest (RF) are used for this purpose. The stage of the initial pod development was shown as having the best predictability for earliest soybean yield prediction. CC, NDVI, and NAVI (Normalized area vegetation index) were shown as the best VIs for yield prediction. The RMSE was found to be about 134.5 to 511.11 kg ha−1 in the different yield models, whereas it was 605.26 to 685.96 kg ha−1 in the cross-validated models. Due to the limited number of training and testing samples in the K-fold cross-validation, the models’ results changed to some extent. Nevertheless, the results of this study will be useful for the application of UAV remote sensing to provide information for soybean production and management. This study demonstrates that VIs coupled with ML models can be used in multistage soybean yield prediction at a farm scale, even with a limited number of training samples. Full article
(This article belongs to the Special Issue Crop Production Parameter Estimation through Remote Sensing Data)
Show Figures

Figure 1

14 pages, 3712 KiB  
Article
Prediction of Daily Ambient Temperature and Its Hourly Estimation Using Artificial Neural Networks in Urban Allotment Gardens and an Urban Park in Valladolid, Castilla y León, Spain
by Francisco Tomatis, Francisco Javier Diez, Maria Sol Wilhelm and Luis Manuel Navas-Gracia
Agronomy 2024, 14(1), 60; https://doi.org/10.3390/agronomy14010060 - 26 Dec 2023
Viewed by 1742
Abstract
Urban green spaces improve quality of life by mitigating urban temperatures. However, there are challenges in obtaining urban data to analyze and understand their influence. With the aim of developing innovative methodologies for this type of research, Artificial Neural Networks (ANNs) were developed [...] Read more.
Urban green spaces improve quality of life by mitigating urban temperatures. However, there are challenges in obtaining urban data to analyze and understand their influence. With the aim of developing innovative methodologies for this type of research, Artificial Neural Networks (ANNs) were developed to predict daily and hourly temperatures in urban green spaces from sensors placed in situ for 41 days. The study areas were four urban allotment gardens (with dynamic and productive vegetation) and a forested urban park in the city of Valladolid, Spain. ANNs were built and evaluated from various combinations of inputs (X), hidden neurons (Y), and outputs (Z) under the practical rule of “making networks simple, to obtain better results”. Seven ANNs architectures were tested: 7-Y-5 (Y = 6, 7, …, 14), 6-Y-5 (Y = 6, 7, …, 14), 7-Y-1 (Y = 2, 3, …, 8), 6-Y-1 (Y = 2, 3, …, 8), 4-Y-1 (Y = 1, 2, …, 7), 3-Y-1 (Y = 1, 2, …, 7), and 2-Y-1 (Y = 2, 3, …, 8). The best-performing model was the 6-Y-1 ANN architecture with a Root Mean Square Error (RMSE) of 0.42 °C for the urban garden called Valle de Arán. The results demonstrated that from shorter data points obtained in situ, ANNs predictions achieve acceptable results and reflect the usefulness of the methodology. These predictions were more accurate in urban gardens than in urban parks, where the type of existing vegetation can be a decisive factor. This study can contribute to the development of a sustainable and smart city, and has the potential to be replicated in cities where the influence of urban green spaces on urban temperatures is studied with traditional methodologies. Full article
(This article belongs to the Special Issue Crop Production Parameter Estimation through Remote Sensing Data)
Show Figures

Figure 1

18 pages, 4469 KiB  
Article
Defining the Ideal Phenological Stage for Estimating Corn Yield Using Multispectral Images
by Carlos Alberto Matias de Abreu Júnior, George Deroco Martins, Laura Cristina Moura Xavier, João Vitor Meza Bravo, Douglas José Marques and Guilherme de Oliveira
Agronomy 2023, 13(9), 2390; https://doi.org/10.3390/agronomy13092390 - 15 Sep 2023
Cited by 1 | Viewed by 1362
Abstract
Image-based spectral models assist in estimating the yield of maize. During the vegetative and reproductive phenological phases, the corn crop undergoes changes caused by biotic and abiotic stresses. These variations can be quantified using spectral models, which are tools that help producers to [...] Read more.
Image-based spectral models assist in estimating the yield of maize. During the vegetative and reproductive phenological phases, the corn crop undergoes changes caused by biotic and abiotic stresses. These variations can be quantified using spectral models, which are tools that help producers to manage crops. However, defining the correct time to obtain these images remains a challenge. In this study, the possibility to estimate corn yield using multispectral images is hypothesized, while considering the optimal timing for detecting the differences caused by various phenological stages. Thus, the main objective of this work was to define the ideal phenological stage for taking multispectral images to estimate corn yield. Multispectral bands and vegetation indices derived from the Planet satellite were considered as predictor variables for the input data of the models. We used root mean square error percentage and mean absolute percentage error to evaluate the accuracy and trend of the yield estimates. The reproductive phenological phase R2 was found to be optimal for determining the spectral models based on the images, which obtained the best root mean square error percentage of 9.17% and the second-best mean absolute percentage error of 7.07%. Here, we demonstrate that it is possible to estimate yield in a corn plantation in a stage before the harvest through Planet multispectral satellite images. Full article
(This article belongs to the Special Issue Crop Production Parameter Estimation through Remote Sensing Data)
Show Figures

Figure 1

Back to TopTop