sensors-logo

Journal Browser

Journal Browser

Sensors in Agriculture 2018

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (30 November 2018) | Viewed by 220218

Special Issue Editor


E-Mail Website
Guest Editor
Head of Agricultural Engineering Laboratory, Faculty of Agriculture, Aristotle University of Thessaloniki (A.U.Th.), P.O. 275, 54124 Thessaloniki, Greece
Interests: remote sensing; multiscale fusion robotic agriculture; sensor networks; robotics; development of cognitive abilities; fusion of global and local cognition
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Agriculture requires technical solutions for increasing production while reducing environmental impact by reducing the application of agro-chemicals and increasing the use of environmentally-friendly management practices. A benefit of this is the reduction of production costs. Sensor technologies produce tools to achieve the abovementioned goals. The explosive technological advances and developments in recent years have enormously facilitated the attainment of these objectives removing many barriers for their implementation, including the reservations expressed by farmers. Precision agriculture is an emerging area where sensor-based technologies play an important role.

Farmers, researchers, and technical manufacturers are joining their efforts to find efficient solutions, improvements in production, and reductions in costs. This Special Issue aims to bring together recent research and developments concerning novel sensors and their applications in agriculture. Sensors in agriculture are based on the requirements of farmers, according to the farming operations that need to be addressed. Papers addressing sensor development for a wide range of agricultural tasks, including, but not limited to, recent research and developments in the following areas are expected:

  • Optical sensors: Hyperspectral, multispectral, fluorescence and thermal sensing
  • Sensors for crop health status determination
  • Sensors for crop phenotyping, germination, emergence and determination of the different growth stages of crops
  • Sensors for detection of microorganism and pest management
  • Airborne sensors (UAV)
  • Multisensor systems, sensor fusion
  • Non-destructive soil sensing
  • Yield estimation and prediction
  • Detection and identification of crops and weeds
  • Sensors for detection of fruits
  • Sensors for fruit quality determination
  • Sensors for weed control
  • Volatile components detection, electronic noses and tongues
  • Sensors for robot navigation, localization and mapping and environmental awareness
  • Sensors for robotic applications in crop management
  • Sensors for positioning, navigation and obstacle detection
  • Sensor networks in agriculture, wearable sensors, the Internet of Things
  • Low energy, disposable and energy harvesting sensors in agriculture

Prof. Dr. Dimitrios Moshou
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Agricultural sensors
  • Sensor information acquisition
  • sensor information processing
  • Sensor-based decision making
  • sensors in agriculture
  • precision agriculture
  • sensors in agricultural production
  • technologies
  • sensors applications
  • robot sensors
  • processing of sensed data

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (17 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

14 pages, 6840 KiB  
Article
LiDAR-Based 3D Scans of Soil Surfaces and Furrows in Two Soil Types
by Frederik F. Foldager, Johanna Maria Pedersen, Esben Haubro Skov, Alevtina Evgrafova and Ole Green
Sensors 2019, 19(3), 661; https://doi.org/10.3390/s19030661 - 6 Feb 2019
Cited by 15 | Viewed by 7871
Abstract
Soil surface measurements play an important role in the performance assessment of tillage operations and are relevant in both academic and industrial settings. Manual soil surface measurements are time-consuming and laborious, which often limits the amount of data collected. An experiment was conducted [...] Read more.
Soil surface measurements play an important role in the performance assessment of tillage operations and are relevant in both academic and industrial settings. Manual soil surface measurements are time-consuming and laborious, which often limits the amount of data collected. An experiment was conducted to compare two approaches for measuring and analysing the cross-sectional area and geometry of a furrow after a trailing shoe sweep. The compared approaches in this study were a manual pinboard and a Light Detection and Ranging (LiDAR) sensor. The experiments were conducted in coarse sand and loamy sand soil bins exposed to three levels of irrigation. Using the LiDAR, a system for generating 3D scans of the soil surface was obtained and a mean furrow geometry was introduced to study the geometrical variations along the furrows. A comparison of the cross-sectional area measurements by the pinboard and the LiDAR showed up to 41% difference between the two methods. The relation between irrigation and the resulting furrow area of a trailing shoe sweep was investigated using the LiDAR measurements. The furrow cross-sectional area increased by 11% and 34% under 20 mm and 40 mm irrigation compared to non-irrigated in the coarse sand experiment. In the loamy sand, the cross-sectional area increased by 17% and 15% by irrigation of 20 mm and 40 mm compared to non-irrigated measured using the LiDAR. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

22 pages, 10195 KiB  
Article
Comparing RGB-D Sensors for Close Range Outdoor Agricultural Phenotyping
by Adar Vit and Guy Shani
Sensors 2018, 18(12), 4413; https://doi.org/10.3390/s18124413 - 13 Dec 2018
Cited by 83 | Viewed by 8593
Abstract
Phenotyping is the task of measuring plant attributes for analyzing the current state of the plant. In agriculture, phenotyping can be used to make decisions concerning the management of crops, such as the watering policy, or whether to spray for a certain pest. [...] Read more.
Phenotyping is the task of measuring plant attributes for analyzing the current state of the plant. In agriculture, phenotyping can be used to make decisions concerning the management of crops, such as the watering policy, or whether to spray for a certain pest. Currently, large scale phenotyping in fields is typically done using manual labor, which is a costly, low throughput process. Researchers often advocate the use of automated systems for phenotyping, relying on the use of sensors for making measurements. The recent rise of low cost, yet reasonably accurate, RGB-D sensors has opened the way for using these sensors in field phenotyping applications. In this paper, we investigate the applicability of four different RGB-D sensors for this task. We conduct an outdoor experiment, measuring plant attribute in various distances and light conditions. Our results show that modern RGB-D sensors, in particular, the Intel D435 sensor, provides a viable tool for close range phenotyping tasks in fields. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

14 pages, 2658 KiB  
Article
Hyperspectral Imaging for Evaluating Impact Damage to Mango According to Changes in Quality Attributes
by Duohua Xu, Huaiwen Wang, Hongwei Ji, Xiaochuan Zhang, Yanan Wang, Zhe Zhang and Hongfei Zheng
Sensors 2018, 18(11), 3920; https://doi.org/10.3390/s18113920 - 14 Nov 2018
Cited by 15 | Viewed by 4319
Abstract
Evaluation of impact damage to mango (Mangifera indica Linn) as a result of dropping from three different heights, namely, 0.5, 1.0 and 1.5 m, was conducted by hyperspectral imaging (HSI). Reflectance spectra in the 900–1700 nm region were used to develop prediction [...] Read more.
Evaluation of impact damage to mango (Mangifera indica Linn) as a result of dropping from three different heights, namely, 0.5, 1.0 and 1.5 m, was conducted by hyperspectral imaging (HSI). Reflectance spectra in the 900–1700 nm region were used to develop prediction models for pulp firmness (PF), total soluble solids (TSS), titratable acidity (TA) and chroma (∆b*) by a partial least squares (PLS) regression algorithm. The results showed that the changes in the mangoes’ quality attributes, which were also reflected in the spectra, had a strong relationship with dropping height. The best predictive performance measured by coefficient of determination (R2) and root mean square errors of prediction (RMSEP) values were: 0.84 and 31.6 g for PF, 0.9 and 0.49 oBrix for TSS, 0.65 and 0.1% for TA, 0.94 and 0.96 for chroma, respectively. Classification of the degree of impact damage to mango achieved an accuracy of more than 77.8% according to ripening index (RPI). The results show the potential of HSI to evaluate impact damage to mango by combining with changes in quality attributes. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

12 pages, 2592 KiB  
Article
On-Barn Pig Weight Estimation Based on Body Measurements by Structure-from-Motion (SfM)
by Andrea Pezzuolo, Veronica Milani, DeHai Zhu, Hao Guo, Stefano Guercini and Francesco Marinello
Sensors 2018, 18(11), 3603; https://doi.org/10.3390/s18113603 - 24 Oct 2018
Cited by 60 | Viewed by 8515
Abstract
Information on the body shape of pigs is a key indicator to monitor their performance and health and to control or predict their market weight. Manual measurements are among the most common ways to obtain an indication of animal growth. However, this approach [...] Read more.
Information on the body shape of pigs is a key indicator to monitor their performance and health and to control or predict their market weight. Manual measurements are among the most common ways to obtain an indication of animal growth. However, this approach is laborious and difficult, and it may be stressful for both the pigs and the stockman. The present paper proposes the implementation of a Structure from Motion (SfM) photogrammetry approach as a new tool for on-barn animal reconstruction applications. This is possible also to new software tools allowing automatic estimation of camera parameters during the reconstruction process even without a preliminary calibration phase. An analysis on pig body 3D SfM characterization is here proposed, carried out under different conditions in terms of number of camera poses and animal movements. The work takes advantage of the total reconstructed surface as reference index to quantify the quality of the achieved 3D reconstruction, showing how as much as 80% of the total animal area can be characterized. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

21 pages, 3977 KiB  
Article
Development of an Apparatus for Crop-Growth Monitoring and Diagnosis
by Jun Ni, Jingchao Zhang, Rusong Wu, Fangrong Pang and Yan Zhu
Sensors 2018, 18(9), 3129; https://doi.org/10.3390/s18093129 - 17 Sep 2018
Cited by 20 | Viewed by 5086
Abstract
To non-destructively acquire leaf nitrogen content (LNC), leaf nitrogen accumulation (LNA), leaf area index (LAI), and leaf dry weight (LDW) data at high speed and low cost, a portable apparatus for crop-growth monitoring and diagnosis (CGMD) was developed according to the spectral monitoring [...] Read more.
To non-destructively acquire leaf nitrogen content (LNC), leaf nitrogen accumulation (LNA), leaf area index (LAI), and leaf dry weight (LDW) data at high speed and low cost, a portable apparatus for crop-growth monitoring and diagnosis (CGMD) was developed according to the spectral monitoring mechanisms of crop growth. According to the canopy characteristics of crops and actual requirements of field operation environments, splitting light beams by using an optical filter and proper structural parameters were determined for the sensors. Meanwhile, an integral-type weak optoelectronic signal processing circuit was designed, which changed the gain of the system and guaranteed the high resolution of the apparatus by automatically adjusting the integration period based on the irradiance received from ambient light. In addition, a coupling processor system for a sensor information and growth model based on the microcontroller chip was developed. Field experiments showed that normalised vegetation index (NDVI) measured separately through the CGMD apparatus and the ASD spectrometer showed a good linear correlation. For measurements of canopy reflectance spectra of rice and wheat, their linear determination coefficients (R2) were 0.95 and 0.92, respectively while the root mean square errors (RMSEs) were 0.02 and 0.03, respectively. NDVI value measured by using the CGMD apparatus and growth indices of rice and wheat exhibited a linear relationship. For the monitoring models for LNC, LNA, LAI, and LDW of rice based on linear fitting of NDVI, R2 were 0.64, 0.67, 0.63 and 0.70, and RMSEs were 0.31, 2.29, 1.15 and 0.05, respectively. In addition, R2 of the models for monitoring LNC, LNA, LAI, and LDW of wheat on the basis of linear fitting of NDVI were 0.82, 0.71, 0.72 and 0.70, and RMSEs were 0.26, 2.30, 1.43, and 0.05, respectively. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

21 pages, 7598 KiB  
Article
Non-Contact Body Measurement for Qinchuan Cattle with LiDAR Sensor
by Lvwen Huang, Shuqin Li, Anqi Zhu, Xinyun Fan, Chenyang Zhang and Hongyan Wang
Sensors 2018, 18(9), 3014; https://doi.org/10.3390/s18093014 - 9 Sep 2018
Cited by 49 | Viewed by 15618
Abstract
The body dimension measurement of large animals plays a significant role in quality improvement and genetic breeding, and the non-contact measurements by computer vision-based remote sensing could represent great progress in the case of dangerous stress responses and time-costing manual measurements. This paper [...] Read more.
The body dimension measurement of large animals plays a significant role in quality improvement and genetic breeding, and the non-contact measurements by computer vision-based remote sensing could represent great progress in the case of dangerous stress responses and time-costing manual measurements. This paper presents a novel approach for three-dimensional digital modeling of live adult Qinchuan cattle for body size measurement. On the basis of capturing the original point data series of live cattle by a Light Detection and Ranging (LiDAR) sensor, the conditional, statistical outliers and voxel grid filtering methods are fused to cancel the background and outliers. After the segmentation of K-means clustering extraction and the RANdom SAmple Consensus (RANSAC) algorithm, the Fast Point Feature Histogram (FPFH) is put forward to get the cattle data automatically. The cattle surface is reconstructed to get the 3D cattle model using fast Iterative Closest Point (ICP) matching with Bi-directional Random K-D Trees and a Greedy Projection Triangulation (GPT) reconstruction method by which the feature points of cattle silhouettes could be clicked and calculated. Finally, the five body parameters (withers height, chest depth, back height, body length, and waist height) are measured in the field and verified within an accuracy of 2 mm and an error close to 2%. The experimental results show that this approach could be considered as a new feasible method towards the non-contact body measurement for large physique livestock. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

14 pages, 3045 KiB  
Article
Olive-Fruit Mass and Size Estimation Using Image Analysis and Feature Modeling
by Juan Manuel Ponce, Arturo Aquino, Borja Millán and José Manuel Andújar
Sensors 2018, 18(9), 2930; https://doi.org/10.3390/s18092930 - 3 Sep 2018
Cited by 27 | Viewed by 4916
Abstract
This paper presents a new methodology for the estimation of olive-fruit mass and size, characterized by its major and minor axis length, by using image analysis techniques. First, different sets of olives from the varieties Picual and Arbequina were photographed in the laboratory. [...] Read more.
This paper presents a new methodology for the estimation of olive-fruit mass and size, characterized by its major and minor axis length, by using image analysis techniques. First, different sets of olives from the varieties Picual and Arbequina were photographed in the laboratory. An original algorithm based on mathematical morphology and statistical thresholding was developed for segmenting the acquired images. The estimation models for the three targeted features, specifically for each variety, were established by linearly correlating the information extracted from the segmentations to objective reference measurement. The performance of the models was evaluated on external validation sets, giving relative errors of 0.86% for the major axis, 0.09% for the minor axis and 0.78% for mass in the case of the Arbequina variety; analogously, relative errors of 0.03%, 0.29% and 2.39% were annotated for Picual. Additionally, global feature estimation models, applicable to both varieties, were also tried, providing comparable or even better performance than the variety-specific ones. Attending to the achieved accuracy, it can be concluded that the proposed method represents a first step in the development of a low-cost, automated and non-invasive system for olive-fruit characterization in industrial processing chains. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

12 pages, 4807 KiB  
Article
Fast Phenomics in Vineyards: Development of GRover, the Grapevine Rover, and LiDAR for Assessing Grapevine Traits in the Field
by Matthew H. Siebers, Everard J. Edwards, Jose A. Jimenez-Berni, Mark R. Thomas, Michael Salim and Rob R. Walker
Sensors 2018, 18(9), 2924; https://doi.org/10.3390/s18092924 - 3 Sep 2018
Cited by 27 | Viewed by 8038
Abstract
This paper introduces GRover (the grapevine rover), an adaptable mobile platform for the deployment and testing of proximal imaging sensors in vineyards for the non-destructive assessment of trunk and cordon volume and pruning weight. A SICK LMS-400 light detection and ranging (LiDAR) radar [...] Read more.
This paper introduces GRover (the grapevine rover), an adaptable mobile platform for the deployment and testing of proximal imaging sensors in vineyards for the non-destructive assessment of trunk and cordon volume and pruning weight. A SICK LMS-400 light detection and ranging (LiDAR) radar mounted on GRover was capable of producing precise (±3 mm) 3D point clouds of vine rows. Vineyard scans of the grapevine variety Shiraz grown under different management systems at two separate locations have demonstrated that GRover is able to successfully reproduce a variety of vine structures. Correlations of pruning weight and vine wood (trunk and cordon) volume with LiDAR scans have resulted in high coefficients of determination (R2 = 0.91 for pruning weight; 0.76 for wood volume). This is the first time that a LiDAR of this type has been extensively tested in vineyards. Its high scanning rate, eye safe laser and ability to distinguish tissue types make it an appealing option for further development to offer breeders, and potentially growers, quantified measurements of traits that otherwise would be difficult to determine. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

14 pages, 3928 KiB  
Article
A Sound Source Localisation Analytical Method for Monitoring the Abnormal Night Vocalisations of Poultry
by Xiaodong Du, Fengdan Lao and Guanghui Teng
Sensors 2018, 18(9), 2906; https://doi.org/10.3390/s18092906 - 1 Sep 2018
Cited by 41 | Viewed by 6844
Abstract
Due to the increasing scale of farms, it is increasingly difficult for farmers to monitor their animals in an automated way. Because of this problem, we focused on a sound technique to monitor laying hens. Sound analysis has become an important tool for [...] Read more.
Due to the increasing scale of farms, it is increasingly difficult for farmers to monitor their animals in an automated way. Because of this problem, we focused on a sound technique to monitor laying hens. Sound analysis has become an important tool for studying the behaviour, health and welfare of animals in recent years. A surveillance system using microphone arrays of Kinects was developed for automatically monitoring birds’ abnormal vocalisations during the night. Based on the principle of time-difference of arrival (TDOA) of sound source localisation (SSL) method, Kinect sensor direction estimations were very accurate. The system had an accuracy of 74.7% in laboratory tests and 73.6% in small poultry group tests for different area sound recognition. Additionally, flocks produced an average of 40 sounds per bird during feeding time in small group tests. It was found that, on average, each normal chicken produced more than 53 sounds during the daytime (noon to 6:00 p.m.) and less than one sound at night (11:00 p.m.–3:00 a.m.). This system can be used to detect anomalous poultry status at night by monitoring the number of vocalisations and area distributions, which provides a practical and feasible method for the study of animal behaviour and welfare. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

12 pages, 3564 KiB  
Article
Spectral Identification of Disease in Weeds Using Multilayer Perceptron with Automatic Relevance Determination
by Afroditi Alexandra Tamouridou, Xanthoula Eirini Pantazi, Thomas Alexandridis, Anastasia Lagopodi, Giorgos Kontouris and Dimitrios Moshou
Sensors 2018, 18(9), 2770; https://doi.org/10.3390/s18092770 - 23 Aug 2018
Cited by 15 | Viewed by 5142
Abstract
Microbotryum silybum, a smut fungus, is studied as an agent for the biological control of Silybum marianum (milk thistle) weed. Confirmation of the systemic infection is essential in order to assess the effectiveness of the biological control application and assist decision-making. Nonetheless, [...] Read more.
Microbotryum silybum, a smut fungus, is studied as an agent for the biological control of Silybum marianum (milk thistle) weed. Confirmation of the systemic infection is essential in order to assess the effectiveness of the biological control application and assist decision-making. Nonetheless, in situ diagnosis is challenging. The presently demonstrated research illustrates the identification process of systemically infected S. marianum plants by means of field spectroscopy and the multilayer perceptron/automatic relevance determination (MLP-ARD) network. Leaf spectral signatures were obtained from both healthy and infected S. marianum plants using a portable visible and near-infrared spectrometer (310–1100 nm). The MLP-ARD algorithm was applied for the recognition of the infected S. marianum plants. Pre-processed spectral signatures served as input features. The spectra pre-processing consisted of normalization, and second derivative and principal component extraction. MLP-ARD reached a high overall accuracy (90.32%) in the identification process. The research results establish the capacity of MLP-ARD to precisely identify systemically infected S. marianum weeds during their vegetative growth stage. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

14 pages, 3662 KiB  
Article
Non-Destructive Classification of Diversely Stained Capsicum annuum Seed Specimens of Different Cultivars Using Near-Infrared Imaging Based Optical Intensity Detection
by Jyothsna Konkada Manattayil, Naresh Kumar Ravichandran, Ruchire Eranga Wijesinghe, Muhammad Faizan Shirazi, Seung-Yeol Lee, Pilun Kim, Hee-Young Jung, Mansik Jeon and Jeehyun Kim
Sensors 2018, 18(8), 2500; https://doi.org/10.3390/s18082500 - 1 Aug 2018
Cited by 8 | Viewed by 4300
Abstract
The non-destructive classification of plant materials using optical inspection techniques has been gaining much recent attention in the field of agriculture research. Among them, a near-infrared (NIR) imaging method called optical coherence tomography (OCT) has become a well-known agricultural inspection tool since the [...] Read more.
The non-destructive classification of plant materials using optical inspection techniques has been gaining much recent attention in the field of agriculture research. Among them, a near-infrared (NIR) imaging method called optical coherence tomography (OCT) has become a well-known agricultural inspection tool since the last decade. Here we investigated the non-destructive identification capability of OCT to classify diversely stained (with various staining agents) Capsicum annuum seed specimens of different cultivars. A swept source (SS-OCT) system with a spectral band of 1310 nm was used to image unstained control C. annuum seeds along with diversely stained Capsicum seeds, belonging to different cultivar varieties, such as C. annuum cv. PR Ppareum, C. annuum cv. PR Yeol, and C. annuum cv. Asia Jeombo. The obtained cross-sectional images were further analyzed for the changes in the intensity of back-scattered light (resulting due to dye pigment material and internal morphological variations) using a depth scan profiling technique to identify the difference among each seed category. The graphically acquired depth scan profiling results revealed that the control specimens exhibit less back-scattered light intensity in depth scan profiles when compared to the stained seed specimens. Furthermore, a significant back-scattered light intensity difference among each different cultivar group can be identified as well. Thus, the potential capability of OCT based depth scan profiling technique for non-destructive classification of diversely stained C. annum seed specimens of different cultivars can be sufficiently confirmed through the proposed scheme. Hence, when compared to conventional seed sorting techniques, OCT can offer multipurpose advantages by performing sorting of seeds in respective to the dye staining and provides internal structural images non-destructively. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

35 pages, 11914 KiB  
Article
An Ultra-Wideband Frequency System for Non-Destructive Root Imaging
by Thomas Truong, Anh Dinh and Khan Wahid
Sensors 2018, 18(8), 2438; https://doi.org/10.3390/s18082438 - 26 Jul 2018
Cited by 10 | Viewed by 4070
Abstract
Understanding the root system architecture of plants as they develop is critical for increasing crop yields through plant phenotyping, and ultra-wideband imaging systems have shown potential as a portable, low-cost solution to non-destructive imaging root system architectures. This paper presents the design, implementation, [...] Read more.
Understanding the root system architecture of plants as they develop is critical for increasing crop yields through plant phenotyping, and ultra-wideband imaging systems have shown potential as a portable, low-cost solution to non-destructive imaging root system architectures. This paper presents the design, implementation, and analysis of an ultra-wideband imaging system for use in imaging potted plant root system architectures. The proposed system is separated into three main subsystems: a Data Acquisition module, a Data Processing module, and an Image Processing and Analysis module. The Data Acquisition module consists of simulated and experimental implementations of a non-contact synthetic aperture radar system to measure ultra-wideband signal reflections from concealed scattering objects in a pot containing soil. The Data Processing module is responsible for interpreting the measured ultra-wideband signals and producing an image using a delay-and-sum beamforming algorithm. The Image Processing and Analysis module is responsible for improving image quality and measuring root depth and average root diameter in an unsupervised manner. The Image Processing and Analysis module uses a modified top-hat transformation alongside quantization methods based on energy distributions in the image to isolate the surface of the imaged root. Altogether, the proposed subsystems are capable of imaging and measuring concealed taproot system architectures with controlled soil conditions; however, the performance of the system is highly dependent on knowledge of the soil conditions. Smaller roots in difficult imaging conditions require future work into understanding and compensating for unwanted noise. Ultimately, this paper sought to provide insight into improving imaging quality of ultra-wideband (UWB) imaging systems for plant root imaging for other works to be followed. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

28 pages, 4439 KiB  
Article
Optimal Polygon Decomposition for UAV Survey Coverage Path Planning in Wind
by Matthew Coombes, Tom Fletcher, Wen-Hua Chen and Cunjia Liu
Sensors 2018, 18(7), 2132; https://doi.org/10.3390/s18072132 - 3 Jul 2018
Cited by 53 | Viewed by 12357
Abstract
In this paper, a new method for planning coverage paths for fixed-wing Unmanned Aerial Vehicle (UAV) aerial surveys is proposed. Instead of the more generic coverage path planning techniques presented in previous literature, this method specifically concentrates on decreasing flight time of fixed-wing [...] Read more.
In this paper, a new method for planning coverage paths for fixed-wing Unmanned Aerial Vehicle (UAV) aerial surveys is proposed. Instead of the more generic coverage path planning techniques presented in previous literature, this method specifically concentrates on decreasing flight time of fixed-wing aircraft surveys. This is achieved threefold: by the addition of wind to the survey flight time model, accounting for the fact fixed-wing aircraft are not constrained to flight within the polygon of the region of interest, and an intelligent method for decomposing the region into convex polygons conducive to quick flight times. It is shown that wind can make a huge difference to survey time, and that flying perpendicular can confer a flight time advantage. Small UAVs, which have very slow airspeeds, can very easily be flying in wind, which is 50% of their airspeed. This is why the technique is shown to be so effective, due to the fact that ignoring wind for small, slow, fixed-wing aircraft is a considerable oversight. Comparing this method to previous techniques using a Monte Carlo simulation on randomised polygons shows a significant reduction in flight time. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

15 pages, 3682 KiB  
Article
Detecting and Monitoring the Flavor of Tomato (Solanum lycopersicum) under the Impact of Postharvest Handlings by Physicochemical Parameters and Electronic Nose
by Sai Xu, Xiuxiu Sun, Huazhong Lu, Hui Yang, Qingsong Ruan, Hao Huang and Minglin Chen
Sensors 2018, 18(6), 1847; https://doi.org/10.3390/s18061847 - 6 Jun 2018
Cited by 34 | Viewed by 4571
Abstract
The objective of this study was to detect and monitor the flavor of tomatoes, as impacted by different postharvest handlings, including chilling storage (CS) and blanching treatment (BT). CS tomatoes were stored in a refrigerator at 5 °C and tested at storage day [...] Read more.
The objective of this study was to detect and monitor the flavor of tomatoes, as impacted by different postharvest handlings, including chilling storage (CS) and blanching treatment (BT). CS tomatoes were stored in a refrigerator at 5 °C and tested at storage day 0, 3, and 7. BT tomatoes were dipped in 50 or 100 °C water for 1 min, and tested immediately. The taste, mouth feel, and aroma of tomatoes were evaluated by testing the total soluble solid content (TSS), titratable acidity (TA), ratio of TSS and TA (TSS/TA), firmness, and electronic nose (E-nose) response to tomatoes. The experimental results showed that the CS can prevent taste and firmness loss to a certain extent, but the sensory results indicated that CS accelerated flavor loss due to the TSS/TA of CS tomatoes increasing slower than control. The taste and firmness of tomatoes were impacted slightly by 50 °C BT, and were significantly impacted by 100 °C BT. Based on physicochemical parameters, different postharvest handling treatments for tomatoes could not be classified except for the 100 °C BT treated tomatoes, which were significantly impacted in terms of taste and mouth feel. The E-nose is an efficient way to detect differences in postharvest handling treatments for tomatoes, and indicated significant aroma changes for CS and BT treated tomato fruit. The classification of tomatoes after different postharvest handling treatments, based on comprehensive flavor (physicochemical parameters and E-nose combined data), is better than that based on single physicochemical parameters or E-nose, and the comprehensive flavor of 100 °C BT tomatoes changed the most. Even so, the tomato flavor change during postharvest handlings is suggested to be detected and monitored by single E-nose data. The E-nose has also been proved as a feasible way to predict the TSS and firmness of tomato fruit rather than TA or TSS/TA, during the postharvest handing process. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

24 pages, 7553 KiB  
Article
A Kinect-Based Segmentation of Touching-Pigs for Real-Time Monitoring
by Miso Ju, Younchang Choi, Jihyun Seo, Jaewon Sa, Sungju Lee, Yongwha Chung and Daihee Park
Sensors 2018, 18(6), 1746; https://doi.org/10.3390/s18061746 - 29 May 2018
Cited by 40 | Viewed by 5380
Abstract
Segmenting touching-pigs in real-time is an important issue for surveillance cameras intended for the 24-h tracking of individual pigs. However, methods to do so have not yet been reported. We particularly focus on the segmentation of touching-pigs in a crowded pig room with [...] Read more.
Segmenting touching-pigs in real-time is an important issue for surveillance cameras intended for the 24-h tracking of individual pigs. However, methods to do so have not yet been reported. We particularly focus on the segmentation of touching-pigs in a crowded pig room with low-contrast images obtained using a Kinect depth sensor. We reduce the execution time by combining object detection techniques based on a convolutional neural network (CNN) with image processing techniques instead of applying time-consuming operations, such as optimization-based segmentation. We first apply the fastest CNN-based object detection technique (i.e., You Only Look Once, YOLO) to solve the separation problem for touching-pigs. If the quality of the YOLO output is not satisfied, then we try to find the possible boundary line between the touching-pigs by analyzing the shape. Our experimental results show that this method is effective to separate touching-pigs in terms of both accuracy (i.e., 91.96%) and execution time (i.e., real-time execution), even with low-contrast images obtained using a Kinect depth sensor. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

Review

Jump to: Research, Other

29 pages, 1430 KiB  
Review
Machine Learning in Agriculture: A Review
by Konstantinos G. Liakos, Patrizia Busato, Dimitrios Moshou, Simon Pearson and Dionysis Bochtis
Sensors 2018, 18(8), 2674; https://doi.org/10.3390/s18082674 - 14 Aug 2018
Cited by 1690 | Viewed by 103712
Abstract
Machine learning has emerged with big data technologies and high-performance computing to create new opportunities for data intensive science in the multi-disciplinary agri-technologies domain. In this paper, we present a comprehensive review of research dedicated to applications of machine learning in agricultural production [...] Read more.
Machine learning has emerged with big data technologies and high-performance computing to create new opportunities for data intensive science in the multi-disciplinary agri-technologies domain. In this paper, we present a comprehensive review of research dedicated to applications of machine learning in agricultural production systems. The works analyzed were categorized in (a) crop management, including applications on yield prediction, disease detection, weed detection crop quality, and species recognition; (b) livestock management, including applications on animal welfare and livestock production; (c) water management; and (d) soil management. The filtering and classification of the presented articles demonstrate how agriculture will benefit from machine learning technologies. By applying machine learning to sensor data, farm management systems are evolving into real time artificial intelligence enabled programs that provide rich recommendations and insights for farmer decision support and action. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Figure 1

Other

Jump to: Research, Review

7 pages, 986 KiB  
Case Report
Spectral Data Collection by Dual Field-of-View System under Changing Atmospheric Conditions—A Case Study of Estimating Early Season Soybean Populations
by Ittai Herrmann, Steven K. Vosberg, Philip A. Townsend and Shawn P. Conley
Sensors 2019, 19(3), 457; https://doi.org/10.3390/s19030457 - 23 Jan 2019
Cited by 7 | Viewed by 3229
Abstract
There is an increasing interest in using hyperspectral data for phenotyping and crop management while overcoming the challenge of changing atmospheric conditions. The Piccolo dual field-of-view system collects up- and downwelling radiation nearly simultaneously with one spectrometer. Such systems offer great promise for [...] Read more.
There is an increasing interest in using hyperspectral data for phenotyping and crop management while overcoming the challenge of changing atmospheric conditions. The Piccolo dual field-of-view system collects up- and downwelling radiation nearly simultaneously with one spectrometer. Such systems offer great promise for crop monitoring under highly variable atmospheric conditions. Here, the system’s utility from a tractor-mounted boom was demonstrated for a case study of estimating soybean plant populations in early vegetative stages. The Piccolo system is described and its performance under changing sky conditions are assessed for two replicates of the same experiment. Plant population assessment was estimated by partial least squares regression (PLSR) resulting in stable estimations by models calibrated and validated under sunny and cloudy or cloudy and sunny conditions, respectively. We conclude that the Piccolo system is effective for data collection under variable atmospheric conditions, and we show its feasibility of operation for precision agriculture research and potential commercial applications. Full article
(This article belongs to the Special Issue Sensors in Agriculture 2018)
Show Figures

Graphical abstract

Back to TopTop