sensors-logo

Journal Browser

Journal Browser

Sensors and Robotics for Digital Agriculture

A topical collection in Sensors (ISSN 1424-8220). This collection belongs to the section "Sensors and Robotics".

Viewed by 123455
Printed Edition Available!
A printed edition of this Special Issue is available here.

Editors


E-Mail Website
Guest Editor
Institute for Bio-economy and Agri-technology (iBO), Centre for Research & Technology Hellas (CERTH), 38333 Volos, Greece
Interests: operation management; supply chain automation; agri-robotics; ICT-agri

E-Mail Website
Guest Editor
Institute for Bio-Economy and Agri-Technology (iBO), Centre for Research & Technology Hellas (CERTH), 38333 Volos, Greece
Interests: precision agriculture; remote sensing; sensor networks; IoT; digital farming; decision support systems; agricultural engineering; agricultural automations
Special Issues, Collections and Topics in MDPI journals

Topical Collection Information

Dear Colleagues,

In recent years, there has been a growing interest in sensors and robotic systems as part of the digitalization of agriculture, which has the potential to vastly increase agricultural systems’ efficiency and sustainability. Agricultural robots (including automation and amended intelligent IT systems) can accomplish various tasks which can lead to more efficient farm management and improved profitability. Sensors deployed on agricultural robots are an essential component for the robots’ autonomy and agronomical functions. These sensors include navigation sensors, context and situation awareness sensors, and sensors ensuring a safe execution of the operation as regards autonomy, as well as sensor technologies for yield mapping and measuring, soil sensing, nutrient and pesticide application, irrigation control, selective harvesting, etc. as regards agronomical functions, all in the framework of precision agriculture applications.

The purpose of this Special Issue is to publish research articles, as well as review articles, addressing recent advances in systems and processes in the field of sensors and robotics within the concept of precision agriculture. Original, high-quality contributions that have not yet been published and that are not currently under review by other journals or peer-reviewed conferences are sought.

Indicatively, research topics include:

  • Human–robot interaction;
  • Computer vision;
  • Robot sensing systems;
  • Artificial intelligence and machine learning;
  • Sensor fusion in agri-robotics;
  • Variable rate applications;
  • Farm management information systems;
  • Remote sensing;
  • ICT applications;
  • UAVs in agriculture;
  • Agri-robotics navigation and awareness;
  • SLAM—Simultaneous localization and mapping;
  • Resource-constrained navigation in agricultural environments;
  • Mapping and obstacle avoidance in agricultural environments.

Prof. Dr. Dionysis Bochtis
Dr. Aristotelis C. Tagarakis
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the collection website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (12 papers)

2024

Jump to: 2023, 2022, 2021

23 pages, 12038 KiB  
Article
Research on Assimilation of Unmanned Aerial Vehicle Remote Sensing Data and AquaCrop Model
by Wei Li, Manpeng Li, Muhammad Awais, Leilei Ji, Haoming Li, Rui Song, Muhammad Jehanzeb Masud Cheema and Ramesh Agarwal
Sensors 2024, 24(10), 3255; https://doi.org/10.3390/s24103255 - 20 May 2024
Viewed by 1082
Abstract
Taking the AquaCrop crop model as the research object, considering the complexity and uncertainty of the crop growth process, the crop model can only achieve more accurate simulation on a single point scale. In order to improve the application scale of the crop [...] Read more.
Taking the AquaCrop crop model as the research object, considering the complexity and uncertainty of the crop growth process, the crop model can only achieve more accurate simulation on a single point scale. In order to improve the application scale of the crop model, this study inverted the canopy coverage of a tea garden based on UAV multispectral technology, adopted the particle swarm optimization algorithm to assimilate the canopy coverage and crop model, constructed the AquaCrop-PSO assimilation model, and compared the canopy coverage and yield simulation results with the localized model simulation results. It is found that there is a significant regression relationship between all vegetation indices and canopy coverage. Among the single vegetation index regression models, the logarithmic model constructed by OSAVI has the highest inversion accuracy, with an R2 of 0.855 and RMSE of 5.75. The tea yield was simulated by the AquaCrop-PSO model and the measured values of R2 and RMSE were 0.927 and 0.12, respectively. The canopy coverage R2 of each simulated growth period basically exceeded 0.9, and the accuracy of the simulation results was improved by about 19.8% compared with that of the localized model. The results show that the accuracy of crop model simulation can be improved effectively by retrieving crop parameters and assimilating crop models through UAV remote sensing. Full article
Show Figures

Figure 1

2023

Jump to: 2024, 2022, 2021

5 pages, 380 KiB  
Editorial
Sensors and Robotics for Digital Agriculture
by Aristotelis C. Tagarakis and Dionysis Bochtis
Sensors 2023, 23(16), 7255; https://doi.org/10.3390/s23167255 - 18 Aug 2023
Viewed by 1773
Abstract
The latest advances in innovative sensing and data technologies have led to an increasing implementation of autonomous systems in agricultural production processes [...] Full article
Show Figures

Figure 1

23 pages, 2014 KiB  
Review
Human–Robot Interaction in Agriculture: A Systematic Review
by Lefteris Benos, Vasileios Moysiadis, Dimitrios Kateris, Aristotelis C. Tagarakis, Patrizia Busato, Simon Pearson and Dionysis Bochtis
Sensors 2023, 23(15), 6776; https://doi.org/10.3390/s23156776 - 28 Jul 2023
Cited by 16 | Viewed by 3951
Abstract
In the pursuit of optimizing the efficiency, flexibility, and adaptability of agricultural practices, human–robot interaction (HRI) has emerged in agriculture. Enabled by the ongoing advancement in information and communication technologies, this approach aspires to overcome the challenges originating from the inherent complex agricultural [...] Read more.
In the pursuit of optimizing the efficiency, flexibility, and adaptability of agricultural practices, human–robot interaction (HRI) has emerged in agriculture. Enabled by the ongoing advancement in information and communication technologies, this approach aspires to overcome the challenges originating from the inherent complex agricultural environments. Τhis paper systematically reviews the scholarly literature to capture the current progress and trends in this promising field as well as identify future research directions. It can be inferred that there is a growing interest in this field, which relies on combining perspectives from several disciplines to obtain a holistic understanding. The subject of the selected papers is mainly synergistic target detection, while simulation was the main methodology. Furthermore, melons, grapes, and strawberries were the crops with the highest interest for HRI applications. Finally, collaboration and cooperation were the most preferred interaction modes, with various levels of automation being examined. On all occasions, the synergy of humans and robots demonstrated the best results in terms of system performance, physical workload of workers, and time needed to execute the performed tasks. However, despite the associated progress, there is still a long way to go towards establishing viable, functional, and safe human–robot interactive systems. Full article
Show Figures

Figure 1

2022

Jump to: 2024, 2023, 2021

20 pages, 6689 KiB  
Article
Agrobot Lala—An Autonomous Robotic System for Real-Time, In-Field Soil Sampling, and Analysis of Nitrates
by Goran Kitić, Damir Krklješ, Marko Panić, Csaba Petes, Slobodan Birgermajer and Vladimir Crnojević
Sensors 2022, 22(11), 4207; https://doi.org/10.3390/s22114207 - 31 May 2022
Cited by 14 | Viewed by 7069
Abstract
This paper presents an autonomous robotic system, an unmanned ground vehicle (UGV), for in-field soil sampling and analysis of nitrates. Compared to standard methods of soil analysis it has several advantages: each sample is individually analyzed compared to average sample analysis in standard [...] Read more.
This paper presents an autonomous robotic system, an unmanned ground vehicle (UGV), for in-field soil sampling and analysis of nitrates. Compared to standard methods of soil analysis it has several advantages: each sample is individually analyzed compared to average sample analysis in standard methods; each sample is georeferenced, providing a map for precision base fertilizing; the process is fully autonomous; samples are analyzed in real-time, approximately 30 min per sample; and lightweight for less soil compaction. The robotic system has several modules: commercial robotic platform, anchoring module, sampling module, sample preparation module, sample analysis module, and communication module. The system is augmented with an in-house developed cloud-based platform. This platform uses satellite images, and an artificial intelligence (AI) proprietary algorithm to divide the target field into representative zones for sampling, thus, reducing and optimizing the number and locations of the samples. Based on this, a task is created for the robot to automatically sample at those locations. The user is provided with an in-house developed smartphone app enabling overview and monitoring of the task, changing the positions, removing and adding of the sampling points. The results of the measurements are uploaded to the cloud for further analysis and the creation of prescription maps for variable rate base fertilization. Full article
Show Figures

Figure 1

18 pages, 1229 KiB  
Article
Behavioural Classification of Cattle Using Neck-Mounted Accelerometer-Equipped Collars
by Dejan Pavlovic, Mikolaj Czerkawski, Christopher Davison, Oskar Marko, Craig Michie, Robert Atkinson, Vladimir Crnojevic, Ivan Andonovic, Vladimir Rajovic, Goran Kvascev and Christos Tachtatzis
Sensors 2022, 22(6), 2323; https://doi.org/10.3390/s22062323 - 17 Mar 2022
Cited by 14 | Viewed by 3822
Abstract
Monitoring and classification of dairy cattle behaviours is essential for optimising milk yields. Early detection of illness, days before the critical conditions occur, together with automatic detection of the onset of oestrus cycles is crucial for obviating prolonged cattle treatments and improving the [...] Read more.
Monitoring and classification of dairy cattle behaviours is essential for optimising milk yields. Early detection of illness, days before the critical conditions occur, together with automatic detection of the onset of oestrus cycles is crucial for obviating prolonged cattle treatments and improving the pregnancy rates. Accelerometer-based sensor systems are becoming increasingly popular, as they are automatically providing information about key cattle behaviours such as the level of restlessness and the time spent ruminating and eating, proxy measurements that indicate the onset of heat events and overall welfare, at an individual animal level. This paper reports on an approach to the development of algorithms that classify key cattle states based on a systematic dimensionality reduction process through two feature selection techniques. These are based on Mutual Information and Backward Feature Elimination and applied on knowledge-specific and generic time-series extracted from raw accelerometer data. The extracted features are then used to train classification models based on a Hidden Markov Model, Linear Discriminant Analysis and Partial Least Squares Discriminant Analysis. The proposed feature engineering methodology permits model deployment within the computing and memory restrictions imposed by operational settings. The models were based on measurement data from 18 steers, each animal equipped with an accelerometer-based neck-mounted collar and muzzle-mounted halter, the latter providing the truthing data. A total of 42 time-series features were initially extracted and the trade-off between model performance, computational complexity and memory footprint was explored. Results show that the classification model that best balances performance and computation complexity is based on Linear Discriminant Analysis using features selected through Backward Feature Elimination. The final model requires 1.83 ± 1.00 ms to perform feature extraction with 0.05 ± 0.01 ms for inference with an overall balanced accuracy of 0.83. Full article
Show Figures

Figure 1

14 pages, 28849 KiB  
Article
Proposing UGV and UAV Systems for 3D Mapping of Orchard Environments
by Aristotelis C. Tagarakis, Evangelia Filippou, Damianos Kalaitzidis, Lefteris Benos, Patrizia Busato and Dionysis Bochtis
Sensors 2022, 22(4), 1571; https://doi.org/10.3390/s22041571 - 17 Feb 2022
Cited by 27 | Viewed by 4970
Abstract
During the last decades, consumer-grade RGB-D (red green blue-depth) cameras have gained popularity for several applications in agricultural environments. Interestingly, these cameras are used for spatial mapping that can serve for robot localization and navigation. Mapping the environment for targeted robotic applications in [...] Read more.
During the last decades, consumer-grade RGB-D (red green blue-depth) cameras have gained popularity for several applications in agricultural environments. Interestingly, these cameras are used for spatial mapping that can serve for robot localization and navigation. Mapping the environment for targeted robotic applications in agricultural fields is a particularly challenging task, owing to the high spatial and temporal variability, the possible unfavorable light conditions, and the unpredictable nature of these environments. The aim of the present study was to investigate the use of RGB-D cameras and unmanned ground vehicle (UGV) for autonomously mapping the environment of commercial orchards as well as providing information about the tree height and canopy volume. The results from the ground-based mapping system were compared with the three-dimensional (3D) orthomosaics acquired by an unmanned aerial vehicle (UAV). Overall, both sensing methods led to similar height measurements, while the tree volume was more accurately calculated by RGB-D cameras, as the 3D point cloud captured by the ground system was far more detailed. Finally, fusion of the two datasets provided the most precise representation of the trees. Full article
Show Figures

Figure 1

16 pages, 1052 KiB  
Review
Toward the Next Generation of Digitalization in Agriculture Based on Digital Twin Paradigm
by Abozar Nasirahmadi and Oliver Hensel
Sensors 2022, 22(2), 498; https://doi.org/10.3390/s22020498 - 10 Jan 2022
Cited by 129 | Viewed by 14838
Abstract
Digitalization has impacted agricultural and food production systems, and makes application of technologies and advanced data processing techniques in agricultural field possible. Digital farming aims to use available information from agricultural assets to solve several existing challenges for addressing food security, climate protection, [...] Read more.
Digitalization has impacted agricultural and food production systems, and makes application of technologies and advanced data processing techniques in agricultural field possible. Digital farming aims to use available information from agricultural assets to solve several existing challenges for addressing food security, climate protection, and resource management. However, the agricultural sector is complex, dynamic, and requires sophisticated management systems. The digital approaches are expected to provide more optimization and further decision-making supports. Digital twin in agriculture is a virtual representation of a farm with great potential for enhancing productivity and efficiency while declining energy usage and losses. This review describes the state-of-the-art of digital twin concepts along with different digital technologies and techniques in agricultural contexts. It presents a general framework of digital twins in soil, irrigation, robotics, farm machineries, and food post-harvest processing in agricultural field. Data recording, modeling including artificial intelligence, big data, simulation, analysis, prediction, and communication aspects (e.g., Internet of Things, wireless technologies) of digital twin in agriculture are discussed. Digital twin systems can support farmers as a next generation of digitalization paradigm by continuous and real-time monitoring of physical world (farm) and updating the state of virtual world. Full article
Show Figures

Figure 1

2021

Jump to: 2024, 2023, 2022

16 pages, 8395 KiB  
Article
Detecting Animal Contacts—A Deep Learning-Based Pig Detection and Tracking Approach for the Quantification of Social Contacts
by Martin Wutke, Felix Heinrich, Pronaya Prosun Das, Anita Lange, Maria Gentz, Imke Traulsen, Friederike K. Warns, Armin Otto Schmitt and Mehmet Gültas
Sensors 2021, 21(22), 7512; https://doi.org/10.3390/s21227512 - 12 Nov 2021
Cited by 30 | Viewed by 5829
Abstract
The identification of social interactions is of fundamental importance for animal behavioral studies, addressing numerous problems like investigating the influence of social hierarchical structures or the drivers of agonistic behavioral disorders. However, the majority of previous studies often rely on manual determination of [...] Read more.
The identification of social interactions is of fundamental importance for animal behavioral studies, addressing numerous problems like investigating the influence of social hierarchical structures or the drivers of agonistic behavioral disorders. However, the majority of previous studies often rely on manual determination of the number and types of social encounters by direct observation which requires a large amount of personnel and economical efforts. To overcome this limitation and increase research efficiency and, thus, contribute to animal welfare in the long term, we propose in this study a framework for the automated identification of social contacts. In this framework, we apply a convolutional neural network (CNN) to detect the location and orientation of pigs within a video and track their movement trajectories over a period of time using a Kalman filter (KF) algorithm. Based on the tracking information, we automatically identify social contacts in the form of head–head and head–tail contacts. Moreover, by using the individual animal IDs, we construct a network of social contacts as the final output. We evaluated the performance of our framework based on two distinct test sets for pig detection and tracking. Consequently, we achieved a Sensitivity, Precision, and F1-score of 94.2%, 95.4%, and 95.1%, respectively, and a MOTA score of 94.4%. The findings of this study demonstrate the effectiveness of our keypoint-based tracking-by-detection strategy and can be applied to enhance animal monitoring systems. Full article
Show Figures

Figure 1

18 pages, 17540 KiB  
Article
Weed Classification Using Explainable Multi-Resolution Slot Attention
by Sadaf Farkhani, Søren Kelstrup Skovsen, Mads Dyrmann, Rasmus Nyholm Jørgensen and Henrik Karstoft
Sensors 2021, 21(20), 6705; https://doi.org/10.3390/s21206705 - 9 Oct 2021
Cited by 8 | Viewed by 2722
Abstract
In agriculture, explainable deep neural networks (DNNs) can be used to pinpoint the discriminative part of weeds for an imagery classification task, albeit at a low resolution, to control the weed population. This paper proposes the use of a multi-layer attention procedure based [...] Read more.
In agriculture, explainable deep neural networks (DNNs) can be used to pinpoint the discriminative part of weeds for an imagery classification task, albeit at a low resolution, to control the weed population. This paper proposes the use of a multi-layer attention procedure based on a transformer combined with a fusion rule to present an interpretation of the DNN decision through a high-resolution attention map. The fusion rule is a weighted average method that is used to combine attention maps from different layers based on saliency. Attention maps with an explanation for why a weed is or is not classified as a certain class help agronomists to shape the high-resolution weed identification keys (WIK) that the model perceives. The model is trained and evaluated on two agricultural datasets that contain plants grown under different conditions: the Plant Seedlings Dataset (PSD) and the Open Plant Phenotyping Dataset (OPPD). The model represents attention maps with highlighted requirements and information about misclassification to enable cross-dataset evaluations. State-of-the-art comparisons represent classification developments after applying attention maps. Average accuracies of 95.42% and 96% are gained for the negative and positive explanations of the PSD test sets, respectively. In OPPD evaluations, accuracies of 97.78% and 97.83% are obtained for negative and positive explanations, respectively. The visual comparison between attention maps also shows high-resolution information. Full article
Show Figures

Figure 1

20 pages, 7240 KiB  
Article
Orchard Mapping with Deep Learning Semantic Segmentation
by Athanasios Anagnostis, Aristotelis C. Tagarakis, Dimitrios Kateris, Vasileios Moysiadis, Claus Grøn Sørensen, Simon Pearson and Dionysis Bochtis
Sensors 2021, 21(11), 3813; https://doi.org/10.3390/s21113813 - 31 May 2021
Cited by 36 | Viewed by 5293
Abstract
This study aimed to propose an approach for orchard trees segmentation using aerial images based on a deep learning convolutional neural network variant, namely the U-net network. The purpose was the automated detection and localization of the canopy of orchard trees under various [...] Read more.
This study aimed to propose an approach for orchard trees segmentation using aerial images based on a deep learning convolutional neural network variant, namely the U-net network. The purpose was the automated detection and localization of the canopy of orchard trees under various conditions (i.e., different seasons, different tree ages, different levels of weed coverage). The implemented dataset was composed of images from three different walnut orchards. The achieved variability of the dataset resulted in obtaining images that fell under seven different use cases. The best-trained model achieved 91%, 90%, and 87% accuracy for training, validation, and testing, respectively. The trained model was also tested on never-before-seen orthomosaic images or orchards based on two methods (oversampling and undersampling) in order to tackle issues with out-of-the-field boundary transparent pixels from the image. Even though the training dataset did not contain orthomosaic images, it achieved performance levels that reached up to 99%, demonstrating the robustness of the proposed approach. Full article
Show Figures

Figure 1

55 pages, 7851 KiB  
Review
Machine Learning in Agriculture: A Comprehensive Updated Review
by Lefteris Benos, Aristotelis C. Tagarakis, Georgios Dolias, Remigio Berruto, Dimitrios Kateris and Dionysis Bochtis
Sensors 2021, 21(11), 3758; https://doi.org/10.3390/s21113758 - 28 May 2021
Cited by 376 | Viewed by 55192
Abstract
The digital transformation of agriculture has evolved various aspects of management into artificial intelligent systems for the sake of making value from the ever-increasing data originated from numerous sources. A subset of artificial intelligence, namely machine learning, has a considerable potential to handle [...] Read more.
The digital transformation of agriculture has evolved various aspects of management into artificial intelligent systems for the sake of making value from the ever-increasing data originated from numerous sources. A subset of artificial intelligence, namely machine learning, has a considerable potential to handle numerous challenges in the establishment of knowledge-based farming systems. The present study aims at shedding light on machine learning in agriculture by thoroughly reviewing the recent scholarly literature based on keywords’ combinations of “machine learning” along with “crop management”, “water management”, “soil management”, and “livestock management”, and in accordance with PRISMA guidelines. Only journal papers were considered eligible that were published within 2018–2020. The results indicated that this topic pertains to different disciplines that favour convergence research at the international level. Furthermore, crop management was observed to be at the centre of attention. A plethora of machine learning algorithms were used, with those belonging to Artificial Neural Networks being more efficient. In addition, maize and wheat as well as cattle and sheep were the most investigated crops and animals, respectively. Finally, a variety of sensors, attached on satellites and unmanned ground and aerial vehicles, have been utilized as a means of getting reliable input data for the data analyses. It is anticipated that this study will constitute a beneficial guide to all stakeholders towards enhancing awareness of the potential advantages of using machine learning in agriculture and contributing to a more systematic research on this topic. Full article
Show Figures

Figure 1

27 pages, 2821 KiB  
Review
Soft Grippers for Automatic Crop Harvesting: A Review
by Eduardo Navas, Roemi Fernández, Delia Sepúlveda, Manuel Armada and Pablo Gonzalez-de-Santos
Sensors 2021, 21(8), 2689; https://doi.org/10.3390/s21082689 - 11 Apr 2021
Cited by 120 | Viewed by 13542
Abstract
Agriculture 4.0 is transforming farming livelihoods thanks to the development and adoption of technologies such as artificial intelligence, the Internet of Things and robotics, traditionally used in other productive sectors. Soft robotics and soft grippers in particular are promising approaches to lead to [...] Read more.
Agriculture 4.0 is transforming farming livelihoods thanks to the development and adoption of technologies such as artificial intelligence, the Internet of Things and robotics, traditionally used in other productive sectors. Soft robotics and soft grippers in particular are promising approaches to lead to new solutions in this field due to the need to meet hygiene and manipulation requirements in unstructured environments and in operation with delicate products. This review aims to provide an in-depth look at soft end-effectors for agricultural applications, with a special emphasis on robotic harvesting. To that end, the current state of automatic picking tasks for several crops is analysed, identifying which of them lack automatic solutions, and which methods are commonly used based on the botanical characteristics of the fruits. The latest advances in the design and implementation of soft grippers are also presented and discussed, studying the properties of their materials, their manufacturing processes, the gripping technologies and the proposed control methods. Finally, the challenges that have to be overcome to boost its definitive implementation in the real world are highlighted. Therefore, this review intends to serve as a guide for those researchers working in the field of soft robotics for Agriculture 4.0, and more specifically, in the design of soft grippers for fruit harvesting robots. Full article
Show Figures

Figure 1

Back to TopTop