Next Article in Journal
Biosolids Benefit Yield and Nitrogen Uptake in Winter Cereals without Excess Risk of N Leaching
Next Article in Special Issue
Unpacking the Processes that Catalyzed the Adoption of Best Management Practices for Lowland Irrigated Rice in the Mekong Delta
Previous Article in Journal
Short-Term Responses to Salinity of Soybean and Chenopodium album Grown in Single and Mixed-Species Hydroponic Systems
Previous Article in Special Issue
Exploring the Relationship between Information-Seeking Behavior and Adoption of Biofertilizers among Onion Farmers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of Deep Learning-Based Variable Rate Agrochemical Spraying System for Targeted Weeds Control in Strawberry Crop

1
School of Agricultural Equipment Engineering, Jiangsu University, Zhenjiang 212013, China
2
Department of Agricultural, Biological, Environment and Energy Engineering, College of Engineering, Northeast Agricultural University, Harbin 150030, China
*
Author to whom correspondence should be addressed.
Agronomy 2021, 11(8), 1480; https://doi.org/10.3390/agronomy11081480
Submission received: 21 May 2021 / Revised: 24 June 2021 / Accepted: 20 July 2021 / Published: 26 July 2021
(This article belongs to the Special Issue Farmers’ Adoption of Agricultural Innovations and Their Impact)

Abstract

:
Agrochemical application is an important tool in the agricultural industry for the protection of crops. Agrochemical application with conventional sprayers results in the waste of applied agrochemicals, which not only increases financial losses but also contaminates the environment. Targeted agrochemical sprayers using smart control systems can substantially decrease the chemical input, weed control cost, and destructive environmental contamination. A variable rate spraying system was developed using deep learning methods for the development of new models to classify weeds and to accurately spray on desired weeds target. Laboratory and field experiments were conducted to assess the sprayer performance for weed classification and precise spraying of the target weeds using three classification CNNs (Convolutional Neural Networks) models. The DCNNs models (AlexNet, VGG-16, and GoogleNet) were trained using a dataset containing a total of 12,443 images captured from the strawberry field (4200 images with spotted spurge, 4265 images with Shepherd’s purse, and 4178 strawberry plants). The VGG-16 model attained higher values of precision, recall and F1-score as compared to AlexNet and GoogleNet. Additionally VGG-16 model recorded higher percentage of completely sprayed weeds target (CS = 93%) values. Overall in all experiments, VGG-16 performed better than AlexNet and GoogleNet for real-time weeds target classification and precision spraying. The experiments results revealed that the Sprayer performance decreased with the increase of sprayer traveling speed above 3 km/h. Experimental results recommended that the sprayer with the VGG-16 model can achieve high performance that makes it more ideal for a real-time spraying application. It is concluded that the advanced variable rate spraying system has the potential for spot application of agrochemicals to control weeds in a strawberry field. It can reduce the crop input costs and the environmental pollution risks.

1. Introduction

Agrochemical application plays an important role in agricultural production. These agrochemicals are essential for controlling weeds, disease and pests for better crop yield. If the weeds are present in large quantity, then development of essential crop will be less and they reduce yields, can make harvesting difficult and may affect the quality of the produce. Spraying plays an important role in reducing crop losses and increasing productivity. Cho et al. [1] stated that the dangerous insects and diseases are reduced by spraying, and losses between 30% and 35% of production can be eliminated. Plant protection is important to ensure the precise quantity and quality of the crop. The use of herbicides is the most preferred method for weed control because manual weeding is a laborious operation. Herbicides are the most widely used type of pesticide today, as weeds are an important factor limiting productivity in many crops [2].
The main goal of agricultural chemical application techniques is to control plant diseases, pests, and weeds and achieve the goal with maximum efficiency of agrochemicals and minimum effort to ensure minimum pollution. The uniform agrochemical application has been shown to be effective in removing weeds and but also it is a labor-intensive and time-consuming practice. Hydraulic sprayers are generally used for uniform agrochemical applications. In hydraulic sprayers, the spray fluid is pressurized by pump and then the pressurized fluid is forced through the spray nozzles toward the foliage. Boom sprayer is a type of hydraulic sprayer used for uniform agrochemical applications. Boom sprayer has multiple spray tips spread out along both boom arms with even spacing and are pointed straight down toward the target. Additionally, Air-blast sprayers are often used for spraying, where spraying liquid are propagated by a high volume of airflow generated by fan. Agrochemicals application by hydraulic sprayers have high ineffectiveness. These commonly used uniform agrochemicals sprayers cause the over-application of these harmful chemicals, increases crop input costs, deteriorates the environment [3,4,5], and contact with humans risks human health [6], and results in low application efficiency of sprayers [7]. That is why excessive use of agrochemicals is one of the factors affecting the economic, environmental, and production parameters. Due to their negative impact, governments and farmers are trying to reduce the entry of herbicides into agricultural activities [8,9].
Precision agriculture offers a solution to this problem by including weed control mechanisms to apply the treatment at the single plant level or in a small group of weeds [10]. In developed countries, there is a strong trend to decrease the use of herbicides in agriculture production [11,12,13,14]. Spraying application methods should ideally be targeted to ensure the safety of non-targets and the environment. As we know that all pesticides, herbicides pose risks to the environment and the user. Therefore, there is a serious need to decrease dependence on traditional agricultural chemicals (and sprayers) without disturbing crop production. In the future, herbicide applications will likely need to be more targeted as weeds. Intelligent weed removal, comprising weed recognition and elimination, has extended high acceptance among the farmers. The use of variable rate agrochemicals (VRA) are becoming very famous in recent years, because of its abundant potential to increase weed control efficiency and decreasing environmental risks and economic expenses [15,16].
Variable rate agrochemicals (VRA) application can greatly decrease the quantity of the chemicals used and reduce the cost of weeds control. Intelligent agrochemicals spraying system based on real-time sensing technology is emerging more rapidly in recent decades. Spraying the crop at a certain application rate will help in precision farming and precision fruit growing. As stated, only 30 to 40% of the agrochemical applied to the targets and most agrochemicals have disappeared into the environment, causing pollution of workers and pollution of the environment. The purpose of protecting plants with various chemicals is to prevent plant contamination [17]. Selective agrochemical spraying is one of the most satisfactory weed control mechanisms today. For decreasing the adverse effects, targeted agrochemical spraying methods have presented an essential development in relations of effectiveness and protection by implementing the modern developments in microelectronics, simulated intelligence and robotics [18]. Jianjun et al. [19] reported an infrared target detection system for variable rate spraying. The sprayer sees the precise design requirements for targeted spraying and can measure the distance between target and sprayer.
Bargen et al. [20] developed a near-infrared echo recognition method to identify plants. The detection of the plants is established on the different structures of canopy size. The target detection system built on infrared equipment is incompetent to detect characteristic information. Ultrasonic sensors also used for variable rate spraying systems. Giles et al. [21] developed an ultrasonic sensor-based conventional air-blast variable rate sprayer. Ultrasonic sensors performance decrease with background sounds. Palacin et al. [22] used a laser scanner to measure the canopy volume of plant in real time for variable rate spraying. Lasers are one of the most commonly used for variable rate spraying but also have some drawbacks, Such as they cannot be used in snowy or foggy weather conditions, also the laser beams cannot enter the vegetation. It also has poor performance in edge detection.
Oberti et al. [23] presented the targeted agrochemical spraying robot. They practiced crop image recognition to find powdery mildew plants; powdery mildew have a powdery appearance on the plants at first, and other powdery mildew indications. Berenstein et al. [24] showed that targeted agrochemical spraying can decrease the amount of agrochemical uses in present farming by 30%. Lee et al. [25] suggested a model for targeted spraying system by means of computer visualization and accurate agrochemical use to control weeds. In addition, agrochemicals can be saved up to 97%.
Machine vision is an effective means of recognizing the position, size, shape, color, and texture of vegetation. The practice of computer visualization in automated farming is gradually more popular [26]. Lamm et al. [27] developed a spraying prototype by means of computer vision procedures that is proficient in differentiating different weeds in the cotton crop and 88.8% of the weeds and 21.3% of the crop were sprayed. Sun et al. and Song et al. [28,29] studied a variety of weed detection sensors and techniques, such as computer vision, remote sensing, thermal imaging, and spectral study. Computer vision has been used for several years and it can differentiate different weeds, plants from the background by image integration procedures due to color variance among them. Hijazi et al. [30] presented high-speed imaging methods used for targeted chemical spraying. Another machine vision system was developed for the automatic segmentation of plants in different growth stages under different imaging situations and lighting conditions. The machine vision system achieved high accuracy and speed [31]. Burgos et al. [32] presented several machine vision-based image processing approaches for the calculation of proportions of weed, crop and soil present in an image presenting an area of interest of the crop field.
The usage of computer-generated intelligence is able to considerably increase the proficiency of spraying systems [33]. The artificial intelligence uses data-driven modeling techniques by taking the processed and labeled images of the targets as input expected at the improvement of a computer vision method. Target plants are identified on the basis of the morphology and texture of plants [34]. In the past few years, machine vision and deep learning methods showed significant signs of progress. Various machine learning applications have been developed to detect and distinguish weeds from real crops in coordination with sensors [35,36].
Machine vision using deep learning artificial neural networks (DL–ANNs) is a relatively new and very effective tool for classifying digital images and identifying objects within images. Deep learning convolutional neural networks (CNNs) are an advanced form of image processing that can quickly and accurately classify images or objects within an image [37]. A major benefit of deep learning is that its functionality is automatically extracted from the raw data. Between deep learning methods, Convoluted Neural Network (CNN) has shown great progress in the large-scale image and live video targets identifications [38]. CNN was initially motivated by the nature of the ability to see animals. The in-depth discovery of CNN (DL-CNN) moves toward the form of inexpensive multiprocessor graphics cards and graphics processing units (GPUs). GPUs excel at rapidly multiplying the arrays and vectors required for DL-CNN training, which can greatly accelerate learning [39].
In recent years, DL-ANNs have been widely tested and deployed in agricultural applications such as weed, pest, and disease detection, fruit and plant counting for yield estimation, and for intelligent navigation in autonomous vehicles [40,41,42]. Deep learning methods have turned out to be the most popular today due to the highest contemporary results achieved from image arrangement, target detection, ordinary verbal processing. The performance of deep learning techniques depends more on the quality of the data set than on other traditional machine learning techniques.
Dos Santos Ferreira et al. [43] devolved a conventional neural network model that classify various broadleaf and grassy weeds in soybean crop, the model performed well and achieved an overall accuracy of >99%. Some other researchers used a mixture of small components of a precision conventional neural network (CNN) to obtain an accurate weed classification that can achieve 90% accuracy with processing of 1.07–1.83 frames per s [44]. Sa et al. [45] used an encoder–decoder cascaded convolution neural network (CNN) to perform a semantic classification of the dense weeds product that provides high classification accuracy.
Yang and Sun [46] reported that DCNNs are supportive of agricultural applications. They used DCNN models to classify weeds with >95% accuracy. Milioto et al. [47] reported achieving high classification accuracy (99%) using CNNs for sugar beets and weeds distinction. Object detection models used for detection, as well as real-time applications of agrochemicals, with high accuracy [48]. Lee et al. [49] introduced the DL-CNN method for plant identification based on leaf characteristics. The development of DL-CNN models is the first step toward the development of a machine vision detection system for penetrating agrochemical applications. Different convolutional neural network (CNN) models have established to deliver new consistent crop/weed recognition outcomes. Recently, convolutional neural networks (CNN) have been used in agricultural applications for in-depth studies. A CNN’s model’s main advantage is its great performance in targets recognition and programmed feature work. Two CNN models (AlexNet and GoogleNet) were used for the classification in [50]. The performance of both models was assessed in terms of accuracy, recall, and precision. The results showed that GoogleNet performed better than the AlexNet architecture. In [51], the VGGNet model achieved higher classification accuracy (0.95). Among the several CNN models, VGGNet is the most accurate model in object classification.
This study evaluates the possibility of using three DL-CNN architectures GoogleNet [52], AlexNet [53] and Visual Geometry Groups Net (VGG-16), [36] for precision spraying in the strawberry field. The literature review revealed that the use of CNN models for weed classification for variable rate spraying still requires some research to deliver a deep learning-based spraying system. Therefore, the purpose of this study was to evaluate the effectiveness of the three deep learning models (GoogleNet, AlexNet and VGG-16) for weeds targeted spraying. The effect of sprayer ground speed on the performance CNNs models for spraying accuracy was also investigated. This article introduces and evaluates an economical variable rate sprayer for precise weed management. Three trained models (GoogleNet, AlexNet and VGG-16) were deployed and tested on sprayer in laboratory and field experiments for targeted weeds spraying. In this paper, a new approach to deep learning convolutional neural networks was used for precise agrochemical spraying for weed control (spotted spurge, Shepherd’s purse) in the strawberry field.

2. Materials and Methods

2.1. The Spraying Machine

A deep learning-based variable rate spraying system was designed and developed for the target spraying application. The sprayer consists of an electric four-wheeled chassis frame vehicle. The front two hub motor wheels were used as the driving wheels and the rear two wheels are independent caster type which used for turning the sprayer. The sprayer had 0.60 m of ground clearance and spacing between the two tires was 0.70 m. The wheel spacing and sprayer height were adjustable according to the site conditions. The sprayer prototype, with a size of 1.2 m (length) × 0.80 m (width) × 0.75 m (height), is power-driven by a 24-V lithium battery and automatically drive by remote control (SAGA1-L8B) and rated for a carrying capacity of up to 100 kg.

2.2. The Deep Learning-Based Variable Rate Spraying System

The developed variable rate spraying system using deep learning techniques consists of the following machine components for targeted spraying. A spraying tank of 50 L was used for agrochemical storage and the spraying tank was connected with a water diaphragm pump, the pump had maximum discharge flow of 15 Lpm and a maximum pressure of 4 bar. The pump main task was to supply the water towards nozzles continuously. Also a pressure relief bypass valve was used to avoid backflow of water toward the agrochemical tank and to maintain the flow toward nozzles in the case of one or more nozzles close when the system does not detected the weeds. Three 12 V solenoid valves (ASCO 8262H002), which have less than 60 ms response time, were used to actuate the nozzles spraying system with relay switches. The distance between the two adjacent spraying nozzles was kept 0.7 m to cover one row of the strawberry crop and the camera was also fixed with each nozzle for image acquisition.
A microcontroller (Arduino ATmega328) panel was used to control the all spraying system devices. The spraying nozzles (Solo 4900654-P) were flexible so that the height and the angle of the spraying nozzle can be changed according to the site conditions. Three low cost webcams (Aluratek AWC01F) were used for the real-time image acquisition process in the experimental field. The cameras were mounted on an iron bar at the heights and positions of spray nozzles to capture the images in real time for further processing of spraying system. The total width of the bar was 1.40 m that allowed the mounting of three cameras and three nozzles at the same time. These cameras were adjusted in such a way that the lowest intersection between the views of cameras will occurred. A schematic working diagram of the spraying system is presented in Figure 1.
Nvidia GeForce GTX 1080 processing unit was used to run the convolutional neural networks (CNN) in this study. Nvidia GeForce GTX 1080 is a particularly powerful single-board computer. Additionally, it has 8 GB GDDR5X RAM that operates on 256-bit memory, and with Pascal GPU GP104 that operates at a frequency of 1733 MHz. Three CNN models were selected for weed identification. This study offers AlexNet, VGG-16 and GoogleNet deep learning architectures for the classification of spotted spurge and Shepherd’s purse weeds in the strawberry crop for real-time spraying. AlexNet was introduced first time in the ImageNet Large Scale Visual Recognition Challenge (ILSVRC-2012) in 2012 as part of the image classification task competition. In addition, the model won by a large margin in competition. The AlexNet architecture consists of eight layers and 61 million parameters. In 2014, VGG-16 won the ILSVRC award. The VGG-16 architecture consists of 16 layers and 138 million parameters. GoogleNet has seven million parameters and a total of 22 layers. Improved versions of these modern models have been found to have remarkable target detection accuracy [54].

2.3. Data Acquisition and Image Processing

The images used for this study were collected from strawberry fields in southern Punjab (31°21′41.99″ N, 70°58′10.99″ E), Pakistan, during the 2020 crop season. Two digital cameras with resolutions ranging from 3000 × 2000 to 1500 × 1000 pixels captured color images of two weeds (spotted spurge, Shepherd’s purse) and strawberry plants. The images were captured approximately 70% of images from waist height (0.99 ± 0.09 m), 15% from knee height (0.52 ± 0.04 m) and 15% from chest height (1.35 ± 0.07 m) to allow the CNNs to recognize targets at a range of distances. These images were captured at different times of the day, under varying light intensities and from different angles to capture every possible setting for CNNs models.
After applying augmentation process on captured images a total of 12,443 images were generated, out of which 4200 images were of spotted spurge, 4265 images were of Shepherd’s purse, and 4178 were of strawberry plants (Figure 2). All the images dataset were randomly subdivided into three sub-datasets for training, validation, and testing of the classification CNNs models. Around 70% (8500 images) was used during training, in which 4200 images were of spotted spurge and Shepherd’s purse weeds, and 2300 were of strawberry plants. Approximately 20% (2488) of the dataset was used for validation (1660 of spotted spurge and Shepherd’s purse, and 830 of strawberry plants) and 10% (1240) of the images including both weeds and strawberry plants were used for testing purposes.

2.4. DCNN Models Training and Testing

For training and testing the DCNN models, all images were re-scaled and cropped using IrfanView software (Version 4.54) to 224 × 224 pixels as an input dataset for easier, learning, validation, and testing processes. The ratio of training and testing validation images was 70:30 for both weeds and plant image datasets. The images used for training were not used during testing for better models performance.
All training experiments were performed using GPU (Nvidia GTX 1080) on ubuntu 16.04. The Tensor Flow framework was used for model training. The Tensor Flow framework was developed by Google. It was initially developed to perform large datasets of numerical calculations [55]. Figure 3 shows the convolutional neural network algorithm for weed classification. During the training of the convolutional neural network models, several hyper-parameters were selected to obtain the high value of precision. The momentum value was attained as 0.95, image size (224 × 224 pixels), base learning rate was 0.001, weight decay was 0.0005, and iterations was 6000.
VGG-16, AlexNet, and GoogleNet DCNN models were used to train the weeds and plant classification datasets using the Tensor Flow framework. After training, DCNN models are ready to classify the weeds with real-time input images from the camera (Figure 4). Precision, recall, F1-Score, and overall accuracy were used to assess the performance of the convolutional neural network models used in this study. Performance results for all CNNs models were modified in matrices of binary confusion, containing true positive (Tp), false positive (Fp), true negative (Tn), and false negative (Fn) [56]. In this framework, Tp characterizes images with well-defined classified weeds; Tn, characterizes well-defined images of strawberry plants without weeds; Fp reflects the number of images that were incorrectly classified as weed images; and Fn represents images that were incorrectly classified as non-weeds images, such as strawberry plants. The models performance values of precision, recall, accuracy and F1 score are a small indicator of probability proficiency and range from 0 to 1. The higher the value, the more the accuracy of the networks. Precision represents the accuracy of a neural network in the event of a positive classification and is measured by Equation (1):
Precision = Tp/Tp + Fp
The Recall represent the effectiveness of the neural network in which the object is classified and calculated by Equation (2):
Recall = Tp/Tp + Fn
Overall Accuracy is the total observation rate of the correctly classified observation and measured by Equation (3):
Overall accuracy = Tp + Tn/Tp + Fp + Fn + Tn
The F1 score is the average of the harmonic mean of precision, recall and calculated by Equation (4):
F1 score = 2 × precision × recall/precision + recall

2.5. Electronic Mechanism

The spraying system electronic mechanism was automated with the controller system and relays. To classify and verify recommended weed identification patterns, an Arduino script was established to read signal data coming from a processing unit that contain the classified weeds. The data signal communication between the processing unit (GPU) and the Arduino microcontroller (ATmega328) unit was done through a universal serial bus (USB) assembly. As per the development, the trained CNNs models gather the classification results from input image acquisition through the camera. Throughout this phase, processed images result by a neural network model are transmitted from the processing unit to the microcontroller unit, and then the microcontroller unit creates the spraying control signals which directed through the USB port that activates the relays and solenoid valves opens. As the solenoid valve opens the spraying liquid start flowing toward spraying nozzle and applied on the desired weeds.

2.6. Research Plan

The experiments were designed to evaluate the performance of CNNs models by acquiring input images in real time from the camera. In addition, to evaluate the working performance of the variable rate sprayer. The experiments were conducted in simulated lab conditions and later in actual field conditions. Two weeds spotted spurge and Shepherd’s purse were selected as targets and strawberry plants as non-targets. The performance of the variable rate sprayer was assessed in all experiments, after spraying the target weeds (spotted spurge, Shepherd’s purse), it is done by manually observing the red color spraying liquid on the target weeds, to find out the all weeds are sprayed or not sprayed. The red paint was added in spray liquid water, so that by visually easily identify the weeds are being sprayed or not in all experiments. The performance of the spraying system was calculated by assessment models and described in (Table 1). In the assessment models, state A, B, C, D indicates completely sprayed, incompletely sprayed, not sprayed, and mistakenly sprayed target, respectively. Additionally the percentages of state A, B, C, and D were also calculated as shown in Table 1.

2.6.1. Lab Experiment

In the lab experiment, the classification accuracy of three DCNN models (AlexNet, VGG-16, and GoogleNet) by acquiring input images in real time from the camera and precision spraying of the variable rate sprayer was assessed. The effect of sprayer (chassis) ground speed on classification accuracy and precision spraying were analyzed. Three different ground speeds (1, 3, and 5 km/h) of the sprayer were applied to evaluate trained models’ classification accuracy values in real-time image acquisition and precision spraying. During the Laboratory experiments the wind speed, relative humidity, the air temperature was 2–4 km h−1, 20–25% and 25–35 °C respectively. In the lab experiment, a track was devolved to pretend a strawberry field (Figure 5). The trial track contains two weeds (spotted spurge, Shepherd’s purse) placed randomly among the strawberry plants in a straight line. The weeds were mentioned as targets and strawberry plants were mentioned non-targets. This trial track holds three parallel rows of weeds and strawberry plants. Each row had ten targets (both weeds) and ten non-targets (strawberry plants). The track was 10 m long and 1.4 m wide (two adjacent rows 0.7 m wide). The experiment was repeated ten times and, the average data values were calculated for all three DCNN models, and during each repetition the position of targets weeds and plants rearranged manually.

2.6.2. Field Experiment

Field performance evaluation of the sprayer was conducted in the strawberry field (Figure 6), so that the variable rate sprayer performance assessed in actual complex condition of the field. In field experiments, the wind speed was 2–5 km h−1 with the ambient temperature of 27–32 °C and relative humidity of 14–20%. The classification accuracy and precision of the spraying system using three neural network models (AlexNet, VGG-16, and GoogleNet) were evaluated in field experiments. The most promising traveling speed (1 km/h) attained from lab experiments was applied in field evaluation experiments. In the strawberry field, three adjacent rows were randomly selected for experiments to assess sprayer performance. Each nozzle of the spryer covers one row of strawberry crops. The length of the selected rows was 10 m and the width of three adjacent rows were 1.5 m (two adjacent rows width was 0.7 m). The experiment was repeated five times and the number of target weeds in each experiment was 30. The average data values were calculated for all three models to evaluate the performance of the variable rate sprayer. For better performance, the sprayer should only spray on certain weeds (targets) and not on the strawberry plants.

3. Results

3.1. Validation Dataset Results of DCNNs Models

The CNNs models performance (accuracy, precision, recall, F1-score) results on the validation dataset are provided in Table 2. For all CNNs models the classification accuracy values were recorded in the range of 0.95 to 0.97. The validation results show that the VGG-16 model outperformed than the other two CNN models and achieved higher values of accuracy, precision, recall, and F1-score. The VGG-16 model achieved the peak values of precision (0.98), recall (0.97), F1-score (0.97) and accuracy (0.97) for weeds classification. The GoogleNet achieved the values of precision (0.96), recall (0.97), F1-score (0.96), and accuracy (0.96) for weeds classification on validation dataset. The AlexNet model for weeds classification recorded lower values of precision (0.95), recall (0.96), F1-score (0.95) and accuracy (0.95) as compared to the other two models.

3.2. Laboratory Experiments

3.2.1. CNNs Models Performance Results for Weeds Classification

The DCNNs models (AlexNet, VGG-16 and GoogleNet) were successfully trained to classify the target weeds in the strawberry crop. The performance of deep learning neural networks models was analyzed by acquiring input images in real time from the camera during lab experiment. Table 3 shows the performance results of AlexNet, VGG-16 and GoogleNet model for weeds classification in real time. The VGG-16 models worked better than AlexNet and GoogleNet for precisely classifying the weeds. F-score, recall, precision and accuracy values were higher in the case of the VGG-16 model, which means that VGG-16 can work significantly better to identify spotted spurge and shepherd purse in strawberry crop compared to the other two models.
The maximum values of precision (0.96), recall (0.94), F1-score (0.94) and accuracy (0.95) achieved for the VGG-16 model by acquiring input images in real time from the camera with sprayer running at 1 km/h and the lowest values of precision (0.88), recall (0.85), F1-score (0.86) and accuracy (0.87) were recorded at 5 km/h because the images were blurry and model performance decreased. Similarly for GoogleNet model the peak values of precision (0.94), recall (0.92), F1-score (0.92) and accuracy (0.93) were achieved by running the sprayer at 1 km/h. The AlexNet model recorded lower values of precision (0.92), recall (0.90), F1-score (0.90) and accuracy (0.91) for the classification of weeds in real time input images at 1 km/h as compared to the other two models. The experimental results revealed that an increase in the sprayer’s forward speed can reduce the classification accuracy of the system due to blurry image quality in real time image acquisition from the camera. Minor performance differences have been reported between sprayer speeds of 1 km/h and 3 km/h.

3.2.2. Performance Evaluation of Spraying System

The precision of the variable rate sprayer was evaluated in the laboratory experiment. The sprayer performed significantly well in terms of precision and accuracy. The number of target weeds in the experiment were 30. Sprayer using the VGG-16 model achieved better results than the other two models (GoogleNet, AlexNet), specifically when comparing the completely sprayed weeds (CS) and missed target weeds (NS). The lab experiment results of precision spraying are shown in Table 4. The highest percentage of completely sprayed weeds (CS = 93%) were noted for the VGG-16 model at 1 km/h and the lowest percentage value of completely sprayed weeds (CS = 80%) was recorded by running the sprayer at 5 km/h this decreased in performance was due to classification frailer by blurry images acquisition. Similarly for the GoogleNet model the highest percentage of completely sprayed weeds (CS = 90%) was achieved by running the system with 1 km/h and the lowest percentage value (CS = 76%) was recorded by running the system with 5 km/h. AlexNet model recorded the lowest percentage of completely sprayed weeds (CS = 87%) at 1 km/h as compared to the other two models.
Some incompletely spray (IS) target weeds were noted during the use of GoogleNet and AlexNet model; however this appears to be reduced using the VGG-16 model for spraying. it is also witnessed that the sprayer did not sprayed any non-weeds target mistakenly in lab experiments. Overall VGG-16 model performed better than the GoogleNet and AlexNet model for precision spraying in lab experiments. Additionally observed from experiments that the percentage of completely sprayed weeds targets decreased with the increase of sprayer ground speed it was because of the decrease in classification accuracy of the models due to blurry images acquisition. The minor differences in sprayer performance were recorded between 1 and 3 km/h sprayer speed. The percentage of the missed spray weeds target (NS) increased by increasing the ground speed above 3 km/h. As a result, this study revealed that the suggested real-time sprayer ground speed should be 1 km/h for a higher percentage of completely sprayed targets.

3.3. Field Experiments

3.3.1. Deep Learning Models Results in The Field Experiment

The Deep learning models (AlexNet, VGG-16 and GoogleNet) successfully classified the weeds in real time live input images from the camera during the field experiments. In field complex environment the performance of deep learning neural networks were assessed by acquiring input images in real time from the camera. VGG-16 and GoogleNet both were able to successfully classify the weeds (spotted spurge, Shepherd’s purse) in field experiments. VGG-16 produced significantly better results than GoogleNet and AlexNet models for weeds classification.
The VGG-16 model achieved higher weeds classification values of precision (0.90), recall (0.88), F1-score (0.88) and accuracy (0.89). The second best model was GoogleNet and the model recorded weed classification values of precision (0.88), recall (0.85), F1-score (0.86) and accuracy (0.87) in real time field experiments. AlexNet model seemed to be less accurate for the classification of weeds (spotted spurge, Shepherd’s purse) in real time field experiments and recorded lower values of precision (0.85), recall (0.81), F1-score (0.82) and accuracy (0.83) as compared to the other two models.
Overall VGG-16 model performed well by achieving higher precision, recall and F1-score values as compared to other two models. Table 5 shows the results of AlexNet, VGG-16 and GoogleNet models for weeds classification in field experiments.

3.3.2. Performance Evaluation of Spraying System in Field

In complex field conditions the performance of the viable rate sprayer was assessed. During field experiments, it was clearly observed that the VGG-16 model performed better than the GoogleNet and AlexNet models, particularly when comparing the complete sprayed weeds (spotted spurge, Shepherd’s purse) targets (CS) and missed weeds target (NS) percentages (Table 6). The VGG-16 model achieved a higher percentage of completely sprayed weeds targets (CS = 86%) as compared to GoogleNet (CS = 83%). This increase of the percentage of completely sprayed targets was 3%. AlexNet model lower percentage of completely sprayed weeds targets (CS = 77%).
It also can be observed from Figure 7 that the AlexNet model has missed more targets than GoogleNet and the VGG-16 model. Therefore the AlexNet model was not significant in complex field conditions. Another important observation was that there is no single mistakenly sprayed (MS) non-weeds target recorded by the VGG-16 model. Overall the sprayer performed significantly well with the VGG-16 model in complex field conditions.

4. Discussion

Among the image classification neural networks, VGG-16 performed better than AlexNet and GoogleNet. Precision, F-score, recall, and accuracy values were higher in the case of the VGG-16 model compared to the other two models, which means that VGG-16 can work considerably better to classify weeds (spotted spurge and shepherd purse) in strawberry fields. Some other researchers [51,57,58] also stated that VGG-16 can achieve better precision than GoogleNet and AlexNet models in the classification of weeds. The CNN models (AlexNet, VGG-16, and GoogleNet) were trained using a dataset containing a total of 12,443 images captured from the strawberry field (4200 images with spotted spurge, 4265 images with Shepherd’s purse, and 4178 of strawberry plants). The ratio of training and validation images was 70:30 for both weeds and plant image datasets.
The validation results showed that the VGG-16 model outperformed than the other two CNN models and achieved higher values of accuracy, precision, recall, and F1-score. The Laboratory and field experiments were performed to evaluate the smart sprayer performance for weed classification by acquiring input images in real time from the camera and precisely spraying the target weeds using three trained object classification CNNs models.
In laboratory experiments the VGG-16 model showed higher classification values of precision (0.96), recall (0.94), F1-score (0.94), and also a higher percentage of completely sprayed weeds target (CS = 93%) values as compared to the other two models. The second best performed model was GoogleNet, the model recorded precision 0.94, recall 0.92, F1-score 0.92 and the percentage of completely sprayed (CS) weeds target was 90%. The AlexNet model recorded lower weeds classification values of precision (0.92), recall (0.90), F1-score (0.90) and also achieved the lowest percentage of completely sprayed weeds targets (CS = 87%) as compared to the other two models.
In the field’s complex environment, the peak values of precision (0.90), recall (0.88) and F1-score (0.88) were achieved by the VGG-16 model. Additionally, VGG-16 model recorded 3% and 9% higher percentage of completely sprayed weeds targets as compared to GoogleNet and AlexNet model respectively. Additionally, the VGG-16 model reduced the percentage of missed sprayed weeds (spotted spurge, Shepherd’s purse) from 17% to 10% (comparing by AlexNet model). VGG-16 and GoogleNet both models recorded significant performance results in field experiments. The AlexNet model appeared to be less accurate for the classification of weeds (spotted spurge, Shepherd’s purse) in real time field experiments and recorded lower values of precision (0.85), recall (0.81), F1-score (0.82) and percentage of completely sprayed weeds targets (CS = 77%) as compared to other two models.
The experiment’s results also suggested that the sprayer’s performance decreased with the increase of sprayer traveling speed above 3 km/h. The higher ground speeds the (5 km/h) of sprayer have resulted in the classification failure of weeds due to blurry image quality in real-time image acquisition from the camera. Overall in all experiments VGG-16 model provides precise and accurate results for real-time weeds (Shepherd’s purse and spotted spurge) classification and precision spraying.

5. Conclusions

A variable rate sprayer using deep learning-based methods to classify the weeds in real time was developed and the performance was evaluated by precisely spraying the desired target weeds. The variable rate sprayer includes webcams for capturing images, computing unit for image processing, a microcontroller board to control system operation, and spray nozzles with solenoid valves. The variable rate sprayer enabled image capture and processing to send trigger signals to open the nozzles and spray on the targeted weeds. This study proposes AlexNet, VGG-16, and GoogleNet deep learning architecture for weed classification. Three object classification CNN models (AlexNet, VGG-16 and GoogleNet) were positively trained and tested with the images of weeds (spotted spurge, Shepherd’s purse) and strawberry plants. The goal of this study was to evaluate the potential of using convolutional neural network models to identify spotted spurge and Shepherd purse weeds in real time for spraying. The VGG-16 was more effective than AlexNet and GoogleNet based on the accuracy of the models and precision spraying. Experimental results recommended that sprayer with VGG-16 model can achieve high performance for real-time spraying application in the field.
In all experiments, the VGG-16 model demonstrated significant performance results. Based on the outcomes of lab experiments and real-time field assessments, it can be concluded that the developed variable rate sprayer was capable of precisely differentiate between the weeds (Shepherd’s purse, spotted spurge) and non-weeds target (strawberry plants) and spraying on the target weeds. The developed system offers a potential solution to prevent input waste of agrochemicals, therefore increasing formers productivity and reducing the environmental contamination.
Future research studies can focus on the identification of the diseases affected plants. Additionally, additional CNNs will be trained at lower resolutions and for other target weeds. It is also observed from the field experiments that sprayer performance was better under trees shadow conditions than the open field. Therefore for future research artificial lights arrangements will be added for better performance.

Author Contributions

Conceptualization, J.L.; methodology, J.L.; software, I.A.; validation, I.A.; formal analysis, I.A. and R.S.N.; investigation, I.A. and R.S.N.; resources, I.A.; data curation, I.A.; writing—original draft preparation, I.A.; writing—review and editing, J.L. and R.S.N.; visualization, I.A.; supervision, J.L.; project administration, I.A.; funding acquisition, I.A. All authors have read and agreed to the published version of the manuscript.

Funding

Primary Research & Development Plan of Changzhou City (Modern Agriculture) (No. CE20202021), Primary Research & Development Plan of Jiangsu Province-Modern Agriculture (No. BE2020383), Priority Academic Program Development of Jiangsu Higher Education Institutions (No. PAPD-2018-87).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study is available on request from the corresponding author.

Conflicts of Interest

The authors declared no conflict of interest.

References

  1. Cho, S.I.; Ki, N.H. Autonomous speed sprayer using machine vision and fuzzylogic. Trans. Am. Soc. Agric. Eng. 1999, 42, 1137–1143. [Google Scholar] [CrossRef]
  2. Dessalegn, A.; Habtamu, A.; Ibrahim, H. Effect of Weeding Frequency on Weed Density, Intensity, Relative Yield Loss and Yield of Food Barley (Hordeom vulgare L.) Variety at Amuru District, Western Oromia. Am. J. Plant Biol. 2021. [Google Scholar] [CrossRef]
  3. Swanson, N.L.; Leu, A.; Abrahamson, J.; Wallet, B. Genetically engineered crops, glyphosate and the deterioration of health in the United States of America. J. Org. Syst. 2014, 9, 6–37. [Google Scholar]
  4. López-Granados, F.; Torres-Sánchez, J.; Serrano-Pérez, A.; de Castro, A.; Mesas-Carrascosa, F.; Peña, J.M. Early season weed mapping in sunflower using UAV technology: Variability of herbicide treatment maps against weed thresholds. Precis. Agric. 2016, 17, 183–199. [Google Scholar] [CrossRef]
  5. Jurado-Expósito, M.; López-Granados, F.; Gonzalez-Andujar, J.L.; Torres, L.G. Characterizing Population Growth Rate of in Wheat-Sunflower No-Tillage Systems Modelling the effects of climate change on weed population dynamics View project. Crop Sci. 2005, 45, 2106–2112. [Google Scholar] [CrossRef] [Green Version]
  6. Lamichhane, J.R.; Dachbrodt-Saaydeh, S.; Kudsk, P.; Messéan, A. Toward a reduced reliance on conventional pesticides in European agriculture. Plant Dis. 2016, 100, 10–24. [Google Scholar] [CrossRef] [Green Version]
  7. Creech, C.F.; Henry, R.S.; Werle, R.; Sandell, L.D.; Hewitt, A.J.; Kruger, G.R. Performance of Postemergence Herbicides Applied at Different Carrier Volume Rates. Weed Technol. 2015, 29, 611–624. [Google Scholar] [CrossRef]
  8. Jorgensen, L.N.; Hovmøller, M.S.; Hansen, J.G.; Lassen, P.; Clark, B.; Bayles, R.; Berg, G. IPM Strategies and Their Dilemmas Including an Introduction to www.eurowheat.org. J. Integr. Agric. 2014, 13, 265–281. [Google Scholar] [CrossRef]
  9. Hillocks, R.J. Farming with fewer pesticides: Eu pesticide review and resulting challenges for UK agriculture. Crop Prot. 2012, 31, 85–93. [Google Scholar] [CrossRef]
  10. Weis, M.; Gutjahr, C.; Ayala, V.R.; Gerhards, R.; Ritter, C.; Scholderle, F. Precision farming for weed management: Techniques. Gesunde Pflanz. 2008, 60, 171–181. [Google Scholar] [CrossRef]
  11. Blasco, J.; Aleixos, N.; Roger, J.M.; Rabatel, G.; Molto, E. Robotic weed control using machine vision. Biosyst. Eng. 2002, 83, 149–157. [Google Scholar] [CrossRef]
  12. Tian, L. Development of a sensor-based precision herbicide application system. Comput. Electron. Agric. 2002, 36, 133–149. [Google Scholar] [CrossRef]
  13. Tellaeche, A.; Burgos-Artizzub, X.P.; Pajaresa, G.; Ribeirob, A. A vision-based method for weeds identification through the bayesian decision theory. Pattern Recognit. Soc. 2008, 41, 521–530. [Google Scholar] [CrossRef]
  14. Sabancı, K.; Aydın, C. Smart robotic weed control system for sugar beet. J. Agric. Sci. Technol. 2017, 19, 73–83. [Google Scholar]
  15. Fennimore, S.A.; Slaughter, D.C.; Siemens, M.C.; Leon, R.G.; Saber, M.N. Technology for automation of weed control in specialty crops. Weed Technol. 2016, 30, 823–837. [Google Scholar] [CrossRef]
  16. Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128. [Google Scholar] [CrossRef]
  17. Yuan, H.; He, X. Herbicides deposit distribution with knapsacks prayer spraying. Plant Prot. 1998, 24, 41–42. [Google Scholar]
  18. Abdulridha, J.; Ehsani, R.; Abd-Elrahman, A.; Ampatzidis, Y. A remote sensing technique for detecting laurel wilt disease in avocado in presence of other biotic and abiotic stresses. Comput. Electron. Agric. 2019, 156, 549–557. [Google Scholar] [CrossRef]
  19. Zou, J.; Zeng, A.; He, X. Research and development of infrared detection system for automatic target sprayer used in orchard. Trans. Chin. Soc. Agric. Eng. 2007, 23, 129–132. [Google Scholar]
  20. Bargen, K.V.; Meyer, G.E.; Mortensen, D.A. Red/near-infrared reflectance sensor system for detecting plants. Optics in Agriculture and Forestry. Int. Soc. Opt. Photonics 1993, 1836, 231–239. [Google Scholar] [CrossRef]
  21. Giles, D.K.; Delwiche, M.J.; Dodd, R.B. Sprayer control by sensing orchard crop characteristics: Orchard architecture and spray liquid savings. J. Agric. Eng. Res. 1989, 43, 271–289. [Google Scholar] [CrossRef]
  22. Palacin, J.; Palleja, T.; Tresanch, M.; Sanz, R. Real-time tree-foliage surface estimation using a ground laser scanner. IEEE Trans. Instrum. Meas. 2007, 56, 1377–1383. [Google Scholar] [CrossRef] [Green Version]
  23. Oberti, R.; Marchi, M.; Tirelli, P.; Calcante, A.; Iriti, M.; Tona, E.; Ulbrich, H. Selective spraying of grapevines for disease control using a modular agricultural robot. Biosyst. Eng. 2016, 146, 203–215. [Google Scholar] [CrossRef]
  24. Berenstein, R.; Shahar, O.; Shapiro, A.; Edan, Y. Grape clusters and foliage detection algorithms for autonomous selective vineyard sprayer. Intell. Serv. Robot. 2010, 3, 233–243. [Google Scholar] [CrossRef]
  25. Lee, W.S.; Slaughter, D.; Giles, D. Robotic weed control system for tomatoes. Precis. Agric. 1999, 1, 95–113. [Google Scholar] [CrossRef]
  26. Pajares, G.; García-Santillán, I.; Campos, Y.; Montalvo, M.; Jose Miguel Guerrero, J.; Emmi, L. Machine-vision systems selection for agricultural vehicles: A guide. J. Imaging 2016, 2, 34. [Google Scholar] [CrossRef] [Green Version]
  27. Lamm, R.D.; Slaughter, D.C.; Giles, D.K. Precision weed control system for cotton. Trans. ASAE 2002, 45, 231. [Google Scholar]
  28. Sun, H.; Li, M.Z.; Zhang, Q. Detection system of smart sprayer: Status, challenges, and perspectives. Int. J. Agric. Biol. Eng. 2012, 5, 10–23. [Google Scholar]
  29. Song, Y.; Sun, H.; Li, M.; Zhang, Q. Technology application of smart spray in agriculture: A review. Intell. Automat. Soft Comput. 2015, 21, 319–333. [Google Scholar] [CrossRef]
  30. Hijazi, B.; Decourselle, T.; Vulgarakis Minov, S.; Nuyttens, D.; Cointault, F.; Pieters, J. The use of high-speed imaging systems for applications in precision agriculture. In New Technologies—Trends, Innovations and Research; Volosencu, C., Ed.; INTECH: Rijeka, Croatia, 2012; pp. 279–296. [Google Scholar]
  31. Sabzi, S.; Abbaspour-Gilandeh, Y.; Javadikia, H. Machine vision system for the automatic segmentation of plants under different lighting conditions. Biosyst. Eng. 2017, 161, 157–173. [Google Scholar] [CrossRef]
  32. Burgos-Artizzu, X.P.; Ribeiro, A.; Tellaeche, A.; Pajares, G.; Fernández-Quintanilla, C. Analysis of natural images processing for the extraction of agricultural elements. Image Vis. Comput. 2010, 28, 138–149. [Google Scholar] [CrossRef]
  33. Abbas, I.; Liu, J.; Faheem, M.; Noor, R.S.; Shaikh, S.A.; Solangi, K.A.; Raza, S.M. Different sensor based intelligent spraying systems in Agriculture. Sens. Actuators A Phys. 2020, 112265. [Google Scholar] [CrossRef]
  34. Zaman, Q.U.; Esau, T.J.; Schumann, A.W.; Percival, D.C.; Chang, Y.K.; Read, S.M.; Farooque, A.A. Development of prototype automated variable rate sprayer for real-time spot application of agrochemicals in wild blueberry fields. Comput. Electron. Agric. 2011, 76, 175–182. [Google Scholar] [CrossRef]
  35. Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B. Recent advances in convolutional neural networks. Pattern Recog. 2018, 77, 354–377. [Google Scholar] [CrossRef] [Green Version]
  36. Steward, B.L.; Tian, F.L.; Tang, L. Distance-based control system for machine vision-based selective spraying. Trans. ASAE 2002, 45, 1255. [Google Scholar] [CrossRef] [Green Version]
  37. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
  38. Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. In Proceedings of the International Conference on Learning Representations (ICLR), San Diego, CA, USA, 7–9 May 2015. [Google Scholar]
  39. Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [Google Scholar] [CrossRef] [Green Version]
  40. Dyrmann, M.; Jørgensen, R.N.; Midtiby, H.S. RoboWeedSupport—Detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Adv. Anim. Biosci. 2017, 8, 842–847. [Google Scholar] [CrossRef]
  41. Sa, I.; Ge, Z.; Dayoub, F.; Upcroft, B.; Perez, T.; McCool, C. DeepFruits: A Fruit Detection System Using Deep Neural Networks. Sensors 2016, 16, 1222. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  42. Padhy, R.P.; Verma, S.; Ahmad, S.; Choudhury, S.K.; Sa, P.K. Deep Neural Network for Autonomous UAV Navigation in Indoor Corridor Environments. Procedia Comput. Sci. 2018, 133, 643–650. [Google Scholar] [CrossRef]
  43. Dos Santos Ferreira, A.; Freitas, D.M.; da Silva, G.G.; Pistori, H. Weed detection in soybean crops using ConvNets. Comp. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
  44. McCool, C.; Perez, T.; Upcroft, B. Mixtures of lightweight deep convolutional neural networks: Applied to agricultural robotics. IEEE Robot. Autom. Lett. 2017, 2, 1344–1351. [Google Scholar] [CrossRef]
  45. Sa, I.; Chen, Z.; Popovic, M.; Khanna, R.; Liebisch, F.; Nieto, J.; Siegwart, R. Weed net: Dense semantic weed classification using multispectral images and mav for smart farming. IEEE Robot. Autom. Lett. 2018, 3, 588–595. [Google Scholar] [CrossRef] [Green Version]
  46. Yang, X.; Sun, M. A Survey on Deep Learning in Crop Planting. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2019; Volume 490, p. 062053. [Google Scholar]
  47. Milioto, A.; Lottes, P.; Stachniss, C. Real-time blob-wise sugar beets vs weeds classification for monitoring fields using convolutional neural networks. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 4, 41–48. [Google Scholar] [CrossRef] [Green Version]
  48. Hirz, M.; Walzel, B. Sensor and object recognition technologies for self-driving cars. Comput. Aided. Des. Appl. 2018, 15, 501–508. [Google Scholar] [CrossRef] [Green Version]
  49. Lee, S.H.; Chan, C.S.; Mayo, S.J.; Remagnino, P. How deep learning extracts and learns leaf features for plant classification. Pattern Recog. 2017, 71, 1–13. [Google Scholar] [CrossRef] [Green Version]
  50. Mohanty, S.P.; Hughes, D.P.; Salathé, M. Using deep learning for image-based plant disease detection. Front. Plant Sci. 2016, 7, 1419. [Google Scholar] [CrossRef] [Green Version]
  51. Ha, J.G.; Moon, H.; Kwak, J.T.; Hassan, S.I.; Dang, M.; Lee, O.N.; Park, H.Y. Deep convolutional neural network for classifying Fusarium wilt of radish from unmanned aerial vehicles. J. Appl. Remote Sens. 2017, 11, 042621. [Google Scholar] [CrossRef]
  52. Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015. [Google Scholar]
  53. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems; Curran Associates, Inc.: New York, NY, USA, 2012; pp. 1097–1105. [Google Scholar]
  54. Atila, Ü.; Uçar, M.; Akyol, K.; Uçar, E. Plant leaf disease classification using EfficientNet deep learning model. Ecol. Inform. 2019. [Google Scholar] [CrossRef]
  55. Abadi, M.; Agarwal, A.; Barham, P.; Brevdo, E.; Chen, Z.; Citro, C.; Corrado, G.S.; Davis, A.; Dean, J.; Devin, M.; et al. TensorFlow: Large-scale machine learning on heterogeneous distributed systems. arXiv 2016, arXiv:1603.04467. [Google Scholar]
  56. Sokolova, M.; Lapalme, G. A systematic analysis of performance measures for classification tasks. Infor. Proc. Manag. 2009, 45, 427–437. [Google Scholar] [CrossRef]
  57. Yu, J.; Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Detection of broadleaf weeds growing in turfgrass with convolutional neural networks. Pest Manag. Sci. 2019, 75, 2211–2218. [Google Scholar] [CrossRef] [PubMed]
  58. Xiao, J.-R.; Chung, P.-C.; Wu, H.-Y.; Phan, Q.-H.; Yeh, J.-L.A.; Hou, M.T.-K. Detection of Strawberry Diseases Using a Convolutional Neural Network. Plants 2021, 10, 31. [Google Scholar] [CrossRef] [PubMed]
Figure 1. The overall configuration of the variable rate spraying system.
Figure 1. The overall configuration of the variable rate spraying system.
Agronomy 11 01480 g001
Figure 2. Sample of dataset images. (a) Strawberry plant (b) shepherd purse weed and (c) spotted spurge weed.
Figure 2. Sample of dataset images. (a) Strawberry plant (b) shepherd purse weed and (c) spotted spurge weed.
Agronomy 11 01480 g002
Figure 3. Diagram of a general convolutional neural network algorithm for weed classification.
Figure 3. Diagram of a general convolutional neural network algorithm for weed classification.
Agronomy 11 01480 g003
Figure 4. Schematic diagram of weed classification model.
Figure 4. Schematic diagram of weed classification model.
Agronomy 11 01480 g004
Figure 5. Design of laboratory experiment trajectory for testing spraying system.
Figure 5. Design of laboratory experiment trajectory for testing spraying system.
Agronomy 11 01480 g005
Figure 6. Schematic diagram of variable rate sprayer in field evaluation experiment.
Figure 6. Schematic diagram of variable rate sprayer in field evaluation experiment.
Agronomy 11 01480 g006
Figure 7. Graphical representation of precision spraying results with all three models.
Figure 7. Graphical representation of precision spraying results with all three models.
Agronomy 11 01480 g007
Table 1. Detail of the spraying system assessment models.
Table 1. Detail of the spraying system assessment models.
ContextDescriptionContextDescription
Aweeds completely sprayedCS = % of weeds completely sprayedState A/total weeds (%)
Bweeds incompletely sprayedIS = % of weeds incompletely sprayedState B/total weeds (%)
Cweeds not sprayedNS = % of missed weedsState C/total weeds (%)
Dnon-weeds are sprayedMS = % of mistakenly sprayedState D/total weeds (%)
Table 2. Validation results of DCNNs models (AlexNet, VGG-16 and GoogleNet).
Table 2. Validation results of DCNNs models (AlexNet, VGG-16 and GoogleNet).
Neural NetworkOverall AccuracyPrecisionRecallF1-Score
VGG-160.970.980.970.97
GoogleNet0.960.960.970.96
AlexNet0.950.950.960.95
Table 3. CNNs models performance results for weeds classification in real time Lab experiment.
Table 3. CNNs models performance results for weeds classification in real time Lab experiment.
Neural
Network
Sprayer Speed km/h Overall AccuracyPrecisionRecallF1-Score
VGG-1610.950.960.940.94
30.940.950.920.93
50.870.880.850.86
GoogleNet10.930.940.920.92
30.930.930.910.91
50.850.860.830.84
AlexNet10.910.920.900.90
30.900.910.880.89
50.830.840.810.82
Table 4. Performance Evaluation results of spraying system in lab experiment.
Table 4. Performance Evaluation results of spraying system in lab experiment.
Neural NetworkSprayer Speed km/hState A
Average Data
State B
Average Data
State C
Average Data
State D
Average Data
CS%IS%NS%MS%
VGG-1612811093330
32802093070
524060800200
GoogleNet12712090360
226130873100
523160763200
AlexNet126130873100
225140833130
521180703260
Table 5. CNNs models performance results for weeds clasfication in real time field experiment.
Table 5. CNNs models performance results for weeds clasfication in real time field experiment.
Neural
Network
Overall AccuracyPrecisionRecallF1-Score
VGG-160.890.900.880.88
GoogleNet0.870.880.850.86
AlexNet0.830.850.810.82
Table 6. Performance evaluation results of spraying system in field experiment.
Table 6. Performance evaluation results of spraying system in field experiment.
Neural NetworkState A
Average Data
State B
Average Data
State C
Average Data
State D
Average Data
CS%IS%NS%MS%
VGG-1626130863100
GoogleNet25131833103
AlexNet23151773173
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Liu, J.; Abbas, I.; Noor, R.S. Development of Deep Learning-Based Variable Rate Agrochemical Spraying System for Targeted Weeds Control in Strawberry Crop. Agronomy 2021, 11, 1480. https://doi.org/10.3390/agronomy11081480

AMA Style

Liu J, Abbas I, Noor RS. Development of Deep Learning-Based Variable Rate Agrochemical Spraying System for Targeted Weeds Control in Strawberry Crop. Agronomy. 2021; 11(8):1480. https://doi.org/10.3390/agronomy11081480

Chicago/Turabian Style

Liu, Jizhan, Irfan Abbas, and Rana Shahzad Noor. 2021. "Development of Deep Learning-Based Variable Rate Agrochemical Spraying System for Targeted Weeds Control in Strawberry Crop" Agronomy 11, no. 8: 1480. https://doi.org/10.3390/agronomy11081480

APA Style

Liu, J., Abbas, I., & Noor, R. S. (2021). Development of Deep Learning-Based Variable Rate Agrochemical Spraying System for Targeted Weeds Control in Strawberry Crop. Agronomy, 11(8), 1480. https://doi.org/10.3390/agronomy11081480

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop