Next Article in Journal
Influence of the Spatial Dimensions of Ultrasonic Transducers on the Frequency Spectrum of Guided Waves
Next Article in Special Issue
Discrimination of Transgenic Maize Kernel Using NIR Hyperspectral Imaging and Multivariate Data Analysis
Previous Article in Journal
A Novel Cross-Layer Routing Protocol Based on Network Coding for Underwater Sensor Networks
Previous Article in Special Issue
Combining Multi-Agent Systems and Wireless Sensor Networks for Monitoring Crop Irrigation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

A True-Color Sensor and Suitable Evaluation Algorithm for Plant Recognition

by
Oliver Schmittmann
* and
Peter Schulze Lammers
Institute of Agricultural Engineering, University Bonn, 53115 Bonn, Germany
*
Author to whom correspondence should be addressed.
Sensors 2017, 17(8), 1823; https://doi.org/10.3390/s17081823
Submission received: 12 May 2017 / Revised: 31 July 2017 / Accepted: 4 August 2017 / Published: 8 August 2017
(This article belongs to the Special Issue Sensors in Agriculture)

Abstract

:
Plant-specific herbicide application requires sensor systems for plant recognition and differentiation. A literature review reveals a lack of sensor systems capable of recognizing small weeds in early stages of development (in the two- or four-leaf stage) and crop plants, of making spraying decisions in real time and, in addition, are that are inexpensive and ready for practical use in sprayers. The system described in this work is based on free cascadable and programmable true-color sensors for real-time recognition and identification of individual weed and crop plants. The application of this type of sensor is suitable for municipal areas and farmland with and without crops to perform the site-specific application of herbicides. Initially, databases with reflection properties of plants, natural and artificial backgrounds were created. Crop and weed plants should be recognized by the use of mathematical algorithms and decision models based on these data. They include the characteristic color spectrum, as well as the reflectance characteristics of unvegetated areas and areas with organic material. The CIE-Lab color-space was chosen for color matching because it contains information not only about coloration (a- and b-channel), but also about luminance (L-channel), thus increasing accuracy. Four different decision making algorithms based on different parameters are explained: (i) color similarity (ΔE); (ii) color similarity split in ΔL, Δa and Δb; (iii) a virtual channel ‘d’ and (iv) statistical distribution of the differences of reflection backgrounds and plants. Afterwards, the detection success of the recognition system is described. Furthermore, the minimum weed/plant coverage of the measuring spot was calculated by a mathematical model. Plants with a size of 1–5% of the spot can be recognized, and weeds in the two-leaf stage can be identified with a measuring spot size of 5 cm. By choosing a decision model previously, the detection quality can be increased. Depending on the characteristics of the background, different models are suitable. Finally, the results of field trials on municipal areas (with models of plants), winter wheat fields (with artificial plants) and grassland (with dock) are shown. In each experimental variant, objects and weeds could be recognized.

1. Introduction

The use of pesticides in agriculture and green areas is regarded critically. The possible impact of residues on human health and the environment causes decreasing societal acceptance. High costs of pesticides, as well as political regulations call for a reduction of herbicide application in agriculture [1,2]. For the chemical industry, it has become more expensive to develop new agents, but alternative weeding methods have deficiencies, e.g., mechanical methods do not allow weed control on the total crop area and thermal and applications with bio-herbicides are uneconomic [3].
One approach to reduce the amount of pesticides is spot application by decentralized injection of agents into the individual nozzles of a sprayer boom [4,5,6,7,8,9,10]. These treatments can be applied with total herbicides, corresponding selective herbicides, with bio-herbicides or alternative mechanical or thermal measures [11]. However, a precondition is the availability a high-resolution plant recognition system with real-time capability for triggering the actuators. Imaging methods for this purpose do not have this ability yet. The assignment of plants is difficult because plant contours can overlap, and the elapse and response times for real-time processing are not applicable [12].
Conventional RGB color sensors consisting of optical components are real-time capable, but do not have sufficient color accuracy. IR-sensors are not able to distinguish between plant types. High-precision spectrometers are too expensive for agricultural use and are not real-time capable.
The aim of the research is to develop a programmable sensor system, consisting of different single sensors, for the identification of individual crop plants or weeds in municipal and agricultural land and to initiate site-specific treatments.
Each sensor should cover a small spot of about 20 cm2, analyze it and make a decision for or against spraying. The sensor should use the algorithm to develop and switch a valve next to one nozzle in real time. It is expected that small plants, for example in the two-leaf stage, can be detected. The advantage is that small plants can be eliminated with a small dosage very effectively, so that costs and impacts on the environment are reduced.
As a first step, each plant has to be detected. This is sufficient if no differentiation is needed and when unselective treatments are intended (e.g., with glyphosate or bio-herbicides). The second step is to distinguish between crop plants, weeds and plants which can be tolerated.
Smart elements are stated in the fact that each sensor can be programed individually with different decision models for different tasks. Previously, before starting the application, each sensor has to be adjusted. Databases with reflection properties are uploaded on each sensor. Finally, the appropriate decision model and algorithm will be selected and uploaded, as well.
- Positive recognition:
Spraying is performed if any plant is detected. The reflection properties are within a specified range. Application areas are weed destruction on municipal land or the use as an alternative to glyphosate.
- Negative recognition:
No spraying is performed if a crop plant is detected. The reflection properties are out of a specified range. Areas of application are weed destruction in row crops like sugar beet or maize.
- Recognition and differentiation between crops and weeds:
The range of reflection properties of different plants is compared with the database. Plants are assigned as crop, weed and harmless weed. Application areas are weed destruction on farmland. For example:
Arable land with crops like wheat or rape
Green areas with grass and broadleaf dock or lawn/golf courses with clover
For this purpose, databases with reflection properties (average and range of values) of different backgrounds (e.g., gravel, stones, soil, grassland) are determined and compiled. Different shades of green, characterizing the spectrum of plant colors, are selected and used for developing algorithms and decision models. These models are based on the color similarity of mixed areas in regard to ΔE, which includes all reflection values, different color and luminance similarity (ΔL, Δa and Δb) and a virtual color channel. The modeling results of the recognition of the greens on different backgrounds are presented.
Additionally, the results of field trials with plants and artificial plants are presented. Exemplarily, the results of the application for plant detection on stones, in winter wheat (Triticum aestivum L.) and dock (Rumex acetosa L.) in grassland by the use of those algorithms are shown. Finally, the suitability of true-color sensor systems for plant recognition will be evaluated.

2. State of the Art

Site-specific plant-protection has been an important field of research over the last few years. Different sensors and methods are used to detect and recognize plants to make a decision for the use of herbicides. In this section, an overview of the state of the art of real-time feasible sensors is presented. It is focused on the suitability for the use in sprayers. Airborne methods or systems that do not refer to single plants or small spots are not taken into account in this overview. The systems can be divided into opto-electronic sensors, imaging techniques and contour-based systems.

2.1. Opto-Electronic Sensors

DetectSpray, Weed-Seeker and Green-Seeker are systems for the detection of herbaceous plants on bare soil in reflection mode [13]. The detection principle is based on the fact that green plants absorb red light in the wavelength range between 630 and 660 nm and are highly reflective in the NIR range between 750 and 1200 nm. Basically, in all systems, two monochromatic diodes are used for the R- and the IR-range. For this basic plant identification, the ratio of the R channel to the IR channel is used as a decision criterion [14,15]. When exceeding a threshold value, the existence of plants can be concluded. In contrast to DetectSpray, the Green- and Weed-Seeker [16] devices use an active light source [14].
Approaches about the assignment of indices to plant groups are reported in the literature [17]. For the differentiation of plants, Biller [14] used five sensitive photodiodes with different wavebands to get a ‘spectral fingerprint’ for specific plant types.
Studies on the response accuracy of monochromatic sensors in comparison to real crop plants with weed population showed correlation coefficients of 0.6–0.9 [18]. These systems are described by the Alberta Farm Machinery Research Centre to be negatively affected by sunlight, preventing correct detections. Further, shadowing can hamper the application, especially in row crops [19].
Weed-IT is an Australian system using a sensor scanning a strip of 1 m [20]. It contains an NIR sensor with a light source. According to the manufacturer, it is applicable at high speeds up to 25 km·h−1 on stubble fields [12,21].
Crop-Cycle (Fa. Holland Scientific, Lincoln, NE, USA) uses reflections in three different wavelengths (670, 730 nm and NIR) for determining the nitrogen supply of plants. Using a calculation model with various indices including a preliminary calibration, the green color can be determined, as well. The manufacturer specifies the size of the measuring area as 20 cm in diameter [22].
Finally, traded under the name ‘AmaSpot’, a sensor-nozzle unit, which is based on the Weed-IT system, was awarded as a novelty in 2015 [23].
In these mentioned commercial systems, a 30 × 30 cm2 area is scanned, and at least 3% of the scanned surface has to be green for a successful recognition. A scanning area of 50 × 50 cm2 needs weeds with a size more than 75 cm2. Consequently, either weeds have to be well developed (extended phenotype), or a high degree of coverage by weeds supports a successful recognition and herbicide application [24]. The distinction of plant type and the variation of the scan area size is not possible. Additionally, the system is not ‘open’ to the user.
Kluge [25] has stated that existing opto-electronic systems are not capable of the distinction between plants, but only generate information on the existence of plants. Therefore, the application of these sensor systems in agriculture is restricted to the period before crops emerge only.
In laboratory experiments, positive results for the determination of plant species by spectrometers are mentioned. Feyaert [26] stated that the differentiation of the reflection characteristics of plant species refers to their physical differences: in the red wavelength by chlorophyll content, which, however, depends on external factors (diseases, water and nutrients), and in the NIR wavelength of the internal structure of the plant, such as cell size and cell wall texture, waxes and trichomes.

2.2. Imaging Techniques

Imaging techniques are based on camera systems (CCD camera, bi-, multi- or hyper-spectral) with appropriate optical components and post-processing software. Plant contours can be detected if plants are freestanding [27]. The use of IR-channel shows good results to differentiate between soil and plants [28,29,30]. Imaging techniques are highly sensitive to varying external conditions. Extended computing power is required for weed detection based on shape factors [31,32,33,34]. Overlapping of plant parts makes recognition and plant type differentiation more difficult. 3D camera systems (time-of-flight cameras) actively emit light with a defined wavelength and receive the reflection of the object. They generate real-time images of all three dimensions and additionally a grey-scale image [35]. Time-of-flight cameras are a technical advancement and improve the quality of plant identification, but due to their low resolution and high costs, they are less suitable for practical use.

2.3. Plant Identification through Plant Contours without a Camera

Various sensors for plant phenotyping are known. Light grid sensors, consisting of horizontally-cascaded light barriers, were successfully tested [36]. In the described trial, these sensors have been mounted on a carrier vehicle, and measurements have been conducted in a maize field with approximately 20 cm-high plants. Plant identification with light grid sensors in row crops only works for large, non-herbaceous weeds under undisturbed conditions. These sensors are not suitable for narrow-spaced crop recognition.
Other methods are based on distance sensors (laser and ultrasonic), which determine the contour of the crop plant [37]. Due to the measurement speed, dynamic oscillation of the carrier vehicle and the sensitivity to small changes in distance, this method is not effective [38].
In conclusion, it can be pointed out that the mentioned optoelectronic sensors were well evaluated in former times, but they are not able to detect small plants in early leaf stages (smaller than 3% of the measuring spot). Differentiation of plants seems to be very difficult. Imaging technologies are more complicated, need more computer performance and are too expensive for practical use. Such systems do measure the reflection, but not the real coloration of the object. The identification by the use of plant contours is complicated, as movements of the sprayer or overlapping plants interfere with the measurement of crops. The use of true-color sensors in combination with algorithms is a further development of the presented optoelectronic sensors.

3. Material and Methods

3.1. Materials

3.1.1. Sensor Technology

Our sensor development is based on the true-color PR0126C sensor of Premosys GmbH (Wiesbaum, Germany). Compared to other sensor systems, true-color sensors represent a compromise between expensive spectrometers with high color accuracy and low cost, but imprecise RGB sensors. The velocity of true-color sensors is much higher than the velocity of spectrometers [39]. The PR0126C sensor can be equipped with different lenses, which determine the spot size of measurement (spot sizes).
For true-color determination and technical implementation of color standards, true-color sensors are coated with interference filters. Because of this filter characteristics, they are highly capable of color measuring and more sensitive than human eyes (standardized according to the German Institute for Standardization DIN 5033 norm). The sensitivity of the filters is related to a defined spectral wavelength. After normalizing the sensor, the color values are assigned to XYZ coordinates. The XYZ space provides the basis for the conversion into other color spaces.
The obtained color information then is converted into the CIE-Lab color space. L characterizes the luminance (L: 0 = black, 100 = white). Channels a and b refer to the coloration (a: −128 = green, 127 = red; b: −128 = blue, 127 = yellow). The color space was introduced in 1976 by the International Commission Internationale de l’Eclairage (CIE) and is frequently used by 3D color systems. CIE-Lab is device independent. For plant recognition, the wide green range is a major advantage.
The true-color sensor consists of 19-diode hexagon color ICs (integrated circuits, Figure 1) supplied by Mazet GmbH (Jena, Germany) [40]. Each diode has three segments with interference filters for the colors red, green and blue. The components of the sensor are the color-IC, a trans-impedance amplifier, a light-emitting diode with a defined wavelength, a fiber optic for emitting light and receiving reflection and an optical lens. The dimension of the lens (focal length range, measuring spot size and form (point or rod lens)) influences the characteristics of the sensor system in regard to the resolution.
The sensor was equipped with a double concave lens with a screen diameter of 50 mm (Figure 2). The spot size is about 20 cm2.

3.1.2. Objects and Backgrounds

For mobile measurements, the backgrounds were divided into anthropogenic and natural. The artificial backgrounds include gravel, concrete slab or paving stones. Natural backgrounds are arable land without vegetation, with stubble or mulch. Green areas were also assigned to this category (grassland with and without dew). The backgrounds are displayed in Figure 3. To characterize different plants by color, four different green color cards from bright to dark green were used.

3.1.3. Test Facilities

(a) Stationary Carrier:
Dynamic measurements on anthropogenic backgrounds were carried out by means of a driven rail system (Figure 4). The sensor was positioned at a predefined height and moved along a distance of 150 cm at a constant forward speed of 0.1 m s−1. Different objects, backgrounds and mixed areas were placed below the sensor line. The measuring frequency was 10 Hz (recording of ten color values per second).
(b) Mobile Field Carrier:
For measurements in the field, a mobile carrier was built (Figure 5) for spray application. It can be trailed manually or by a vehicle. Time- and distance-based recordings of color values were performed.

3.2. Methods

3.2.1. Characterization of Backgrounds

In all experiments, objects (plants characterized by cards) and backgrounds (artificial and natural soils with and without vegetation) are characterized by Lab values and their variation (=noise). For this purpose, statistical indicators were used and presented by means of average and frequency distributions. Each background description contains more than 500 values. The backgrounds are divided into human (anthropogenic) and natural vegetation influenced (Figure 5).

3.2.2. Data Management and Processing

Data were collected by means of private domain software. The color values of objects and backgrounds were compiled and implemented into a database, which will be used for decision models and (real-time) plant identification. The setup, justification of the sensors, acquisition, visualization and recording of the data were realized by using this private domain software. However, the software is also important for other aspects, like course-controlled data acquisition, recording of data in a defined format, visualization of measured values and testing of algorithms.

3.2.3. Decision Making Based on Analyzing the Color Similarity of Mixed Areas

Mixed areas are defined as backgrounds covered with a large or small proportion of objects (plants). The database serves to estimate the potential of plant recognition and identification on different backgrounds. ∆E describes the color distance between two color values in the Lab-color space. According to ISO 12647 [41] and ISO 13655 [42], ∆E for this study is calculated by Equation (1):
E B G , M A = ( L B G L M A ) 2 + ( a B G a M A ) 2 + ( b B G b M A ) 2
ΔEBG,MA = color similarity of background and mixed area (background with objects)
LBG = luminance value of background
LMA = luminance value of mixed area
aBG = color value of background for Channel a
aMA = color value of mixed area for Channel a
bBG = color value of background for Channel b
bMAj = color value of mixed area for Channel b
It describes the color similarity between the background without plants and the mixed area: background with plants (Table 1). The range of ΔE is used as a trigger for the decision procedure.
To get more information about color similarity, each channel was analyzed individually. The formulas are given in Equations (2)–(4):
L B G , M A = ( L B G L M A ) 2
a B G , M A = ( a B G a M A ) 2
b B G , M A = ( b B G b M A ) 2

3.2.4. Decision Making Based on Modeling

A statistical model is based on the assumption that the difference of the color values of the background and mixed area should be bigger than the standard deviation of the background without objects and the standard deviation of the object (the sum of both standard deviations (Formula (5)). For decision making, a difference in only one channel may be sufficient. The information about the differences in each channel could be used to classify the plants.
An object exists if:
L MA L BG σ LBG + σ LObj a MA a BG σ aBG + σ aObj b MA b BG σ bBG + σ bObj
The required relative coverage area A is calculated as follows:
A L = 100 · ( σ L B G + σ L O b j ) ( L O b j L B G ) 2
A a = 100 · ( σ a B G + σ a O b j ) ( a O b j a B G ) 2
A b = 100 · ( σ b B G + σ b O b j ) ( b O b j b B G ) 2
AL, Aa, Ab = minimal relative area with plants covered for Channels L, a and b
LObj, aObj, bObj = color values of the object for Channels L, a and b
LBG, aBG, bBG = color values of the background for Channels L, a and b
σLObj, σaObj, σbObj = standard deviation of the object for Channels L, a and b
σLBG, σaBG, σbBG = standard deviation of the background for Channels L, a and b
As an additional identification unit, a virtual ‘d channel’ is defined as the difference between the a and b value.
d = ( a b ) 2

4. Results and Discussion

4.1. Database to Characterize Backgrounds

In Table 2, the reflection values of the anthropogenic and natural backgrounds are listed and characterized by means Ø and standard deviations σ.
- Luminance:
The L-channel in the anthropogenic surfaces is located in a range from 6.39 to 19.31 (Table 2 and Appendix Figure A1). In comparison, the L-channel values of natural backgrounds are in a range from 7.57 to 12.03. Anthropogenic and natural backgrounds are in the same range in this channel. Also important is the variation of these values: it is very conspicuous that the arable land has the smallest standard deviation (0.13). It could be an indicator that even small differences in this channel may be sufficient to detect plants. The biggest deviations are in gravel (1.05, dark and bright stones), red paving stones (1.28, red stone and grey joint) and grassland with dew (2.23).
- Coloration:
Channel a and Channel b give information about the coloration of the backgrounds. Excluding the red paving stones, all tested anthropogenic backgrounds in Channel a are in a range of 0.45–6.52. The range of natural backgrounds is from −8.86 to 5.51. The negative values are caused by the green color of grassland and can be an indicator for the presence of plants. It can be concluded that there is an overlapping of both kinds of background.
The b-channel is located in the paved surfaces in a range from 10.78 to 25.03 and in the natural backgrounds from 16.16 to 24.80. The range of natural backgrounds is within the range of anthropogenic surfaces.

4.2. Characterization of Different Green Tones in Regard to Plant Recognition

In Table 3, the Lab-values for four different green tones are exemplarily listed:
- Luminance:
The variation of the selected greens is higher than the variation of the backgrounds. The range of Channel L is between 29 and 45. In comparison to the backgrounds, the luminance is an outstanding criterion to detect plants. Furthermore, the magnitude of L could be a criterion to differentiate plants and weed.
- Coloration:
Channel a values are between −15 and −62 and differ in greens significantly from the chosen backgrounds. This wide range of values supports the idea of differentiating different plants by the use of true-color sensors and the CIE-Lab color space. It would be sufficient as the sole criterion for triggering a further evaluation step. In conclusion, the different shades of green can be clearly distinguished by the Lab channels.

4.3. Object Identification by Analyzing Color Similarity

The following figures display the described delta values depending on the background characteristics/scattering. For discrimination, a threshold has to be determined. In Figure 6, the delta-signals for solid background are shown exemplarily. This design is the most promising task for weed recognition.
The four objects (r = 1 cm) are detected accurately. The ∆E and ∆L are highly responsive to colored points. By setting a threshold of five, all green objects can be filtered out by the use of ΔE and ΔL. The threshold for Δa and Δb is two. If a measured value is higher, an object/plant exists. To determine the color (kind of plant), the relation of each delta value has to be assessed. The relation between ΔL, Δa and Δb suits the characterization and identification of objects.
The data in Figure 7 are recorded in a young wheat stand (BBCH 13 [44]; Figure 6). Herbicide application at this point in time is common. It is a good example for plant recognition and site-specific spraying in existing crops with selective herbicides. In contrast to the literature [25], it can be shown that the differentiation of plant and weeds is possible by the use of a threshold. With a threshold of 12, all six objects can be identified correctly.
Figure 8 shows the results of the detection of broadleaf dock on grassland (dock plants are toxic for some animals [45]). A special task is to detect green plants on green areas. ΔE and ΔL are suitable to recognize dock by the use of a threshold of about seven. Δa provides no significant signals. The green color of both plants is quite similar. The reason why the differences in ΔL are higher is caused by the leaf position. A horizontal arrangement of leaves causes higher reflection intensity as the vertical arrangement of leaves from grass.

4.4. Object Identification by Modeling

The result of the decision-based mathematical model (Section 3.2.4) is displayed in Table 4. For each background, the relative object (plant) size of each of the four greens is calculated. Exemplarily, the valuation is displayed for lens diameters of five (spot size 20 cm2) and ten centimeters (spot size ~80 cm2).
According to Table 4, it can be concluded that the quality of recognition for all channels is different. As described previously, the influence of the background is relatively low. Green 1 can be identified by the use of the luminance very well. A coverage of 0.5% on the field should be enough for detection. Channel a is also quite suitable, except on red paving stones.
For Channel L, the sufficient cover of the measurement spot is between 0.5% on fields (Green 1) and up to 13.3% on grassland with dew (Green 4). It is evident that soil without vegetation has the best detection success. The virtual channel d (the difference between a and b) does not show a big advantage. d is more suitable than b, but has no advantage in comparison to a.

5. Summary and Conclusions

True-color sensors are a good compromise between inexpensive RGB sensors and spectrometers with high spectral resolution. The objective was to study the suitability of this kind of sensor for the detection and differentiation of crop and weed plants in agricultural and municipal areas.
For plant-specific spraying, the boom of a sprayer can be equipped with those sensors. The sensor, valve and nozzle together make up an independent sensor-valve-nozzle unit. Depending on sensor spot and spraying angle of the nozzles, the distances between each nozzle can be up to 50 cm. Each true-color sensor is able to control one valve for one nozzle. If spot diameter and nozzle distance are not the same, more sensors (=sensor array) can control the same valve. The possible detection success and identification of small plants is calculated and tested by the use of different algorithms. The spot sizes should be between 20 cm2 and 80 cm2 to detect plants in the two-leaf stage. The presented detection method contains the following steps:
  • Normalization and adjusting of the sensor in the field. On a place without weeds, the background properties are determined, and a threshold for discrimination will be defined.
  • Selection of the mode: positive or negative detection.
  • Selection and upload of the algorithms and database with reflection properties to each sensor-valve-nozzle unit.
  • Running the plant recognition and differentiation process.
  • Control of the sprayer valve.
The used CIE-Lab color space is suitable for plant recognition, due to the distinction between luminance and coloration. The a-channel (green-red) of the color space is very sensitive for the discrimination of green-colored plants. Databases with reflection properties are assigned. It was shown that backgrounds and green objects are quite different especially in luminance and Channel a.
Four methods for detection are presented:
  • The detection based on color similarity ΔE,
  • The splitting color similarity into ΔL, Δa and Δb,
  • A Virtual Channel d and
  • A modeling algorithm based on statistics.
The suitability of the methods are introduced and have been applied for chosen true backgrounds and different green tones. Methodically, the detection quality and the potential of the sensor were tested under defined conditions on fields with and without vegetation. Based on the experiments carried out, the following statements can be made:
-
Due to the built-in single sensor controller, actuators can be addressed and activated in real time.
-
A specified evaluation algorithm for plant identification, the decision model and the current calibration values have to be updated to a central computer.
-
Minimal differences in coloration (a- and b-channel) and reflectivity can be detected with the sensor. Coloration properties are applicable for plant identification.
-
The differentiation of plants in crop and weed appears to be possible by a multistage model. The procedure is as follows:
  • Extraction of suspicious points by the consideration of ∆E.
  • Targeted analysis of these signals by the use of the individual Channels L, a and b and
  • Comparison of all four channels (including Virtual Channel d) relative to each other to decrease the influence of the object size.
-
The relative weed/plant coverage of the measuring spot was calculated by a mathematical model. By choosing a decision model previously, the detection quality can be increased. Depending on the background characteristics, different models are more suitable.
-
Plants with a size of 1–5% of the measuring spot can be recognized. Weeds in the two-leaf stage can be identified.
-
The detection success of the recognition system is displayed and described in field tests. Field trials on municipal areas (with models of plants), winter wheat fields (with models of plants) and grassland (with dock) are shown. In the experiment variants, objects and weeds can be recognized.
It can be stated that true-color sensors are able to detect small differences in luminance and the coloration of objects. They are real-time capable, easy to use and inexpensive. The sensor system is open for the user and can be adapted to the individual condition. In combination with the presented algorithms, it was proven that the sensor has the potential to differentiate between crop and weed.
In comparison to the existing optoelectronic systems presented in this paper, true-color sensors are further developed. Plants can be detected and discriminated. Even a discrimination of green plants is possible in some cases. An important next step is to evaluate this sensor system under a ‘real field condition’ in different crops. The amounts of savings of herbicides will be the most convincing evaluation parameter.

Acknowledgments

This project was funded by the Deutsche Bundesumweltstiftung (DBU). We thank Matthias Kuhl from the Premosys company for the cooperation and providing the CIE-Lab sensors and technical device.

Author Contributions

Oliver Schmittmann designed the sensor system and performed the experiments; Peter Schulze Lammers supported the cooperation with the Premosys company and writing of the paper.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Figure A1. Signal distribution of backgrounds for Channels L, a and b. (a) Asphalt; (b) Fine chippings; (c) Red paving stones; (d) Stone steps; (e) Gray paving stones; (f) Gravel; (g) Sandy path; (h) Arable land; (i) Grazing land; (j) Grassland with dew.
Figure A1. Signal distribution of backgrounds for Channels L, a and b. (a) Asphalt; (b) Fine chippings; (c) Red paving stones; (d) Stone steps; (e) Gray paving stones; (f) Gravel; (g) Sandy path; (h) Arable land; (i) Grazing land; (j) Grassland with dew.
Sensors 17 01823 g009

References

  1. Bundesministerium für Ernährung und Landwirtschaft. Nationaler Aktionsplan zur Nachhaltigen Anwendung von Pflanzen-Schutzmitteln. Federal Ministry for Consumer Protection, Food and Agriculture of Germany, 2013. Available online: http://www.bmel.de/SharedDocs/Downloads/Broschueren/Nationaler AktionsplanPflanzenschutz.pdf?__blob=publicationFile (accessed on 5 May 2017).
  2. International Institute for Sustainable Development. FAO, WHO Join Forces to Reduce Risks Posed by Pesticides. 2017. Available online: http://sdg.iisd.org/news/fao-who-join-forces-to-reduce-risks-posed-by-pesticides/ (accessed on 18 June 2017).
  3. Schmittmann, O.; Kam, H.; Schulze Lammers, P. Position steered sowing of sugar beet Technology and precision. In Proceedings of the 2nd International Conference on Machine Control & Guidance—MCG, Bonn, Germany, 9–11 March 2010. [Google Scholar]
  4. Hloben, P. Study on the Response Time of Direct Injection Systems for Variable Rate Application of Herbicides (VDI-MEG Schriftenreihe 459). Ph.D. Thesis, University of Bonn, Bonn, Germany, 2007. [Google Scholar]
  5. Vondricka, J. Study on the Process of Direct Nozzle Injection for Real-Time Site Specific Pesticide Application (VDI-MEG Schriftenreihe 465). Ph.D. Thesis, University of Bonn, Bonn, Germany, 2008. [Google Scholar]
  6. Walgenbach, M. Aufbau und Untersuchung Eines Versuchsträgers zur Direktein-Speisung an der Düse (VDI-MEG Schriftenreihe 533). Ph.D. Thesis, University of Bonn, Bonn, Germany, 2014. [Google Scholar]
  7. Vondricka, J.; Schulze Lammers, P. Evaluation of a carrier control valve for a direct nozzle injection system. Biosyst. Eng. 2009, 103, 43–46. [Google Scholar] [CrossRef]
  8. Rockwell, A.D.; Ayers, P.D. A variable rate, direct nozzle injection field sprayer. Appl. Eng. Agric. 1996, 12, 531–538. [Google Scholar] [CrossRef]
  9. Paice, M.E.R.; Miller, P.C.H.; Bodle, J.D. An experimental sprayer for the spatially selective application of herbicides. J. Agric. Eng. Res. 1995, 60, 107–116. [Google Scholar] [CrossRef]
  10. Downey, D.; Crowe, T.G.; Giles, D.K.; Slaughter, D.C. Direct Nozzle Injection of Pesticide Concentrate into Continuous Flow for Intermittent Spray Applications. Trans. ASABE 2006, 49, 865–873. [Google Scholar] [CrossRef]
  11. Cordeau, S.; Triolet, M.; Wayman, S.; Steinberg, C.; Guillemin, J.-P. Bioherbicides. Dead in the water? A review of the existing products for integrated weed management. Crop Prot. 2016, 87, 44–49. [Google Scholar] [CrossRef]
  12. Kempenaar, C.; Groeneveld, R.M.W.; Uffing, A.J.M. Evaluation of Weed IT Model 2006 MKII. Spray Volume and Dose Response Tests; Plant Research International Wageningen UR: Wageningen, The Netherlands, 2006. [Google Scholar]
  13. Knittel, G.; Stahli, W. Spritz- und Sprühverfahren in Pflanzenschutz und Flüssigdüngung bei Flächenkulturen; Books on Demand: Berlin, Germany, 2001. [Google Scholar]
  14. Biller, R. Optoelektronik zur Unkrauterkennung—Erste Erfahrungen beim Test unter simulierten Bedingungen und beim Einsatz auf Versuchsflächen Innovative Verfahren zur Unkrauterkennung. KTBL Arb. 1996, 236, 75–85. [Google Scholar]
  15. Gibson, P.; Power, C. Introductory Remote Sensing: Digital Image Processing and Applications; Routledge: London, UK, 2000. [Google Scholar]
  16. Trimble. 2017. Available online: http://www.trimble.com/agriculture/weedseeker.aspx (accessed on 24 April 2017).
  17. Vrindts, E.; de Baerdemaeker, J. Optical discrimination of crop, weed and soil for on-line weed detection. In Proceedings of the First European Conference on Precision Agriculture, Warwick University Conference Centre, Coventry, UK, 7–10 September 1997. [Google Scholar]
  18. Wartenberg, G.; Langner, H.-R.; Böttger, H.; Schmidt, H.; Ruckelshausen, A. Messsystem zur Bewertung des Unkrautvorkommens; Bornimer Agrartechnische Berichte: Potsdam, Germany, 2005. [Google Scholar]
  19. Hanks, J.E.; Beck, J.L. Sensor-controlled hooded sprayer for row crops. Weed Technol. 1998, 12, 308–314. [Google Scholar]
  20. WeedIT. 2017. Available online: www.weedit.com.au (accessed on 24 April 2017).
  21. Visser, R.; Timmermans, A.J.M. Weed-It: A new selective weed control system. In Proceedings of the SPIE 2907, Optics in Agriculture, Forestry, and Biological Processing II. 120, Boston, MA, USA, 18 November 1996. [Google Scholar]
  22. Holland Scientific. Crop-Cycle ACS 470 Manual. 2014. Available online: http://hollandscientific.com/wp-content/uploads/files/ACS430_Manual.pdf (accessed on 15 December 2014).
  23. Köller, K.H. Landtechnische Innovationen auf der Agritechnica 2015. Available online: http://www.dlg.org/aktuell_landwirtschaft.html?detail/2015dlg/1/1/7922 (accessed on 18 December 2015).
  24. Department of Agriculture and Rural Development, Alberta, Canada, 2013. The Detectspray Spraying System. Available online: http://www1.agric.gov.ab.ca/$department/deptdocs.nsf/all/eng7995 (accessed on 24 February 2015).
  25. Kluge, A. Methoden zur Automatischen Unkrauterkennung für die Prozesssteuerung von Herbizidmaßnahmen. Ph.D. Thesis, University of Stuttgart, Stuttgart, Germany, 2011. [Google Scholar]
  26. Feyaert, F.; Pollet, P.; van Gool, L.; Wambact, P. Vision system for weed detection using hyper-spectral imaging, structural field information and unsupervised training sample collection. In Proceedings of the British Crop Protection Conference, Brighton, UK, 15–18 November 1999; pp. 607–614. [Google Scholar]
  27. Choi, K.H.; Han, A.H.; Han, S.H.; Park, K.-H.; Kim, K.S.; Kim, S. Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields. Comput. Electron. Agric. 2015, 113, 266–274. [Google Scholar] [CrossRef]
  28. Brivot, R.; Marchant, J.A. Segmentation of plants and weeds for a precision crop protection robot using infrared images. IEE Proc.-Vis. Image Signal Process. 1996, 143, 118–124. [Google Scholar] [CrossRef]
  29. Dammer, K.-H.; Kim, D.-S. Real-time variable-rate herbicide application for weed control in carrots. Weed Res. 2016, 56, 237–246. [Google Scholar] [CrossRef]
  30. Gerhards, R.; Christensen, S. Real-time weed detection, decision making and patch spraying in maize, sugarbeet, winter wheat and winter barley. Weed Res. 2003, 43, 385–392. [Google Scholar] [CrossRef]
  31. Gerhards, R.; Oebel, H. Practical experiences with a system for site-specific weed control in arable crops using real-time image analysis and GPS-controlled patch spraying. Weed Res. 2006, 46, 185–193. [Google Scholar] [CrossRef]
  32. Laursen, M.S.; Jørgensen, R.N.; Midtiby, H.S.; Jensen, K.; Christiansen, M.P.; Giselsson, T.M.; Mortensen, A.K.; Jensen, P.K. Dicotyledon Weed Quantification Algorithm for Selective Herbicide Application in Maize Crops. Sensors 2016, 16, 1848. [Google Scholar] [CrossRef] [PubMed]
  33. Sökefeld, M.; Gerhards, R.; Oebel, H.; Therburg, R.-D. Image acquisition for weed detection and identification by digital image analysis. In Precision Agriculture ‘07; Stafford, J.V., Ed.; Academic Publishers: Wageningen, The Netherlands, 2007; pp. 523–528. [Google Scholar]
  34. Sökefeld, M.; Gerhards, R.; Kühbauch, W. Teilschlagspezifische Unkrautkontrolle—Von der Unkrauterfassung bis zur Herbizidapplikation. J. Plant Dis. Prot. 2000, 17, 227–233. [Google Scholar]
  35. Klose, R.; Penlington, J.; Ruckelshausen, A. Usability study of 3D Time-of-Flight cameras for automatic plant phenotyping. In Proceedings of the 1st International Workshop on Computer Image Analysis in Agriculture, Potsdam, Germany, 27–28 August 2009. [Google Scholar]
  36. Fender, F.; Hanneken, M.; Linz, A.; Ruckelshausen, A.; Spicer, M. Messende Lichtgitter und Multispektralkameras als bildgebende Systeme zur Pflanzenerkennung. Bornimer Agrartech. Ber. 2005, 40, 7–16. [Google Scholar]
  37. Lee, W.S.; Alchanatis, V.; Yang, C.; Hirafuji, M.; Moshou, D.; Li, C. Sensing technologies for precision specialty crop production. Comput. Electron. Agric. 2010, 74, 2–33. [Google Scholar] [CrossRef]
  38. Schmittmann, O. Teilflächenspezifische Ertragsmessung von Zuckerrüben in Echtzeit unter besonderer Berücksichtigung der Einzelrübenmasse (VDI-MEG Schriftenreihe 401). Ph.D. Thesis, University of Bonn, Bonn, Germany, 2002. [Google Scholar]
  39. Jensen, K.; Nimz, T. Welche Genauigkeit Erreicht Man Mit Farbsensoren und Mini-Spektrometern? (2015). Available online: http://www.mazet.de/de/downloads/produkt-kunden informationen/white-paper/item/download/39_9302d1d2d74f08ece546cc7c63cf0386.html (accessed on 24 April 2017).
  40. Mazet. 2017. Available online: www.mazet.de (accessed on 24 April 2017).
  41. ISO 12647. Graphic Technology—Process Control for the Production of Half-Tone Colour Separations, Proof and Production Prints; International Organization for Standardization: Geneva, Switzerland, 2013.
  42. ISO 13655. Graphic Technology—Spectral Measurement and Colorimetric Computation for Graphic Arts Images; International Organization for Standardization: Geneva, Switzerland, 2009.
  43. EN ISO 11664-4, 1976. Colorimetry—Part 4: CIE 1976 L*a*b* Colour Space. Available online: https://de.wikipedia.org/wiki/Delta_E (accessed on 17 July 2017).
  44. Meier, U.; Bleiholder, H.; Buhr, L.; Feller, C.; Hack, H.; Heß, M.; Lancashire, P.D.; Schnock, U.; Stau, R.; van den Boom, T.; et al. The BBCH system to coding the phenological growth stages of plants—History and publications. J. Kulturpflanzen 2009, 61, 41–52. [Google Scholar]
  45. Van Evert, F.K.; Samsom, J.; Polder, G.; Vijn, M.; van Dooren, H.-J.; Lamaker, A. A robot to detect and control broad-leaved dock (Rumex obtusifolius L.) in grassland. J. Field Robot. 2011, 28, 264–277. [Google Scholar] [CrossRef]
Figure 1. Nineteen-diode hexagon sensor IC with three segment photodiodes for color detection.
Figure 1. Nineteen-diode hexagon sensor IC with three segment photodiodes for color detection.
Sensors 17 01823 g001
Figure 2. Design of the sensor-lens unit prepared for use in the experiments. Components: 1. True-color sensor with PC interface (RS232), power supply, switching outputs and two optical fibers (emitter and receiver); 2. Lens (double concave, diameter 5 cm, angle 22.5° connector for optical fiber); 3. Space for valve and nozzle.
Figure 2. Design of the sensor-lens unit prepared for use in the experiments. Components: 1. True-color sensor with PC interface (RS232), power supply, switching outputs and two optical fibers (emitter and receiver); 2. Lens (double concave, diameter 5 cm, angle 22.5° connector for optical fiber); 3. Space for valve and nozzle.
Sensors 17 01823 g002
Figure 3. Representation of backgrounds. (a) Gray paving stones; (b) Stone steps; (c) Asphalt; (d) Fine chippings; (e) Gravel; (f) Red paving stones; (g) Sandy path; (h) Arable land with shadows; (i) Grassland; (j) Grassland with dew.
Figure 3. Representation of backgrounds. (a) Gray paving stones; (b) Stone steps; (c) Asphalt; (d) Fine chippings; (e) Gravel; (f) Red paving stones; (g) Sandy path; (h) Arable land with shadows; (i) Grassland; (j) Grassland with dew.
Sensors 17 01823 g003
Figure 4. Stationary carrier for dynamic sensor tests.
Figure 4. Stationary carrier for dynamic sensor tests.
Sensors 17 01823 g004
Figure 5. Concept (Left) and prototype (Right) of the mobile field carrier with spraying device.
Figure 5. Concept (Left) and prototype (Right) of the mobile field carrier with spraying device.
Sensors 17 01823 g005
Figure 6. Reflection of paved ground with different colored cards, ∆E, ∆L, ∆a, ∆b and the defined threshold for detection.
Figure 6. Reflection of paved ground with different colored cards, ∆E, ∆L, ∆a, ∆b and the defined threshold for detection.
Sensors 17 01823 g006
Figure 7. Reflection of a winter wheat field with different colored cards, ∆E, ∆L, ∆a, ∆b and the defined threshold for detection.
Figure 7. Reflection of a winter wheat field with different colored cards, ∆E, ∆L, ∆a, ∆b and the defined threshold for detection.
Sensors 17 01823 g007
Figure 8. Reflection of grassland with dock, ∆E, ∆L, ∆a, ∆b and the defined threshold for detection.
Figure 8. Reflection of grassland with dock, ∆E, ∆L, ∆a, ∆b and the defined threshold for detection.
Sensors 17 01823 g008
Table 1. Interpretation and evaluation of the ∆E values ([43], translated).
Table 1. Interpretation and evaluation of the ∆E values ([43], translated).
∆EEvaluation Categories
0.0...0.5no to almost no difference
0.5...1.0difference may be noticeable to the trained eye
1.0...2.0weak perceptible color difference
2.0...4.0perceived color difference
4.0...5.0substantial difference in color, which is rarely tolerated
above 5.0high difference defined as a different color
Table 2. Statistical description of reflection values of selected anthropogenic and natural backgrounds by the mean Ø and standard deviation σ.
Table 2. Statistical description of reflection values of selected anthropogenic and natural backgrounds by the mean Ø and standard deviation σ.
Lab
ØσØ − σØ + σØσØ − σØ + σØσØ − σØ + σ
Anthropogenic Backgrounds
Grey paving stones18.690.6218.0719.312.801.001.803.8020.810.5420.2721.35
Asphalt12.910.3412.5713.252.161.710.453.8713.531.1212.4114.65
Gravel7.441.056.398.493.151.002.154.1512.241.4610.7813.70
Stone steps13.550.5014.0514.052.340.521.822.8615.440.7614.6816.20
Fine chippings15.550.5515.0016.103.900.503.404.4018.520.8417.6819.36
Red paving stones15.281.2814.0016.5613.612.8910.7216.5024.120.9123.2125.03
Sandy path11.450.4111.0411.864.901.623.286.5219.721.1818.5420.90
Natural Backgrounds
Arable land11.320.1311.1911.454.810.704.115.5120.100.4719.9320.57
Grassland10.350.969.3911.31−2.372.40−4.770.0322.742.0620.6824.80
Grassland with dew9.802.237.5712.03−6.322.54−8.86−3.7819.473.3216.1522.79
Table 3. Description of the reflection properties of different selected greens.
Table 3. Description of the reflection properties of different selected greens.
Green 1Green 2Green 3Green 4
Sensors 17 01823 i001 Sensors 17 01823 i002 Sensors 17 01823 i003 Sensors 17 01823 i004
LabLabLabLab
Median44.78−15.2144.4740.45−26.2751.7533.95−49.8633.3929.16−61.65−4.33
Mean44.80−15.2444.3940.45−26.1851.8033.95−49.9533.5429.15−61.51−4.22
Minimum44.71−14.8543.7940.39−16.8350.8232.95−49.6132.9528.98−63.01−3.25
Maximum44.84−15.7344.6940.55−25.6052.9134.47−50.5334.4729.32−60.35−4.96
Table 4. Required relative coverage of different shades of green on backgrounds for Channels L, a, b and Virtual Channel d.
Table 4. Required relative coverage of different shades of green on backgrounds for Channels L, a, b and Virtual Channel d.
Green 1Green 2Green 3Green 4
LabdLabdLabdLabd
Gray pavings2.57.13.53.03.67.78.57.15.93.616.45.29.25.76.812.2
Asphalt cover1.211.54.64.41.810.38.47.72.95.013.35.94.26.812.812.3
Gravel2.97.05.43.53.77.69.06.95.03.614.15.36.45.615.911.0
Flagged floor1.84.63.63.42.56.17.87.13.82.812.75.35.45.09.811.6
Concrete2.14.14.32.52.95.78.86.64.52.615.94.86.54.88.810.9
Red pavings4.518.05.95.65.715.410.98.68.49.226.26.711.710.07.313.4
Sandy path1.49.55.94.92.09.110.28.23.14.619.86.34.26.49.813.5
Field0.54.93.12.51.06.28.16.61.83.015.04.82.65.16.78.0
Grassland2.921.010.88.83.715.214.311.55.37.033.68.96.98.511.920.4
Grassland+dew6.531.814.415.27.818.916.815.610.47.934.812.713.39.418.825.2
Spot diameter 5 cm (~20 cm2)...10 cm (~80 cm2)
0–3.0 less:0.6cm2 less:2.4cm2plant shape size
3.1–5.0 1cm2 4cm2
5.1–7.5 1.5cm2 6cm2
7.6–10 2cm2 8cm2

Share and Cite

MDPI and ACS Style

Schmittmann, O.; Schulze Lammers, P. A True-Color Sensor and Suitable Evaluation Algorithm for Plant Recognition. Sensors 2017, 17, 1823. https://doi.org/10.3390/s17081823

AMA Style

Schmittmann O, Schulze Lammers P. A True-Color Sensor and Suitable Evaluation Algorithm for Plant Recognition. Sensors. 2017; 17(8):1823. https://doi.org/10.3390/s17081823

Chicago/Turabian Style

Schmittmann, Oliver, and Peter Schulze Lammers. 2017. "A True-Color Sensor and Suitable Evaluation Algorithm for Plant Recognition" Sensors 17, no. 8: 1823. https://doi.org/10.3390/s17081823

APA Style

Schmittmann, O., & Schulze Lammers, P. (2017). A True-Color Sensor and Suitable Evaluation Algorithm for Plant Recognition. Sensors, 17(8), 1823. https://doi.org/10.3390/s17081823

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop