Distance-Dependent Multimodal Image Registration for Agriculture Tasks
Abstract
:1. Introduction
Task | Sensors | Description of Research | Reference |
---|---|---|---|
Target detection | RGB, spectral | survey on fruit detection using computer vision | [4] |
RGB camera | detection of grape clusters | [5] | |
Thermal vision | detection of oranges using thermal imagery | [6] | |
RGB camera + thermal camera | detection of oranges using RGB camera and thermal camera and conducting fusion between | [2] | |
Precision spraying | RGB camera | selective sprayer for weed control using machine vision, real time controller and controllable spraying system | [7] |
CCD camera | developed a tree crown recognition and smart spraying system | [8] | |
BW camera + ultrasonic | speed spraying using fuzzy logic control of machine vision and ultrasonic sensors | [9] | |
Ultrasonic | spraying robot for vine production | [10] | |
Robotic harvesting | Stereo vision | apple harvester robot | [11] |
Color camera (HSV) | robot for greenhouse operations | [12] | |
Color camera | apple harvester robot | [13] | |
2 wavelengths of color camera | strawberry harvester robot | [14] |
2. Methods
2.1. Image Registration Model
- (i)
- Capture a scene with varying distances between the sensors and the CPs,
- (ii)
- Calculate the TM for each scene with its corresponding distance,
- (iii)
- Construct a collection of all the TM calculated (Figure 2),
- (iv)
- Collect the corresponding values from each element of each matrix, use them as samples of a distance-dependent function, and perform a regression R with these samples to obtain a functional representation of the element
- (v)
- Collect all functions to construct the DDTM.
2.2. Target Rail and Sensor Position
2.3. Artificial Control Points
- (i)
- Capture input RGB image (Figure 5a),
- (ii)
- Convert the RGB image to HSV representation and isolate the hue and saturation channels (Figure 5b,d where (b) is the Hue and (d) is the saturation),
- (iii)
- Threshold the hue and the saturation (thresholds were set according to the detected color) (Figure 5c,e),
- (iv)
- Merge (logical OR) the resulted binary images (step iii, Figure 5f),
- (v)
- Isolate the ACPs plate using the RGB image (Figure 5g) (the ACPs plate is brighter relative to the environment and by applying threshold on the R,G,B channels the ACPs plate can be identified and isolate),
- (vi)
- (vii)
- Remove small clusters (<500) of pixels that considered as noise (using Matlab command bwareaopen) (Figure 5i),
- (viii)
- Fill holes in the image using morphological operations (using Matlab command imfill) (Figure 5j),
- (ix)
- Apply erosion followed by dilation using disk mask (r = 15 pixel) (Figure 5k). Using the disk shape mask, this step helps to remove small, non-ellipse shapes,
- (x)
- Filter remaining identification errors by searching saturated pixels in the shape center (expected due to the light bulb) (Figure 5l).
- (i)
- Capture a thermal image,
- (ii)
- Threshold the image for high temperature values,
- (iii)
- Remove small clusters (<10) of pixels that considered as noise (using Matlab command bwareaopen),
- (iv)
- Apply erosion followed by dilation using disk mask (r = 5). Using the disk shape mask, this step helps to remove small, non-ellipse shapes.
2.4. ACP Analysis
Sensor | Average | Standard Deviation |
---|---|---|
pixel | pixel | |
RGB | 1.36 | 0.88 |
Thermal | 0.87 | 0.32 |
Thermal (normalized) | 2.7 | 1.03 |
3. Experimental Evaluation
3.1. Experiment 1: Estimation of the DDTM
Regression Result | Approximation as Constant | |
---|---|---|
h00 | f(D) = 2.15E−06·D + 3.72E−01 | 0.3729 |
h01 | f(D) = 1.06E−06·D − 1.41E−02 | −0.014085 |
h02 | f(D) = −0.012784·D − 16.110487 | |
h10 | f(D) = −1.44E−06·D − 2.54E−02 | 0.025381 |
h11 | f(D) = 2.44E−06·D − 3.63E−01 | 0.363118 |
h12 | f(D) = −0.000845·D − 17.503239 | |
h20 | f(D) = −5.56E−09·D + 9.00E−05 | 0.000090 |
h21 | f(D) = 3.71E−09·D − 1.45E−05 | −0.000014 |
3.2. Experiment 2: DDTM Evaluation, Straight Vertical Plane
3.3. Experiment 3: DDTM Evaluation, Oblique ACPs Plane and Sensor Vibration
Vibration Frequency Hz (Swing Arm Rotation Frequency Hz) | Acceleration g | Number of Captured Images | Vertical Plane CP1, CP2, CP5, CP6 pixel | Oblique PlaneCP3, CP4, CP7, CP8 pixel | ||
---|---|---|---|---|---|---|
Mean Error | Standard Deviation | Mean Error | Standard Deviation | |||
0 | 0 | 118 | 3.68 | 0.67 | 4.02 | 0.86 |
2.18 | 0.011 | 118 | 3.34 | 0.79 | 4.18 | 1.20 |
2.49 | 0.0577 | 118 | 3.29 | 0.72 | 4.42 | 1.21 |
2.80 | 0.0595 | 118 | 3.38 | 0.62 | 3.62 | 0.98 |
3.11 | 0.1117 | 118 | 3.07 | 0.70 | 3.81 | 1.04 |
3.42 | 0.1941 | 118 | 3.38 | 0.84 | 3.89 | 1.19 |
3.73 | 0.3467 | 118 | 4.07 | 1.33 | 4.19 | 1.51 |
4.04 | 0.3252 | 118 | 3.91 | 1.23 | 4.11 | 1.43 |
4.35 | 0.2433 | 118 | 3.45 | 0.87 | 3.72 | 1.32 |
4.66 | 0.2596 | 118 | 3.74 | 1.10 | 3.76 | 1.13 |
4.97 | 0.3575 | 118 | 3.55 | 0.96 | 4.20 | 1.33 |
5.28 | 0.4519 | 118 | 3.68 | 0.88 | 4.13 | 1.20 |
4. Discussion and Conclusions
Acknowledgments
Conflicts of Interest
References
- Kapach, K.; Barnea, E.; Mairon, R.; Edan, Y.; Ben–Shahar, O. Computer vision for fruit harvesting robots–state of the art and challenges ahead. Int. J. Comput. Vis. Robot. 2012, 3, 4–34. [Google Scholar] [CrossRef]
- Bulanon, D.; Burks, T.; Alchanatis, V. Image fusion of visible and thermal images for fruit detection. Biosyst. Eng. 2009, 103, 12–22. [Google Scholar] [CrossRef]
- Stajnko, D.; Lakota, M.; Hocevar, M. Application of Thermal Imaging for Visualization of High Density Orchards and Counting Apple Fruits. In Proceedings of the VII International Symposium on Modelling in Fruit Research and Orchard Management, Copenhagen, Denmark, 1 April 2006; pp. 211–215.
- Jimenez, A.; Ceres, R.; Pons, J. A survey of computer vision methods for locating fruit on trees. Trans. Am. Soc. Agric. Eng. 2000, 43, 1911–1920. [Google Scholar] [CrossRef]
- Berenstein, R.; Shahar, O.B.; Shapiro, A.; Edan, Y. Grape clusters and foliage detection algorithms for autonomous selective vineyard sprayer. Intell. Serv. Robot. 2010, 3, 233–243. [Google Scholar] [CrossRef]
- Bulanon, D.; Burks, T.; Alchanatis, V. Study on temporal variation in citrus canopy using thermal imaging for citrus fruit detection. Biosyst. Eng. 2008, 101, 161–171. [Google Scholar] [CrossRef]
- Steward, B.L.; Tian, L.F.; Tang, L. Distance-based control system for machine vision-based selective spraying. Trans. Am. Soc. Agric. Eng. 2002, 45, 1255. [Google Scholar] [CrossRef]
- Zheng, J. Intelligent Pesticide Spraying Aims for Tree Target. In Resour. Resource: Engineering & Technology for a Sustainable World; ASABE: St. Joseph, MI, USA, 2005. [Google Scholar]
- Shin, B.S.; Kim, S.H.; Park, J.U. Autonomous Agricultural Vehicle Using Overhead Guide; Automation Technology for off-Road Equipment: Chicago, IL, USA, 2002. [Google Scholar]
- Ogawa, Y.; Kondo, N.; Monta, M.; Shibusawa, S. Spraying robot for grape production. In Springer Tracts In Advanced Robotics; Springer: Berlin, Germany, 2006; pp. 539–548. [Google Scholar]
- Kassay, L. Hungarian robotic apple harvester. In Proceedings of the International Summer Meeting of American Society of Agricultural Engineers, Charlotte, NC, USA, 21–24 June 1992.
- Buemi, F.; Massa, M.; Sandini, G. Agrobot: A Robotic System for Greenhouse Operations. In Proceedings of the 4th Workshop on robotics in Agriculture, IARP, Tolouse, France, 30–31 October 1995; pp. 172–184.
- Baeten, J.; Donné, K.; Boedrij, S.; Beckers, W.; Claesen, E. Autonomous fruit picking machine: A robotic apple harvester. In Field and Service Robotics; Springer: Berlin, Germany, 2008; pp. 531–539. [Google Scholar]
- Yamamoto, S.; Hayashi, S.; Yoshida, H.; Kobayashi, K. Development of a stationary robotic strawberry harvester with a picking mechanism that approaches the target fruit from below. Jpn. Agric. Res. Q. 2014, 48, 261–269. [Google Scholar] [CrossRef]
- Zitova, B.; Flusser, J. Image registration methods: A survey. Image Vis. Comput. 2003, 21, 977–1000. [Google Scholar] [CrossRef]
- Lester, H.; Arridge, S.R. A survey of hierarchical non-linear medical image registration. Pattern Recognit. 1999, 32, 129–149. [Google Scholar] [CrossRef]
- Zheng, Z.; Wang, H.; Khwang Teoh, E. Analysis of gray level corner detection. Pattern Recognit. Lett. 1999, 20, 149–162. [Google Scholar] [CrossRef]
- Ton, J.; Jain, A.K. Registering landsat images by point matching. IEEE Trans. Geosci. Remote Sens. 1989, 27, 642–651. [Google Scholar] [CrossRef]
- Stockman, G.; Kopstein, S.; Benett, S. Matching images to models for registration and object detection via clustering. IEEE Trans. Pattern Anal. Mach. Intell. 1982, PAMI-4, 229–241. [Google Scholar] [CrossRef]
- Pratt, W.K. Correlation techniques of image registration. IEEE Trans. Aerosp. Electron. Syst. 1974, AES-10, 353–358. [Google Scholar] [CrossRef]
- Barnea, D.I.; Silverman, H.F. A class of algorithms for fast digital image registration. IEEE Trans. Comput. 1972, 100, 179–186. [Google Scholar] [CrossRef]
- Althof, R.J.; Wind, M.G.J.; Dobbins, J.T., III. A rapid and automatic image registration algorithm with subpixel accuracy. IEEE Trans. Med. Imaging 1997, 16, 308–316. [Google Scholar] [CrossRef] [PubMed]
- Jarc, A.; Pers, J.; Rogelj, P.; Perse, M.; Kovacic, S. Texture features for affine registration of thermal (FLIR) and visible images. In Proceedings of the Computer Vision Winter Workshop, St. Lambrecht, Austria, 6–8 February 2007.
- Davis, J.W.; Sharma, V. Background-subtraction using contour-based fusion of thermal and visible imagery. Comput. Vis. Image Underst. 2007, 106, 162–182. [Google Scholar] [CrossRef]
- Istenic, R.; Heric, D.; Ribaric, S.; Zazula, D. Thermal and visual image registration in hough parameter space. In Proceedings of the 14th International Workshop on Systems, Signals and Image Processing, Maribor, Yugoslavia, 27–30 June 2007; pp. 106–109.
- Schaefer, G.; Tait, R.; Zhu, S.Y. Overlay of thermal and visual medical images using skin detection and image registration. In Proceedings of the 28th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS’06, New York, NY, USA, 30 August–3 September 2006; pp. 965–967.
- Tait, R.J.; Schaefer, G.; Hopgood, A.A.; Zhu, S.Y. Efficient 3-d medical image registration using a distributed blackboard architecture. In Proceedings of the 28th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS’06, New York, NY, USA, 30 August–3 September 2006; pp. 3045–3048.
- Szeliski, R. Computer Vision: Algorithms and Applications; Springer: Berlin, Germany, 2010. [Google Scholar]
- Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: New York, NY, USA, 2003. [Google Scholar]
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Berenstein, R.; Hočevar, M.; Godeša, T.; Edan, Y.; Ben-Shahar, O. Distance-Dependent Multimodal Image Registration for Agriculture Tasks. Sensors 2015, 15, 20845-20862. https://doi.org/10.3390/s150820845
Berenstein R, Hočevar M, Godeša T, Edan Y, Ben-Shahar O. Distance-Dependent Multimodal Image Registration for Agriculture Tasks. Sensors. 2015; 15(8):20845-20862. https://doi.org/10.3390/s150820845
Chicago/Turabian StyleBerenstein, Ron, Marko Hočevar, Tone Godeša, Yael Edan, and Ohad Ben-Shahar. 2015. "Distance-Dependent Multimodal Image Registration for Agriculture Tasks" Sensors 15, no. 8: 20845-20862. https://doi.org/10.3390/s150820845
APA StyleBerenstein, R., Hočevar, M., Godeša, T., Edan, Y., & Ben-Shahar, O. (2015). Distance-Dependent Multimodal Image Registration for Agriculture Tasks. Sensors, 15(8), 20845-20862. https://doi.org/10.3390/s150820845