Visual Detection and Tracking System for a Spherical Amphibious Robot
Abstract
:1. Introduction
2. Previous Work and Application Requirements
2.1. An Amphibious Spherical Robot
2.2. Vision Application Requirements
3. Visual Detection and Tracking System
3.1. Workflow of the System
3.2. Structure of the System
4. Image Pre-Processing Subsystem
4.1. Principle of the Image Pre-Processing Algorithm
4.2. Image Pre-Processing Subsystem
5. Detection and Tracking Subsystem
5.1. Moving Target Detection Subsystem
Algorithm 1. Gaussian mixture model-based moving target detection |
input: the enhanced image Rx, y and parameters of Gaussian mixture model μx,y,k, σx,y,k and ωx,y,k, where x ∈ [1,Width], y ∈ [1,Height], k ∈ [1,K] |
output: the foreground image Fx, y, where x ∈ [1,Width], y ∈ [1,Height] |
procedure GaussianMixtureModelDetection(R, μ, σ, w) |
Step #1 Initialize the parameters of Gaussian mixture model |
μx,y,k←rand(), σx,y,k←σ0, ωx,y,k←1/K |
Step #2 Try to match the Gaussian mixture model with the n-th image |
for k = 1 to K do |
if Rx,y,n μx,y,k < d·σx,y,k then |
matchk = 1 |
ωx,y,k = (1 − α)·ωx,y,k + α |
μx,y,k = (1 − α/ωx,y,k)·μx,y,k + α/ωx,y,k·Rx,y,n |
σx,y,k = |
else |
ωx,y,k = (1 − α)·ωx,y,k |
end if |
end |
Step #3 Normalize the weight ωx,y,k and sort the model with ωx,y,k/σx,y,k |
Step #4 Reinitialize the model with minimum weight if there is no matched model, |
if then |
μx,y,0 = pixelx,y,n |
σx,y,k = σ0 |
end if |
Step #5 |
for k = 1 to K do |
if ωx,y,k > T and Rx,y,n μx,y,k < d·σx,y,k then |
Fx, y = 0 |
break |
else |
Fx, y = 255 |
end if |
end |
Step #6 Execute 3 × 3 erode and dilate operations over R(x, y) |
Step #7 Execute connected region analysis and list potential moving target |
Step #8 Specify the object larger than AreaThresh as the target |
end procedure |
5.2. Visual Tracking Subsystem
6. Experimental Results
7. Conclusions and Future Work
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Thompson, D.; Caress, D.; Thomas, H.; Conlin, D. MBARI mapping AUV operations in the gulf of California 2015. In Proceedings of the OCEANS 2015 - MTS/IEEE Washington, Washington, DC, USA, 19–22 October 2015. [Google Scholar]
- Tran, N.-H.; Choi, H.-S.; Bae, J.-H.; Oh, J.-Y.; Cho, J.-R. Design, control, and implementation of a new AUV platform with a mass shifter mechanism. Int. J. Precis. Eng. Manuf. 2015, 16, 1599–1608. [Google Scholar] [CrossRef]
- Ribas, D.; Palomeras, N.; Ridao, P.; Carreras, M.; Mallios, A. Girona 500 auv: From survey to intervention. IEEE ASME Trans. Mechatron. 2012, 17, 46–53. [Google Scholar] [CrossRef]
- Shi, L.; Guo, S.; Mao, S.; Yue, C.; Li, M.; Asaka, K. Development of an amphibious turtle-inspired spherical mother robot. J. Bionic Eng. 2013, 10, 446–455. [Google Scholar] [CrossRef]
- Kaznov, V.; Seeman, M. Outdoor navigation with a spherical amphibious robot. In Proceedings of the2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Taipei, Taiwan, 18–22 October 2010. [Google Scholar]
- Jia, L.; Hu, Z.; Geng, L.; Yang, Y.; Wang, C. The concept design of a mobile amphibious spherical robot for underwater operation. In Proceedings of the 2016 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), Chengdu, China, 19–22 June 2016. [Google Scholar]
- Chen, W.-H.; Chen, C.-P.; Tsai, J.-S.; Yang, J.; Lin, P.-C. Design and implementation of a ball-driven omnidirectional spherical robot. Mech. Mach. Theory 2013, 68, 35–48. [Google Scholar] [CrossRef]
- Guo, S.; He, Y.; Shi, L.; Pan, S.; Tang, K.; Xiao, R.; Guo, P. Modal and fatigue analysis of critical components of an amphibious spherical robot. Microsyst. Technol. 2016, 1–15. [Google Scholar] [CrossRef]
- Paull, L.; Saeedi, S.; Seto, M.; Li, H. AUV navigation and localization: A review. IEEE J. Ocean. Eng. 2014, 39, 131–149. [Google Scholar] [CrossRef]
- Grothues, T.M.; Dobarro, J.; Eiler, J. Collecting, interpreting, and merging fish telemetry data from an AUV: Remote sensing from an already remote platform. In Proceedings of the 2010 IEEE/OES Autonomous Underwater Vehicles, Monterey, CA, USA, 1–3 September 2010. [Google Scholar]
- Bosch Alay, J.; Grácias, N.R.E.; Ridao Rodríguez, P.; Istenič, K.; Ribas Romagós, D. Close-range tracking of underwater vehicles using light beacons. Sensors 2016, 16, 429. [Google Scholar] [CrossRef] [PubMed]
- Massot-Campos, M.; Oliver-Codina, G. Optical sensors and methods for underwater 3D reconstruction. Sensors 2015, 15, 31525–31557. [Google Scholar] [CrossRef] [PubMed]
- Yahya, M.; Arshad, M. Tracking of multiple light sources using computer vision for underwater docking. Procedia Comput. Sci. 2015, 76, 192–197. [Google Scholar] [CrossRef]
- Zhang, L.; He, B.; Song, Y.; Yan, T. Consistent target tracking via multiple underwater cameras. In Proceedings of the OCEANS 2016 - Shanghai, Shanghai, China, 10–13 April 2016. [Google Scholar]
- Chen, Z.; Shen, J.; Fan, T.; Sun, Z.; Xu, L. Single-camera three-dimensional tracking of underwater objects. Int. J. Signal Process. Image Process. Pattern Recognit. 2015, 8, 89–104. [Google Scholar] [CrossRef]
- Chuang, M.C.; Hwang, J.N.; Williams, K.; Towler, R. Multiple fish tracking via Viterbi data association for low-frame-rate underwater camera systems. In Proceedings of the 2013 IEEE International Symposium on Circuits and Systems (ISCAS2013), Beijing, China, 19–23 May 2013. [Google Scholar]
- Chuang, M.-C.; Hwang, J.-N.; Ye, J.-H.; Huang, S.-C.; Williams, K. Underwater Fish Tracking for Moving Cameras Based on Deformable Multiple Kernels. IEEE Trans. Syst. Man Cybern. Syst. 2016, PP, 1–11. [Google Scholar] [CrossRef]
- Lee, D.; Kim, G.; Kim, D.; Myung, H.; Choi, H.-T. Vision-based object detection and tracking for autonomous navigation of underwater robots. Ocean Eng. 2012, 48, 59–68. [Google Scholar] [CrossRef]
- Shiau, Y.-H.; Chen, C.-C.; Lin, S.-I. Using bounding-surrounding boxes method for fish tracking in real world underwater observation. Int. J. Adv. Robot. Syst. 2013, 10, 261–270. [Google Scholar] [CrossRef]
- Li, M.; Guo, S.; Hirata, H.; Ishihara, H. A roller-skating/walking mode-based amphibious robot. Rob. Comput. Integr. Manuf. 2017, 44, 17–29. [Google Scholar] [CrossRef]
- Li, Y.; Guo, S. Communication between spherical underwater robots based on the acoustic communication methods. In Proceedings of the 2016 IEEE International Conference on Mechatronics and Automation (ICMA), Harbin, China, 7–10 August 2016. [Google Scholar]
- Pan, S.; Shi, L.; Guo, S.; Guo, P.; He, Y.; Xiao, R. A low-power SoC-based moving target detection system for amphibious spherical robots. In Proceedings of the 2015 IEEE International Conference on Mechatronics and Automation (ICMA), Beijing, China, 2–5 August 2015. [Google Scholar]
- Pan, S.; Shi, L.; Guo, S. A Kinect-based real-time compressive tracking prototype system for amphibious spherical robots. Sensors 2015, 15, 8232–8252. [Google Scholar] [CrossRef] [PubMed]
- Crockett, L.H.; Elliot, R.A.; Enderwitz, M.A.; Stewart, R.W. The Zynq Book: Embedded Processing with the ARM Cortex-A9 on the Xilinx Zynq-7000 All Programmable SoC; PStrathclyde Academic Media: Strathclyde, Scotland, 2014; pp. 15–21. [Google Scholar]
- Schechner, Y.Y.; Karpel, N. Clear underwater vision. In Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2004), Washington, DC, USA, 27 June–2 July 2004. [Google Scholar]
- Roser, M.; Dunbabin, M.; Geiger, A. Simultaneous underwater visibility assessment, enhancement and improved stereo. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014. [Google Scholar]
- Jobson, D.J.; Rahman, Z.U.; Woodell, G.A. A multiscale retinex for bridging the gap between color images and the human observation of scenes. IEEE Trans. Image Process. 1997, 6, 965–976. [Google Scholar] [CrossRef] [PubMed]
- Xiao, S.; Li, Y. Fast multiscale Retinex algorithm of image haze removal with color fidelity. Comput. Eng. Appl. 2015, 51, 176–179. [Google Scholar]
- Liu, Y.; Yao, H.; Gao, W.; Chen, X.; Zhao, D. Nonparametric background generation. J. Vis. Commun. Image Represent. 2007, 18, 253–263. [Google Scholar] [CrossRef]
- Negrea, C.; Thompson, D.E.; Juhnke, S.D.; Fryer, D.S.; Loge, F.J. Automated detection and tracking of adult pacific lampreys in underwater video collected at snake and Columbia River fishways. North Am. J. Fish. Manag. 2014, 34, 111–118. [Google Scholar] [CrossRef]
- KaewTraKulPong, P.; Bowden, R. An improved adaptive background mixture model for real-time tracking with shadow detection. In Video-Based Surveillance Systems; Springer: Berlin, German, 2002; pp. 135–144. [Google Scholar]
- Mukherjee, D.; Wu, Q.J.; Nguyen, T.M. Gaussian mixture model with advanced distance measure based on support weights and histogram of gradients for background suppression. IEEE Trans. Ind. Inform. 2014, 10, 1086–1096. [Google Scholar] [CrossRef]
- Ibarguren, A.; Martínez-Otzeta, J.M.; Maurtua, I. Particle filtering for industrial 6DOF visual servoing. J. Intell. Robot. Syst. 2014, 74, 689–696. [Google Scholar] [CrossRef]
- Yang, S.; Scherer, S.A.; Schauwecker, K.; Zell, A. Autonomous landing of MAVs on an arbitrarily textured landing site using onboard monocular vision. J. Intell. Robot. Syst. 2014, 74, 27–43. [Google Scholar] [CrossRef]
- Zhang, K.; Song, H. Real-time visual tracking via online weighted multiple instance learning. Pattern Recognit. 2013, 46, 397–411. [Google Scholar] [CrossRef]
- Henriques, J.F.; Caseiro, R.; Martins, P.; Batista, J. High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 37, 583–596. [Google Scholar] [CrossRef] [PubMed]
- Zhang, K.; Liu, Q.; Wu, Y.; Yang, M.-H. Robust visual tracking via convolutional networks without training. IEEE Trans. Image Process. 2016, 25, 1779–1792. [Google Scholar] [CrossRef] [PubMed]
- Zhang, K.; Zhang, L.; Yang, M.-H. Fast compressive tracking. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 2002–2015. [Google Scholar] [CrossRef] [PubMed]
- Wu, Y.; Lim, J.; Yang, M. Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 37, 1834–1848. [Google Scholar] [CrossRef] [PubMed]
- Visual Tracker Benchmark. Available online: http://www.visual-tracking.net (accessed on 26 April 2017).
- Lei, F.; Zhang, X. Underwater target tracking based on particle filter. In Proceedings of the 2012 7th International Conference on Computer Science & Education (ICCSE), Melbourne, Australia, 14–17 July 2012. [Google Scholar]
- Walther, D.; Edgington, D.R.; Koch, C. Detection and tracking of objects in underwater video. In Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2004), Washington, DC, USA, 27 June–2 July 2004. [Google Scholar]
- Wang, N.; Shi, J.; Yeung, D.-Y.; Jia, J. Understanding and Diagnosing Visual Tracking Systems. In Proceedings of the 2015 IEEE International Conference on Computer Vision (CVPR 2015), Santiago, CA, USA, 7–13 December 2015. [Google Scholar]
- An Open Source Tracking Testbed and Evaluation Web Site. Available online: http://vision.cse.psu.edu/publications/pdfs/opensourceweb.pdf (accessed on 15 December 2016).
Vision System | Hardware Platform | Image Size | Maximum Frame Rate | Working Scenarios |
---|---|---|---|---|
Proposed System | SoC | 320 × 240 | 56.3 fps | Static and dynamic background |
Shiau [19] et al. | PC | 640 × 480 | 20.0 fps | Static background |
Chuang [15] et al. | PC | 2048 × 2048 | 5.0 fps | Dark environment |
Lei [41] et al. | PC | 352 × 288 | 3.3 fps | Swimming pool |
Walther [42] et al. | PC | 720 × 480 | 30.0 fps | Dark environment |
Sequences | PWC (Proposed) | Pr (Proposed) | PWC (GBM) | Pr (GBM) |
---|---|---|---|---|
Sequences 1 | 0.018 | 0.821 | 0.092 | 0.733 |
Sequences 2 | 0.069 | 0.675 | 0.183 | 0.484 |
Sequences 3 | 0.030 | 0.985 | 0.052 | 0.924 |
Sequences 4 | 0.254 | 0.784 | 0.382 | 0.564 |
Algorithm | Criteria | Sequence 1 | Sequence 2 | Sequence 3 | Sequence 4 |
---|---|---|---|---|---|
Proposed | SR (CLE) | 100 (11.7) | 91.8 (21.2) | 100 (17.8) | 100 (6.6) |
CT | SR (CLE) | 98.8 (13.8) | 87.1 (27.1) | 71.3 (27.1) | 100 (8.6) |
WMIL | SR (CLE) | 100 (12.2) | 77.3 (29.3) | 98.7 (20.1) | 92.1 (12.8) |
HOG-SVM | SR (CLE) | 100 (6.5) | 85.1 (27.4) | 100 (18.7) | 100 (3.7) |
TemplateMatch | SR (CLE) | 100 (6.1) | 84.7 (28.1) | 80.3 (23.1) | 100 (9.8) |
MeanShift | SR (CLE) | 10.3 (58.1) | 14.3 (53.2) | 35.4 (62.3) | 100 (9.8) |
VarianceRatio | SR (CLE) | 12.2 (59.2) | 50.6 (33.2) | 56.3 (36.7) | 100 (9.7) |
PeakDifference | SR (CLE) | 12.0 (58.9) | 72.1 (31.7) | 16.3 (67.2) | 100 (7.2) |
RatioShift | SR (CLE) | 11.9 (45.6) | 67.4 (28.2) | 3.2 (87.3) | 100 (8.4) |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Guo, S.; Pan, S.; Shi, L.; Guo, P.; He, Y.; Tang, K. Visual Detection and Tracking System for a Spherical Amphibious Robot. Sensors 2017, 17, 870. https://doi.org/10.3390/s17040870
Guo S, Pan S, Shi L, Guo P, He Y, Tang K. Visual Detection and Tracking System for a Spherical Amphibious Robot. Sensors. 2017; 17(4):870. https://doi.org/10.3390/s17040870
Chicago/Turabian StyleGuo, Shuxiang, Shaowu Pan, Liwei Shi, Ping Guo, Yanlin He, and Kun Tang. 2017. "Visual Detection and Tracking System for a Spherical Amphibious Robot" Sensors 17, no. 4: 870. https://doi.org/10.3390/s17040870
APA StyleGuo, S., Pan, S., Shi, L., Guo, P., He, Y., & Tang, K. (2017). Visual Detection and Tracking System for a Spherical Amphibious Robot. Sensors, 17(4), 870. https://doi.org/10.3390/s17040870