Computer-Vision-Based Vibration Tracking Using a Digital Camera: A Sparse-Optical-Flow-Based Target Tracking Method
Abstract
:1. Introduction
2. Target Tracking Methods: Background Literature
2.1. Sparse Optical Flow
2.2. Feature Matching
2.3. Dense Optical Flow
2.4. Template Matching
3. Proposed Method
4. Vision-Based Sensing System
4.1. Camera Calibration and Scale Conversion
4.2. Frame Tracking Strategies and Displacement Calculation
5. Experimental Setup for Measurement
5.1. Experimental Three-Story Shear Building Structure
5.2. Excitation System
5.3. Reference System
5.4. Vision Sensor System
5.4.1. Data Acquisition
5.4.2. Data Processing
6. Qualitative and Quantitative Assessment of Proposed Method
6.1. Implementation of the Proposed Method
6.2. Comparison with Existing Sparse Optical Flow Tracking Methods
6.3. Comparison with Existing Feature-Matching-Based Tracking Methods
6.4. Comparison with Existing Dense-Optical-Flow-Based Target Tracking Methods
6.5. Comparison with Existing Template-Matching-Based Target Tracking Methods
7. Discussion of Results
7.1. Effect of ROI Selection and Outlier Removal Methods
7.2. Effect of Excitation Frequency on the Accuracy of Vision-Based Methods
7.3. Effect of Frame Rate and Frame Strategy
7.4. Effect of Sparse Optical Flow versus Feature-Matching Technique on Keypoints Tracking
8. Summary and Conclusions
- A sparse-optical-flow-based target tracking is enhanced by the use of various components such as the ORB keypoint detector, multi-level optical flow algorithm, and outlier removal techniques. Existing sparse-optical-flow-based computer vision methods are known to have disadvantages such as tracking large displacements. This limitation is improved by the use of two outlier removal methods and multi-point movement tracking to obtain a comprehensive assessment of the structural response. The comparison results illustrated in Table 1 show that the proposed method exhibits higher accuracy than existing methods for cases with larger displacement amplitudes and similar accuracy for all other cases.
- Validation of the proposed vision-based target tracking method is performed with a shear building experimental setup. The structure is subjected to transient vibrations at three excitation frequencies with varying amplitudes. Figure 8 illustrates the calculated versus measured acceleration time history. It is observed that the target tracking method is able to detect the structural motion and calculate its acceleration at numerous points of the structure with great accuracy.
- Various other computer-vision-based methods, such as dense-optical-flow-based, feature-matching-based, and template-matching-based target tracking, are compared with the proposed methodology to check for its accuracy. The limitations of existing methodologies and the proposed enhancements are summarized as follows:
- Template-matching-based target tracking approaches have an inherent disadvantage due to adverse factors, such as partial occlusion, shape deformation and rotation, which can affect the detection of predefined templates. Therefore, the proposed sparse-optical-flow-based method attempts to track various keypoints on the vibrating structure without the use of external templates. As shown in Table 4, it is observed that the proposed method achieves higher accuracies than the existing template-matching-based target tracking approaches.
- Another similar keypoint tracking approach, called the feature-matching-based target tracking method, is also compared. However, the keypoint detectors implemented as a part of this existing method have some disadvantages, such as the illumination, scale, blur, and compression of images captured during structural vibrations. In the proposed sparse-optical-flow-based method, these limitations are corrected by the use of the ORB FAST keypoint detector in combination with the LK algorithm to detect a higher number of matched keypoints (Table 2).
- Additionally, Table 3 shows that the proposed method is observed to perform quite similarly to the existing dense-optical-flow-based technique which compares predefined ROI templates without any outlier removal approach. However, for lower excitation frequencies, the computer-vision-based technique proposed in this study with outlier removal outperforms the existing dense-optical-flow-based method.
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Dong, C.Z.; Celik, O.; Catbas, F.N.; O’Brien, E.J.; Taylor, S. Structural displacement monitoring using deep learning-based full field optical flow methods. Struct. Infrastruct. Eng. 2020, 16, 51–71. [Google Scholar] [CrossRef]
- Fradelos, Y.; Thalla, O.; Biliani, I.; Stiros, S. Study of lateral displacements and the natural frequency of a pedestrian bridge using low-cost cameras. Sensors 2020, 20, 3217. [Google Scholar] [CrossRef] [PubMed]
- Kalybek, M.; Bocian, M.; Nikitas, N. Performance of Optical Structural Vibration Monitoring Systems in Experimental Modal Analysis. Sensors 2021, 21, 1239. [Google Scholar] [CrossRef] [PubMed]
- Feng, D.; Feng, M.Q. Computer Vision for Structural Dynamics and Health Monitoring; John Wiley & Sons: Hoboken, NJ, USA, 2021. [Google Scholar]
- Chen, X.; Ling, J.; Wang, S.; Yang, Y.; Luo, L.; Yan, Y. Ship detection from coastal surveillance videos via an ensemble Canny-Gaussian-morphology framework. J. Navig. 2021, 74, 1252–1266. [Google Scholar] [CrossRef]
- Zhang, G.; Li, H.; Shen, S.; Trinh, T.; He, F.; Talke, F.E. Effect of Track-Seeking Motion on Off-Track Vibrations of the Head-Gimbal Assembly in HDDs. IEEE Trans. Magn. 2018, 54, 1–6. [Google Scholar] [CrossRef]
- Khuc, T.; Catbas, F.N. Computer vision-based displacement and vibration monitoring without using physical target on structures. Struct. Infrastruct. Eng. 2017, 13, 505–516. [Google Scholar] [CrossRef]
- Dong, C.Z.; Bas, S.; Catbas, F.N. Investigation of vibration serviceability of a footbridge using computer vision-based methods. Eng. Struct. 2020, 224, 111224. [Google Scholar] [CrossRef]
- Hoskere, V.; Park, J.W.; Yoon, H.; Spencer, B.F., Jr. Vision-based modal survey of civil infrastructure using unmanned aerial vehicles. J. Struct. Eng. 2019, 145, 04019062. [Google Scholar] [CrossRef]
- Choi, H.; Kang, B.; Kim, D. Moving Object Tracking Based on Sparse Optical Flow with Moving Window and Target Estimator. Sensors 2022, 22, 2878. [Google Scholar] [CrossRef]
- Lucas, B.D.; Kanade, T. An Iterative Image Registration Technique with an Application to Stereo Vision. In Proceedings of the DARPA Image Understanding Workshop, Vancouver, BC, Canada, 24–28 August 1981; pp. 121–130. [Google Scholar]
- Xu, Y.; Brownjohn, J.M. Review of machine-vision based methodologies for displacement measurement in civil structures. J. Civ. Struct. Health Monit. 2018, 8, 91–110. [Google Scholar] [CrossRef] [Green Version]
- Khaloo, A.; Lattanzi, D. Pixel-wise structural motion tracking from rectified repurposed videos. Struct. Control Health Monit. 2017, 24, e2009. [Google Scholar] [CrossRef]
- Kalybek, M.; Bocian, M.; Pakos, W.; Grosel, J.; Nikitas, N. Performance of Camera-Based Vibration Monitoring Systems in Input-Output Modal Identification Using Shaker Excitation. Remote Sens. 2021, 13, 3471. [Google Scholar] [CrossRef]
- Hosseinzadeh, A.Z.; Harvey, P., Jr. Pixel-based operating modes from surveillance videos for structural vibration monitoring: A preliminary experimental study. Measurement 2019, 148, 106911. [Google Scholar] [CrossRef]
- Lydon, D.; Lydon, M.; Taylor, S.; Del Rincon, J.M.; Hester, D.; Brownjohn, J. Development and field testing of a vision-based displacement system using a low cost wireless action camera. Mech. Syst. Signal Process. 2019, 121, 343–358. [Google Scholar] [CrossRef]
- Yoon, H.; Elanwar, H.; Choi, H.; Golparvar-Fard, M.; Spencer, B.F., Jr. Target-free approach for vision-based structural system identification using consumer-grade cameras. Struct. Control Health Monit. 2016, 23, 1405–1416. [Google Scholar] [CrossRef]
- Dong, C.Z.; Celik, O.; Catbas, F.N. Marker-free monitoring of the grandstand structures and modal identification using computer vision methods. Struct. Health Monit. 2019, 18, 1491–1509. [Google Scholar] [CrossRef]
- Song, Y.Z.; Bowen, C.R.; Kim, A.H.; Nassehi, A.; Padget, J.; Gathercole, N. Virtual visual sensors and their application in structural health monitoring. Struct. Health Monit. 2014, 13, 251–264. [Google Scholar] [CrossRef]
- Dong, C.Z.; Catbas, F.N. A non-target structural displacement measurement method using advanced feature matching strategy. Adv. Struct. Eng. 2019, 22, 3461–3472. [Google Scholar] [CrossRef]
- Khuc, T.; Catbas, F.N. Completely contactless structural health monitoring of real-life structures using cameras and computer vision. Struct. Control Health Monit. 2017, 24, e1852. [Google Scholar] [CrossRef]
- Ehrhart, M.; Lienhart, W. Development and evaluation of a long range image-based monitoring system for civil engineering structures. In Proceedings of the Structural Health Monitoring and Inspection of Advanced Materials, Aerospace, and Civil Infrastructure 2015, International Society for Optics and Photonics, San Diego, CA, USA, 9–12 March 2015; Volume 9437, p. 94370K. [Google Scholar]
- Celik, O.; Dong, C.Z.; Catbas, F.N. A computer vision approach for the load time history estimation of lively individuals and crowds. Comput. Struct. 2018, 200, 32–52. [Google Scholar] [CrossRef]
- Horn, B.K.; Schunck, B.G. Determining optical flow. Artif. Intell. 1981, 17, 185–203. [Google Scholar] [CrossRef]
- Black, M.J.; Anandan, P. The robust estimation of multiple motions: Parametric and piecewise-smooth flow fields. Comput. Vis. Image Underst. 1996, 63, 75–104. [Google Scholar] [CrossRef]
- Sun, D.; Roth, S.; Black, M.J. Secrets of optical flow estimation and their principles. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, CA, USA, 13–18 June 2010; pp. 2432–2439. [Google Scholar]
- Farneback, G. Very high accuracy velocity estimation using orientation tensors, parametric motion, and simultaneous segmentation of the motion field. In Proceedings of the Eighth IEEE International Conference on Computer Vision, ICCV 2001, Vancouver, BC, Canada, 7–14 July 2001; Volume 1, pp. 171–177. [Google Scholar]
- Farneback, G. Fast and accurate motion estimation using orientation tensors and parametric motion models. In Proceedings of the 15th International Conference on Pattern Recognition, ICPR-2000, Barcelona, Spain, 3–8 September 2000; Volume 1, pp. 135–139. [Google Scholar]
- Farnebäck, G. Two-frame motion estimation based on polynomial expansion. In Scandinavian Conference on Image Analysis; Springer: Berlin/Heidelberg, Germany, 2003; pp. 363–370. [Google Scholar]
- Won, J.; Park, J.W.; Park, K.; Yoon, H.; Moon, D.S. Non-target structural displacement measurement using reference frame-based deepflow. Sensors 2019, 19, 2992. [Google Scholar] [CrossRef]
- Revaud, J.; Weinzaepfel, P.; Harchaoui, Z.; Schmid, C. Deepmatching: Hierarchical deformable dense matching. Int. J. Comput. Vis. 2016, 120, 300–323. [Google Scholar] [CrossRef]
- Weinzaepfel, P.; Revaud, J.; Harchaoui, Z.; Schmid, C. DeepFlow: Large displacement optical flow with deep matching. In Proceedings of the IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013; pp. 1385–1392. [Google Scholar]
- Ilg, E.; Mayer, N.; Saikia, T.; Keuper, M.; Dosovitskiy, A.; Brox, T. Flownet 2.0: Evolution of optical flow estimation with deep networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2462–2470. [Google Scholar]
- Guo, J. Dynamic displacement measurement of large-scale structures based on the Lucas–Kanade template tracking algorithm. Mech. Syst. Signal Process. 2016, 66, 425–436. [Google Scholar] [CrossRef]
- Xu, Y.; Brownjohn, J.M.; Huseynov, F. Accurate deformation monitoring on bridge structures using a cost-effective sensing system combined with a camera and accelerometers: Case study. J. Bridge Eng. 2019, 24, 05018014. [Google Scholar] [CrossRef]
- Xu, Y.; Brownjohn, J.; Hester, D.; Koo, K. Dynamic displacement measurement of a long span bridge using vision-based system. In Proceedings of the 8th European Workshop On Structural Health Monitoring (EWSHM 2016), Bilbao, Spain, 5–8 July 2016. [Google Scholar]
- Stephen, G.; Brownjohn, J.; Taylor, C. Measurements of static and dynamic displacement from visual monitoring of the Humber Bridge. Eng. Struct. 1993, 15, 197–208. [Google Scholar] [CrossRef]
- Feng, D.; Feng, M.Q.; Ozer, E.; Fukuda, Y. A vision-based sensor for noncontact structural displacement measurement. Sensors 2015, 15, 16557–16575. [Google Scholar] [CrossRef]
- Ngeljaratan, L.; Moustafa, M.A. Structural health monitoring and seismic response assessment of bridge structures using target-tracking digital image correlation. Eng. Struct. 2020, 213, 110551. [Google Scholar] [CrossRef]
- Poozesh, P.; Sabato, A.; Sarrafi, A.; Niezrecki, C.; Avitabile, P.; Yarala, R. Multicamera measurement system to evaluate the dynamic response of utility-scale wind turbine blades. Wind Energy 2020, 23, 1619–1639. [Google Scholar] [CrossRef]
- Liu, B.; Zhang, D.; Guo, J.; Zhu, C. Vision-based displacement measurement sensor using modified Taylor approximation approach. Opt. Eng. 2016, 55, 114103. [Google Scholar] [CrossRef]
- Omidalizarandi, M.; Kargoll, B.; Paffenholz, J.A.; Neumann, I. Accurate vision-based displacement and vibration analysis of bridge structures by means of an image-assisted total station. Adv. Mech. Eng. 2018, 10, 1687814018780052. [Google Scholar] [CrossRef]
- Guo, J.; Jiao, J.; Fujita, K.; Takewaki, I. Damage identification for frame structures using vision-based measurement. Eng. Struct. 2019, 199, 109634. [Google Scholar] [CrossRef]
- Zhang, D.; Guo, J.; Lei, X.; Zhu, C. A high-speed vision-based sensor for dynamic vibration analysis using fast motion extraction algorithms. Sensors 2016, 16, 572. [Google Scholar] [CrossRef]
- Zhong, J.; Zhong, S.; Zhang, Q.; Zhuang, Y.; Lu, H.; Fu, X. Vision-based measurement system for structural vibration monitoring using non-projection quasi-interferogram fringe density enhanced by spectrum correction method. Meas. Sci. Technol. 2016, 28, 015903. [Google Scholar] [CrossRef]
- Alipour, M.; Washlesky, S.J.; Harris, D.K. Field deployment and laboratory evaluation of 2D digital image correlation for deflection sensing in complex environments. J. Bridge Eng. 2019, 24, 04019010. [Google Scholar] [CrossRef]
- Harmanci, Y.E.; Gülan, U.; Holzner, M.; Chatzi, E. A novel approach for 3D-structural identification through video recording: Magnified tracking. Sensors 2019, 19, 1229. [Google Scholar] [CrossRef]
- Aoyama, T.; Li, L.; Jiang, M.; Takaki, T.; Ishii, I.; Yang, H.; Umemoto, C.; Matsuda, H.; Chikaraishi, M.; Fujiwara, A. Vision-based modal analysis using multiple vibration distribution synthesis to inspect large-scale structures. J. Dyn. Syst. Meas. Control 2019, 141. [Google Scholar] [CrossRef]
- Rublee, E.; Rabaud, V.; Konolige, K.; Bradski, G. ORB: An efficient alternative to SIFT or SURF. In Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; pp. 2564–2571. [Google Scholar]
- Nie, G.Y.; Cheng, M.M.; Liu, Y.; Liang, Z.; Fan, D.P.; Liu, Y.; Wang, Y. Multi-Level Context Ultra-Aggregation for Stereo Matching. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
- Rosten, E.; Drummond, T. Machine Learning for High-Speed Corner Detection; European Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2006; pp. 430–443. [Google Scholar]
- Calonder, M.; Lepetit, V.; Strecha, C.; Fua, P. Brief: Binary Robust Independent Elementary Features; European Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2010; pp. 778–792. [Google Scholar]
- Adelson, E.H.; Anderson, C.H.; Bergen, J.R.; Burt, P.J.; Ogden, J.M. Pyramid methods in image processing. RCA Eng. 1984, 29, 33–41. [Google Scholar]
- Pulli, K.; Baksheev, A.; Kornyakov, K.; Eruhimov, V. Real-time computer vision with OpenCV. Commun. ACM 2012, 55, 61–69. [Google Scholar] [CrossRef]
- Han, L.; Li, Z.; Zhong, K.; Cheng, X.; Luo, H.; Liu, G.; Shang, J.; Wang, C.; Shi, Y. Vibration detection and motion compensation for multi-frequency phase-shifting-based 3d sensors. Sensors 2019, 19, 1368. [Google Scholar] [CrossRef] [Green Version]
- Sandhu, H.K. Artificial Intelligence Based Condition Monitoring of Nuclear Piping-Equipment Systems. Ph.D. Thesis, North Carolina State University, Raleigh, NC, USA, 2021. [Google Scholar]
- Bodda, S.S.; Keller, M.; Gupta, A.; Senfaute, G. A Methodological Approach to Update Ground Motion Prediction Models Using Bayesian Inference. Pure Appl. Geophys. 2022, 179, 247–264. [Google Scholar] [CrossRef]
- Jiang, S.; Campbell, D.; Lu, Y.; Li, H.; Hartley, R. Learning to Estimate Hidden Motions with Global Motion Aggregation. arXiv 2021, arXiv:2104.02409. [Google Scholar]
- Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.J.; Marín-Jiménez, M.J. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
- Zamora, J.; Fortino, G. Tracking algorithms for TPCs using consensus-based robust estimators. Nucl. Instrum. Methods Phys. Res. Sect. A Accel. Spectrometers Detect. Assoc. Equip. 2021, 988, 164899. [Google Scholar] [CrossRef]
- Liu, C.; Xu, J.; Wang, F. A Review of Keypoints’ Detection and Feature Description in Image Registration. Sci. Program. 2021, 2021, 8509164. [Google Scholar] [CrossRef]
Freq | Methods | Bottom (%) | Middle (%) | Top (%) |
---|---|---|---|---|
2 Hz | Shi–Tomasi corner + LK [10,14] | 0.0184 (+5.747) | 0.0170 (+11.842) | 0.0216 (+4.854) |
Harris corner + LK [9] | 0.0181 (+4.023) | 0.0170 (+11.842) | 0.0212 (+2.913) | |
SURF + LK [15] | 0.0253 (+45.402) | 0.0165 (+8.553) | 0.0226 (+9.709) | |
SURF + LK + MLESAC [16] | 0.0178 (+2.299) | 0.0166 (+9.211) | 0.0226 (+9.709) | |
SURF + LK + Bidir. error [18] | 0.0177 (+1.724) | 0.0166 (+9.211) | 0.0224 (+8.738) | |
SURF + MLK | 0.0321 (+84.483) | 0.0162(+6.579) | 0.0213 (+3.398) | |
Proposed | 0.0174 (+0) | 0.0152 (+0) | 0.0206 (+0) | |
5 Hz | Shi–Tomasi corner + LK [10,14] | 0.0217 (+1.878) | 0.1225 (+151.540) | 0.2668 (+352.971) |
Harris corner + LK [9] | 0.0218 (+2.347) | 0.1267 (+160.164) | 0.2851 (+384.041) | |
SURF + LK [15] | 0.0272 (+27.700) | 0.1307 (+168.378) | 0.2815 (+377.929) | |
SURF + LK + MLESAC [16] | 0.0222 (+4.225) | 0.1338 (+174.743) | 0.2799 (+375.212) | |
SURF + LK + Bidir. error [18] | 0.0223 (+4.695) | 0.1320 (+171.047) | 0.2740 (+365.195) | |
SURF + MLK | 0.0264 (+23.944) | 0.1246 (+155.852) | 0.1492 (+153.311) | |
Proposed | 0.0213 (+0) | 0.0487 (+0) | 0.0589 (+0) | |
10 Hz | Shi–Tomasi corner + LK [10,14] | 0.5975 (+150.945) | 0.0639 (+0.157) | 0.3693 (+122.336) |
Harris corner + LK [9] | 0.6058 (+154.431) | 0.0638 (+0) | 0.3731 (+124.624) | |
SURF + LK [15] | 0.5944 (+149.643) | 0.0692 (+8.464) | 0.4062 (+144.551) | |
SURF + LK + MLESAC [16] | 0.6112 (+156.699) | 0.0686 (+7.524) | 0.4113 (+147.622) | |
SURF + LK + Bidir. error [18] | 0.6070 (+154.935) | 0.0687 (+7.680) | 0.4097 (+146.659) | |
SURF + MLK | 0.2830 (+18.858) | 0.0661 (+3.605) | 0.1886 (+13.546) | |
Proposed | 0.2381 (+0) | 0.0647 (+1.411) | 0.1661 (+0) |
Freq. | Method | Bottom (%) | Middle (%) | Top (%) |
---|---|---|---|---|
2 Hz | FM-Fixed | 0.0243 (+39.655) | 0.0235 (+54.605) | 0.0266 (+29.126) |
FM-Updated | 0.0407 (+133.908) | 0.0349 (+129.605) | 0.0366 (+77.670) | |
Proposed | 0.0174 (+0) | 0.0152 (+0) | 0.0206 (+0) | |
5 Hz | FM-Fixed | 0.0839 (+293.897) | 0.0862 (+77.002) | 0.0950 (+61.290) |
FM-Updated | 0.0725 (+240.376) | 0.0798 (+63.860) | 0.0813 (+38.031) | |
Proposed | 0.0213 (+0) | 0.0487 (+0) | 0.0589 (+0) | |
10 Hz | FM-Fixed | 0.2499 (+4.956) | 0.1071 (+65.533) | 0.1841 (+10.837) |
FM-Updated | 0.2441 (+2.520) | 0.0977 (+51.005) | 0.1759 (+5.900) | |
Proposed | 0.2381 (+0) | 0.0647 (0) | 0.1661 (+0) |
Freq. | Method | Bottom (%) | Middle (%) | Top (%) |
---|---|---|---|---|
2 Hz | DOF-Updated | 0.0178 (+2.299) | 0.0173 (+13.816) | 0.0221 (+7.282) |
Proposed | 0.0174 (+0) | 0.0152 (+0) | 0.0206 (+0) | |
5 Hz | DOF-Updated | 0.0247 (+15.962) | 0.0582 (+19.507) | 0.0619 (+5.093) |
Proposed | 0.0213 (+0) | 0.0487 (+0) | 0.0589 (+0) | |
10 Hz | DOF-Updated | 0.2382 (+0.042) | 0.0637 (+0) | 0.1670 (+0.542) |
Proposed | 0.2381 (+0) | 0.0647 (+1.570) | 0.1661 (+0) |
Freq. | Method | Bottom (%) | Middle (%) | Top (%) |
---|---|---|---|---|
2 Hz | Marker-Fixed | 0.0186 (+6.897) | 0.0173 (+13.816) | 0.0219 (+6.311) |
Marker-Updated | 0.0341 (+95.977) | 0.0414 (+172.368) | 0.0245 (+18.932) | |
Proposed | 0.0174 (+0) | 0.0152 (+0) | 0.0206 (+0) | |
5 Hz | Marker-Fixed | 0.0357 (+67.606) | 0.0583 (+19.713) | 0.0699 (+18.676) |
Marker-Updated | 0.0512 (+140.376) | 0.1938 (+297.947) | 0.1472 (+149.915) | |
Proposed | 0.0213 (+0) | 0.0487 (+0) | 0.0589 (+0) | |
10 Hz | Marker-Fixed | 0.2391 (+0.420) | 0.0707 (+9.274) | 0.1704 (+2.589) |
Marker-Updated | 0.6249 (+162.453) | 0.1367 (+111.283) | 0.1697 (+2.167) | |
Proposed | 0.2381 (+0) | 0.0647 (+1.570) | 0.1661 (+0) |
Methods | Freq | Bottom (%) | Middle (%) | Top (%) | ROI Size (Pixels) | Image Processing Speed (fps) |
---|---|---|---|---|---|---|
ORB + MLK + MLESAC | 2 Hz | 0.0177 (+2.299) | 0.0152 (0) | 0.0204 (0) | 17.13 | |
5 Hz | 0.0211 (+1.442) | 0.0721 (+48.049) | 0.0608 (+3.932) | |||
10 Hz | 0.2477 (+4.032) | 0.0740 (+16.352) | 0.1685 (+1.445) | |||
ORB + MLK + Bidir. error | 2 Hz | 0.0173 (0) | 0.0152 (0) | 0.0204 (0) | 8.89 | |
5 Hz | 0.0208 (0) | 0.0656 (+34.702) | 0.0585 (0) | |||
10 Hz | 0.2476 (+3.990) | 0.0636 (0) | 0.1681 (+1.204) | |||
Proposed | 2 Hz | 0.0174 (+0.578) | 0.0152 (0) | 0.0206 (+0.980) | 13.77 | |
5 Hz | 0.0213 (+2.404) | 0.0487 (0) | 0.0589 (+0.684) | |||
10 Hz | 0.2381 (0) | 0.0647 (+1.700) | 0.1661 (0) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nie, G.-Y.; Bodda, S.S.; Sandhu, H.K.; Han, K.; Gupta, A. Computer-Vision-Based Vibration Tracking Using a Digital Camera: A Sparse-Optical-Flow-Based Target Tracking Method. Sensors 2022, 22, 6869. https://doi.org/10.3390/s22186869
Nie G-Y, Bodda SS, Sandhu HK, Han K, Gupta A. Computer-Vision-Based Vibration Tracking Using a Digital Camera: A Sparse-Optical-Flow-Based Target Tracking Method. Sensors. 2022; 22(18):6869. https://doi.org/10.3390/s22186869
Chicago/Turabian StyleNie, Guang-Yu, Saran Srikanth Bodda, Harleen Kaur Sandhu, Kevin Han, and Abhinav Gupta. 2022. "Computer-Vision-Based Vibration Tracking Using a Digital Camera: A Sparse-Optical-Flow-Based Target Tracking Method" Sensors 22, no. 18: 6869. https://doi.org/10.3390/s22186869
APA StyleNie, G. -Y., Bodda, S. S., Sandhu, H. K., Han, K., & Gupta, A. (2022). Computer-Vision-Based Vibration Tracking Using a Digital Camera: A Sparse-Optical-Flow-Based Target Tracking Method. Sensors, 22(18), 6869. https://doi.org/10.3390/s22186869