Sparse Optical Flow Implementation Using a Neural Network for Low-Resolution Thermal Aerial Imaging
Abstract
:1. Introduction
2. Related Work
3. Motivations and Contribution
4. Optical Flow with Thermal Imaging
4.1. Thermal Sensor
4.2. Automatic Gain Control
5. Sparse and Dense Optical Flow Technique in UAV Navigation
Feature Extraction
6. Thermal Dataset Availability
6.1. Dataset 1
6.2. Dataset 2
6.3. Training and Validating Sets
6.4. Generated ground truth from Thermal Dataset
7. The RAFT_s Model
7.1. RGB Optical Flow Dataset
7.1.1. MPI-Sintel
7.1.2. Flying Chairs
7.2. Train the Model
7.3. Evaluation Methodology
8. Result
8.1. Signals Accuracy
8.1.1. Dataset 1
8.1.2. Dataset 2 during High Thermal Contrast Conditions
8.1.3. Dataset 2 during Cold-Soaked Low Thermal Conditions
8.2. Effect of Cropping Window and Number of Features on Accuracy and Processing Time
8.2.1. Case 1: Constant Cropping Window
8.2.2. Case 2: Constant Number of Features
9. Discussion
10. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Sample Availability
Abbreviations
LWIR | Long Wavelength Infrared |
AGC | Automatic Gain Control |
FFC | Flat Field Correction |
The Image Interpolation Algorithm | |
UAV | Unmanned Aerial Vehicle |
LK | The Lucas–Kanade Algorithm |
References
- Nguyen, T.X.B.; Rosser, K.; Chahl, J. A Review of Modern Thermal Imaging Sensor Technology and Applications for Autonomous Aerial Navigation. J. Imaging 2021, 7, 217. [Google Scholar] [CrossRef] [PubMed]
- Chahl, J.S.; Srinivasan, M.V.; Zhang, S.W. Landing strategies in honeybees and applications to uninhabited airborne vehicles. Int. J. Robot. Res. 2004, 23, 101–110. [Google Scholar] [CrossRef]
- Chahl, J.; Mizutani, A. Biomimetic attitude and orientation sensors. IEEE Sens. J. 2010, 12, 289–297. [Google Scholar] [CrossRef]
- Conroy, J.; Gremillion, G.; Ranganathan, B.; Humbert, J.S. Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton. Robot. 2009, 27, 189. [Google Scholar] [CrossRef]
- Rosser, K.; Chahl, J. Reducing the complexity of visual navigation: Optical track controller for long-range unmanned aerial vehicles. J. Field Robot. 2019, 36, 1118–1140. [Google Scholar] [CrossRef]
- Miller, A.; Miller, B.; Popov, A.; Stepanyan, K. Optical Flow as a navigation means for UAV. In Proceedings of the 2018 Australian & New Zealand Control Conference (ANZCC), Melbourne, Australia, 7–8 December 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 302–307. [Google Scholar]
- Gan, S.K.; Sukkarieh, S. Multi-UAV target search using explicit decentralized gradient-based negotiation. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 751–756. [Google Scholar]
- Perera, A.G.; Law, Y.W.; Chahl, J. Human pose and path estimation from aerial video using dynamic classifier selection. Cogn. Comput. 2018, 10, 1019–1041. [Google Scholar] [CrossRef] [Green Version]
- Luo, F.; Jiang, C.; Yu, S.; Wang, J.; Li, Y.; Ren, Y. Stability of cloud-based UAV systems supporting big data acquisition and processing. IEEE Trans. Cloud Comput. 2017, 7, 866–877. [Google Scholar] [CrossRef]
- Wang, J.; Jiang, C.; Ni, Z.; Guan, S.; Yu, S.; Ren, Y. Reliability of cloud controlled multi-UAV systems for on-demand services. In Proceedings of the GLOBECOM 2017–2017 IEEE Global Communications Conference, Singapore, 4–8 December 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–6. [Google Scholar]
- Itkin, M.; Kim, M.; Park, Y. Development of cloud-based UAV monitoring and management system. Sensors 2016, 16, 1913. [Google Scholar] [CrossRef]
- Lee, J.; Wang, J.; Crandall, D.; Šabanović, S.; Fox, G. Real-time, cloud-based object detection for unmanned aerial vehicles. In Proceedings of the 2017 First IEEE International Conference on Robotic Computing (IRC), Taichung, Taiwan, 10–12 April 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 36–43. [Google Scholar]
- Horn, B.K.; Schunck, B.G. Determining optical flow. In Techniques and Applications of Image Understanding; International Society for Optics and Photonics: Bellingham, WA, USA, 1981; Volume 281, pp. 319–331. [Google Scholar]
- Srinivasan, M.V.; Zhang, S.W.; Chahl, J.S.; Stange, G.; Garratt, M. An overview of insect-inspired guidance for application in ground and airborne platforms. Proc. Inst. Mech. Eng. Part J. Aerosp. Eng. 2004, 218, 375–388. [Google Scholar]
- Srinivasan, M.V.; Zhang, S.; Altwein, M.; Tautz, J. Honeybee navigation: Nature and calibration of the “odometer”. Science 2000, 287, 851–853. [Google Scholar] [CrossRef]
- Srinivasan, M.V. Honey bees as a model for vision, perception, and cognition. Annu. Rev. Entomol. 2010, 55, 267–284. [Google Scholar] [CrossRef] [PubMed]
- Garratt, M.A.; Chahl, J.S. Vision-based terrain following for an unmanned rotorcraft. J. Field Robot. 2008, 25, 284–301. [Google Scholar] [CrossRef]
- Esch, H.; Burns, J. Distance estimation by foraging honeybees. J. Exp. Biol. 1996, 199, 155–162. [Google Scholar] [CrossRef]
- Chahl, J.; Mizutani, A.; Strens, M.; Wehling, M. Autonomous Navigation Using Passive Sensors and Small Computers; Infotech@ Aerospace: Arlington, Virginia, 2005; p. 7013. [Google Scholar]
- Honegger, D.; Meier, L.; Tanskanen, P.; Pollefeys, M. An open source and open hardware embedded metric optical flow cmos camera for indoor and outdoor applications. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 1736–1741. [Google Scholar]
- Barrows, G.L.; Chahl, J.S.; Srinivasan, M.V. Biomimetic visual sensing and flight control. In Proceedings of the Seventeenth International Unmanned Air Vehicle Systems Conference, Bristol, UK, 30 March–1 April 2002; pp. 1–15. [Google Scholar]
- Franz, M.O.; Chahl, J.S.; Krapp, H.G. Insect-inspired estimation of egomotion. Neural Comput. 2004, 16, 2245–2260. [Google Scholar] [CrossRef]
- Garratt, M.; Chahl, J. Visual control of an autonomous helicopter. In Proceedings of the 41st Aerospace Sciences Meeting and Exhibit, Reno, NV, USA, 6–9 January 2003; p. 460. [Google Scholar]
- Farnebäck, G. Two-frame motion estimation based on polynomial expansion. In Proceedings of the Scandinavian Conference on Image Analysis, Halmstad, Sweden, 29 June–2 July 2003; Springer: Berlin/Heidelberg, Germany, 2003; pp. 363–370. [Google Scholar]
- Lucas, B.D.; Kanade, T. An iterative image registration technique with an application to stereo vision. In Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI), Vancouver, BC, Canada, 24–28 August 1981. [Google Scholar]
- Dabov, K.; Foi, A.; Katkovnik, V.; Egiazarian, K. Image denoising with block-matching and 3D filtering. In Image Processing: Algorithms and Systems, Neural Networks, and Machine Learning; International Society for Optics and Photonics: Bellingham, WA, USA, 2006; Volume 6064, p. 606414. [Google Scholar]
- Srinivasan, M.V. An image-interpolation technique for the computation of optic flow and egomotion. Biol. Cybern. 1994, 71, 401–415. [Google Scholar] [CrossRef]
- Dosovitskiy, A.; Fischer, P.; Ilg, E.; Hausser, P.; Hazirbas, C.; Golkov, V.; Van Der Smagt, P.; Cremers, D.; Brox, T. Flownet: Learning optical flow with convolutional networks. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 2758–2766. [Google Scholar]
- Ilg, E.; Mayer, N.; Saikia, T.; Keuper, M.; Dosovitskiy, A.; Brox, T. Flownet 2.0: Evolution of optical flow estimation with deep networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2462–2470. [Google Scholar]
- Ranjan, A.; Black, M.J. Optical flow estimation using a spatial pyramid network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 4161–4170. [Google Scholar]
- Grover, N.; Agarwal, N.; Kataoka, K. liteflow: Lightweight and distributed flow monitoring platform for sdn. In Proceedings of the 2015 1st IEEE Conference on Network Softwarization (NetSoft), London, UK, 13–17 April 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 1–9. [Google Scholar]
- Sun, D.; Yang, X.; Liu, M.Y.; Kautz, J. Pwc-net: Cnns for optical flow using pyramid, warping, and cost volume. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 8934–8943. [Google Scholar]
- Teed, Z.; Deng, J. Raft: Recurrent all-pairs field transforms for optical flow. In Proceedings of the European Conference on Computer Vision, Glasgow, UK, 23–28 August 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 402–419. [Google Scholar]
- Kwasniewska, A.; Szankin, M.; Ruminski, J.; Sarah, A.; Gamba, D. Improving Accuracy of Respiratory Rate Estimation by Restoring High Resolution Features with Transformers and Recursive Convolutional Models. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 3857–3867. [Google Scholar]
- Zhou, H.; Sun, M.; Ren, X.; Wang, X. Visible-Thermal Image Object Detection via the Combination of Illumination Conditions and Temperature Information. Remote Sens. 2021, 13, 3656. [Google Scholar] [CrossRef]
- Stypułkowski, K.; Gołda, P.; Lewczuk, K.; Tomaszewska, J. Monitoring system for railway infrastructure elements based on thermal imaging analysis. Sensors 2021, 21, 3819. [Google Scholar] [CrossRef]
- Gonzalez-Dugo, V.; Zarco-Tejada, P.; Nicolás, E.; Nortes, P.A.; Alarcón, J.; Intrigliolo, D.S.; Fereres, E. Using high resolution UAV thermal imagery to assess the variability in the water status of five fruit tree species within a commercial orchard. Precis. Agric. 2013, 14, 660–678. [Google Scholar] [CrossRef]
- Meron, M.; Sprintsin, M.; Tsipris, J.; Alchanatis, V.; Cohen, Y. Foliage temperature extraction from thermal imagery for crop water stress determination. Precis. Agric. 2013, 14, 467–477. [Google Scholar] [CrossRef]
- Alchanatis, V.; Cohen, Y.; Cohen, S.; Moller, M.; Sprinstin, M.; Meron, M.; Tsipris, J.; Saranga, Y.; Sela, E. Evaluation of different approaches for estimating and mapping crop water status in cotton with thermal imaging. Precis. Agric. 2010, 11, 27–41. [Google Scholar] [CrossRef]
- Weiss, C.; Kirmas, A.; Lemcke, S.; Böshagen, S.; Walter, M.; Eckstein, L.; Leonhardt, S. Head tracking in automotive environments for driver monitoring using a low resolution thermal camera. Vehicles 2022, 4, 219–233. [Google Scholar] [CrossRef]
- Szankin, M.; Kwasniewska, A.; Ruminski, J. Influence of thermal imagery resolution on accuracy of deep learning based face recognition. In Proceedings of the 2019 12th International Conference on Human System Interaction (HSI), Richmond, VA, USA, 25–27 June 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–6. [Google Scholar]
- Kwaśniewska, A.; Rumiński, J.; Rad, P. Deep features class activation map for thermal face detection and tracking. In Proceedings of the 2017 10Th international conference on human system interactions (HSI), Ulsan, Korea, 17–19 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 41–47. [Google Scholar]
- Khaksari, K.; Nguyen, T.; Hill, B.Y.; Quang, T.; Perrault, J.; Gorti, V.; Malpani, R.; Blick, E.; Cano, T.G.; Shadgan, B.; et al. Review of the efficacy of infrared thermography for screening infectious diseases with applications to COVID-19. J. Med. Imaging 2021, 8, 010901. [Google Scholar] [CrossRef]
- Khanam, F.T.Z.; Chahl, L.A.; Chahl, J.S.; Al-Naji, A.; Perera, A.G.; Wang, D.; Lee, Y.; Ogunwa, T.T.; Teague, S.; Nguyen, T.X.B.; et al. Noncontact sensing of contagion. J. Imaging 2021, 7, 28. [Google Scholar] [CrossRef]
- Maddern, W.; Vidas, S. Towards robust night and day place recognition using visible and thermal imaging. In Proceedings of the RSS 2012 Workshop: Beyond Laser and Vision: Alternative Sensing Techniques for Robotic Perception; University of Sydney: Camperdown, NSW, Australia, 2012; pp. 1–6. [Google Scholar]
- Brunner, C.; Peynot, T.; Vidal-Calleja, T.; Underwood, J. Selective combination of visual and thermal imaging for resilient localization in adverse conditions: Day and night, smoke and fire. J. Field Robot. 2013, 30, 641–666. [Google Scholar] [CrossRef] [Green Version]
- Mouats, T.; Aouf, N.; Sappa, A.D.; Aguilera, C.; Toledo, R. Multispectral stereo odometry. IEEE Trans. Intell. Transp. Syst. 2014, 16, 1210–1224. [Google Scholar] [CrossRef]
- Mouats, T.; Aouf, N.; Chermak, L.; Richardson, M.A. Thermal stereo odometry for UAVs. IEEE Sens. J. 2015, 15, 6335–6347. [Google Scholar] [CrossRef] [Green Version]
- FREE Teledyne FLIR Thermal Dataset for Algorithm Training. FREE Teledyne FLIR Thermal Dataset for Algorithm Training. Available online: https://www.flir.eu/oem/adas/adas-dataset-form/ (accessed on 30 March 2022).
- Khattak, S.; Papachristos, C.; Alexis, K. Visual-thermal landmarks and inertial fusion for navigation in degraded visual environments. In Proceedings of the 2019 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–9. [Google Scholar]
- Khattak, S.; Papachristos, C.; Alexis, K. Keyframe-based direct thermal–inertial odometry. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 3563–3569. [Google Scholar]
- Khattak, S.; Papachristos, C.; Alexis, K. Marker based thermal-inertial localization for aerial robots in obscurant filled environments. In International Symposium on Visual Computing; Springer: Berlin/Heidelberg, Germany, 2018; pp. 565–575. [Google Scholar]
- Khattak, S.; Papachristos, C.; Alexis, K. Keyframe-based thermal–inertial odometry. J. Field Robot. 2020, 37, 552–579. [Google Scholar] [CrossRef]
- Khattak, S.; Nguyen, H.; Mascarich, F.; Dang, T.; Alexis, K. Complementary multi–modal sensor fusion for resilient robot pose estimation in subterranean environments. In Proceedings of the 2020 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 1–4 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1024–1029. [Google Scholar]
- Weinmann, M.; Leitloff, J.; Hoegner, L.; Jutzi, B.; Stilla, U.; Hinz, S. THERMAL 3D MAPPING FOR OBJECT DETECTION IN DYNAMIC SCENES. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 2, 53–60. [Google Scholar] [CrossRef] [Green Version]
- Brook, A.; Vandewal, M.; Ben-Dor, E. Fusion of optical and thermal imagery and LiDAR data for application to 3-D urban environment and structure monitoring. Remote Sens.—Adv. Tech. Platforms 2012, 2012, 29–50. [Google Scholar]
- O’Donohue, D.; Mills, S.; Kingham, S.; Bartie, P.; Park, D. Combined thermal-LIDAR imagery for urban mapping. In Proceedings of the 2008 23rd International Conference Image and Vision Computing New Zealand, Christchurch, New Zealand, 26–28 November 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 1–6. [Google Scholar]
- Lu, Y.; Lu, G. An alternative of lidar in nighttime: Unsupervised depth estimation based on single thermal image. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, 3–8 January 2021; pp. 3833–3843. [Google Scholar]
- Shin, Y.S.; Kim, A. Sparse depth enhanced direct thermal-infrared SLAM beyond the visible spectrum. IEEE Robot. Autom. Lett. 2019, 4, 2918–2925. [Google Scholar] [CrossRef] [Green Version]
- Rosser, K.; Nguyen, T.X.B.; Moss, P.; Chahl, J. Low complexity visual UAV track navigation using long-wavelength infrared. J. Field Robot. 2021, 38, 882–897. [Google Scholar] [CrossRef]
- Nguyen, T.X.B.; Rosser, K.; Perera, A.; Moss, P.; Teague, S.; Chahl, J. Characteristics of optical flow from aerial thermal imaging,“thermal flow”. J. Field Robot. 2022, 39, 580–599. [Google Scholar] [CrossRef]
- Nguyen, T.X.B.; Rosser, K.; Chahl, J. A Comparison of Dense and Sparse Optical Flow Techniques for Low-Resolution Aerial Thermal Imagery. J. Imaging 2022, 8, 116. [Google Scholar] [CrossRef]
- Sheen, D.M.; McMakin, D.L.; Hall, T.E. Cylindrical millimeter-wave imaging technique and applications. In Passive Millimeter-Wave Imaging Technology IX; SPIE: Bellingham, WA, USA, 2006; Volume 6211, pp. 58–67. [Google Scholar]
- Sheen, D.M.; McMakin, D.L.; Collins, H.D. Circular scanned millimeter-wave imaging system for weapon detection. In Law Enforcement Technologies: Identification Technologies and Traffic Safety; SPIE: Bellingham, WA, USA, 1995; Volume 2511, pp. 122–130. [Google Scholar]
- Zhang, R.; Cao, S. 3D imaging millimeter wave circular synthetic aperture radar. Sensors 2017, 17, 1419. [Google Scholar] [CrossRef]
- Smith, J.W.; Yanik, M.E.; Torlak, M. Near-field MIMO-ISAR millimeter-wave imaging. In Proceedings of the 2020 IEEE Radar Conference (RadarConf20), Florence, Italy , 21–25 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
- Li, R.; Liu, J.; Zhang, L.; Hang, Y. LIDAR/MEMS IMU integrated navigation (SLAM) method for a small UAV in indoor environments. In Proceedings of the 2014 DGON Inertial Sensors and Systems (ISS), Karlsruhe, Germany, 16–17 September 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 1–15. [Google Scholar]
- Chen, D.; Gao, G.X. Probabilistic graphical fusion of LiDAR, GPS, and 3D building maps for urban UAV navigation. Navigation 2019, 66, 151–168. [Google Scholar] [CrossRef] [Green Version]
- Kumar, G.A.; Patil, A.K.; Patil, R.; Park, S.S.; Chai, Y.H. A LiDAR and IMU integrated indoor navigation system for UAVs and its application in real-time pipeline classification. Sensors 2017, 17, 1268. [Google Scholar] [CrossRef] [Green Version]
- Chiang, K.W.; Tsai, G.J.; Li, Y.H.; El-Sheimy, N. Development of LiDAR-based UAV system for environment reconstruction. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1790–1794. [Google Scholar] [CrossRef]
- Shah, S.T.H.; Xuezhi, X. Traditional and modern strategies for optical flow: An investigation. SN Appl. Sci. 2021, 3, 1–14. [Google Scholar] [CrossRef]
- Shi, J.; Tomasi, C. Good features to track. In Proceedings of the 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 21–23 June 1994; IEEE: Piscataway, NJ, USA, 1994; pp. 593–600. [Google Scholar]
- FLIR Corp. FLIR Lepton Engineering Data Sheet; FLIR Corp.: Wilsonville, OR, USA, 2018. [Google Scholar]
- Bradski, G. The openCV library. Dr. Dobb’s Journal: Softw. Tools Prof. Program. 2000, 25, 120–123. [Google Scholar]
- Papachristos, C.; Mascarich, F.; Alexis, K. Thermal-inertial localization for autonomous navigation of aerial robots through obscurants. In Proceedings of the 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 12–15 June 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 394–399. [Google Scholar]
- Lu, Y.; Xue, Z.; Xia, G.S.; Zhang, L. A survey on vision-based UAV navigation. Geo-Spat. Inf. Sci. 2018, 21, 21–32. [Google Scholar] [CrossRef] [Green Version]
- Engel, J.; Koltun, V.; Cremers, D. Direct sparse odometry. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 611–625. [Google Scholar] [CrossRef]
- Arbelaez, P.; Maire, M.; Fowlkes, C.; Malik, J. Contour detection and hierarchical image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 33, 898–916. [Google Scholar] [CrossRef] [Green Version]
- Dollár, P.; Zitnick, C.L. Fast edge detection using structured forests. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 37, 1558–1570. [Google Scholar] [CrossRef]
- Endres, F.; Hess, J.; Sturm, J.; Cremers, D.; Burgard, W. 3-D mapping with an RGB-D camera. IEEE Trans. Robot. 2013, 30, 177–187. [Google Scholar] [CrossRef]
- Poma, X.S.; Riba, E.; Sappa, A. Dense extreme inception network: Towards a robust cnn model for edge detection. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Snowmass Village, CO, USA, 1–5 March 2022; pp. 1923–1932. [Google Scholar]
- Harris, C.; Stephens, M. A combined corner and edge detector. In Proceedings of the Alvey Vision Conference. Citeseer, Manchester, UK, 31 August–2 September 1988; Volume 15, pp. 10–5244. [Google Scholar]
- Bureau of Meteorology. Wommera Weather. 2020. Available online: http://www.bom.gov.au/places/sa/woomera/ (accessed on 22 August 2002).
- Butler, D.J.; Wulff, J.; Stanley, G.B.; Black, M.J. A naturalistic open source movie for optical flow evaluation. In Proceedings of the European Conference on Computer Vision, Florence, Italy, 7–13 October 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 611–625. [Google Scholar]
Feature Detection Setting | Maximum corners | 1000 |
Quality level | 0.02 | |
Minimum distance | 5 | |
Block size | 5 |
Source | Training Set | Evaluation Set | Site Condition | Total Images | |
---|---|---|---|---|---|
Dataset 1 | [60] | Yes: 10,894 | Yes: 2000 | High contrast | 12,894 |
Dataset 2 | [62] | No | Yes: 2800 | High and low contrast | 2800 |
Total images | 10,894 | 4800 | 15,694 |
No Features | Cropping | Cross-Correlation X | Cross-Correlation Y | Sparse FPS | Dense FPS | Difference |
---|---|---|---|---|---|---|
1 | 40 × 40 | 0.381 | 0.219 | 29.2 | 11 | +165.45% |
2 | 40 × 40 | 0.412 | 0.371 | 24.3 | 11 | +120.9% |
3 | 40 × 40 | 0.741 | 0.673 | 22.5 | 11 | +104/54% |
4 | 40 × 40 | 0.988 | 0.969 | 19.2 | 11 | +74.54% |
5 | 40 × 40 | 0.991 | 0.983 | 16.3 | 11 | +48.18% |
6 | 40 × 40 | 0.967 | 0.931 | 12.3 | 11 | +11.81% |
7 | 40 × 40 | 0.961 | 0.981 | 8 | 11 | −27.27% |
No Features | Cropping | Cross-Correlation X | Cross-Correlation Y | Sparse FPS | Dense FPS | Difference |
---|---|---|---|---|---|---|
4 | 20 × 20 | 0.123 | 0.07 | 34.5 | 11 | +213.64% |
4 | 30 × 30 | 0.126 | 0.05 | 26.9 | 11 | +144.54% |
4 | 35 × 35 | 0.642 | 0.694 | 21.7 | 11 | +97.27% |
4 | 40 × 40 | 0.988 | 0.969 | 19.2 | 11 | +74.54% |
4 | 45 × 45 | 0.963 | 0.953 | 17.5 | 11 | +55.09% |
4 | 50 × 50 | 0.983 | 0.943 | 14.5 | 11 | +31.81% |
4 | 55 × 55 | 0.921 | 0.953 | 10.2 | 11 | −7.27% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nguyen, T.X.B.; Chahl, J. Sparse Optical Flow Implementation Using a Neural Network for Low-Resolution Thermal Aerial Imaging. J. Imaging 2022, 8, 279. https://doi.org/10.3390/jimaging8100279
Nguyen TXB, Chahl J. Sparse Optical Flow Implementation Using a Neural Network for Low-Resolution Thermal Aerial Imaging. Journal of Imaging. 2022; 8(10):279. https://doi.org/10.3390/jimaging8100279
Chicago/Turabian StyleNguyen, Tran Xuan Bach, and Javaan Chahl. 2022. "Sparse Optical Flow Implementation Using a Neural Network for Low-Resolution Thermal Aerial Imaging" Journal of Imaging 8, no. 10: 279. https://doi.org/10.3390/jimaging8100279
APA StyleNguyen, T. X. B., & Chahl, J. (2022). Sparse Optical Flow Implementation Using a Neural Network for Low-Resolution Thermal Aerial Imaging. Journal of Imaging, 8(10), 279. https://doi.org/10.3390/jimaging8100279