Photogrammetric Measurement of Grassland Fire Spread: Techniques and Challenges with Low-Cost Unmanned Aerial Vehicles
Abstract
:1. Introduction
2. Materials and Methods
- The first flight (Figure 3) was conducted during attempts to ignite the meadow and waiting for suitable wind conditions—the aim was to create a cohesive block of images to which images from the subsequent monitoring flight could be later aligned.
- The second flight (Figure 4) monitored the development of the fire itself at an interval of 2 s between images while the UAV gradually moved along the fireline—the goal was to automatically align the images with the first flight and to project them onto a model from the third flight.
- The third flight (Figure 5) was conducted immediately after the fire to map the entire burned area and create a 3D model of the surface without grassland cover.
3. Photogrammetric Processing
- Processing of images from the 3rd flight—significant features in the surface texture were automatically detected based on the SIFT algorithm, matched using the Nearest Neighbor approach, and then matches were geometrically verified with epipolar geometry using random sample consensus (RANSAC) [52]. The model coordinates of tie points and the elements of the interior and relative orientation of images were adjusted during bundle adjustment, resulting in a reconstructed 3D scene. Georeferencing was also performed as part of SfM using manually measured GCPs. Detailed surface reconstruction based on MVS algorithms, and a generated 3D model (Figure 7), were also conducted.
- Processing of images from the 1st flight—identical to the 3rd flight, with the difference that fixed elements of the interior orientation obtained from the camera calibration on images from the 3rd flight were used during image orientation. This step was necessary due to the reduced reliability of the results from the 1st flight caused by high grass cover—similar problems are also described in [53].
- Processing of images from the 2nd flight—the UAV moved alongside the fire line, but its movement was not continuous, and sometimes it remained stationary in one place. Here, a significant problem arose with a series of images taken from one position during monitoring. For reliable processing using SfM, it is necessary that baselines exist between adjacent images, meaning that the projection center of the camera changes its position in space; otherwise, the angle of intersection of determining rays at a given point becomes too small, and scene reconstruction is unstable. Essentially, this might not be a problem; in photogrammetry, it is often recommended to take images from one camera position for a higher degree of redundancy, however, when there are too many such images, combined with the RANSAC algorithm, it may cause failure in image orientation. Therefore, it was necessary to divide the images from the 2nd flight into 13 subgroups, within which the images were oriented separately (Table 2). Subgroup 13 contained only images where the camera changed its position (dynamic flight), and there was no problem orienting them directly with the images from the 1st flight. In groups 1 to 12, images created from almost static UAV positions were then included (Figure 8). In groups 1 to 12, the closest moving images from group 13 (according to overlap) were always added to ensure that the respective scene had reconstructible 3D geometry. If the reconstruction failed in one calculation, it was repeated until RANSAC found a solution for the relative orientation of the images that satisfied the largest number of already matched tie points, which, in some cases, included up to 5 repeated calculations.
- Sequential relative orientation of partial image blocks 1 to 12 with block 13 connected to the 1st flight. Subsequently, all blocks were merged using the “Merge chunks” function in the Agisoft Metashape software into one common image block, which contained 40 images from the 1st flight (mapping) and 290 images from the 2nd flight (monitoring). This was possible since all image blocks were in the same reference coordinate system.
- Import of the 3D model from the 3rd flight into the merged image block from the 1st and 2nd flights.
- Creation and export of orthoimages from the 2nd flight (monitoring) based on the 3D model from the 3rd flight (Figure 9).
4. Analysis of the Results
4.1. The Reliability of Image Orientation
4.2. The Reliability of the Digital Surface Model
4.3. The Reliability of Orthoimages
5. Discussion and Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Gill, A.M.; Stephens, S.L.; Cary, G.J. The worldwide “wildfire” problem. Ecol. Appl. 2013, 23, 438–454. [Google Scholar] [CrossRef] [PubMed]
- Pechony, O.; Shindell, D.T. Driving forces of global wildfires over the past millennium and the forthcoming century. Proc. Natl. Acad. Sci. USA 2010, 107, 19167–19170. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Chappellaz, J.; Park, K.; Mak, J.E. Large variations in Southern Hemisphere biomass burning during the last 650 years. Science 2010, 330, 1663–1666. [Google Scholar] [CrossRef] [PubMed]
- Van Hees, P. Validation and verification of fire models for fire safety engineering. Procedia Eng. 2013, 62, 154–168. [Google Scholar] [CrossRef]
- Artés, T.; Oom, D.; De Rigo, D.; Durrant, T.H.; Maianti, P.; Libertà, G.; San-Miguel-Ayanz, J. A global wildfire dataset for the analysis of fire regimes and fire behaviour. Sci. Data 2019, 6, 296. [Google Scholar] [CrossRef] [PubMed]
- Lopes, A.M.G.; Sousa, A.C.M.; Viegas, D.X. Numerical simulation of turbulent flow and fire propagation in complex topography. Numer. Heat Transf. Part A Appl. 1995, 27, 229–253. [Google Scholar] [CrossRef]
- Boboulos, M.; Purvis, M.R.I. Wind and slope effects on ROS during the fire propagation in East-Mediterranean pine forest litter. Fire Saf. J. 2009, 44, 764–769. [Google Scholar] [CrossRef]
- Sullivan, A.L. Wildland surface fire spread modelling 2009, 1990–2007. 1: Physical and quasi-physical models. Int. J. Wildland Fire 2009, 18, 349–368. [Google Scholar] [CrossRef]
- Dickson, B.G.; Prather, J.W.; Xu, Y.; Hampton, H.M.; Aumack, E.N.; Sisk, T.D. Mapping the probability of large fire occurrence in northern Arizona, USA. Landsc. Ecol. 2006, 21, 747–761. [Google Scholar] [CrossRef]
- Syphard, A.D.; Radeloff, V.C.; Keuler, N.S.; Taylor, R.S.; Hawbaker, T.J.; Stewart, S.I.; Clayton, M.K. Predicting spatial patterns of fire on a southern California landscape. Int. J. Wildland Fire 2008, 17, 602–613. [Google Scholar] [CrossRef]
- Semeraro, T.; Mastroleo, G.; Aretano, R.; Facchinetti, G.; Zurlini, G.; Petrosillo, I. GIS Fuzzy Expert System for the assessment of ecosystems vulnerability to fire in managing Mediterranean natural protected areas. J. Environ. Manag. 2016, 168, 94–103. [Google Scholar] [CrossRef] [PubMed]
- West, A.M.; Kumar, S.; Jarnevich, C.S. Regional modeling of large wildfires under current and potential future climates in Colorado and Wyoming, USA. Clim. Change 2016, 134, 565–577. [Google Scholar] [CrossRef]
- Li, Z.; Nadon, S.; Cihlar, J.; Stocks, B. Satellite-based mapping of Canadian boreal forest fires: Evaluation and comparison of algorithms. Int. J. Remote Sens. 2000, 21, 3071–3082. [Google Scholar] [CrossRef]
- Li, Y.; Vodacek, A.; Kremens, R.L.; Ononye, A.; Tang, C. A hybrid contextual approach to wildland fire detection using multispectral imagery. IEEE Trans. Geosci. Remote Sens. 2005, 43, 2115–2126. [Google Scholar] [CrossRef]
- Stow, D.A.; Riggan, P.J.; Storey, E.J.; Coulter, L.L. Measuring fire spread rates from repeat pass airborne thermal infrared imagery. Remote Sens. Lett. 2014, 5, 803–812. [Google Scholar] [CrossRef]
- Ambrosia, V.G.; Wegener, S.S.; Sullivan, D.V.; Buechel, S.W.; Dunagan, S.E.; Brass, J.A.; Stoneburner, J.; Schoenung, S.M. Demonstrating UAV-acquired real-time thermal data over fires. Photogramm. Eng. Remote Sens. 2003, 69, 391–402. [Google Scholar] [CrossRef]
- Sherstjuk, V.; Zharikova, M.; Sokol, I. Forest fire-fighting monitoring system based on UAV team and remote sensing. In Proceedings of the 2018 IEEE 38th International Conference on Electronics and Nanotechnology (ELNANO), Kyiv, Ukraine, 24–26 April 2018; pp. 663–668. [Google Scholar] [CrossRef]
- Gomez, C.; Purdie, H. UAV-based photogrammetry and geocomputing for hazards and disaster risk monitoring—A review. Geoenviron. Disasters 2016, 3, 23. [Google Scholar] [CrossRef]
- Afghah, F.; Razi, A.; Chakareski, J.; Ashdown, J. Wildfire monitoring in remote areas using autonomous unmanned aerial vehicles. In Proceedings of the IEEE INFOCOM 2019-IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS), Paris, France, 29 April–2 May 2019; pp. 835–840. [Google Scholar] [CrossRef]
- Ambroz, M.; Balažovjech, M.; Medľa, M.; Mikula, K. Numerical modeling of wildland surface fire propagation by evolving surface curves. Adv. Comput. Math. 2019, 45, 1067–1103. [Google Scholar] [CrossRef]
- Ambroz, M.; Mikula, K.; Fraštia, M.; Marčiš, M. Parameter estimation for the forest fire propagation model. Tatra Mt. Math. Publ. 2020, 75, 1–22. [Google Scholar] [CrossRef]
- Martinez-de Dios, J.R.; Arrue, B.C.; Ollero, A.; Merino, L.; Gómez-Rodríguez, F. Computer vision techniques for forest fire perception. Image Vis. Comput. 2008, 26, 550–562. [Google Scholar] [CrossRef]
- Toulouse, T.; Rossi, L.; Akhloufi, M.A.; Pieri, A.; Maldague, X. A multimodal 3D framework for fire characteristics estimation. Meas. Sci. Technol. 2018, 29, 025404. [Google Scholar] [CrossRef]
- Förstner, W.; Wrobel, B.P. Photogrammetric Computer Vision; Springer: Cham, Switzerland, 2016. [Google Scholar] [CrossRef]
- Ciullo, V.; Rossi, L.; Pieri, A. Experimental fire measurement with UAV multimodal stereovision. Remote Sens. 2020, 12, 3546. [Google Scholar] [CrossRef]
- Zhao, Y.; Ma, J.; Li, X.; Zhang, J. Saliency detection and deep learning-based wildfire identification in UAV imagery. Sensors 2018, 18, 712. [Google Scholar] [CrossRef] [PubMed]
- Tomaštík, J.; Mokroš, M.; Surový, P.; Grznárová, A.; Merganič, J. UAV RTK/PPK method—An optimal solution for mapping inaccessible forested areas? Remote Sens. 2019, 11, 721. [Google Scholar] [CrossRef]
- Giordan, D.; Hayakawa, Y.; Nex, F.; Remondino, F.; Tarolli, P. The use of remotely piloted aircraft systems (RPASs) for natural hazards monitoring and management. Nat. Hazards Earth Syst. Sci. 2018, 18, 1079–1096. [Google Scholar] [CrossRef]
- Štroner, M.; Urban, R.; Seidl, J.; Reindl, T.; Brouček, J. Photogrammetry using UAV-mounted GNSS RTK: Georeferencing strategies without GCPs. Remote Sens. 2021, 13, 1336. [Google Scholar] [CrossRef]
- Ullman, S. The interpretation of structure from motion. Proceedings of the Royal Society of London. Series B. Biol. Sci. 1979, 203, 405–426. [Google Scholar] [CrossRef]
- Lowe, G. Sift-the scale invariant feature transform. Int. J. 2004, 2, 2. [Google Scholar]
- Morelli, L.; Ioli, F.; Maiwald, F.; Mazzacca, G.; Menna, F.; Remondino, F. Deep-Image-Matching: An open-source toolbox for multi-view image matching of complex geomorphological scenarios. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2024, 48, 309–316. [Google Scholar] [CrossRef]
- Schonberger, J.L.; Frahm, J.M. Structure-from-motion revisited. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 4104–4113. [Google Scholar] [CrossRef]
- Agarwal, S.; Furukawa, Y.; Snavely, N.; Simon, I.; Curless, B.; Seitz, S.M.; Szeliski, R. Building Rome in a day. Commun. ACM 2011, 54, 105–112. [Google Scholar] [CrossRef]
- Gao, X.S.; Hou, X.R.; Tang, J.; Cheng, H.F. Complete solution classification for the perspective-three-point problem. IEEE Trans. Pattern Anal. Mach. Intell. 2003, 25, 930–943. [Google Scholar] [CrossRef]
- Crandall, D.; Owens, A.; Snavely, N.; Huttenlocher, D. Discrete-continuous optimization for large-scale structure from motion. In Proceedings of the CVPR 2011, Colorado Springs, CO, USA, 20–25 June 2011; pp. 3001–3008. [Google Scholar] [CrossRef]
- Marčiš, M.; Barták, P.; Valaška, D.; Fraštia, M.; Trhan, O. Use of image based modelling for documentation of intricately shaped objects. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 327–334. [Google Scholar] [CrossRef]
- Qureshi, A.H.; Alaloul, W.S.; Murtiyoso, A.; Saad, S.; Manzoor, B. Comparison of Photogrammetry Tools Considering Rebar Progress Recognition. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, 43, 141–146. [Google Scholar] [CrossRef]
- Pukanská, K.; Bartoš, K.; Bella, P.; Sabová, J. Comparison of non-contact surveying technologies for modelling underground morphological structures. Acta Montan. Slovaca 2017, 22, 246. [Google Scholar]
- Haala, N. Multiray photogrammetry and dense image matching. In Photogrammetric Week; Fritsch, E.D., Ed.; VDE Verlag: Heidelberg, Germany, 2011; Volume 11, pp. 185–195. [Google Scholar]
- Triggs, B.; McLauchlan, P.F.; Hartley, R.I.; Fitzgibbon, A.W. Bundle adjustment—A modern synthesis. In Vision Algorithms: Theory and Practice: International Workshop on Vision Algorithms Corfu, Greece, September 21–22 2000, 1999 Proceedings; Springer: Berlin/Heidelberg, Germany, 2002; pp. 298–372. [Google Scholar] [CrossRef]
- Schneider, J.; Schindler, F.; Läbe, T.; Förstner, W. Bundle adjustment for multi-camera systems with points at infinity. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 3, 75–80. [Google Scholar] [CrossRef]
- Börlin, N.; Grussenmeyer, P. Experiments with metadata-derived initial values and linesearch bundle adjustment in architectural photogrammetry. In Proceedings of the XXIV International CIPA Symposium, Strasbourg, France, 2–6 September 2013; Copernicus Publications: Enschede, The Netherlands, 2013; Volume 2, pp. 43–48. [Google Scholar] [CrossRef]
- Blanch, X.; Eltner, A.; Guinau, M.; Abellan, A. Multi-Epoch and Multi-Imagery (MEMI) photogrammetric workflow for enhanced change detection using time-lapse cameras. Remote Sens. 2021, 13, 1460. [Google Scholar] [CrossRef]
- Pukanská, K.; Bartoš, K.; Bakoň, M.; Papčo, J.; Kubica, L.; Barlák, J.; Rovňák, M.; Kseňak, Ľ.; Zelenakova, M.; Savchyn, I.; et al. Multi-sensor and multi-temporal approach in monitoring of deformation zone with permanent monitoring solution and management of environmental changes: A case study of Solotvyno salt mine, Ukraine. Front. Earth Sci. 2023, 11, 1167672. [Google Scholar] [CrossRef]
- McRae, R.H.; Sharples, J.J.; Wilkes, S.R.; Walker, A. An Australian pyro-tornadogenesis event. Nat. Hazards 2013, 65, 1801–1811. [Google Scholar] [CrossRef]
- Eltner, A.; Kaiser, A.; Abellan, A.; Schindewolf, M. Time lapse structure-from-motion photogrammetry for continuous geomorphic monitoring. Earth Surf. Process. Landf. 2017, 42, 2240–2253. [Google Scholar] [CrossRef]
- Ioli, F.; Bruno, E.; Calzolari, D.; Galbiati, M.; Mannocchi, A.; Manzoni, P.; Martini, M.; Bianchi, A.; Cina, A.; De Michele, C.; et al. A replicable open-source multi-camera system for low-cost 4d glacier monitoring. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, 48, 137–144. [Google Scholar] [CrossRef]
- Cucchiaro, S.; Cavalli, M.; Vericat, D.; Crema, S.; Llena, M.; Beinat, A.; Marchi, L.; Cazorzi, F. Monitoring topographic changes through 4D-structure-from-motion photogrammetry: Application to a debris-flow channel. Environ. Earth Sci. 2018, 77, 632. [Google Scholar] [CrossRef]
- Pacheco-Ruiz, R.; Adams, J.; Pedrotti, F. 4D modelling of low visibility Underwater Archaeological excavations using multi-source photogrammetry in the Bulgarian Black Sea. J. Archaeol. Sci. 2018, 100, 120–129. [Google Scholar] [CrossRef]
- Sherwood, C.R.; Warrick, J.A.; Hill, A.D.; Ritchie, A.C.; Andrews, B.D.; Plant, N.G. Rapid, remote assessment of Hurricane Matthew impacts using four-dimensional structure-from-motion photogrammetry. J. Coast. Res. 2018, 34, 1303–1316. [Google Scholar] [CrossRef]
- Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Harwin, S.; Lucieer, A.; Osborn, J. The impact of the calibration method on the accuracy of point clouds derived using unmanned aerial vehicle multi-view stereopsis. Remote Sens. 2015, 7, 11933–11953. [Google Scholar] [CrossRef]
- Gouverneur, B.; Verstockt, S.; Pauwels EJ, E.M.; Han, J.; de Zeeuw, P.M.; Vermeiren, J. Archeological treasures protection based on early forest wildfire multi-band imaging detection system. In Electro-Optical and Infrared Systems: Technology and Applications IX; SPIE: Bellingham, WA, USA, 2012; Volume 8541, pp. 104–119. [Google Scholar] [CrossRef]
- Dlesk, A.; Vach, K.; Pavelka, K. Photogrammetric co-processing of thermal infrared images and RGB images. Sensors 2022, 22, 1655. [Google Scholar] [CrossRef]
Camera Parameter | Value |
---|---|
Image sensor size | 13.13 × 8.76 mm |
Image sensor resolution | 5472 × 3648 pixels |
Pixel size | 2.4 μm |
Focal length | 10.26 mm |
Shutter type | Electronic rolling shutter |
Aperture | f/2.8 to f/11 |
Flight | Subset | Images in Block | Static/Moving Camera | Purpose |
---|---|---|---|---|
1 (before fire) | - | 40 | Moving | Main image block |
2 | 1 | 14 | Static | Monitoring |
2 | 9 | Static | Monitoring | |
3 | 37 | Static | Monitoring | |
4 | 28 | Static | Monitoring | |
5 | 10 | Static | Monitoring | |
6 | 18 | Static | Monitoring | |
7 | 31 | Static | Monitoring | |
8 | 13 | Static | Monitoring | |
9 | 6 | Static | Monitoring | |
10 | 5 | Static | Monitoring | |
11 | 38 | Static | Monitoring | |
12 | 6 | Static | Monitoring | |
13 | 76 | Moving | Monitoring | |
3 (after fire) | - | 28 | Moving | 3D model |
Accuracy | Generic Preselection | Key Point Limit | Tie Point Limit |
---|---|---|---|
High | Yes | 40,000 per image | 10,000 per image |
Flight | Images | Tie Points | Tie Points RMS Reprojection Error [pix] | GCPs X RMSE [m] | GCPs Y RMSE [m] | GCPs Z RMSE [m] | GCPs RMS Reprojection Error [pix] |
---|---|---|---|---|---|---|---|
1 (before fire) | 40 | 112,627 | 0.75 | 0.08 | 0.11 | 0.10 | 0.57 |
1 + 2 (monitor.) | 331 | 165,025 | 1.23 | 0.03 | 0.05 | 0.03 | 0.42 |
3 (after fire) | 28 | 64,666 | 0.94 | 0.02 | 0.04 | 0.05 | 0.36 |
Parameter | Flight 1 | Flight 3 | Difference [%] |
---|---|---|---|
f [pix] | 4939.29 | 4298.16 | −14.9 |
cx [pix] | −55.58 | 22.17 | 350.7 |
cy [pix] | −691.08 | 12.56 | 5603.1 |
K1 | 0.000441 | −0.008228 | 105.4 |
K2 | 0.017576 | 0.036370 | 51.7 |
K3 | −0.023303 | −0.043687 | 46.7 |
P1 | 0.002480 | 0.001253 | −97.9 |
P2 | −0.001990 | −0.002049 | 2.9 |
Flight | GCPs in BA | Precalib. Camera | Images | Tie Points | Tie Point RMS Reprojection Error [pix] | GCPs X RMSE [m] | GCPs Y RMSE [m] | GCPs Z RMSE [m] | GCPs RMS Reprojection Error [pix] |
---|---|---|---|---|---|---|---|---|---|
1 (before fire) | NO | NO | 40 | 111,123 | 0.73 | 0.38 | 0.27 | 0.35 | 0.43 |
1 (before fire) | YES | NO | 40 | 112,738 | 0.75 | 0.08 | 0.11 | 0.10 | 0.57 |
1 + 2 (monitor.) | NO | YES | 331 | 237,373 | 0.85 | 0.22 | 0.26 | 0.05 | 0.62 |
3 (after fire) | NO | NO | 28 | 63,934 | 0.50 | 0.20 | 0.14 | 0.10 | 0.32 |
3 (after fire) | YES | NO | 28 | 64,666 | 0.94 | 0.02 | 0.04 | 0.05 | 0.36 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Marčiš, M.; Fraštia, M.; Lieskovský, T.; Ambroz, M.; Mikula, K. Photogrammetric Measurement of Grassland Fire Spread: Techniques and Challenges with Low-Cost Unmanned Aerial Vehicles. Drones 2024, 8, 282. https://doi.org/10.3390/drones8070282
Marčiš M, Fraštia M, Lieskovský T, Ambroz M, Mikula K. Photogrammetric Measurement of Grassland Fire Spread: Techniques and Challenges with Low-Cost Unmanned Aerial Vehicles. Drones. 2024; 8(7):282. https://doi.org/10.3390/drones8070282
Chicago/Turabian StyleMarčiš, Marián, Marek Fraštia, Tibor Lieskovský, Martin Ambroz, and Karol Mikula. 2024. "Photogrammetric Measurement of Grassland Fire Spread: Techniques and Challenges with Low-Cost Unmanned Aerial Vehicles" Drones 8, no. 7: 282. https://doi.org/10.3390/drones8070282
APA StyleMarčiš, M., Fraštia, M., Lieskovský, T., Ambroz, M., & Mikula, K. (2024). Photogrammetric Measurement of Grassland Fire Spread: Techniques and Challenges with Low-Cost Unmanned Aerial Vehicles. Drones, 8(7), 282. https://doi.org/10.3390/drones8070282