An Optimized Smoke Segmentation Method for Forest and Grassland Fire Based on the UNet Framework
Abstract
:1. Introduction
2. Materials and Methods
2.1. Dataset Preparation
2.1.1. Data and Preprocessing
2.1.2. Dataset Overview
2.2. Methodology
2.2.1. Module Introduction
- HardRes
- 2.
- DSASPP
- (1)
- Depthwise separable convolution: DSASPP leverages depthwise separable convolutions, which consist of depthwise convolutions followed by pointwise convolutions. This separation of convolutional operations reduces computational complexity while maintaining strong representational power. It allows the module to extract features at multiscale effectively.
- (2)
- Atrous convolution: Atrous convolution, also known as dilated convolution, is employed within the DSASPP module to expand the receptive field without increasing the number of parameters. By applying convolutional filters with varying dilation rates, the module captures contextual information at different scales, enabling it to perceive both local and global features.
- (3)
- Spatial pyramid pooling: The spatial pyramid pooling component within DSASPP divides the input feature map into grids of different sizes and computes pooling operations independently within each grid. This process effectively captures features at multiple spatial resolutions, enabling the module to handle objects of various sizes within an image.
- 3.
- SCSE
- (1)
- CSE: CSE focuses on recalibrating channel-wise features by capturing global information for each channel through Global Average Pooling (GAP). This process eliminates redundant information while emphasizing crucial features within each channel.
- (2)
- SCE: In addition to CSE, SCE operates at the spatial level, dynamically adjusting the importance of spatial regions within an image. This adaptation is facilitated by learned weights that modulate the significance of spatial features, allowing the network to prioritize areas of interest while de-emphasizing irrelevant regions.
2.2.2. Model Optimization
2.3. Experimental Environment
2.4. Accuracy Assessment
3. Results
3.1. Comparative Experiment
3.2. Ablation Experiment
3.3. Combination Analysis
3.4. Transferability Analysis
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Bowman, D.M.; Kolden, C.A.; Abatzoglou, J.T.; Johnston, F.H.; van der Werf, G.R.; Flannigan, M. Vegetation fires in the Anthropocene. Nat. Rev. Earth Environ. 2020, 1, 500–515. [Google Scholar] [CrossRef]
- Keith, D.A.; Ferrer-Paris, J.R.; Nicholson, E.; Bishop, M.J.; Polidoro, B.A.; Ramirez-Llodra, E.; Tozer, M.G.; Nel, J.L.; Mac Nally, R.; Gregr, E.J. A function-based typology for Earth’s ecosystems. Nat. Commun. 2022, 610, 513–518. [Google Scholar] [CrossRef]
- Pausas, J.G.; Bond, W.J. On the three major recycling pathways in terrestrial ecosystems. Trends Ecol. Evol. 2020, 35, 767–775. [Google Scholar] [CrossRef]
- Hutto, R.L.; Keane, R.E.; Sherriff, R.L.; Rota, C.T.; Eby, L.A.; Saab, V.A. Toward a more ecologically informed view of severe forest fires. Ecosphere 2016, 7, e01255. [Google Scholar] [CrossRef]
- Chowdary, V.; Gupta, M.K.; Singh, R. A Review on forest fire detection techniques: A decadal perspective. Networks 2018, 4, 12. [Google Scholar] [CrossRef]
- Tedim, F.; Leone, V.; Amraoui, M.; Bouillon, C.; Coughlan, M.R.; Delogu, G.M.; Fernandes, P.M.; Ferreira, C.; McCaffrey, S.; McGee, T.K. Defining extreme wildfire events: Difficulties, challenges, and impacts. Fire 2018, 1, 9. [Google Scholar] [CrossRef]
- Martell, D.L. A review of recent forest and wildland fire management decision support systems research. Curr. For. Rep. 2015, 1, 128–137. [Google Scholar] [CrossRef]
- Xianlin, Q.; Xiaotong, L.; Shuchao, L. Forest fire early warning and monitoring techniques using satellite remote sensing in China. J. Remote Sens. 2020, 5, 511–520. [Google Scholar]
- Dewanti, R.; Lolitasari, I. Detection of Forest Fire, Smoke Source Locations in Kalimantan During the Dry Season for the Year 2015 using Landsat 8 from the Threshold of Brightness Temperature Algorithm. Int. J. Remote Sens. Earth Sci. 2017, 12, 151–160. [Google Scholar]
- Fischer, C.; Halle, W.; Säuberlich, T.; Frauenberger, O.; Hartmann, M.; Oertel, D.; Terzibaschian, T. Small Satellite Tools for High-Resolution Infrared Fire Monitoring. J. Imaging 2022, 8, 49. [Google Scholar] [CrossRef]
- Hua, L.; Shao, G. The progress of operational forest fire monitoring with infrared remote sensing. J. For. Res. 2017, 28, 215–229. [Google Scholar] [CrossRef]
- Wang, Z.; Yang, P.; Liang, H.; Zheng, C.; Yin, J.; Tian, Y.; Cui, W. Semantic segmentation and analysis on sensitive parameters of forest fire smoke using smoke-unet and landsat-8 imagery. Remote Sens. 2021, 14, 45. [Google Scholar] [CrossRef]
- Geetha, S.; Abhishek, C.; Akshayanat, C. Machine vision based fire detection techniques: A survey. Fire Technol. 2021, 57, 591–623. [Google Scholar] [CrossRef]
- Qin, X.-L.; Zhu, X.; Yang, F.; Zhao, K.-R.; Pang, Y.; Li, Z.-Y.; Li, X.-Z.; Zhang, J.-X. Analysis of sensitive spectral bands for burning status detection using hyper-spectral images of Tiangong-01. Spectrosc. Spectr. Anal. 2013, 33, 1908–1911. [Google Scholar]
- Çetin, A.E.; Dimitropoulos, K.; Gouverneur, B.; Grammalidis, N.; Günay, O.; Habiboǧlu, Y.H.; Töreyin, B.U.; Verstockt, S. Video fire detection—Review. Digit. Signal Process. 2013, 23, 1827–1843. [Google Scholar] [CrossRef]
- Zhan, J.; Hu, Y.; Cai, W.; Zhou, G.; Li, L. PDAM–STPNNet: A small target detection approach for wildland fire smoke through remote sensing images. Symmetry 2021, 13, 2260. [Google Scholar] [CrossRef]
- Chaturvedi, S.; Khanna, P.; Ojha, A. A survey on vision-based outdoor smoke detection techniques for environmental safety. ISPRS J. Photogramm. Remote Sens. 2022, 185, 158–187. [Google Scholar] [CrossRef]
- Gaur, A.; Singh, A.; Kumar, A.; Kumar, A.; Kapoor, K. Video flame and smoke based fire detection algorithms: A literature review. Fire Technol. 2020, 56, 1943–1980. [Google Scholar] [CrossRef]
- Shi, J.; Yuan, F.; Xia, X. Video smoke detection: A literature survey. Image Graph. 2018, 23, 303–322. [Google Scholar]
- Xia, X.; Yuan, F.; Zhang, L.; Yang, L.; Shi, J. From traditional methods to deep ones: Review of visual smoke recognition, detection, and segmentation. J. Image Graph. 2019, 24, 1627–1647. [Google Scholar]
- Barmpoutis, P.; Papaioannou, P.; Dimitropoulos, K.; Grammalidis, N. A review on early forest fire detection systems using optical remote sensing. Sensors 2020, 20, 6442. [Google Scholar] [CrossRef]
- Sun, X.; Sun, L.; Huang, Y. Forest fire smoke recognition based on convolutional neural network. J. For. Res. 2021, 32, 1921–1927. [Google Scholar] [CrossRef]
- Christopher, S.; Chou, J. The potential for collocated AGLP and ERBE data for fire, smoke, and radiation budget studies. Int. J. Remote Sens. 1997, 18, 2657–2676. [Google Scholar] [CrossRef]
- Chung, Y.-S.; Le, H. Detection of forest-fire smoke plumes by satellite imagery. Atmos. Environ. 1984, 18, 2143–2151. [Google Scholar] [CrossRef]
- Chrysoulakis, N.; Herlin, I.; Prastacos, P.; Yahia, H.; Grazzini, J.; Cartalis, C. An improved algorithm for the detection of plumes caused by natural or technological hazards using AVHRR imagery. Remote Sens. Environ. 2007, 108, 393–406. [Google Scholar] [CrossRef]
- Lu, X.; Zhang, X.; Li, F.; Cochrane, M.A.; Ciren, P. Detection of fire smoke plumes based on aerosol scattering using VIIRS data over global fire-prone regions. Remote Sens. 2021, 13, 196. [Google Scholar] [CrossRef]
- Xie, Y. Detection of Smoke and Dust Aerosols Using Multi-Sensor Satellite Remote Sensing Measurements; George Mason University: Fairfax, VA, USA, 2009. [Google Scholar]
- Xie, Y.; Qu, J.; Xiong, X.; Hao, X.; Che, N.; Sommers, W. Smoke plume detection in the eastern United States using MODIS. Int. J. Remote Sens. 2007, 28, 2367–2374. [Google Scholar] [CrossRef]
- Ko, B.; Park, J.; Nam, J.-Y. Spatiotemporal bag-of-features for early wildfire smoke detection. Image Vis. Comput. 2013, 31, 786–795. [Google Scholar] [CrossRef]
- Li, Z.; Khananian, A.; Fraser, R.H.; Cihlar, J. Automatic detection of fire smoke using artificial neural networks and threshold approaches applied to AVHRR imagery. IEEE Trans. Geosci. Remote Sens. 2001, 39, 1859–1870. [Google Scholar]
- Xiong, D.; Yan, L. Early smoke detection of forest fires based on SVM image segmentation. J. For. Sci. 2019, 65, 150–159. [Google Scholar] [CrossRef]
- Ba, R.; Chen, C.; Yuan, J.; Song, W.; Lo, S. SmokeNet: Satellite smoke scene detection using convolutional neural network with spatial and channel-wise attention. Remote Sens. 2019, 11, 1702. [Google Scholar] [CrossRef]
- Chen, S.; Cao, Y.; Feng, X.; Lu, X.J.N. Global2Salient: Self-adaptive feature aggregation for remote sensing smoke detection. Neurocomputing 2021, 466, 202–220. [Google Scholar] [CrossRef]
- Ismanto, H.; Marfai, M. Classification tree analysis (Gini-Index) smoke detection using Himawari_8 satellite data over Sumatera-Borneo maritime continent Sout East Asia. In Proceedings of the IOP Conference Series: Earth and Environmental Science: The 2nd International Conference on Environmental Resources Management in Global Region (ICERM 2018), Yogyakarta, Indonesia, 22–23 October 2018; IOP Publishing: Bristol, UK, 2019; p. 012043. [Google Scholar]
- Li, X.; Song, W.; Lian, L.; Wei, X. Forest fire smoke detection using back-propagation neural network based on MODIS data. Remote Sens. 2015, 7, 4473–4498. [Google Scholar] [CrossRef]
- Li, X.; Wang, J.; Song, W.; Ma, J.; Telesca, L.; Zhang, Y. Automatic smoke detection in modis satellite data based on k-means clustering and fisher linear discrimination. Photogramm. Eng. Remote Sens. 2014, 80, 971–982. [Google Scholar] [CrossRef]
- Leinonen, J.; Hamann, U.; Sideris, I.V.; Germann, U. Thunderstorm Nowcasting with Deep Learning: A Multi-Hazard Data Fusion Model. Geophys. Res. Lett. 2023, 50, e2022GL101626. [Google Scholar] [CrossRef]
- Yang, L.; Cervone, G. Analysis of remote sensing imagery for disaster assessment using deep learning: A case study of flooding event. Soft Comput. 2019, 23, 13393–13408. [Google Scholar] [CrossRef]
- Zhang, Y.; Xie, D.; Tian, W.; Zhao, H.; Geng, S.; Lu, H.; Ma, G.; Huang, J.; Choy Lim Kam Sian, K.T. Construction of an Integrated Drought Monitoring Model Based on Deep Learning Algorithms. Remote Sens. 2023, 15, 667. [Google Scholar] [CrossRef]
- Liu, Y.; Wu, L. Geological disaster recognition on optical remote sensing images using deep learning. Procedia Comput. Sci. 2016, 91, 566–575. [Google Scholar] [CrossRef]
- Shafapourtehrany, M.; Rezaie, F.; Jun, C.; Heggy, E.; Bateni, S.M.; Panahi, M.; Özener, H.; Shabani, F.; Moeini, H. Mapping Post-Earthquake Landslide Susceptibility Using U-Net, VGG-16, VGG-19, and Metaheuristic Algorithms. Remote Sens. 2023, 15, 4501. [Google Scholar] [CrossRef]
- Pyo, J.; Park, L.J.; Pachepsky, Y.; Baek, S.-S.; Kim, K.; Cho, K.H. Using convolutional neural network for predicting cyanobacteria concentrations in river water. Water Res. 2020, 186, 116349. [Google Scholar] [CrossRef] [PubMed]
- Shamsudeen, T.Y. Advances in remote sensing technology, machine learning and deep learning for marine oil spill detection, prediction and vulnerability assessment. Remote Sens. 2020, 12, 3416. [Google Scholar]
- Mi, W.; Beibei, G.; Xiaoxiang, L.; Lin, X.; Yufeng, C.; Shuying, J.; Xiao, Z. On-orbit geometric calibration and accuracy verification of GF-6 WFV camera. Acta Geod. Cartogr. Sin. 2020, 49, 171. [Google Scholar]
- Xu, B.; Wang, N.; Chen, T.; Li, M. Empirical evaluation of rectified activations in convolutional network. arXiv 2015, arXiv:1505.00853v2. [Google Scholar]
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Ramachandran, P.; Zoph, B.; Le, Q.V. Searching for activation functions. arXiv 2017, arXiv:1710.05941. [Google Scholar]
- Howard, A.; Sandler, M.; Chu, G.; Chen, L.-C.; Chen, B.; Tan, M.; Wang, W.; Zhu, Y.; Pang, R.; Vasudevan, V. Searching for mobilenetv3. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea, 27 October–2 November 2019; pp. 1314–1324. [Google Scholar]
- Chen, L.-C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 834–848. [Google Scholar] [CrossRef]
- Roy, A.G.; Navab, N.; Wachinger, C. Concurrent spatial and channel ‘squeeze & excitation’in fully convolutional networks. In Proceedings of the Medical Image Computing and Computer Assisted Intervention–MICCAI 2018: 21st International Conference, Granada, Spain, 16–20 September 2018; pp. 421–429. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, 5–9 October 2015; pp. 234–241. [Google Scholar]
- Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- Badrinarayanan, V.; Kendall, A.; Cipolla, R. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 2481–2495. [Google Scholar] [CrossRef]
NO. | Band Name | Spectral Range/μm | Spatial Resolution/m | Revisit Cycle/d | Width/km |
---|---|---|---|---|---|
1 | Blue | 0.45–0.52 | 16 | 4 | ≥800 |
2 | Green | 0.52–0.59 | |||
3 | Red | 0.63–0.69 | |||
4 | NIR | 0.77–0.89 | |||
5 | Red-Edge1 | 0.69–0.73 | |||
6 | Red-Edge2 | 0.73–0.77 | |||
7 | Coastal Blue | 0.40–0.45 | |||
8 | Yellow | 0.59–0.63 |
Methods | Stage | Precision/% | Recall/% | F1-Score/% | Jaccard/% |
---|---|---|---|---|---|
SegNet | train | 82.39 | 77.70 | 78.39 | 66.61 |
test | 86.77 | 70.66 | 72.38 | 60.70 | |
FCN | train | 87.21 | 83.75 | 84.84 | 74.41 |
test | 92.59 | 69.35 | 75.28 | 63.41 | |
UNet | train | 84.08 | 84.22 | 83.36 | 72.44 |
test | 89.01 | 73.34 | 74.62 | 63.56 | |
SmokeUNet | train | 86.54 | 83.70 | 84.43 | 73.76 |
test | 94.94 | 70.14 | 76.87 | 66.01 | |
GFUNet | train | 89.15 | 89.41 | 88.95 | 80.55 |
test | 94.00 | 80.25 | 85.50 | 75.76 |
Methods | Precision/% | Recall/% | F1-Score/% | Jaccard/% |
---|---|---|---|---|
GFUNet | 94.00 | 80.25 | 85.50 | 75.76 |
GFUNet-DSASPP | 94.00 | 77.47 | 83.18 | 72.85 |
GFUNet-HardRes | 91.14 | 78.20 | 81.60 | 70.78 |
GFUNet-SCSE | 84.64 | 80.58 | 77.70 | 66.65 |
Combinations | Band Composition | Precision/% | Recall/% | F1-Score/% | Jaccard/% |
---|---|---|---|---|---|
C1 | b1, b2, b3 | 91.29 | 77.51 | 80.58 | 70.06 |
C2 | b1, b2, b3, b4 | 91.03 | 78.20 | 81.51 | 70.78 |
C3 | b1, b2, b3, b5 | 92.48 | 76.93 | 81.04 | 70.79 |
C4 | b1, b2, b3, b7 | 91.38 | 80.47 | 83.39 | 73.18 |
C5 | b1, b2, b3, b8 | 92.35 | 79.44 | 83.30 | 73.06 |
C6 | b1, b2, b3, b4, b5 | 91.27 | 78.46 | 81.99 | 71.39 |
C7 | b1, b2, b3, b4, b7 | 94.84 | 77.21 | 83.83 | 73.19 |
C8 | b1, b2, b3, b4, b8 | 92.33 | 79.85 | 84.25 | 73.95 |
C9 | All bands | 94.00 | 80.25 | 85.50 | 75.76 |
Data | Location | Date | Precision/% | Recall/% | F1-Score/% | Jaccard/% |
---|---|---|---|---|---|---|
GF-1 WFV | Alberta, Canada | 7 June 2023 | 85.78 | 89.10 | 87.41 | 77.63 |
Kigoma, Tanzania | 25 June 2023 | 99.31 | 82.44 | 90.09 | 81.97 | |
Amurskaya, Russia | 9 July 2023 | 98.44 | 93.33 | 95.82 | 91.97 | |
Average Accuracy Indicator | 94.51 | 88.29 | 91.11 | 83.86 | ||
HJ-2A/B CCD | Alberta, Canada | 4 July 2023 | 99.83 | 57.90 | 73.30 | 57.85 |
Tanganyika, Congo | 15 July 2023 | 83.67 | 90.90 | 87.13 | 77.20 | |
Çanakkale, Turkey | 23 August 2023 | 98.45 | 84.04 | 90.67 | 82.94 | |
Average Accuracy Indicator | 93.98 | 77.61 | 83.70 | 72.66 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hu, X.; Jiang, F.; Qin, X.; Huang, S.; Yang, X.; Meng, F. An Optimized Smoke Segmentation Method for Forest and Grassland Fire Based on the UNet Framework. Fire 2024, 7, 68. https://doi.org/10.3390/fire7030068
Hu X, Jiang F, Qin X, Huang S, Yang X, Meng F. An Optimized Smoke Segmentation Method for Forest and Grassland Fire Based on the UNet Framework. Fire. 2024; 7(3):68. https://doi.org/10.3390/fire7030068
Chicago/Turabian StyleHu, Xinyu, Feng Jiang, Xianlin Qin, Shuisheng Huang, Xinyuan Yang, and Fangxin Meng. 2024. "An Optimized Smoke Segmentation Method for Forest and Grassland Fire Based on the UNet Framework" Fire 7, no. 3: 68. https://doi.org/10.3390/fire7030068
APA StyleHu, X., Jiang, F., Qin, X., Huang, S., Yang, X., & Meng, F. (2024). An Optimized Smoke Segmentation Method for Forest and Grassland Fire Based on the UNet Framework. Fire, 7(3), 68. https://doi.org/10.3390/fire7030068