Methodology for Severe Convective Cloud Identification Using Lightweight Neural Network Model Ensembling
Abstract
:1. Introduction
2. Study Area and Data
2.1. Study Area
2.2. FY-4B Data
- Multi-channel imaging. AGRI is equipped with multiple observation channels, including visible light, near-infrared, mid-infrared, and far-infrared bands, supporting wide-band atmospheric, cloud, and surface observations. This multi-channel observation capability allows AGRI to capture detailed features under different meteorological and environmental conditions.
- High spatial resolution. AGRI provides a spatial resolution of up to 500 m in the visible light and near-infrared channels, and even higher resolution in other channels, enabling it to capture more detailed meteorological and surface information.
- Fast scanning capability. AGRI provides rapid observation updates, including global scans every 15 min, regional scans every 5 min, and rapid scans of key areas every minute, greatly improving the monitoring and response speed to extreme weather events.
2.3. Data Preprocessing
2.4. FY-4B Data Labeling: Severe Convective Cloud Dataset Construction
3. Method
3.1. Parameter Configuration
3.2. Loss Function
3.3. Model Evaluation Index
- Accuracy: Defined as the ratio of correctly predicted results to the total results, accuracy offers a straightforward measure of overall model effectiveness. Specifically, the accuracy measures the proportion of total predictions that are correct. It ranges from 0 to 1, where 1 indicates perfect accuracy, and 0 indicates complete inaccuracy.
- 2.
- Precision: Precision assesses the accuracy of positive predictions and evaluates the incidence of false positives. It is particularly useful in situations where the cost of a false positive is high. Like accuracy, precision ranges from 0 to 1, with 1 being perfect (no false positives) and 0 indicating all positive predictions are incorrect.
- 3.
- Recall: Also known as sensitivity, recall measures the model’s ability to detect all actual positives. Its range is also between 0 and 1, where 1 means all true positives are correctly identified, and 0 signifies no true positives are detected.
- 4.
- F1 Score: The F1 score is the harmonic mean of precision and recall, providing a balance between the two when an equal importance is assumed. It is particularly useful when the cost of false positives and false negatives is high. The F1 score also ranges from 0 to 1, where 1 is the best possible score, indicating perfect precision and recall.
- 5.
- Intersection over Union (IoU): Also known as the Jaccard index, IoU is the ratio of the intersection to the union of the predicted and true labels. An IoU of 1 indicates a perfect prediction where the predicted labels or boundaries completely coincide with the ground truth, while an IoU of 0 signifies no overlap at all between the predicted and actual labels.
- 6.
- Overall Performance (OP): To synthesize the insights provided by individual metrics into a single performance indicator, we introduce the OP index. The higher the OP score, the better, as it suggests a model that performs well across all key aspects of binary classification. OP is computed as the sum of the standardized scores of the five aforementioned metrics:
3.4. Training
3.5. Building the EF Network via Model Ensembling and Ablation Studies
3.6. Parallel Computing
4. Results
4.1. Qualitative Analysis
4.2. Performance Metrics Analysis
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Ma, R.; Sun, J.; Yang, X. An eight-year climatology of the warm-season severe thunderstorm environments over North China. Atmos. Res. 2021, 254, 105519. [Google Scholar] [CrossRef]
- Hang, R.; Wang, J.; Ge, L.; Shi, C.; Wei, J. Convective Cloud Detection From Himawari-8 Advanced Himawari Imager Data Using a Dual-Branch Deformable Convolutional Network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 7490–7500. [Google Scholar] [CrossRef]
- Tian, Y.; Pang, S.; Qu, Y. Fusion Cloud Detection of Multiple Network Models Based on Hard Voting Strategy. In Proceedings of the IGARSS 2022—2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; pp. 6646–6649. [Google Scholar]
- Li, T.; Wu, D.; Wang, L.; Yu, X. Recognition algorithm for deep convective clouds based on FY4A. Neural Comput. Appl. 2022, 34, 21067–21088. [Google Scholar] [CrossRef]
- Li, L.; Li, X.; Jiang, L.; Su, X.; Chen, F. A review on deep learning techniques for cloud detection methodologies and challenges. Signal Image Video Process. 2021, 15, 1527–1535. [Google Scholar] [CrossRef]
- Han, L.; Sun, J.; Zhang, W. Convolutional neural network for convective storm nowcasting using 3-D Doppler weather radar data. IEEE Trans. Geosci. Remote Sens. 2019, 58, 1487–1495. [Google Scholar] [CrossRef]
- Han, D.; Lee, J.; Im, J.; Sim, S.; Lee, S.; Han, H.J. A novel framework of detecting convective initiation combining automated sampling, machine learning, and repeated model tuning from geostationary satellite data. Remote Sens. 2019, 11, 1454. [Google Scholar] [CrossRef]
- Chen, Q.; Yin, X.; Li, Y.; Zheng, P.; Chen, M.; Xu, Q. Recognition of Severe Convective Cloud Based on the Cloud Image Prediction Sequence from FY-4A. Remote Sens. 2023, 15, 4612. [Google Scholar] [CrossRef]
- Bai, C.; Zhang, M.; Zhang, J.; Zheng, J.; Chen, S. LSCIDMR: Large-scale satellite cloud image database for meteorological research. IEEE Trans. Cybern. 2021, 52, 12538–12550. [Google Scholar] [CrossRef] [PubMed]
- Fu, Y.; Mi, X.; Han, Z.; Zhang, W.; Liu, Q.; Gu, X.; Yu, T. A Machine-Learning-Based Study on All-Day Cloud Classification Using Himawari-8 Infrared Data. Remote Sens. 2023, 15, 5630. [Google Scholar] [CrossRef]
- Kai, Z.; Jiansheng, L.; Jianfeng, Y.; Wen, O.; Gaojie, W.; Xun, Z. A cloud and snow detection method of TH-1 image based on combined ResNet and DeeplabV3+. Acta Geod. Cartogr. Sin. 2020, 49, 1343. [Google Scholar]
- Lee, Y.; Kummerow, C.D.; Ebert-Uphoff, I. Applying machine learning methods to detect convection using GOES-16 ABI data. Atmos. Meas. Techn. Discuss. 2020, 2020, 1–28. [Google Scholar]
- Li, W.; Zhang, F.; Lin, H.; Chen, X.; Li, J.; Han, W. Cloud detection and classification algorithms for Himawari-8 imager measurements based on deep learning. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–17. [Google Scholar] [CrossRef]
- Ge, W.; Yang, X.; Jiang, R.; Shao, W.; Zhang, L.; Sensing, R. CD-CTFM: A Lightweight CNN-Transformer Network for Remote Sensing Cloud Detection Fusing Multiscale Features. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 4538–4551. [Google Scholar] [CrossRef]
- Liu, Q.; Li, Y.; Yu, M.; Chiu, L.S.; Hao, X.; Duffy, D.Q.; Yang, C.J. Daytime rainy cloud detection and convective precipitation delineation based on a deep neural Network method using GOES-16 ABI images. Remote Sens. 2019, 11, 2555. [Google Scholar] [CrossRef]
- Ukkonen, P.; Mäkelä, A. Evaluation of machine learning classifiers for predicting deep convection. J. Adv. Model. Earth Syst. 2019, 11, 1784–1802. [Google Scholar] [CrossRef]
- Gong, C.; Long, T.; Yin, R.; Jiao, W.; Wang, G. A Hybrid Algorithm with Swin Transformer and Convolution for Cloud Detection. Remote Sens. 2023, 15, 5264. [Google Scholar] [CrossRef]
- Guo, B.; Zhang, F.; Li, W.; Zhao, Z.; Sensing, R. Cloud Classification by machine learning for Geostationary Radiation Imager. IEEE Trans. Geosci. Remote Sens. 2024, 62, 4102814. [Google Scholar] [CrossRef]
- Ma, N.; Sun, L.; Wang, Q.; Yu, Z.; Liu, S. Improved cloud detection for Landsat 8 images using a combined neural network model. Remote Sens. Lett. 2020, 11, 274–282. [Google Scholar] [CrossRef]
- Xu, C.; Geng, S.; Wang, D.; Zhou, M. Cloud detection of space-borne video remote sensing using improved Unet method. In Proceedings of the International Conference on Algorithms, High Performance Computing, and Artificial Intelligence (AHPCAI 2021), Sanya, China, 19–21 November 2021; pp. 297–303. [Google Scholar]
- Zhang, Z.; Iwasaki, A.; Xu, G.; Song, J. Cloud detection on small satellites based on lightweight U-net and image compression. J. Appl. Remote Sens. 2019, 13, 026502. [Google Scholar] [CrossRef]
- Zhou, K.; Zheng, Y.; Dong, W.; Wang, T. A deep learning network for cloud-to-ground lightning nowcasting with multisource data. J. Atmos. Ocean. Technol. 2020, 37, 927–942. [Google Scholar] [CrossRef]
- Molina, M.J.; Gagne, D.J.; Prein, A.F. A benchmark to test generalization capabilities of deep learning methods to classify severe convective storms in a changing climate. Earth Space Sci. 2021, 8, e2020EA001490. [Google Scholar] [CrossRef]
- Rumapea, H.; Zarlis, M.; Efendy, S.; Sihombing, P. Improving Convective Cloud Classification with Deep Learning: The CC-Unet Model. Int. J. Adv. Sci. Eng. Inf. Technol. 2024, 14, 28. [Google Scholar] [CrossRef]
- Yang, K.; Wang, Z.; Deng, M.; Dettmann, B. Improved tropical deep convective cloud detection using MODIS observations with an active sensor trained machine learning algorithm. Remote Sens. Environ. 2023, 297, 113762. [Google Scholar] [CrossRef]
- Yang, Y.; Zhao, C.; Sun, Y.; Chi, Y.; Fan, H. Convective Cloud Detection and Tracking Using the New-Generation Geostationary Satellite Over South China. IEEE Trans. Geosci. Remote Sens. 2023, 61, 4103912. [Google Scholar] [CrossRef]
- Zhang, X.; Wang, T.; Chen, G.; Tan, X.; Zhu, K. Convective clouds extraction from Himawari–8 satellite images based on double-stream fully convolutional networks. IEEE Geosci. Remote Sens. Lett. 2019, 17, 553–557. [Google Scholar] [CrossRef]
- Paszke, A.; Chaurasia, A.; Kim, S.; Culurciello, E. Enet: A deep neural network architecture for real-time semantic segmentation. arXiv 2016, arXiv:1606.02147. [Google Scholar]
- Mehta, S.; Rastegari, M.; Caspi, A.; Shapiro, L.; Hajishirzi, H. Espnet: Efficient spatial pyramid of dilated convolutions for semantic segmentation. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 552–568. [Google Scholar]
- Poudel, R.P.; Liwicki, S.; Cipolla, R. Fast-scnn: Fast semantic segmentation network. arXiv 2019, arXiv:1902.04502. [Google Scholar]
- Zhao, H.; Qi, X.; Shen, X.; Shi, J.; Jia, J. Icnet for real-time semantic segmentation on high-resolution images. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 405–420. [Google Scholar]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.-C. Mobilenetv2: Inverted residuals and linear bottlenecks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 4510–4520. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, 5–9 October 2015; Proceedings, Part III 18. pp. 234–241. [Google Scholar]
- Yurtkulu, S.C.; Şahin, Y.H.; Unal, G. Semantic segmentation with extended DeepLabv3 architecture. In Proceedings of the 2019 27th Signal Processing and Communications Applications Conference (SIU), Sivas, Turkey, 24–26 April 2019; pp. 1–4. [Google Scholar]
- Wang, C.-H.; Huang, K.-Y.; Yao, Y.; Chen, J.-C.; Shuai, H.-H.; Cheng, W.-H. Lightweight deep learning: An overview. IEEE Consum. Electron. Mag. 2022, 99, 1–12. [Google Scholar] [CrossRef]
- Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
- Schenk, T. Introduction to Photogrammetry; The Ohio State University: Columbus, OH, USA, 2005; Volume 106. [Google Scholar]
- Seiz, G.; Shields, J.; Feister, U.; Baltsavias, E.P.; Gruen, A. Cloud mapping with ground-based photogrammetric cameras. Int. J. Remote Sens. 2007, 28, 2001–2032. [Google Scholar] [CrossRef]
- Hammouti, M.; Gencarelli, C.N.; Sterlacchini, S.; Biondi, R. Volcanic clouds detection applying machine learning techniques to GNSS radio occultations. GPS Solut. 2024, 28, 116. [Google Scholar] [CrossRef]
- Kaplan, E.D.; Hegarty, C. Understanding GPS/GNSS: Principles and Applications; Artech House: London, UK, 2017. [Google Scholar]
- Farooq, B.; Manocha, A. Satellite-based change detection in multi-objective scenarios: A comprehensive review. Remote Sens. Appl. Soc. Environ. 2024, 34, 101168. [Google Scholar] [CrossRef]
- Abdollahi, A.; Pradhan, B.; Alamri, A.M. An ensemble architecture of deep convolutional Segnet and Unet networks for building semantic segmentation from high-resolution aerial images. Geocarto Int. 2022, 37, 3355–3370. [Google Scholar] [CrossRef]
- O’Donncha, F.; Zhang, Y.; Chen, B.; James, S.C. Ensemble model aggregation using a computationally lightweight machine-learning model to forecast ocean waves. J. Mar. Syst. 2019, 199, 103206. [Google Scholar] [CrossRef]
- Kühnlein, M.; Appelhans, T.; Thies, B.; Nauss, T. Improving the accuracy of rainfall rates from optical satellite sensors with machine learning—A random forests-based approach applied to MSG SEVIRI. Remote Sens. Environ. 2014, 141, 129–143. [Google Scholar] [CrossRef]
- Qian, Z.; Wang, D.; Shi, X.; Yao, J.; Hu, L.; Yang, H.; Ni, Y.J. Lightning Identification Method Based on Deep Learning. Atmosphere 2022, 13, 2112. [Google Scholar] [CrossRef]
- Hu, K.; Zhang, D.; Xia, M.J. CDUNet: Cloud detection UNet for remote sensing imagery. Remote Sens. 2021, 13, 4533. [Google Scholar] [CrossRef]
- Hu, K.; Zhang, D.; Xia, M.; Qian, M.; Chen, B.J. LCDNet: Light-weighted cloud detection network for high-resolution remote sensing images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 4809–4823. [Google Scholar] [CrossRef]
- Luo, C.; Feng, S.; Yang, X.; Ye, Y.; Li, X.; Zhang, B.; Chen, Z.; Quan, Y. LWCDnet: A lightweight network for efficient cloud detection in remote sensing images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–16. [Google Scholar] [CrossRef]
- Qian, J.; Ci, J.; Tan, H.; Xu, W.; Jiao, Y.; Chen, P. Cloud Detection Method Based on Improved DeeplabV3+ Remote Sensing Image. IEEE Access 2024, 12, 9229–9242. [Google Scholar] [CrossRef]
- Yao, X.; Guo, Q.; Li, A.J. Light-weight cloud detection network for optical remote sensing images with attention-based deeplabv3+ architecture. Remote Sens. 2021, 13, 3617. [Google Scholar] [CrossRef]
- Chang, S.; Li, Y.; Shi, C.; Guo, D. Combined Effects of the ENSO and the QBO on the Ozone Valley over the Tibetan Plateau. Remote Sens. 2022, 14, 4935. [Google Scholar] [CrossRef]
- England, S.L.; Liu, G.; Withers, P.; Yiğit, E.; Lo, D.; Jain, S.; Schneider, N.M.; Deighan, J.; McClintock, W.E.; Mahaffy, P.R.J.J.o.G.R.P. Simultaneous observations of atmospheric tides from combined in situ and remote observations at Mars from the MAVEN spacecraft. J. Geophys. Res. Planets 2016, 121, 594–607. [Google Scholar] [CrossRef]
- Gao, J.W.; Rong, Z.J.; Klinger, L.; Li, X.Z.; Liu, D.; Wei, Y. A Spherical Harmonic Martian Crustal Magnetic Field Model Combining Data Sets of MAVEN and MGS. Earth Space Sci. 2021, 8, e2021EA001860. [Google Scholar] [CrossRef]
- Chen, B.-F.; Kuo, Y.-T.; Huang, T.-S. A deep learning ensemble approach for predicting tropical cyclone rapid intensification. Atmos. Sci. Lett. 2023, 24, e1151. [Google Scholar] [CrossRef]
- Singla, P.; Duhan, M.; Saroha, S. An ensemble method to forecast 24-h ahead solar irradiance using wavelet decomposition and BiLSTM deep learning network. Earth Sci. Inform. 2022, 15, 291–306. [Google Scholar] [CrossRef] [PubMed]
- Lin, C.-Y.; Chang, Y.-S.; Abimannan, S. Ensemble multifeatured deep learning models for air quality forecasting. Atmos. Pollut. Res. 2021, 12, 101045. [Google Scholar] [CrossRef]
- Guastavino, S.; Piana, M.; Tizzi, M.; Cassola, F.; Iengo, A.; Sacchetti, D.; Solazzo, E.; Benvenuto, F. Prediction of severe thunderstorm events with ensemble deep learning and radar data. Sci. Rep. 2022, 12, 20049. [Google Scholar] [CrossRef] [PubMed]
- Sha, Y.; Sobash, R.A.; Gagne, D.J. Generative ensemble deep learning severe weather prediction from a deterministic convection-allowing model. Artif. Intell. Earth Syst. 2024, 3, e230094. [Google Scholar] [CrossRef]
- Lei, B.; Yang, L.; Xu, Z. Using convolutional neural network to classify convective cloud on radar echoes. In Proceedings of the 2019 International Conference on Meteorology Observations (ICMO), Chengdu, China, 28–31 December 2019; pp. 1–3. [Google Scholar]
- Zhou, K.; Zheng, Y.; Li, B.; Dong, W.; Zhang, X. Forecasting different types of convective weather: A deep learning approach. J. Meteorol. Res. 2019, 33, 797–809. [Google Scholar] [CrossRef]
- Ganaie, M.A.; Hu, M.; Malik, A.K.; Tanveer, M.; Suganthan, P.N. Ensemble deep learning: A review. Eng. Appl. Artif. Intell. 2022, 115, 105151. [Google Scholar] [CrossRef]
- Zhang, P.; Xu, Z.; Guan, M.; Xie, L.; Xian, D.; Liu, C. Progress of Fengyun Meteorological Satellites Since 2020. Chin. J. Space Sci. 2022, 42, 724–732. [Google Scholar] [CrossRef]
- Zhu, Z.; Shi, C.; Gu, J. Characterization of bias in Fengyun-4B/AGRI infrared observations using RTTOV. Remote Sens. 2023, 15, 1224. [Google Scholar] [CrossRef]
- Ma, Z.; Zhu, S.; Yang, J. FY4QPE-MSA: An all-day near-real-time quantitative precipitation estimation framework based on multispectral analysis from AGRI onboard Chinese FY-4 series satellites. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–15. [Google Scholar] [CrossRef]
- Zhu, S.; Ma, Z. Does AGRI of FY4A have the ability to capture the motions of precipitation? IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- Xu, W. Precipitation and convective characteristics of summer deep convection over East Asia observed by TRMM. Mon. Weather Rev. 2013, 141, 1577–1592. [Google Scholar] [CrossRef]
- Murakami, M. Analysis of the deep convective activity over the Western Pacific and Southeast Asia part II: Seasonal and intraseasonal variations during Northern Summer. J. Meteorol. Soc. Jpn. 1984, 62, 88–108. [Google Scholar] [CrossRef]
- Su, B.; Chen, A.; Liu, M.; Kong, L.; Zhang, A.; Tian, Z.; Liu, B.; Wang, X.; Wang, W.; Zhang, X.; et al. Ground Calibration and In-Flight Performance of the Low Energy Particle Analyzer on FY-4B. Atmosphere 2023, 14, 1834. [Google Scholar] [CrossRef]
- Huang, Y.; Bao, Y.; Petropoulos, G.P.; Lu, Q.; Huo, Y.; Wang, F.J. Precipitation Estimation Using FY-4B/AGRI Satellite Data Based on Random Forest. Remote Sens. 2024, 16, 1267. [Google Scholar] [CrossRef]
- Li, X.; Cao, Q.; Zhou, S.; Qian, J.; Wang, B.; Zou, Y.; Wang, J.; Shen, X.; Han, C.; Wang, L.; et al. Prelaunch Radiometric Characterization and Calibration for Long Wave Infrared Band of FY-4B GHI. Acta Opt. Sin. 2023, 43, 1212005. [Google Scholar]
- Wenqiang, L.; Yi, Q.; Lei, Y.; Yong, H. Analysis of the ranging systematic error of the FY-4 geostationary satellite and its influence on orbit determination. Chin. J. Sci. Instrum. 2022, 43, 73–83. [Google Scholar]
- Zhang, J.; Liu, P.; Zhang, F.; Song, Q. CloudNet: Ground-based cloud classification with deep convolutional neural network. Geophys. Res. Lett. 2018, 45, 8665–8672. [Google Scholar] [CrossRef]
- Li, S.; Wang, M.; Sun, S.; Wu, J.; Zhuang, Z. CloudDenseNet: Lightweight Ground-Based Cloud Classification Method for Large-Scale Datasets Based on Reconstructed DenseNet. Sensors 2023, 23, 7957. [Google Scholar] [CrossRef] [PubMed]
- Canny, J. A computational approach to edge detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, 6, 679–698. [Google Scholar] [CrossRef]
- Deriche, R. Using Canny’s criteria to derive a recursively implemented optimal edge detector. Int. J. Comput. Vis. 1987, 1, 167–187. [Google Scholar] [CrossRef]
- Shih, F.Y.; Cheng, S. Automatic seeded region growing for color image segmentation. Image Vis. Comput. 2005, 23, 877–886. [Google Scholar] [CrossRef]
- Bao, P.; Zhang, L.; Wu, X. Canny edge detection enhancement by scale multiplication. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1485–1490. [Google Scholar] [CrossRef] [PubMed]
- Lapušinskij, A.; Suzdalev, I.; Goranin, N.; Janulevičius, J.; Ramanauskaitė, S.; Stankūnavičius, G. The application of Hough transform and Canny edge detector methods for the visual detection of cumuliform clouds. Sensors 2021, 21, 5821. [Google Scholar] [CrossRef] [PubMed]
- Ansel, J.; Yang, E.; He, H.; Gimelshein, N.; Jain, A.; Voznesensky, M.; Bao, B.; Bell, P.; Berard, D.; Burovski, E. PyTorch 2: Faster Machine Learning Through Dynamic Python Bytecode Transformation and Graph Compilation. In Proceedings of the 29th ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS’24), La Jolla, CA, USA, 27 April–1 May 2024; Association for Computing Machinery: New York, NY, USA, 2024; pp. 317–335. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- De Boer, P.-T.; Kroese, D.P.; Mannor, S.; Rubinstein, R.Y. A tutorial on the cross-entropy method. Ann. Oper. Res. 2005, 134, 19–67. [Google Scholar] [CrossRef]
Channel Type | Band | Center Wavelength (μm) | Bandwidth (μm) | Spatial Resolution (km) | Main Applications |
---|---|---|---|---|---|
VIS/NIR | 1 | 0.47 | 0.45–0.49 | 1 | Small-particle aerosols, true color synthesis |
2 | 0.65 | 0.55–0.75 | 0.5 | Vegetation, image navigation registration, stellar observations | |
3 | 0.825 | 0.75–0.90 | 1 | Vegetation, aerosols over water surfaces | |
Shortwave IR | 4 | 1.379 | 1.371–1.386 | 2 | Cirrus clouds |
5 | 1.61 | 1.58–1.64 | 2 | Low cloud/snow identification, water/ice cloud discrimination | |
6 | 2.25 | 2.10–2.35 | 2 | Cirrus, aerosols, particle size | |
Midwave IR | 7 | 3.75 | 3.50–4.0 | 2 | Clouds and high albedo targets, fire spots |
8 | 3.75 | 3.50–4.0 | 4 | Low albedo targets, surface | |
Water vapor | 9 | 6.25 | 5.80–6.70 | 4 | Upper-level water vapor |
10 | 6.95 | 6.75–7.15 | 4 | Mid-level water vapor | |
11 | 7.42 | 7.24–7.60 | 4 | Low-level water vapor | |
Longwave IR | 12 | 8.55 | 8.3–8.8 | 4 | Clouds |
13 | 10.80 | 10.30–11.30 | 4 | Clouds, surface temperature, etc. | |
14 | 12.00 | 11.50–12.50 | 4 | Clouds, total water vapor, surface temperature | |
15 | 13.3 | 13.00–13.60 | 4 | Clouds, water vapor |
Model | Avg Train Time (s/Epoch) | Avg Val Time (s/Epoch) | Number of Parameters | Model Size (MB) | Batch Size | Initial Learning Rate |
---|---|---|---|---|---|---|
ENet | 75.21 | 13.63 | 214,465 | 0.95 | 8 | 0.001 |
ESPNet | 26.80 | 12.99 | 103,784 | 0.42 | 8 | 0.001 |
Fast-SCNN | 118.45 | 13.52 | 1,112,350 | 4.12 | 8 | 0.0005 |
ICNet | 21.11 | 13.43 | 3741 | 0.02 | 4 | 0.0005 |
MobileNetV2 | 221.24 | 15.65 | 3,363,602 | 11.39 | 16 | 0.0005 |
DeepLabV3 | 536.48 | 28.91 | 41,999,209 | 160.56 | 4 | 0.0001 |
U-Net | 483.16 | 18.57 | 7,383,234 | 28.22 | 4 | 0.001 |
Model Combination | Test Time (s) | Test Loss | Accuracy | Precision | Recall | F1 | IoU | OP |
---|---|---|---|---|---|---|---|---|
E + F | 18.65 | 0.0156 | 0.9941 | 0.9391 | 0.9201 | 0.9295 | 0.8684 | 4.6512 |
E + F + M | 25.27 | 0.0202 | 0.994 | 0.9442 | 0.9122 | 0.9279 | 0.8655 | 4.6438 |
E + ES + F + M | 24.95 | 0.0258 | 0.9939 | 0.9606 | 0.8911 | 0.9246 | 0.8597 | 4.6299 |
E + ES + F | 19.63 | 0.0215 | 0.9938 | 0.959 | 0.89 | 0.9232 | 0.8574 | 4.6234 |
ES + F + M | 21.24 | 0.0215 | 0.9935 | 0.9538 | 0.8891 | 0.9203 | 0.8524 | 4.6105 |
E + F + I + M | 25.05 | 0.0258 | 0.9936 | 0.9628 | 0.8814 | 0.9203 | 0.8524 | 4.6091 |
E + M | 19.51 | 0.0184 | 0.9935 | 0.9553 | 0.8869 | 0.9199 | 0.8516 | 4.6072 |
ES + F | 17.23 | 0.0176 | 0.9933 | 0.9511 | 0.8871 | 0.918 | 0.8485 | 4.5983 |
E + F + I | 20.76 | 0.0215 | 0.9934 | 0.9623 | 0.877 | 0.9177 | 0.8479 | 4.598 |
E + ES + F + I + M | 25.76 | 0.0337 | 0.9932 | 0.9704 | 0.8659 | 0.9151 | 0.8436 | 4.5882 |
F + I + M | 21.56 | 0.0228 | 0.9928 | 0.9538 | 0.8726 | 0.9114 | 0.8372 | 4.5678 |
F + M | 20.73 | 0.0203 | 0.9925 | 0.9083 | 0.9139 | 0.9111 | 0.8367 | 4.5666 |
E + ES + M | 22.44 | 0.0265 | 0.9929 | 0.9673 | 0.8601 | 0.9105 | 0.8358 | 4.5646 |
E | 15.29 | 0.0173 | 0.9927 | 0.9437 | 0.8794 | 0.9105 | 0.8356 | 4.5625 |
E + ES + F + I | 20.67 | 0.03 | 0.9928 | 0.9698 | 0.8571 | 0.91 | 0.8349 | 4.5619 |
ES + F + I + M | 23.18 | 0.029 | 0.9928 | 0.9694 | 0.856 | 0.9092 | 0.8335 | 4.5609 |
F + I | 16.98 | 0.0199 | 0.9922 | 0.9503 | 0.8591 | 0.9024 | 0.8222 | 4.5281 |
E + I + M | 19.28 | 0.0271 | 0.9923 | 0.9712 | 0.8416 | 0.9018 | 0.8212 | 4.5262 |
ES + F + I | 18.09 | 0.0259 | 0.9921 | 0.9694 | 0.8394 | 0.8997 | 0.8177 | 4.5183 |
ES + M | 17.15 | 0.0225 | 0.992 | 0.9656 | 0.8409 | 0.899 | 0.8165 | 4.514 |
F | 18.37 | 0.0208 | 0.9912 | 0.8686 | 0.9313 | 0.8989 | 0.8164 | 4.5064 |
M | 16.72 | 0.0199 | 0.9915 | 0.9194 | 0.8762 | 0.8973 | 0.8137 | 4.4988 |
E + ES | 16.31 | 0.0252 | 0.9917 | 0.9612 | 0.8383 | 0.8956 | 0.8109 | 4.4981 |
E + ES + I + M | 20.59 | 0.037 | 0.9918 | 0.9752 | 0.8269 | 0.895 | 0.8099 | 4.4977 |
E + I | 15.86 | 0.0271 | 0.9905 | 0.9662 | 0.803 | 0.8771 | 0.7811 | 4.4179 |
ES + I + M | 17.93 | 0.0338 | 0.9905 | 0.9771 | 0.7937 | 0.8759 | 0.7792 | 4.4164 |
I + M | 17.21 | 0.0257 | 0.9904 | 0.9679 | 0.7982 | 0.8749 | 0.7777 | 4.4091 |
E + ES + I | 17.87 | 0.0373 | 0.9903 | 0.9709 | 0.793 | 0.873 | 0.7746 | 4.4018 |
ES | 16.39 | 0.0272 | 0.9884 | 0.9502 | 0.7659 | 0.8481 | 0.7363 | 4.2889 |
ES + I | 14.51 | 0.0402 | 0.9863 | 0.9684 | 0.6979 | 0.8112 | 0.6824 | 4.1462 |
I | 11.95 | 0.0464 | 0.9786 | 0.9156 | 0.5442 | 0.6826 | 0.5182 | 3.6392 |
Model Combination | Test Time (s) | Test Loss | Accuracy | Precision | Recall | F1 | IoU | OP |
---|---|---|---|---|---|---|---|---|
E + F | 18.65 | 0.0156 | 0.9941 | 0.9391 | 0.9201 | 0.9295 | 0.8684 | 4.6512 |
E | 15.29 | 0.0173 | 0.9927 | 0.9437 | 0.8794 | 0.9105 | 0.8356 | 4.5625 |
U-Net | 20.41 | 0.0173 | 0.9926 | 0.955 | 0.854 | 0.9017 | 0.8209 | 4.5242 |
F | 18.37 | 0.0208 | 0.9912 | 0.8686 | 0.9313 | 0.8989 | 0.8164 | 4.5064 |
M | 16.72 | 0.0199 | 0.9915 | 0.9194 | 0.8762 | 0.8973 | 0.8137 | 4.4988 |
DeepLabV3 | 43.15 | 0.0256 | 0.9912 | 0.9412 | 0.8323 | 0.8834 | 0.7911 | 4.4392 |
ES | 16.39 | 0.0272 | 0.9884 | 0.9502 | 0.7659 | 0.8481 | 0.7363 | 4.2889 |
I | 11.95 | 0.0464 | 0.9786 | 0.9156 | 0.5442 | 0.6826 | 0.5182 | 3.6392 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, J.; He, M. Methodology for Severe Convective Cloud Identification Using Lightweight Neural Network Model Ensembling. Remote Sens. 2024, 16, 2070. https://doi.org/10.3390/rs16122070
Zhang J, He M. Methodology for Severe Convective Cloud Identification Using Lightweight Neural Network Model Ensembling. Remote Sensing. 2024; 16(12):2070. https://doi.org/10.3390/rs16122070
Chicago/Turabian StyleZhang, Jie, and Mingyuan He. 2024. "Methodology for Severe Convective Cloud Identification Using Lightweight Neural Network Model Ensembling" Remote Sensing 16, no. 12: 2070. https://doi.org/10.3390/rs16122070
APA StyleZhang, J., & He, M. (2024). Methodology for Severe Convective Cloud Identification Using Lightweight Neural Network Model Ensembling. Remote Sensing, 16(12), 2070. https://doi.org/10.3390/rs16122070