Two-Phase Flow Pattern Identification by Embedding Double Attention Mechanisms into a Convolutional Neural Network
Abstract
:1. Introduction
2. Methodology
2.1. Convolutional Neural Network--ResNet50
2.2. Attention Mechanism
2.2.1. CBAM Attention Mechanism
2.2.2. ECA Mechanism
2.3. Principle of CBAM-ECA-ResNet50
3. Validation of CBAM-ECA-ResNet50
3.1. Dataset and Data Augmentation
3.2. Model Training
3.3. Results and Model Performance Analysis
- (1)
- (2)
- In annular flow, one image is often mistakenly identified as slug flow. There are many bubbles in this annular flow image, so a similar phase interface appears in slug flow. However, both CBAM-ECA-ResNet50 and ECA-ECA-ResNet50 correctly classify this difficult annular flow image data, and the recall reached 100%. This finding shows that these two models enhance the process of feature extraction and improve the performance of model identification due to the introduction of an attention mechanism and realize the identification and classification of images with higher difficulty.
- (3)
- In contrast experiments, the error classification results are mostly concentrated in slug flow and dense bubbly flow because the slug flow in the dataset has different gas slug sizes and the image may be filled with foggy bubbles of different areas. These results are very similar to the characteristics of dense bubbly flow. The results of Figure 14a show that CBAM-ECA-ResNet50 only misclassifies slug flow and that the recall of slug flow is 98.90%. The other three types of image data are correctly classified, and the recall is 100%. Among the 365 slug flow images, only one image is misclassified as annular flow. The precision of CBAM-ECA-ResNet50 for annular flow is 99.59%, possibly because the gas slug in this slug flow image is too large, resulting in an increase in the void fraction of the pipeline in the instantaneous state and the formation of a long liquid film on the pipeline wall. Thus, the image features are similar to annular flow. In addition, three slug flow images were misclassified as dense bubbly flow. The classification results show that CBAM-ECA-ResNet50 has the best identification performance of gas–liquid two-phase flow pattern images in the comparison model.
- (4)
- Precision, recall and F1 score are important evaluation indicators for the model to identify each label datum. The precision, recall and F1 score of ResNet50-CBAM are the lowest: 97.64%, 98.35% and 97.95%, respectively. The precision, recall and F1 score of CBAM-ECA-ResNet50 are the highest: 99.54%, 98.90% and 99.63%, respectively. In a comprehensive comparison, CBAM-ECA-ResNet50, due to the introduction of the double attention modules, amplifies the important information in the feature map so that the iterative learning process of the model has been further optimized. Both the identification effect of the overall sample and the average classification effect index of specific categories are enhanced. The model has the best comprehensive performance in the comparison model.
3.4. Applicability Analysis
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Dalane, K.; Dai, Z.; Mogseth, G.; Hillestad, M.; Deng, L. Potential applications of membrane separation for subsea natural gas processing: A review. J. Nat. Gas Sci. Eng. 2017, 39, 101–117. [Google Scholar] [CrossRef]
- Kaushik, M.; Kumar, M. An alpha-cut interval based IF-importance measure for intuitionistic fuzzy fault tree analysis of subsea oil and gas production system. Appl. Ocean Res. 2022, 125, 103229. [Google Scholar] [CrossRef]
- Bhardwaj, U.; Teixeira, A.P.; Soares, C.G. Bayesian framework for reliability prediction of subsea processing systems accounting for influencing factors uncertainty. Reliab. Eng. Syst. Saf. 2022, 218, 108143. [Google Scholar] [CrossRef]
- Meribout, M.; Azzi, A.; Ghendour, N.; Kharoua, N.; Khezzar, L.; AlHosani, E. Multiphase Flow Meters Targeting Oil & Gas Industries. Measurement 2020, 165, 108111. [Google Scholar]
- Wang, H.; Xu, Y.; Shi, B.; Zhu, C.; Wang, Z. Optimization and intelligent control for operation parameters of multiphase mixture transportation pipeline in oilfield: A case study. J. Pipeline Sci. Eng. 2021, 1, 367–378. [Google Scholar] [CrossRef]
- Matsubara, H.; Naito, K. Effect of liquid viscosity on flow patterns of gas-liquid two-phase flow in a horizontal pipe. Int. J. Multiph. Flow 2011, 37, 1277–1281. [Google Scholar] [CrossRef]
- Kumar, S.; Kumar, S.; Kumar, I. Internal two-phase flow induced vibrations: A review. Cogent Eng. 2022, 9, 2083472. [Google Scholar]
- Khan, U.; Pao, W.; Sallih, N. A review: Factors affecting internal two-phase flow-induced vibrations. Appl. Sci. 2022, 12, 8406. [Google Scholar] [CrossRef]
- Xue, Y.; Stewart, C.; Kelly, D.; Campbell, D.; Gormley, M. Two-Phase Annular Flow in Vertical Pipes: A Critical Review of Current Research Techniques and Progress. Water 2022, 14, 3496. [Google Scholar] [CrossRef]
- Besagni, G.; Varallo, N.; Mereu, R. Computational Fluid Dynamics Modelling of Two-Phase Bubble Columns: A Comprehensive Review. Fluids 2023, 8, 91. [Google Scholar] [CrossRef]
- Li, L.; Dong, F.; Zhang, S. Adaptive spatio-temporal feature extraction and analysis for horizontal gas-water two-phase flow state prediction. Chem. Eng. Sci. 2023, 268, 118434. [Google Scholar] [CrossRef]
- Tang, T.; Yang, J.; Zheng, J.; Wong, L.; He, S.; Ye, J.; Ou, G.F. Failure analysis and prediction of pipes due to the interaction between multiphase flow and structure. Eng. Failure Anal. 2009, 16, 1749–1756. [Google Scholar] [CrossRef]
- Wiedemann, P.; Doss, A.; Schleicher, E.; Hampel, U. Fuzzy flow pattern identification in horizontal air-water two-phase flow based on wire-mesh sensor data. Int. J. Multiph. Flow 2019, 117, 153–162. [Google Scholar] [CrossRef]
- Nie, F.; Wang, H.; Song, Q.; Zhao, Y.; Shen, J.; Gong, M. Image identification for two-phase flow patterns based on CNN algorithms. Int. J. Multiph. Flow 2022, 152, 104067. [Google Scholar] [CrossRef]
- Quero, G.; Mascagni, P.; Kolbinger, F.R.; Fiorillo, C.; De Sio, D.; Longo, F.; Schena, C.A.; Laterza, V.; Rosa, F.; Menghi, R.; et al. Artificial Intelligence in Colorectal Cancer Surgery: Present and Future Perspectives. Cancers 2022, 14, 3803. [Google Scholar] [CrossRef]
- Guo, J.; Bai, L.; Yu, Z.; Zhao, Z.; Wan, B. An ai-application-oriented in-class teaching evaluation model by using statistical modeling and ensemble learning. Sensors 2021, 21, 241. [Google Scholar] [CrossRef]
- Xie, J.; Zheng, Y.; Du, R.; Xiong, W.; Cao, Y.; Ma, Z.; Cao, D.; Guo, J. Deep Learning-Based Computer Vision for Surveillance in ITS: Evaluation of State-of-the-Art Methods. IEEE Trans. Veh. Technol. 2021, 70, 3027–3042. [Google Scholar] [CrossRef]
- Huang, Y.; Li, D.; Niu, H.; Conte, D. Visual identification of oscillatory two-phase flow with complex flow patterns. Measurement 2021, 186, 110148. [Google Scholar] [CrossRef]
- Du, M.; Yin, H.; Chen, X.; Wang, X. Oil-in-water two-phase flow pattern identification from experimental snapshots using convolutional neural network. IEEE Access 2019, 7, 6219–6225. [Google Scholar] [CrossRef]
- Zhang, L.; Wang, H.; He, Y.; Cui, Z. Two-Phase Flow Feature Extraction and Regime Identification in Horizontal Pipe. In Proceedings of the 2008 7th World Congress on Intelligent Control and Automation, Chongqing, China, 25–27 June 2008. [Google Scholar]
- Qi, G.; Dong, F.; Xu, Y.; Wu, M.; Hu, J. Gas/liquid two-phase flow regime identification in horizontal pipe using support vector machines. In Proceedings of the 2005 4th International Conference on Machine Learning and Cybernetics, Canton, China, 18–21 August 2005. [Google Scholar]
- Saito, Y.; Torisaki, S.; Miwa, S. Two-phase flow regime identification using fluctuating force signals under machine learning techniques. In Proceedings of the 2018 26th International Conference on Nuclear Engineering (ICONE), London, UK, 22–26 July 2018. [Google Scholar]
- Currie, G. Intelligent Imaging: Anatomy of Machine Learning and Deep Learning. J. Nucl. Med. Technol. 2019, 47, 273–281. [Google Scholar] [CrossRef]
- Dong, S.; Xia, Y.; Peng, T. Traffic identification model based on generative adversarial deep convolutional network. Ann. Telecommun. 2022, 77, 573–587. [Google Scholar] [CrossRef]
- Wang, H.; Hou, J.; Chen, N. A Survey of Vehicle Re-Identification Based on Deep Learning. IEEE Access 2019, 7, 172443–172469. [Google Scholar] [CrossRef]
- Yuan, C.; Xu, C.; Wang, T.; Liu, F.; Zhao, Z.; Feng, P.; Guo, J. Deep multi-instance learning for end-to-end person re-identification. Multimed. Tools Appl. 2018, 77, 12437–12467. [Google Scholar] [CrossRef]
- Shahin, A.I.; Guo, Y.; Amin, K.M.; Sharawi, A.A. White blood cells identification system based on convolutional deep neural learning networks. Comput. Meth. Program. Biomed. 2019, 168, 69–80. [Google Scholar] [CrossRef] [PubMed]
- Ran, X.; Xue, L.; Zhang, Y.; Liu, Z.; Sang, X.; He, J. Rock Classification from Field Image Patches Analyzed Using a Deep Convolutional Neural Network. Mathematics 2019, 7, 755. [Google Scholar] [CrossRef] [Green Version]
- Cheung, M.; Shi, J.; Wright, O.; Jiang, L.Y.; Liu, X.; Moura, J.M.F. Graph Signal Processing and Deep Learning: Convolution, Pooling, and Topology. IEEE Signal Proc. Magaz. 2020, 37, 139–149. [Google Scholar] [CrossRef]
- Xu, H.; Tang, T. Two-phase flow pattern online monitoring system based on convolutional neural network and transfer learning. Nucl. Eng. Technol. 2022, 54, 4751–4758. [Google Scholar] [CrossRef]
- Li, W.; Song, W.; Yin, G.; Ong, M.C.; Han, F. Flow regime identification in the subsea jumper based on electrical capacitance tomography and convolution neural network. Ocean Eng. 2022, 266, 113152. [Google Scholar] [CrossRef]
- Xu, H.; Tang, H.; Zhang, B.; Liu, Y. Identification of two-phase flow regime in the energy industry based on modified convolutional neural network. Prog. Nucl. Energy 2022, 147, 104191. [Google Scholar] [CrossRef]
- Niu, X.; Gao, Y.; Wang, R.; Du, M. Vertical Oil-in-Water Flow Pattern Identification with Deep CNN-LSTM Network. In Proceedings of the 2020 International Conference on Intelligent Computing and Human-Computer Interaction (ICHCI), Sanya, China, 4–6 December 2020. [Google Scholar]
- OuYang, L.; Jin, N.; Ren, W. A new deep neural network framework with multivariate time series for two-phase flow pattern identification. Exp. Syst. Appl. 2022, 205, 117704. [Google Scholar] [CrossRef]
- Carrasquilla, J.; Melko, R.G. Machine learning phases of matter. Nat. Phys. 2017, 13, 431–434. [Google Scholar] [CrossRef] [Green Version]
- Hu, Y.; Modat, M.; Gibson, E.; Li, W.; Ghavamia, N.; Bonmati, E.; Wang, G.; Bandula, S.; Moore, C.M.; Emberton, M.; et al. Weakly-supervised convolutional neural networks for multimodal image registration. Med. Image Anal. 2018, 49, 1–13. [Google Scholar] [CrossRef]
- Rehman, A.; Naz, S.; Razzak, M.I.; Hameed, I.A. Automatic Visual Features for Writer Identification: A Deep Learning Approach. IEEE Access 2019, 7, 17149–17157. [Google Scholar] [CrossRef]
- Vives-Boix, V.; Ruiz-Fernandez, D. Synaptic metaplasticity for image processing enhancement in convolutional neural networks. Neurocomputing 2021, 462, 534–543. [Google Scholar] [CrossRef]
- Xia, X.; Jiang, S.; Zhou, N.; Cui, J.; Li, X. Groundwater contamination source identification and high-dimensional parameter inversion using residual dense convolutional neural network. J. Hydrol. 2022, 617, 129013. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Yu, H.; Sun, H.; Tao, J.; Qin, C.; Xiao, D.; Jin, Y.; Liu, C. A multi-stage data augmentation and AD-ResNet-based method for EPB utilization factor prediction. Autom. Constr. 2023, 147, 104734. [Google Scholar] [CrossRef]
- Guo, M.; Xu, T.; Liu, J.; Liu, Z.; Jiang, P.; Mu, T.; Zhang, S.; Martin, R.R.; Cheng, M.; Hu, S. Attention mechanisms in computer vision: A survey. Comput. Vis. Media 2022, 8, 331–368. [Google Scholar] [CrossRef]
- Cai, S.; Wang, C.; Ding, J.; Yu, J.; Fan, J. FDAM: Full-dimension attention module for deep convolutional neural networks. Int. J. Multimed. Inf. Retr. 2022, 11, 599–610. [Google Scholar] [CrossRef]
- Huang, J.; Mo, J.; Zhang, J.; Ma, X. A Fiber Vibration Signal Recognition Method Based on CNN-CBAM-LSTM. Appl. Sci. Basel 2022, 12, 8478. [Google Scholar] [CrossRef]
- Woo, S.H.; Park, J.; Lee, J.Y.; Kweon, I.S. CBAM: Convolutional Block Attention Module. In Proceedings of the 15th European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018. [Google Scholar]
- Hu, J.; Shen, L.; Albanie, S.; Sun, G.; Wu, E. Squeeze-and-Excitation Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2011–2023. [Google Scholar] [CrossRef] [Green Version]
- Shu, X.; Chang, F.; Zhang, X.; Shao, C.; Yang, X. ECAU-Net: Efficient channel attention U-Net for fetal ultrasound cerebellum segmentation. Biomed. Signal Proc. Control 2022, 75, 103528. [Google Scholar] [CrossRef]
- Wang, Q.; Wu, B.; Zhu, P.; Li, P.; Zuo, W.; Hu, Q. ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020. [Google Scholar]
- Shi, Y.; Wang, Z.; Du, X.; Ling, G.; Jia, W.; Lu, Y. Research on the membrane fouling diagnosis of MBR membrane module based on ECA-CNN. J. Environ. Chem. Eng. 2022, 10, 107649. [Google Scholar] [CrossRef]
- Lin, X.; Huang, Q.; Huang, W.; Tan, X.; Fang, M.; Ma, L. Single Image Deraining via detail-guided Efficient Channel Attention Network. Comput. Graph. UK 2021, 97, 117–125. [Google Scholar] [CrossRef]
- Shaban, H.; Tavoularis, S. Video: Zorbubbles (Producing flow regimes in air-water flow). In Proceedings of the 68th Annual Meeting of the APS Division of Fluid Dynamics, Boston, MA, USA, 22–24 November 2015. [Google Scholar] [CrossRef]
Part | Layer Name | Output Size | Operator | Channels |
---|---|---|---|---|
1 | Conv | 112 × 112 | conv7 × 7 | 64 |
2 | Max Pooling | 112 × 112 | max pool3 × 3 | - |
3 | CBAM | 112 × 112 | channel attention module spatial attention module | 64 64 |
4 | Conv Stage 1 | 56 × 56 | 64 64 256 | |
5 | Conv Stage 2 | 28 × 28 | 128 128 512 | |
6 | Conv Stage 3 | 14 × 14 | 256 256 1024 | |
7 | Conv Stage 4 | 7 × 7 | 512 512 2048 | |
8 | ECA | 7 × 7 | ECA attention module | 2048 |
9 | Global Average Pooling | 1 × 1 | global average pool | - |
10 | Fully Connected | 1 × 1 | fully connected | 4 |
11 | Output |
Classification | Original Dataset | Training Dataset | Validation Dataset |
---|---|---|---|
Annular flow | 807 | 565 | 242 |
Sparse bubbly flow | 818 | 573 | 245 |
Dense bubbly flow | 681 | 477 | 204 |
Slug flow | 1216 | 851 | 365 |
Total | 3522 | 2466 | 1056 |
Parameters | |
---|---|
Optimizer | SGDM |
Momentum | 0.9 |
Weight decay | 0.0001 |
Initial learning rate | 0.1 |
Learning rate decay step | 30/60/90 (Epoch) |
Decay rate | 0.1 |
Batch size | 32 |
Max epoch | 100 |
Model Name | Number of Attention Modules | Simplified Structure of the Model |
---|---|---|
ECA-CBAM-Resnet50 | 2 | |
CBAM-CBAM-ResNet50 | 2 | |
ECA-ECA-ResNet50 | 2 | |
CBAM-Resnet50 | 1 | |
ECA-Resnet50 | 1 | |
Resnet50-CBAM | 1 | |
Resnet50-ECA | 1 |
Models | Image Size | Batch Size | Memory/Mb | Loss | Validation Accuracy/% |
---|---|---|---|---|---|
CBAM-ECA-ResNet50 | 224 | 32 | 3014 | 0.05197 | 99.62 (↑) |
ECA-CBAM-ResNet50 | 224 | 32 | 2993 | 0.11292 | 98.48 (↓) |
CBAM-CBAM-ResNet50 | 224 | 32 | 3020 | 0.11136 | 98.48 (↓) |
ECA-ECA-ResNet50 | 224 | 32 | 2988 | 0.06250 | 99.43 (↑) |
CBAM-ResNet50 | 224 | 32 | 3018 | 0.09436 | 98.67 (↓) |
ECA-ResNet50 | 224 | 32 | 2992 | 0.08481 | 99.15 (↑) |
ResNet50-CBAM | 224 | 32 | 2969 | 0.09603 | 98.67 (↓) |
ResNet50-ECA | 224 | 32 | 2963 | 0.06239 | 99.34 (↑) |
ResNet50 | 224 | 32 | 2967 | 0.10124 | 98.96 |
Models | Precision/% | Recall/% | F1 Score/% | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
AF | SBF | DBF | SF | AVG | AF | SBF | DBF | SF | AVG | AF | SBF | DBF | SF | AVG | |
CBAM-ECA-ResNet50 | 99.59 | 100 | 98.55 | 100 | 99.54 | 100 | 100 | 100 | 98.90 | 99.73 | 99.79 | 100 | 99.27 | 99.45 | 99.63 |
ECA-CBAM-ResNet50 | 98.77 | 100 | 97.06 | 98.07 | 98.48 | 99.59 | 100 | 97.06 | 97.53 | 98.55 | 99.18 | 100 | 97.06 | 97.80 | 98.51 |
CBAM-CBAM-ResNet50 | 99.59 | 100 | 95.26 | 98.61 | 98.37 | 99.17 | 100 | 98.53 | 96.99 | 98.67 | 99.38 | 100 | 96.87 | 97.80 | 98.51 |
ECA-ECA-ResNet50 | 99.59 | 100 | 98.07 | 99.72 | 99.35 | 100 | 100 | 99.51 | 98.63 | 99.54 | 99.79 | 100 | 98.78 | 99.17 | 99.44 |
CBAM-ResNet50 | 100 | 100 | 94.01 | 99.72 | 98.43 | 99.59 | 100 | 100 | 96.44 | 99.01 | 99.79 | 100 | 96.91 | 98.05 | 98.69 |
ECA-ResNet50 | 99.59 | 100 | 97.13 | 99.44 | 99.04 | 99.59 | 100 | 99.51 | 98.08 | 99.30 | 99.59 | 100 | 98.31 | 98.76 | 99.16 |
ResNet50-CBAM | 99.59 | 100 | 91.82 | 99.14 | 97.64 | 99.59 | 100 | 99.02 | 94.79 | 98.35 | 99.59 | 100 | 95.28 | 96.92 | 97.95 |
ResNet50-ECA | 99.59 | 100 | 98.07 | 99.45 | 99.28 | 99.59 | 100 | 99.51 | 98.63 | 99.43 | 99.59 | 100 | 98.78 | 99.04 | 99.35 |
ResNet50 | 100 | 100 | 96.19 | 99.17 | 98.84 | 99.59 | 100 | 99.02 | 97.81 | 99.11 | 99.79 | 100 | 97.58 | 98.49 | 98.97 |
Models | Input Shape | Batch Size | Model Size | Flops | Validation Accuracy/% |
---|---|---|---|---|---|
CBAM-ECA-ResNet50 | (224, 224, 3) | 32 | 23.521 M | 4.140 G | 99.62 (↑) |
CBAM-SE-ResNet50 | (224, 224, 3) | 32 | 24.045 M | 4.141 G | 99.43 (↑) |
CBAM-BAM-ResNet50 | (224, 224, 3) | 32 | 24.787 M | 4.159 G | 97.92 (↓) |
CBAM-Shuffle-ResNet50 | (224, 224, 3) | 32 | 23.521 M | 4.140 G | 98.67 (↓) |
CBAM-CoT-ResNet50 | (224, 224, 3) | 32 | 60.250 M | 5.940 G | 99.15 (↑) |
Models | Parameter Quantity and Proportion | ||||||
---|---|---|---|---|---|---|---|
Conv + Max Pooling | Conv Stage 1 | Conv Stage 2 | Conv Stage 3 | Conv Stage 4 | Attention Modules | GAP + FC | |
CBAM-ECA-ResNet50 | 0.009 M | 0.216 M | 1.220 M | 7.098 M | 14.965 M | 0.005 M | 0.008 M |
0.038% | 0.918% | 5.187% | 30.177% | 63.624% | 0.021% | 0.034% | |
CBAM-SE-ResNet50 | 0.009 M | 0.216 M | 1.220 M | 7.098 M | 14.965 M | 0.529 M | 0.008 M |
0.037% | 0.898% | 5.074% | 29.520% | 62.237% | 2.200% | 0.033% | |
CBAM-BAM-ResNet50 | 0.009 M | 0.216 M | 1.220 M | 7.098 M | 14.965 M | 1.271 M | 0.008 M |
0.036% | 0.871% | 4.922% | 28.636% | 60.374% | 5.128% | 0.032% | |
CBAM-Shuffle-ResNet50 | 0.009 M | 0.216 M | 1.220 M | 7.098 M | 14.965 M | 0.005 M | 0.008 M |
0.038% | 0.918% | 5.187% | 30.177% | 63.624% | 0.021% | 0.034% | |
CBAM-CoT-ResNet50 | 0.009 M | 0.216 M | 1.22 M | 7.098 M | 14.965 M | 36.734 M | 0.008 M |
0.015% | 0.359% | 2.025% | 11.781% | 24.838% | 60.969% | 0.013% |
Models | Flops and Proportion | ||||||
---|---|---|---|---|---|---|---|
Conv + Max Pooling | Conv Stage 1 | Conv Stage 2 | Conv Stage 3 | Conv Stage 4 | Attention Modules | GAP + FC | |
CBAM-ECA-ResNet50 | 0.122 G | 0.680 G | 1.037 G | 1.471 G | 0.811 G | 0.015 G | 0.004 G |
2.947% | 16.425% | 25.048% | 35.531% | 19.589% | 0.362% | 0.097% | |
CBAM-SE-ResNet50 | 0.122 G | 0.680 G | 1.037 G | 1.471 G | 0.811 G | 0.016 G | 0.004 G |
2.946% | 16.421% | 25.042% | 35.523% | 19.585% | 0.386% | 0.097% | |
CBAM-BAM-ResNet50 | 0.122 G | 0.680 G | 1.037 G | 1.471 G | 0.811 G | 0.034 G | 0.004 G |
2.933% | 16.350% | 24.934% | 35.369% | 19.500% | 0.818% | 0.096% | |
CBAM-Shuffle-ResNet50 | 0.122 G | 0.680 G | 1.037 G | 1.471 G | 0.811 G | 0.015 G | 0.004 G |
2.947% | 16.425% | 25.048% | 35.531% | 19.589% | 0.362% | 0.097% | |
CBAM-CoT-ResNet50 | 0.122 G | 0.680 G | 1.037 G | 1.471 G | 0.811 G | 1.815 G | 0.004 G |
2.054% | 11.448% | 17.458% | 24.764% | 13.653% | 30.556% | 0.067% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Qiao, W.; Guo, H.; Huang, E.; Chen, H.; Lian, C. Two-Phase Flow Pattern Identification by Embedding Double Attention Mechanisms into a Convolutional Neural Network. J. Mar. Sci. Eng. 2023, 11, 793. https://doi.org/10.3390/jmse11040793
Qiao W, Guo H, Huang E, Chen H, Lian C. Two-Phase Flow Pattern Identification by Embedding Double Attention Mechanisms into a Convolutional Neural Network. Journal of Marine Science and Engineering. 2023; 11(4):793. https://doi.org/10.3390/jmse11040793
Chicago/Turabian StyleQiao, Weiliang, Hongtongyang Guo, Enze Huang, Haiquan Chen, and Chuanping Lian. 2023. "Two-Phase Flow Pattern Identification by Embedding Double Attention Mechanisms into a Convolutional Neural Network" Journal of Marine Science and Engineering 11, no. 4: 793. https://doi.org/10.3390/jmse11040793
APA StyleQiao, W., Guo, H., Huang, E., Chen, H., & Lian, C. (2023). Two-Phase Flow Pattern Identification by Embedding Double Attention Mechanisms into a Convolutional Neural Network. Journal of Marine Science and Engineering, 11(4), 793. https://doi.org/10.3390/jmse11040793