Future Definition and Extraction of the Blast Furnace 3D Burden Surface Based on Intelligent Algorithms
Abstract
:1. Introduction
2. 3D Burden Surface Feature Parameter Definition
2.1. Phased Array Radar
2.2. 3D Burden Surface Feature Parameter Definition
3. 3D Burden Surface Feature Extraction
3.1. Voxel Feature Extraction by BNVGG
3.2. Point Cloud Feature Extraction
3.2.1. PointNet
- 1.
- It uses the maximum pooling layer as a symmetric function that aggregates information from all points. The high-dimensional features extracted from the point cloud converge through a symmetry function and finally approach the function defined on the point set:Here, h is simulated by a multilayer perceptron (MLP) and combined with the maximum pooling function to simulate g. With the set of h, the network can learn f to capture different properties of the point cloud data.
- 2.
- The structure combines the local and global features. After calculating the global features of the point cloud data, the global features and the local features of each point are connected as a new feature matrix, and then the new features of each point are extracted from the new feature matrix so that the local and global information is considered for each point.
- 3.
- Two federated alignment networks rotate and align each point and feature in the point cloud data. PointNet uses T-net to learn the structure of point cloud data and obtain a spatial transformation matrix. This method can be extended to feature space alignment. We add another T-net network after the convolutional layer to learn the feature structure information of this layer and perform spatial transformation processing on the current high-dimensional features. However, the dimensionality of the features is much higher than that of the point cloud data, which leads to increased computational complexity, making it hard to converge. The orthogonal transformation matrix does not lose information, so PointNet adds the orthogonalization function of the transformation matrix to the final loss function:Here, A is the spatial transformation matrix learned by T-net, which has the effect of aligning features.
3.2.2. PointCNN
- 1.
- Subtract the coordinates of point p from the coordinates of each neighbor point and obtained the relative coordinates of the point.
- 2.
- The location information of neighboring points is converted into feature vectors through an network.
- 3.
- Splice the transformed features with their own features to obtain new features.
- 4.
- Calculate the -matrix through the network, corresponding to the current input.
- 5.
- Use the -matrix to process the feature matrix of a specific order.
- 6.
- Perform convolution operations to obtain the future of the p point set.
Algorithm 1-Conv operator. | |
Input: | |
Output: | ▹ Features “projected”, or “aggregated”, into representative point p |
| ▹ Move to local coordinate system of p |
| ▹ Individually lift each point into dimensional space |
| ▹ Concatenate , and is a matrix |
| ▹ Learn the -transformation matrix |
| ▹ Weight and permute with the learned |
| ▹ Finally, typical convolution between and |
- 1.
- Neighbor sampling: The KNN algorithm is used to find nearest neighbors for each representative point in q, and the interval D sampling is decimated to obtain the nearest neighbor point set matrix.
- 2.
- Decentralization: Subtract the corresponding representative point coordinates from each set of neighbor coordinates to obtain the relative coordinate matrix of the near neighbor points.
- 3.
- Feature splicing: The relative coordinate matrix boosts the dimensions through two fully connected layers and is spliced with the input feature vector set F to obtain a new set of features.
- 4.
- Calculation transformation -matrix: The relative coordinate obtained by step 2 is processed, and the coordinate-related features are extracted to obtain a set of a transformation -matrix.
- 5.
- Conduct transformation -matrix: The features extracted in step 3 are combined with the transformation -matrix to obtain a new feature matrix.
- 6.
- Feature transformation: New features are extracted and elevated to the C dimension.
4. Results
4.1. Feature Parameter Results
4.2. BNVGG Results
4.3. PointNet Results
4.4. PointCNN Results
5. Discussion
- 1.
- The 18 feature parameters based on expert experience can accurately classify some burden surfaces with obvious differences, but they are difficult to classify on similar shapes. The accuracy of the test set was about 65.37%. It defined the shape of the burden surface from the perspective of the data and had a certain reference value.
- 2.
- We extracted the voxel features of the burden surfaces by the optimized BNVGG network. The accuracy of the burden surface recognition was increased (95.67%), and the time of network training was reduced (238 s). The comparative experiment verified the effectiveness of the network structure, which had good identification and classification effects on the burden surface data.
- 3.
- The two neural network models of PointNet (96.5%) and PointCNN (94.4%) can obtain better results when processing point cloud data. PointNet implements the extraction and classification of the global features, while PointCNN realizes the extraction of the global and local features of the burden surface data. However, due to the particularity of the burden surface data, the local feature part of PointCNN has no effect. In the extraction of global features, PointNet’s performance is better, and it is more suitable for the feature extraction and classification of the burden surface.
- Increase the scale and category of the data before training. By increasing data complexity, the advantages and disadvantages of deep neural networks can be reflected.
- Due to the difficulty of obtaining the original data, the generative confrontation network can be used to learn the original data features and generate more authentic simulation data.
- For the three types of features extracted in this manuscript, the feature fusion operation is performed through the method of attention mechanism. The fusion features could be used to test the effect of burden surface recognition.
- Through the effective fusion of the features of burden surfaces and the parameters of furnaces, the matching and fault diagnosis of new burden surface data could be achieved.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Zhang, H.; Sun, S.; Zhang, S. Precise Burden Charging Operation During Iron-Making Process in Blast Furnace. IEEE Access 2021, 9, 45655–45667. [Google Scholar] [CrossRef]
- Zhao, J.; Zuo, H.; Wang, Y.; Wang, J.; Xue, Q. Review of green and low-carbon ironmaking technology. Ironmak. Steelmak. 2020, 47, 296–306. [Google Scholar] [CrossRef]
- Zankl, D.; Schuster, S.; Feger, R.; Stelzer, A.; Scheiblhofer, S.; Schmid, C.M.; Ossberger, G.; Stegfellner, L.; Lengauer, G.; Feilmayr, C.; et al. BLASTDAR—A large radar sensor array system for blast furnace burden surface imaging. IEEE Sens. J. 2015, 15, 5893–5909. [Google Scholar] [CrossRef]
- Xu, T.; Chen, Z.; Jiang, Z.; Huang, J.; Gui, W. A real-time 3D measurement system for the blast furnace burden surface using high-temperature industrial endoscope. Sensors 2020, 20, 869. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- An, J.Q.; Peng, K.; Cao, W.H.; Wu, M. Modeling of high temperature gas flow 3D distribution in BF throat based on the computational fluid dynamics. J. Adv. Comput. Intell. Intell. Inform. 2015, 19, 269–276. [Google Scholar] [CrossRef]
- Li, Y.; Zhang, S.; Zhang, J.; Yin, Y.; Xiao, W.; Zhang, Z. Data-driven multiobjective optimization for burden surface in blast furnace with feedback compensation. IEEE Trans. Ind. Inform. 2019, 16, 2233–2244. [Google Scholar] [CrossRef] [Green Version]
- Shi, Q.; Wu, J.; Ni, Z.; Lv, X.; Ye, F.; Hou, Q.; Chen, X. A blast furnace burden surface deeplearning detection system based on radar spectrum restructured by entropy weight. IEEE Sens. J. 2020, 21, 7928–7939. [Google Scholar] [CrossRef]
- Liu, Y.C. The Law of Blast Furnace. Metall. Ind. Press 2006, 25, 54–57. [Google Scholar]
- Guan, X.; Yin, Y. Multi-point radar detection method to reconstruct the shape of blast furnace material line. Autom. Instrum. 2015, 36, 19–21. [Google Scholar]
- Zhu, Q.; Lü, C.L.; Yin, Y.X.; Chen, X.Z. Burden distribution calculation of bell-less top of blast furnace based on multi-radar data. J. Iron Steel Res. Int. 2013, 20, 33–37. [Google Scholar] [CrossRef]
- Matsuzaki, S. Estimation of stack profile of burden at peripheral zone of blast furnace top. ISIJ Int. 2003, 43, 620–629. [Google Scholar] [CrossRef]
- Li, C.; An, M.; Gao, Z. Research and Practice of Burden Distribution in BF. Iron Steel Beijing 2006, 41, 6. [Google Scholar]
- Zhang, H.; Zhang, S.; Yin, Y.; Zhang, X. Blast furnace material surface feature extraction and cluster analysis. Control. Theory Appl. 2017, 34, 938–946. [Google Scholar]
- Ding, X.; Zhang, X.; Ma, N.; Han, J.; Ding, G.; Sun, J. Repvgg: Making vgg-style convnets great again. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 13733–13742. [Google Scholar]
- Li, Y.; Bu, R.; Sun, M.; Wu, W.; Di, X.; Chen, B. Pointcnn: Convolution on x-transformed points. In Proceedings of the Advances in Neural Information Processing Systems, Montréal, QC, Canada, 3–8 December 2018; p. 31. [Google Scholar]
- Wei, J.D.; Chen, X.Z. Blast furnace gas flow strength prediction using FMCW radar. Trans. Iron Steel Inst. Jpn. 2015, 55, 600–604. [Google Scholar] [CrossRef]
- Kundu, C.; Patra, D.; Patra, P.; Tudu, B. Burden Profile Measurement System for Blast Furnaces Using Phased Array Radar. Int. J. Recent Eng. Res. Dev. 2021, 6, 24–47. [Google Scholar]
- Sun, S.S.; Yu, Z.J.; Zhang, S.; Xiao, W.D.; Yang, Y.L. Reconstruction and classification of 3D burden surfaces based on two model drived data fusion. Expert Syst. Appl. 2022, 119406. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Santurkar, S.; Tsipras, D.; Ilyas, A.; Madry, A. How does batch normalization help optimization? In Proceedings of the Advances in Neural Information Processing Systems, Montréal, QC, Canada, 3–8 December 2018; p. 31. [Google Scholar]
- Charles, R.Q.; Su, H.; Kaichun, M.; Guibas, L.J. Pointnet: Deep learning on point sets for 3d classification and segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 652–660. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef] [Green Version]
- Esfe, M.H.; Eftekhari, S.A.; Hekmatifar, M.; Toghraie, D. A well-trained artificial neural network for predicting the rheological behavior of MWCNT–Al2O3 (30–70%)/oil SAE40 hybrid nanofluid. Sci. Rep. 2021, 11, 17696. [Google Scholar] [CrossRef] [PubMed]
- Qing, H.; Hamedi, S.; Eftekhari, S.A.; Alizadeh, S.M.; Toghraie, D.; Hekmatifar, M.; Ahmed, A.N.; Khan, A. A well-trained feed-forward perceptron Artificial Neural Network (ANN) for prediction the dynamic viscosity of Al2O3–MWCNT (40:60)-Oil SAE50 hybrid nano-lubricant at different volume fraction of nanoparticles, temperatures, and shear rates. Int. Commun. Heat Mass Transf. 2021, 128, 105624. [Google Scholar] [CrossRef]
Category | Normal Surface | Surface Center That Is Too High |
---|---|---|
Highest point of the central coke area | (−0.0149, 0.0603, 8.3764) | (−0.0078, 0.0371, 8.8423) |
Center coke area width | 4.4337 | 3.5887 |
Average height of central coke area | 7.3073 | 8.0187 |
Lower funnel center | (−0.1030, −0.0579) | (−0.3332, 0.0131) |
Average height of the lower part of the funnel | 6.2381 | 7.1951 |
Funnel width | 7.0114 | 5.7969 |
Upper funnel center | (−0.0665, −0.1135) | (−0.0396, −0.0030) |
Average height of the upper part of the funnel | 7.1065 | 8.0944 |
Average depth of funnel | 0.8684 | 0.8993 |
Outer edge center | (0.0090, 0.1101) | (−0.0263, −0.0171) |
Outer edge radius of burden surface | 6.1137 | 6.1334 |
Average height of the outer edge of the burden surface | 5.986 | 6.0198 |
Average width of the platform | 2.6135 | 3.235 |
Category | Oversized funnel | Oversized central coke area |
Highest point of the central coke area | (−0.0767, −0.0379, 7.3416) | (−0.0232, −0.0657, 8.8423) |
Center coke area width | 4.1924 | 7.9719 |
Average height of central coke area | 6.6158 | 7.1492 |
Lower funnel center | (−0.0232, −0.0657) | (−0.0109, 0.1026) |
Average height of the lower part of the funnel | 5.89 | 5.8666 |
Funnel width | 11.1124 | 9.9707 |
Upper funnel center | (0.0014, −0.0049) | (0.4436, −0.0149) |
Average height of the upper part of the funnel | 8.2377 | 6.0718 |
Average depth of funnel | 2.3477 | 1.8051 |
Outer edge center | (−0.0514, −0.0299) | (−0.0299, −0.0350) |
Outer edge radius of burden surface | 6.103 | 6.0781 |
Average height of the outer edge of the burden surface | 8.0259 | 7.9719 |
Average width of the platform | 0.5483 | 1.1433 |
Metric | Train Accuracy | Test Accuracy | Loss | Run Time (s) |
---|---|---|---|---|
VGG16 | 80.00 | 71.67 | 0.4555 | 259 |
VGG16 | 96.00 | 95.25 | 0.0088 | 792 |
ResNET [22] | 96.88 | 94.00 | 0.1087 | 36 |
AlexNET [23] | 99.22 | 94.11 | 0.0423 | 133 |
BNVGG | 99.22 | 95.67 | 0.0016 | 238 |
Algorithm | Feature Parameters | BNVGG | PointNet | PointCNN |
---|---|---|---|---|
Train Accuracy | 73.0 | 99.2 | 99.7 | 99.5 |
Test Accuracy | 65.4 | 95.7 | 96.5 | 94.4 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sun, S.; Yu, Z.; Zhang, S.; Xiao, W. Future Definition and Extraction of the Blast Furnace 3D Burden Surface Based on Intelligent Algorithms. Appl. Sci. 2022, 12, 12860. https://doi.org/10.3390/app122412860
Sun S, Yu Z, Zhang S, Xiao W. Future Definition and Extraction of the Blast Furnace 3D Burden Surface Based on Intelligent Algorithms. Applied Sciences. 2022; 12(24):12860. https://doi.org/10.3390/app122412860
Chicago/Turabian StyleSun, Shaolun, Zejun Yu, Sen Zhang, and Wendong Xiao. 2022. "Future Definition and Extraction of the Blast Furnace 3D Burden Surface Based on Intelligent Algorithms" Applied Sciences 12, no. 24: 12860. https://doi.org/10.3390/app122412860
APA StyleSun, S., Yu, Z., Zhang, S., & Xiao, W. (2022). Future Definition and Extraction of the Blast Furnace 3D Burden Surface Based on Intelligent Algorithms. Applied Sciences, 12(24), 12860. https://doi.org/10.3390/app122412860