Plastic Constitutive Training Method for Steel Based on a Recurrent Neural Network
Abstract
:1. Introduction
2. Basic Principles
2.1. The Computational Procedure of the Constitutive Model in the FEM
2.2. Linear Hardening Constitutive Model for Metals
3. Basic Structure and Characteristics of the Recurrent Neural Network
3.1. Original Recurrent Neural Network
3.2. Long Short-Term Memory
3.3. Gated Recurrent Unit
4. Data Generation and Model Training
4.1. Data Preparation
4.1.1. Data Generation
4.1.2. Data Pre-Processing
4.2. Model Training Method
5. Influence Effect and Prediction Results of Model Parameters
5.1. Model Parameter Influence Effect
5.1.1. Influence of Recurrent Neural Network Structure
5.1.2. Training Frequency and Training Batch
5.1.3. Learning Rate of the Optimizer
5.2. Forecast Result
5.2.1. Model Prediction Ability
5.2.2. Model Prediction Effect
6. Conclusions
- (1)
- During data generation and processing, the data pre-processing method of the stress–strain curve prediction problem seriously affects the accuracy of the model. Using the nonlinear square reduction method to process the data to the same order of magnitude can better ensure that gradient explosion and gradient disappearance do not occur during the training of the model. The method of randomly generating a strain sequence and calculating a stress sequence based on a Gaussian random distribution is reasonable, and the generated curve can reflect the corresponding constitutive model.
- (2)
- In terms of model type and structure, RNNs are weak in stress–strain prediction, with a long training time and poor accuracy. The GRU and LSTM are suitable for this kind of problem. The marginal effect of the depth and number of simple stacked neural networks decreases, which is not only unable to effectively improve the model performance but also increases the training time. A two-layer structure can predict such problems. The optimal structures of LSTM and the GRU are A50-100 and B150-150, respectively.
- (3)
- In terms of model parameters and training methods, the accuracy of the model will increase with the decrease in batch size and the increase in training batches, but the training time will also increase. The optimal batch size and training batches are 60 and 100, respectively. The influence of the learning rate of the model was discussed. Setting the initial learning rate of 0.001 and setting the decay learning rate can improve the accuracy of the model.
- (4)
- The prediction ability of the model is positively correlated with the number of training set data and negatively correlated with the length of the required prediction curve. The GRU is less dependent on datasets than LSTM is, and both models are more accurate for short series’ prediction. In the case of short, medium, and long sequences, the prediction abilities of the GRU are 6.13, 6.7, and 3.3 times those of LSTM, respectively. Moreover, the proposed method can learn the law of two kinds of linear hardening constitutive models well and has a good hysteretic curve prediction ability.
- (5)
- The method proposed in this paper is mainly applicable to the prediction of the linear hardening principal relationship for steel and can predict the lengths of data series with up to 2000 entries. The trained constitutive neural network needs to be combined with a customized material interface if it is to be applied to existing commercial or open-source numerical simulation software, which will be further investigated in future research.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
References
- Wang, C.; Fan, J.S.; Xu, L.Y.; Nie, X. Cyclic Hardening and Softening Behavior of the Low Yield Point Steel: Implementation and Validation. Eng. Struct. 2020, 210, 110220. [Google Scholar] [CrossRef]
- Wang, M.; Ke, X.G. Seismic Design of Widening Flange Connection with Fuses Based on Energy Dissipation. J. Constr. Steel Res. 2020, 170, 106076. [Google Scholar] [CrossRef]
- Xu, L.Y.; Nie, X.; Fan, J.S. Cyclic Behaviour of Low-Yield-Point Steel Shear Panel Dampers. Eng. Struct. 2016, 126, 391–404. [Google Scholar] [CrossRef]
- Spacone, E.; El-Tawil, S. Nonlinear Analysis of Steel-Concrete Composite Structures: State of the Art. J. Struct. Eng. 2004, 130, 159–168. [Google Scholar] [CrossRef]
- Thai, H.T.; Nguyen, T.K.; Lee, S.; Patel, V.I.; Vo, T.P. Review of Nonlinear Analysis and Modeling of Steel and Composite Structures. Int. J. Struct. Stab. Dyn. 2020, 20, 2030003. [Google Scholar] [CrossRef]
- Tang, Z.; Yang, X.; Liu, Q.; Pan, Y.; Kong, L.; Zhuge, H. Elastoplastic Hysteretic Behavior and Constitutive Models of In-Service Structural Steel Considering Fatigue-Induced Pre-Damages. Constr. Build. Mater. 2023, 392, 131912. [Google Scholar] [CrossRef]
- Olivier, G.; Csillag, F.; Christoforidou, A.; Tromp, L.; Veltkamp, M.; Pavlovic, M. Feasibility of Bolted Connectors in Hybrid FRP-Steel Structures. Constr. Build. Mater. 2023, 383, 131100. [Google Scholar] [CrossRef]
- Wang, Y.-Z.; Kanvinde, A.; Li, G.-Q.; Wang, Y.-B. A New Constitutive Model for High Strength Structural Steels. J. Constr. Steel Res. 2021, 182, 106646. [Google Scholar] [CrossRef]
- Zhao, G.; Liu, J.; Meng, S.; Liu, C.; Wang, Q. Performance of Corrugated Steel Plate Flange Joint under Combined Compression and Bending: Experimental and Numerical Investigations. Constr. Build. Mater. 2023, 389, 131798. [Google Scholar] [CrossRef]
- Alom, M.Z.; Taha, T.M.; Yakopcic, C.; Westberg, S.; Sidike, P.; Nasrin, M.S.; Hasan, M.; Van Essen, B.C.; Awwal, A.A.S.; Asari, V.K. A State-of-the-Art Survey on Deep Learning Theory and Architectures. Electronics 2019, 8, 292. [Google Scholar] [CrossRef]
- Zhang, W.G.; Li, H.R.; Li, Y.Q.; Liu, H.L.; Chen, Y.M.; Ding, X.M. Application of Deep Learning Algorithms in Geotechnical Engineering: A Short Critical Review. Artif. Intell. Rev. 2021, 54, 5633–5673. [Google Scholar] [CrossRef]
- Liu, D.P.; Yang, H.; Elkhodary, K.I.; Tang, S.; Guo, X. Cyclic Softening in Nonlocal Shells-A Data-Driven Graph-Gradient Plasticity Approach. Extreme Mech. Lett. 2023, 60, 101995. [Google Scholar] [CrossRef]
- Liu, D.P.; Yang, H.; Elkhodary, K.I.; Tang, S.; Liu, W.K.; Guo, X. Mechanistically Informed Data-Driven Modeling of Cyclic Plasticity via Artificial Neural Networks. Comput. Methods Appl. Mech. Eng. 2022, 393, 114766. [Google Scholar] [CrossRef]
- Ghaboussi, J.; Sidarta, D.E. New Nested Adaptive Neural Networks (NANN) for Constitutive Modeling. Comput. Geotech. 1998, 22, 29–52. [Google Scholar] [CrossRef]
- Ghaboussi, J.; Garrett, J.H.; Wu, X. Knowledge-Based Modeling of Material Behavior with Neural Networks. J. Eng. Mech. 1991, 117, 132–153. [Google Scholar] [CrossRef]
- Mozaffar, M.; Bostanabad, R.; Chen, W.; Ehmann, K.; Cao, J.; Bessa, M.A. Deep Learning Predicts Path-Dependent Plasticity. Proc. Natl. Acad. Sci. USA 2019, 116, 26414–26420. [Google Scholar] [CrossRef]
- Zopf, C.; Kaliske, M. Numerical Characterisation of Uncured Elastomers by a Neural Network Based Approach. Comput. Struct. 2017, 182, 504–525. [Google Scholar] [CrossRef]
- Logarzo, H.J.; Capuano, G.; Rimoli, J.J. Smart Constitutive Laws: Inelastic Homogenization through Machine Learning. Comput. Methods Appl. Mech. Eng. 2021, 373, 113482. [Google Scholar] [CrossRef]
- Zhu, H.; Zhang, C.; Chen, S.; Wu, J. A Modified Johnson-Cook Constitutive Model for Structural Steel after Cooling from High Temperature. Constr. Build. Mater. 2022, 340, 127746. [Google Scholar] [CrossRef]
- Hai, L.; Ban, H.; Huang, C.; Shi, Y. Experimental Cyclic Behaviour and Constitutive Modelling of Hot-Rolled Titanium-Clad Bimetallic Steel. Constr. Build. Mater. 2022, 360, 129591. [Google Scholar] [CrossRef]
- Xue, X.; Ding, Z.; Huang, L.; Hua, J.; Wang, N. Residual Monotonic Stress–Strain Property of Q690 High-Strength Steel: Experimental Investigation and Constitutive Model. Constr. Build. Mater. 2023, 392, 132010. [Google Scholar] [CrossRef]
- De Souza Neto, E.A.; Peric, D.; Owen, D.R.J. Computational Methods for Plasticity: Theory and Applications; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
- Georgiopoulos, M.; Li, C.; Kocak, T. Learning in the Feed-Forward Random Neural Network: A Critical Review. Perform. Eval. 2011, 68, 361–384. [Google Scholar] [CrossRef]
- Yu, Y.; Si, X.S.; Hu, C.H.; Zhang, J.X. A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures. Neural Comput. 2019, 31, 1235–1270. [Google Scholar] [CrossRef] [PubMed]
- Gao, S.; Huang, Y.F.; Zhang, S.; Han, J.C.; Wang, G.Q.; Zhang, M.X.; Lin, Q.S. Short-Term Runoff Prediction with GRU and LSTM Networks without Requiring Time Step Optimization during Sample Generation. J. Hydrol. 2020, 589, 125188. [Google Scholar] [CrossRef]
- Wang, C.; Xu, L.Y.; Fan, J.S. A General Deep Learning Framework for History-Dependent Response Prediction Based on UA-Seq2Seq Model. Comput. Methods Appl. Mech. Eng. 2020, 372, 113357. [Google Scholar] [CrossRef]
Parameter Class | Parameter Setting Situation |
---|---|
Type of model | RNN/LSTM/GRU |
Sequence length | 200/400/600/1000/1200/1400/1600/1800/2000 |
Training data size | 500/1000/2000/4000/6000/8000/10,000 |
Network layer number | 1/2/3/4/5 |
Number of neurons | 50/100/150/200/250/300 |
Learning rate | Constant learning rate/Decay learning rate |
Training batch | 0–100 |
Batch size | 10/20/40/60/80/100/120/140/160/180/200 |
Forecast Result | Training Set Size | |||||||
---|---|---|---|---|---|---|---|---|
500 | 1000 | 2000 | 4000 | 6000 | 8000 | 10,000 | ||
Stress–strain scale | 200 | × | — | — | — | √ | √ | √ |
400 | × | × | — | — | √ | √ | √ | |
600 | × | × | × | — | √ | √ | √ | |
1000 | × | × | × | — | — | √ | √ | |
1200 | × | × | × | — | — | — | √ | |
1400 | × | × | × | — | — | — | √ | |
1600 | × | × | × | — | — | — | √ | |
1800 | × | × | × | × | — | — | √ | |
2000 | × | × | × | × | — | — | √ |
Forecast Result | Training Set Size | |||||||
---|---|---|---|---|---|---|---|---|
500 | 1000 | 2000 | 4000 | 6000 | 8000 | 10,000 | ||
Stress–strain scale | 200 | — | — | √ | √ | √ | √ | √ |
400 | — | — | — | √ | √ | √ | √ | |
600 | × | — | — | √ | √ | √ | √ | |
1000 | × | — | — | √ | √ | √ | √ | |
1200 | × | — | — | √ | √ | √ | √ | |
1400 | × | — | — | √ | √ | √ | √ | |
1600 | × | — | — | √ | √ | √ | √ | |
1800 | × | × | — | √ | √ | √ | √ | |
2000 | × | × | — | √ | √ | √ | √ |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, T.; Yu, Y.; Luo, H.; Wang, Z. Plastic Constitutive Training Method for Steel Based on a Recurrent Neural Network. Buildings 2024, 14, 3279. https://doi.org/10.3390/buildings14103279
Wang T, Yu Y, Luo H, Wang Z. Plastic Constitutive Training Method for Steel Based on a Recurrent Neural Network. Buildings. 2024; 14(10):3279. https://doi.org/10.3390/buildings14103279
Chicago/Turabian StyleWang, Tianwei, Yongping Yu, Haisong Luo, and Zhigang Wang. 2024. "Plastic Constitutive Training Method for Steel Based on a Recurrent Neural Network" Buildings 14, no. 10: 3279. https://doi.org/10.3390/buildings14103279
APA StyleWang, T., Yu, Y., Luo, H., & Wang, Z. (2024). Plastic Constitutive Training Method for Steel Based on a Recurrent Neural Network. Buildings, 14(10), 3279. https://doi.org/10.3390/buildings14103279