Next Article in Journal
Computational Analysis of Strain-Induced Effects on the Dynamic Properties of C60 in Fullerite
Next Article in Special Issue
Advances in Sustainable Concrete System
Previous Article in Journal
Microstructure and Fatigue Properties of Resistance Element Welded Joints of DP500 Steel and AW 5754 H22 Aluminum Alloy
Previous Article in Special Issue
Effect of Synthetic Quadripolymer on Rheological and Filtration Properties of Bentonite-Free Drilling Fluid at High Temperature
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Interpretable Machine Learning Models for Punching Shear Strength Estimation of FRP Reinforced Concrete Slabs

School of Civil Engineering and Architecture, Zhejiang Sci-Tech University, Hangzhou 310018, China
*
Author to whom correspondence should be addressed.
Crystals 2022, 12(2), 259; https://doi.org/10.3390/cryst12020259
Submission received: 26 January 2022 / Revised: 11 February 2022 / Accepted: 13 February 2022 / Published: 14 February 2022
(This article belongs to the Special Issue Advances in Sustainable Concrete System)

Abstract

:
Fiber reinforced polymer (FRP) serves as a prospective alternative to reinforcement in concrete slabs. However, similarly to traditional reinforced concrete slabs, FRP reinforced concrete slabs are susceptible to punching shear failure. Accounts of the insufficient consideration of impact factors, existing empirical models and design provisions for punching strength of FRP reinforced concrete slabs have some problems such as high bias and variance. This study established machine learning-based models to accurately predict the punching shear strength of FRP reinforced concrete slabs. A database of 121 groups of experimental results of FRP reinforced concrete slabs are collected from a literature review. Several machine learning algorithms, such as artificial neural network, support vector machine, decision tree, and adaptive boosting, are selected to build models and compare the performance between them. To demonstrate the predicted accuracy of machine learning, this paper also introduces 6 empirical models and design codes for comparative analysis. The comparative results demonstrate that adaptive boosting has the highest predicted precision, in which the root mean squared error, mean absolute error and coefficient of determination of which are 29.83, 23.00 and 0.99, respectively. GB 50010-2010 (2015) has the best predicted performance among these empirical models and design codes, and ACI 318-19 has the similar result. In addition, among these empirical models, the model proposed by El-Ghandour et al. (1999) has the highest predicted accuracy. According to the results obtained above, SHapley Additive exPlanation (SHAP) is adopted to illustrate the predicted process of AdaBoost. SHAP not only provides global and individual interpretations, but also carries out feature dependency analysis for each input variable. The interpretation results of the model reflect the importance and contribution of the factors that influence the punching shear strength in the machine learning model.

1. Introduction

Reinforced concrete slabs are one of the common horizontal load-carrying members in civil engineering, and widely applied in bridges, ports and hydro-structures. Since there is no beam in the flat slab under longitudinal load, the punching failure of reinforced concrete slab occurred easily. Many researchers have carried out numerous experiments on the punching shear resistance of reinforced concrete slabs, and obtained successful results [1,2,3]. However, steel bars are prone to corrosion, which will result in the shortening of the actual service life [4]. In recent years, with the development of technology, the durability of structure has attracted people’s attention. For coastal areas and the areas of using chlorides such as deicing salt, the actual service life of structures is often much lower than their design service life, resulting in massive losses [5].
Fiber reinforced polymer (FRP) is a material which has many advantages such as light, high strength and corrosion resistance. In a corrosion environment, to solve the problem of short actual service life of structure, FRP bar can be applied as an alternative to steel bars in concrete structures. This is because FRP bar can be appropriate for service demands of concrete structure in severe environments which contain plenty of chloride ion and sulfate ion. However, FRP bar has a low elastic modulus, which will result in FRP reinforced concrete slabs being more prone to punching failure than reinforced slabs (Figure 1). Many experimental studies show that [6,7,8,9,10,11] under the same conditions, the punching shear strength and stiffness of the cracked FRP reinforced concrete slabs decrease faster than reinforced concrete slabs. Matthys and Taerwe [6] concluded that the crack development of slabs and brittleness of punching failure were significantly affected by the bond performance of FRP grid reinforcement through experimental results. The experimental results of Ospina et al. [7] indicated that the bond behavior between FRP bar and concrete have a significant impact on punching shear capacity of FRP reinforced concrete slabs. It is shown that the slab thickness and the concrete compressive strength were of considerable influence on the punching shear capacities of FRP reinforced concrete slabs by Bouguerra et al. [9]. Ramzy et al. [12] reported that the punching shear behavior of flat slab was related to the size effect of construction and the Young’s modulus of FRP reinforcement.
In terms of theoretical models, most of the punching shear strength computational formulas of FRP reinforced concrete slabs were derived from traditional reinforced concrete flat with modifications to account for FRP [13]. Some current design specifications such as the GB 50010-2010 [14] and the ACI 318-14 [15] regard the eccentric shear stress model as theoretical basis. Based on the ACI 318-11, El-Ghandour et al. [16] suggested that one considers the impact of elastic modulus of FRP bars, and come up with an improved equation for punching shear strength. After that, El-Ghandour et al. [17] used the same method to modify the design formula of the BS 8110-97. Matthys and Taerwe [6] considered the effect of the equivalent reinforcement ratio, and proposed a modification of the BS 8110-97. On this basis, Ospina et al. [7] proposed an empirical model for computing the punching shear strength of FRP slabs by modifying the relation between the punching shear capacity and the equivalent reinforcement ratio. On the basis of the probability of exceedance, Ju et al. [18] proposed a new approach for analyzing the punching shear strength of FRP reinforced two-way concrete slabs by using Monte Carlo simulations. However, the aforementioned empirical models adopted some simplifications during theoretical derivations, thus the empirical models were unable to consider all of the influential factors. What’s more, the parameters in the aforementioned empirical models were determined by the traditional regression analyses from experimental results. Therefore, the accuracy of the models is highly dependent on the choices of theoretical models and quality of the databases.
In recent years, with the development of artificial intelligence, some algorithms with data at the core have emerged [19]. Among these algorithms, machine learning has received remarkable attention of researchers, and there have been many successful examples [20,21,22,23,24]. In structure engineering, Hoang et al. [25] constructed machine learning based alternatives for estimating the punching shear capacity of steel fiber reinforced concrete (SFRC) flat slabs. Hoang et al. [26] presented the development of an ensemble machine learning model to predict the punching shear resistance of R/C interior slabs. Mangalathu et al. [27] build an explainable machine learning model to predict the punching shear strength of flat slabs without transverse reinforcement. In addition, some researchers [28,29] even use the atomistic simulations as the input parameters of machine learning to predict the performance of materials and structures, which has also seen success. To the best of the authors’ knowledge, however, no study examined the interpretable machine learning models in predicting the punching shear strength of FRP reinforced concrete slabs. Therefore, this study intends to fill this research gap. The deep relationship between material properties and punching shear strength will be found if the machine learning research on FRP reinforced concrete slabs is carried out. It is helpful to refine the traditional empirical models and design codes and make them more accurate than before. In this paper, an experimental database for the punching shear strength of FRP reinforced concrete slabs is first compiled, and used for training, validating, and testing machine learning models.
Four machine learning algorithms, namely artificial neural network (ANN), support vector machine (SVM), decision tree (DT) and adaptive boosting (AdaBoost), are employed for punching shear strength prediction. Then, by comparing the performance of machine learning models, their efficiency and accuracy can be determined. These comparisons are valuable for identifying the efficiency and prediction ability of the machine learning models. However, there is problems which need to be solved; for example, the values of hyper-parameters in machine learning are often difficult to determine. Therefore, to assist the improvement of the predicted performance of machine learning models, optimization methods such as particle swarm optimization (PSO) and empirical method will be used. In addition, since previous models based on machine learning found it difficult to explain the predicted mechanism, the predicted results of models were somehow unconvincing [27]. This paper uses SHapley Additive exPlanation (SHAP) [30] to explain the predicted result of model. Distinct from other machine learning papers explained by SHAP, the kernel explainer will be used in this paper, which is appropriate for all the machine learning models. Not only can SHAP carry out analysis of feature importance for input factors, it can also determine whether the impact of input features on predictions is positive or negative. In addition, SHAP can make researchers realize the interrelation between input features, and how each input feature will influence the final predicted value for a single sample. Consequently, the emergence of SHAP renders the predicted results of machine learning more convincing than before.

2. Experimental Database of FRP Reinforced Concrete Slab

To ensure the accuracy of predicted results, a database with adequate samples is required for model training. In this paper, 121 groups of experimental results [6,7,8,9,10,12,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49] of FRP reinforced concrete slabs under punching shear tests were collected, and made it into a database. In the data set, 80% of the data was used as a training set (10% of this was used as validation set), and remaining 20% were used as test set. In this study, 6 critical parameters are selected to characterize the punching shear strength of FRP slabs such as the types of column section, cross-section area of column (A/cm2), slab’s effective depth (d/mm), compressive strength of concrete (f’c/Mpa), Young’s modulus of FRP reinforcement (Ef/Gpa), and reinforcement ratio (ρf/%). According to the findings of Ju et al. [18], these 6 parameters have a great influence on punching shear strength of FRP slabs. Therefore, these 6 parameters are considered as input parameters and the punching shear strength of FRP slabs is considered as an output parameter of machine learning algorithms. To simplify the names of the 6 input parameters and the 1 output parameter, we used x1 to x6 and y to represent them, respectively. The column section has 3 types: square (x1 = 1), circle (x1 = 2), rectangle (x1 = 3). The distribution of the dataset is shown as Table 1 and Figure 2, and the detailed information is shown as Appendix A.
Before starting the machine learning procedure, the data of input variables should be preprocessed. In this paper, we used deviation standardization as the preprocessing method; this can be written as:
x i * = x i min 1 j m x j max 1 j m x j min 1 j m x j
where x is the sample before feature scaling; x* is the sample after feature scaling; m is the total number of samples. Since the sample x contains features of several input parameters, it can also be named the feature vector.

3. Machine Learning Algorithms

Generally, the implementation of machine learning has four stages: (a) divide the database into training set and test set; (b) apply the training set to model training; (c) check whether the accuracy requirements are met; d) output the predicted model for test or adjust the values of hyper-parameters. The flowchart for this procedure is shown as Figure 3. A total of 4 machine learning models, namely, ANN, SVM, DT and Adaboost, are selected in this study. As for ANN, an input layer, hidden layers and an output layer are formed for the predicted results, where several neurons are applied. As for SVM, a prediction equation is formed based on the predicted output in which the error should be smaller than a fixed value. As for DT, the whole database is separated into tree-like decisions which are based on one or several input characteristics, and the output is the tree which falls into the data. As for the AdaBoost, a group of weak learners are collected to form a strong learner which achieves better predicted results. Notably, the predicted performance of model has relations with the value of hyper-parameters. Therefore, in this paper, some methods will be selected to better determine the value of hyper-parameters.

3.1. Artificial Neural Network

Artificial neural network (ANN) is a supervised learning algorithm that was originally used to simulate the neural network of human brain [50]. Generally, a simple ANN model consists of an input layer, one or two hidden layers, and an output layer. There are several neurons in each layer of ANN, where the number of neurons in the input layer and output layer depends on the number of input parameters and output parameters, respectively. In this paper, the number of neurons in the input layer and output layer can be seen in Table 1, which equals 6 and 1, respectively.
Except for the output layer, the neurons of each layer needs to be activated by activation function. In ANN, generally, the sigmoid function is used as activation function [51], which expression can be written as:
g z = 1 1 + e z
z = w T x * + b
where b is the bias unit; w denotes the weight vector of sample. According to Rumelhart et al. [52] described, when ANN is used for regression forecasting, the cost function of model as follows:
J w , b = 1 m i = 1 m y pred i y i 2 + C 2 m w 2
where ypred denotes the predicted value of machine learning; C is the regularization coefficient.
Since the cost function represents the error between the actual value and the predicted value, to improve the accuracy of predicted result, a method similar to gradient descent, in order to decrease the value of cost function, is required. The gradient descent is expressed as:
w j : = w j α w j J w , b b : = b α b J w , b
where α is learning rate.
Before starting the training procedure, there are some values of hyper-parameters need to be set. For neuron number of hidden layer, Shen et al. [53] mentioned a method to determine their value range, which method as follows:
i = 0 X C H i > m H = X + O + a H = log 2 X
where X is the number of neurons in the input layer; H is the number of neurons in the hidden layer; O is the number of neurons in the output layer; a is the constant between 1 to 10. According to the Equation (6), it can be ascertained that the range of neurons in hidden layer ranges from 7 to 13.
In order to better compare the performance of models which have different neuron numbers in the hidden layer, 3 indicators are introduced, which names root mean squared error (RMSE), mean absolute error (MAE) and coefficient of determination (R2), which are, respectively, defined as:
RMSE = 1 m i = 1 m y pred i y i 2
MAE = 1 m i = 1 m y pred i y i
R 2 = 1 i = 1 m y pred i y i 2 i = 1 m y i 1 m i = 1 m y i 2
The performance of ANN which have different neuron numbers in hidden layer is shown as Table 2. According to the comparative results, the number of neurons in the hidden layer H is determined as 8. In addition, the values of other hyper-parameters confirmed by manual adjustment are: the number of hidden layers equal 1; the training times equal 5000; the learning rate α equals 5 × 10−5; the regularization coefficient C equals 20.

3.2. Support Vector Machine

Support vector machine (SVM) was originated from statistical learning theory, and it was firstly proposed by Cortes and Vapnik [54] in 1995. When SVM is used for regression analysis, it can also be called support vector regression (SVR). The predicted method of SVR maps the data to high-dimensional space and use hyperplane for fitting, and these data are generally hard to fit into low-dimensional space [55]. The expression of SVR reflect the relation between input factors and predicted value [56], it can be written as:
y pred = w T ϕ x + b
where ϕ(x) means feature vector after mapping. With the same as ANN, SVR also has the cost function to represents the performance of model. The cost function can be written as:
J w , b , ξ i , ξ i * = 1 2 w 2 + C i = 1 m ξ i + ξ i *
where ξ and ξ* are the slack variables, they represent the errors of the sample exceeding the lower and upper bounds of the hyperplane, respectively.
To minimize the value of cost function, Lagrange multiplier method is used for transforming the original problem to its dual problem. The Lagrange multiplier method is formulated in the following form:
L w , b , λ , λ * , ξ , ξ * , μ , μ * = 1 2 w 2 + C i = 1 m ξ i + ξ i * i = 1 m μ i ξ i i = 1 m μ i * ξ i * + i = 1 m λ i y pred i y i ε ξ i + i = 1 m λ i * y i y pred i ε ξ i *
where μ, μ*, λ, λ* denote the Lagrange multiplier; ε is the maximum allowable error. After transforming, the expression of dual problem as follows:
Maximise   i = 1 m y i λ i * λ i ε λ i * + λ i 1 2 i = 1 m j = 1 m λ i * λ i λ j * λ j x i T x j
subjected to i = 1 m λ i * λ i = 0 , 0 λ i , λ i * C .
The Karush-Kuhn-Tucker (KKT) conditions need to be met for Equation (13), which can be written as:
λ i y pred i y i ε ξ i = 0 , λ i * y i y pred i ε ξ i * = 0 , λ i λ i * = 0 , ξ i ξ i * = 0 , C λ i ξ i = 0 , C λ i * ξ i * = 0 .
According to Equations (13) and (14), the weight vector w and bias unit b can be acquired. Finally, the expression of SVR as follows:
y pred = i = 1 m λ i * λ i κ x , x i + b
κ x i , x j = ϕ x i T ϕ x j
where κ(xi, xj) denotes kernel function, which can used to express the inner product of eigenvectors in high-dimensional space. The kernel function has many types, and the radial basis function (RBF) kernel function is selected in this paper, which can be expressed as:
κ x i , x j = exp γ x i x j 2
where γ > 0 is the coefficient of RBF kernel function, which is a hyper-parameter. Except that, in this paper, the values of other hyper-parameters in SVR need to be determined: the maximum allowable error ε, the regularization coefficient C. There are many methods which can be used for numerical adjustment of hyper-parameters, in this paper we have selected particle swarm optimization.
Particle swarm optimization (PSO) is an optimized algorithm inspired by the foraging behavior of bird flock [57,58]. On account of the fact that PSO has a large number of advantages, such as being easily comprehendible and containing, it is usually used for hyper-parameter adjustment in SVM [59,60]. For every particle in PSO, there is two parameters to determine the search result, which are position p and velocity v, respectively. These 2 parameters are defined as [58]:
v d * t + 1 = W v d * t + r 1 C 1 P d * t p d * t + r 2 C 2 G d * t p d * t
p d * t + 1 = p d * t + v d * t + 1
where t is the number of iterations; d* denotes the dimension of search space; W is the inertial coefficient; C1 and C2 are the self-learning factor and the global learning factor; r1 and r2 are random number between (0, 1); P is the location with the best fitness among all the visited locations of each particle; G is the location with the best fitness among all the visited locations of all the particles.
In this paper, some hyper-parameters in Equations (18) and (19) have been confirmed: the maximum of iteration equals 150; the dimension of search space d* equals 3; the inertial coefficient W equals 0.8; the total number of particles equals 40; the self-learning factor C1 and global learning factor C2 are all equal to 2. The hybrid machine learning approach consists of PSO and SVR can be called PSO-SVR, whose best value of hyper-parameters also can be determined by calculating: the maximum allowable error ε equals 10; the regularization coefficient C equals 4238.58; the coefficient of RBF kernel function γ equals 2.88.

3.3. Decision Tree

Decision tree (DT) served as a supervised learning algorithm that is proposed by Quinlan [61]. In the beginning, DT only could be used for conducting the classification forecasting, and its types less such as iterative dichotomiser 3 (ID3) and classifier 4.5 (C4.5). In this paper, classification and regression tree (CART) [62] is selected to make a study of regression prediction. The dividing evidence of CART is the variance of samples, which can be written as:
Minimise x i R 1 y i c 1 2 + x i R 2 y i c 2 2
where R1 and R2 denote the sample set after dividing; c1 and c2 are mean of samples in R1 and R2, respectively. With the division of sample set, gradually, the purity of leaf nodes will improve.
Pruning is a method to avoid the over-fitting in DT, and it contains two basic methods: pre-pruning and post-pruning [63,64]. In this paper, the specific way of pruning is to adjust the maximum of depth, which equals 5.

3.4. Adaptive Boosting

Adaptive boosting (AdaBoost) is one of the ensemble learning algorithms that can combine multiple weak learners into strong learner [65]. To be distinct from aforementioned machine learning algorithms, each sample in AdaBoost has a subsampling weight which will constantly adjust [66]. At the beginning of the training process, the weights of samples need to be initialized [67], which can be written as:
w k i * = 1 m
where w*ki denotes the weight of the i-th sample in the k-th weak learner. The largest error of the k-th weak learner on the training set as follows:
E k = max y i G k x i
where Gk(xi) denotes the predicted value of the i-th sample in the k-th weak learner. The relative error eki of each sample and the error ratio ek of the k-th weak learner can be written as:
e k i = y i G k x i 2 E k 2
e k = i = 1 m w k i e k i
After that, the weight βk of the k-th weak learner can be calculated, the value of which will increase as the error ratio ek decreases. The weight βk can be defined as:
β k = e k 1 e k
Afterwards, the error ratio of the next weak learner can be reckoned, and the weights of samples need to be updated:
w k + 1 , i * = w k i * Z k β k 1 e k i
Z k = i = 1 m w k i β k 1 e k i
where Zk is the normalization factor. Finally, the expression of the strong learner can be written as:
f x = k = 1 K ln 1 β k g x
where K is the total number of weak learners; g(x) is the median of all the βkGk(x). The algorithm of the weak learner needs to be confirmed, and PSO-SVR is chosen in this paper.

3.5. Predicted Results

The result comparison between ANN, PSO-SVR, DT and AdaBoost are provided in Table 3. The four machine learning models shows high predicted accuracy reflected in R2. Especially AdaBoost and PSO-SVR, the predicted accuracy of the former is highest among these four AI models in training set, and the latter is in test set. It is noted that although the forecasting performance of ANN for the test set is better than DT and just slightly lower than PSO-SVR and AdaBoost, the performance for training set is much lower than the other three models.
Figure 4, Figure 5, Figure 6 and Figure 7 depict the scatter distribution of predicted results for each machine learning model. In Figure 4, it can be found that the fitting condition of ANN is not very suitable when the true value is larger than 1400 kN. Maybe this situation is related to insufficient data of punching shear strength larger than 1400 kN. Therefore, it is noted that if we want to use this ANN model for predicting the punching shear strength of FRP reinforced concrete slabs, the value range of the predicted object should be restrained under 1400 kN. In addition, from the figure, it can be seen that ANN focuses more on fitting samples which are located in the interval with dense sample distribution. Consequently, a way to improve the overall predicted accuracy of ANN is to ensure that the samples are evenly distributed. It is demonstrated in Figure 5 that PSO-SVR shows the good predicted performance. Although there some predicted values have a relatively larger error between the actual values, the overall predicted accuracy is higher than ANN and DT. Figure 6 depicts the predictions of DT on the dataset; there doubt that the predicted result of DT for the test set is worse than ANN and PSO-SVR. In training set, the predictions of DT are almost no error with the punching shear strength is larger than 1200 kN, it is related to the predicted mechanism of DT [68]. Moreover, the situation also can be seen from the predicted results of DT on punching shear strength less than 1200 kN, the predicted value of which is same when the punching strength is in the identical range. Meanwhile, in test set, DT has a low predicted accuracy for samples with punching shear strength of around 1200 kN, i.e., the DT model proposed in this paper has the risk of over-fitting when the punching shear strength larger than 1200 kN. Obviously, no matter ANN or DT, the predicted accuracy of them is closely related to the number of samples in each value range. Furthermore, in this paper, it can be concluded that neither ANN nor DT has mined the true intrinsic connection between the data. Figure 7 shows the predicted results of AdaBoost for the training set and the test set, it can be seen that all of the dots are closely distributed around the best fitting line. Since PSO-SVR is chosen as the weak learner of AdaBoost, some wildly inaccurate predicted values are improved, and some comparatively accurate predicted values are retained. Moreover, compare Figure 4 to Figure 7, it is no hard to see that the fitting situation of AdaBoost reveals the best predicting performance.

3.6. Comparison with Traditional Empirical Models

In order to reflect the accuracy of the machine learning calculation results, this paper introduces design specifications and empirical models proposed by researchers. In this study, 3 punching shear strength models are selected from design codes and reviewed: GB 50010-2010 (2015) [14], ACI 318-19 [69] and BS 8110-97 [70]. Most design codes adopt the same function for both FRP slabs and reinforced concrete slabs. The punching shear strength of slabs as a function of the concrete compressive strength. Some design codes also take the reinforcement ratio, size effect, depth of the slab and so on into consideration. Three design specifications and three empirical models are adopted in comparative analysis between calculations. The formulas of selected traditional empirical model are shown in Table 4.
Table 5 summaries experimental results of machine learning models and traditional models. Obviously, the predicted performance of AdaBoost is better than other predicted models reflected in RMSE, MAE and R2. ANN, DT and PSO-SVR, the other three AI models, although their predicted results are not best, they show a better performance compared with traditional empirical models. Among these traditional models, Formula (1) (GB 50010-2010 (2015)) has the highest predicted accuracy, the RMSE, MAE and R2 of which are 150.41, 98.48 and 0.73, respectively. Notably, the predictions between Formulas (1) and (2) only has a little error; perhaps it is related to the fact that they are both derived from the eccentric shear stress model [14,69]. In addition, the forecasting performance of the model proposed by El-Ghandour et al. is better than the other traditional models proposed by other researchers. In brief, the RMSE and MAE of traditional empirical models are generally greater than 100, and the R2 of them ranged from 0.6 to 0.8. However, the RMSE and MAE of machine learning models are less than 100, and the R2 of them are generally greater than 0.9 and even close to 1.0. Consequently, it can be said that the difference in the predicted accuracy between traditional empirical models and machine learning models is obvious.
Figure 8 shows the predicted results of traditional empirical models and machine learning models for the whole data set. To facilitate comparison between different types of models in these figures, the red dots are used to represent the predicted result of national codes. Afterwards, the blue and purple dots are used to represent the predicted results of traditional models proposed by researchers and machine learning models, respectively. It can be found that the predicted error of traditional models is relatively large, and the machine learning models reach the opposite conclusion. The forecasting result of Figure 8a is similar to Figure 8b, and this conclusion is also reflected in Table 5. In addition, although the predicted accuracy of Formulas (1) and (2) is better than other traditional empirical models, their predicted results are still unsafe. On the contrary, the predicted results of Formula (3) to Formula (6) tend to be conservative even though they do not have the best predicted performance. From the predicted performance of the machine learning models for the complete database, it can be found that the predicted results of ANN and PSO-SVR are likely to be unsafe when the punching shear strength smaller than 200 kN. On the side, when the punching shear strength larger than 1200 kN, the forecasting result of ANN is conservative. Furthermore, except formula 4, the predicted accuracy of all the empirical models are better than ANN when the punching strength larger than 1400 kN. Consequently, it can be concluded that ANN is very dependent on samples.

4. Model Interpretations

4.1. Shapley Additive Explanation

In this study, SHapley Additive exPlanation (SHAP) is applied to explain the predicting results given by the AdaBoost model. SHAP originates from game theory and is an additive model, i.e., the output of the model is a linear addition of the input variables [71]. The expression of SHAP can be defined as:
y pred i = y base + j = 1 n f x i j
where ybase is the baseline value of model; n is the total number of input variables; f(xij) is the SHAP value of the xij.
According to Equation (29), it can be concluded that the predicted value of each sample depends on the SHAP value of each feature in the sample. In other words, each feature is a contributor for final predicted value. The SHAP value can be positive or negative, depending on the influence trend of each feature to predicted result. SHAP contains several explainers, in current machine learning papers it often uses the tree explainer to illustrate its models, which is only suitable for a part of machine learning algorithms. Consequently, the kernel explainer will be used to illustrate the machine learning model which has the best predicted performance in this paper.

4.2. Global Interpretations

According to the experimental results obtained above, the predicted results of AdaBoost, which has the best predicted performance, was explained. The predicted results of AdaBoost can be interpreted by SHAP in different ways. Firstly, the feature importance analysis of input parameters is shown in Figure 9a. It can be seen that the importance of each input parameter is non-negligible, that is each input parameter will have a different degree of influence on the predicted results. This is consistent with the existing experimental conclusions [4,72]. According to the analysis results, it can be understood that the SHAP value of each parameter are x1 equals 53.45, x2 equals 44.50, x3 equals 126.36, x4 equals 21.57, x5 equals 17.23, x6 equals 52.96, respectively. Obviously, the slab’s effective depth (x3) has the most critical influence on the punching shear capacity, more than twice as much as types of column section (x1). Moreover, though the Young’s modulus of FRP reinforcement (x5) has the least importance on the predicted results, sometimes ignoring this feature will cause unnecessary errors in the predicted results of model. Consequently, using these 6 features as the input parameters of machine learning algorithm is reasonable.
Figure 9b is a SHAP summary plot of the features, which displays the distribution of the SHAP values of each input variable in each sample and demonstrates the overall influence trends. In the figure, the x-axis is the SHAP value of input parameters, the y-axis is the input parameters, ranked by importance. The points in the figure represent the samples in the dataset and their colors represent the feature value, for which color from blue to red represents the value from small to large. From the figure, it can be seen that a few of SHAP values of samples in x3 are very high, which are around 600. It can be said that in these samples, the value of x3 has decisive effect on the predicted results. In these input variables, the slab’s effective depth (x3), types of column section (x1), reinforcement ratio (x6), cross-section area of column (x2), Young’s modulus of FRP reinforcement (x5) all have the positive influence on the punching shear strength. However, the influence trend of the compressive strength of concrete (x4) on the predicted results is hard to definite, which is very complex.

4.3. Individual Interpretations

Except the global interpretations, SHAP also can provides the individual interpretation for each sample in the dataset. In this study, 2 samples are selected, the predicted process of which are illustrated as Figure 10. In the figure, the gray line represents the baseline value, which equals 403.62, and the colorful lines represents the decisive process of each feature. In Figure 10, it can be seen that from the baseline value, every feature will change the value to a different degree, and finally predicted values 419.27 and 501.00, respectively, will be obtained. In the predicted process of the 2 samples, types of column section (x1), slab’s effective depth (x3), the compressive strength of concrete (x4) have negative influence on predicted results; whereas, the cross-section area of column (x2), Young’s modulus of FRP reinforcement (x5), reinforcement ratio (x6) have positive influence on it. Furthermore, reinforcement ratio (x6) acts as determinant in forecasting process of sample 1, and the cross-section area of column (x2) plays an essential role in sample 2. In summary, both the global interpretations and the individual interpretations give a detailed insight into the process by which each input feature affects the final predicted value.

4.4. Feature Dependency

In addition to explaining the predicted mechanism of model, SHAP can also reveal the interaction between the specific feature and its most closely related feature, which is shown in Figure 11. It can be seen from the Figure that with the values increasing of x1, x2, x3, x6, their SHAP values will also be increasing, but the relationships between x4, x5 and their SHAP values are complicated and nonlinear. As the values of x4, x5 increase, the SHAP values of them will increase or decrease; thus, it is difficult to describe the concrete relationships. However, the one that interacts most closely with x1, x2, x5, x6 is x3, the features most closely related to x3 and x4 are x2 and x6, respectively. According to the results of feature dependency, further experiments can be carried out to analyze the relationship between the two variables that interact most closely, and analyze the collaborative impact of these two variables to output. It might be helpful to improve the punching shear resistance of FRP reinforced concrete slab-column connections, and the traditional empirical models cannot accomplish this because the traditional models, relatively, have the worse predicted performance.

5. Conclusions

In this paper, several machine learning algorithms, ANN, SVR, DT and AdaBoost, are adopted for punching shear strength of FRP reinforced concrete slabs. A total of 121 groups of experimental tests are collected and the test variables are divided into 6 inputs (types of column section, cross-section area of column, slab’s effective depth, compressive strength of concrete, Young’s modulus of FRP reinforcement, reinforcement ratio) and 1 output (punching shear strength). Random subsets of 80% and 20% of the data were used for training and testing, respectively, and 10% of the data in training set were used for validating. In addition, 3 indicators (RMSE, MAE, R2) are introduced to assist the predicted performance. For improving the predicted performance, the empirical method and PSO are utilized for adjusting the values of hyper-parameters in ANN and SVR, respectively. After tuning, both ANN and SVR display the good predicted performance. According to the experimental results, the best performance model is AdaBoost, the RMSE, MAE, R2 of which are 29.83, 23.00, 0.99, respectively. The predicted performance of ANN, DT and PSO-SVR are worse than AdaBoost, but they are all better than the 6 traditional empirical models introduced in this paper. The predicted result of ANN for training set has a relatively large deviation when the punching shear strength exceeding 1400 kN, but DT has almost no error. Among these traditional models, GB 50010-2010 (2015) has the least deviation reflected in RMSE and R2, the values of them are 150.41 and 0.73, respectively. Moreover, the model proposed by El-Ghandour et al. (1999) has the best forecasting performance among three traditional models proposed by researchers.
SHAP is used to interpret the predicted process of AdaBoost with the best performance in this study. On the basis of the result of global interpretations, it can be known that every input variable has the different degrees of influence on predicted results, which demonstrates the rationality of input variables selection. Furthermore, the slab’s effective depth has the highest impact factor on the predicted results, equaling 126.36, more than twice as much as types of column section, which equals 53.45. In addition, the Young’s modulus of FRP reinforcement has the lowest impact factor, equaling 17.23. From the SHAP summary plot, it can be understood that all input variables have positive influence on predicted results except the compressive strength of concrete, it is because its influence trend is relatively fuzzy. According to the individual interpretations, the forecasting process of single sample is shown, which can help researchers to understand the predicted process of model roundly. In addition, SHAP also provides the results of feature dependency, which is helpful to researchers in understanding the relationship between the specific feature and its most closely related feature. According to the analysis results of SHAP, the relationship between material properties and punching shear strength may really be revealed.

Author Contributions

Conceptualization, Y.S. and S.L.; software, Y.S. and J.S.; validation, Y.S. and S.L.; formal analysis, S.L.; writing—original draft preparation, Y.S.; writing—review and editing, S.L.; visualization, Y.S.; supervision, S.L.; project administration, S.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Science Foundation of Zhejiang Province of China, grant number LY22E080016; National Science Foundation of China, grant number 51808499.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

This study is supported by the Engineering Research Centre of Precast Concrete of Zhejiang Province. The help of all members of the Engineering Research Centre is sincerely appreciated. We would also like to express our sincere appreciation to the anonymous referee for valuable suggestions and corrections.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Test Data of FRP Reinforced Concrete Slabs

The sources of the database and the specific material properties are listed in Table A1. The input variables are the types of column section x1, cross-section area of column x2 (A/cm2), slab’s effective depth x3 (d/mm), compressive strength of concrete x4 (f’c/MPa), Young’s modulus of FRP reinforcement x5 (Ef/GPa), and reinforcement ratio x6 (ρf/%), respectively. The output variable is the experimental value of punching shear strength y (V/kN). The column section has three types: square (x1 = 1), circle (x1 = 2), rectangle (x1 = 3). In order to use the experimental data for comparative analysis between models, the concrete compressive strength of columns with different cross-section types (cylinder and cube) was unified. Furthermore, to prevent errors caused by different data precision, retain two significant digits to the decimal point for input parameters except for column section type x1.
Table A1. Database of punching shear strength for FRP reinforced concrete slabs.
Table A1. Database of punching shear strength for FRP reinforced concrete slabs.
ReferenceSpecimenx1x2x3x4x5x6y
Ahmad et al. [31]
(1994)
SN1156.2561.0042.40113.000.9593.00
SN2156.2561.0039.60113.000.9578.00
SN31100.0061.0036.00113.000.9596.00
SN41100.0061.0036.60113.000.9599.00
Banthia et al. [32]
(1995)
278.5455.0041.00100.000.3165.00
278.5455.0052.90100.000.3161.00
Matthys et al. [6]
(2000)
C12176.7296.0036.7091.800.27181.00
C1′2415.4896.0037.3091.800.27189.00
C22176.7295.0035.7095.001.05255.00
C2′2415.4895.0036.3095.001.05273.00
C32176.72126.0033.8092.000.52347.00
C3′2415.48126.0034.3092.000.52343.00
CS2176.7295.0032.60147.000.19142.00
CS’2415.4895.0033.20147.000.19150.00
H12176.7295.00118.0037.300.62207.00
H22176.7289.0035.8040.703.76231.00
H2′250.2789.0035.9040.703.76171.00
H32176.72122.0032.1044.801.22237.00
H3′250.27122.0032.1044.801.22217.00
Khanna et al. [33]
(2000)
131250.00138.0035.0042.002.40756.00
El-Ghandour et al. [34]
(2003)
SG11400.00142.0032.0045.000.18170.00
SC11400.00142.0032.80110.000.15229.00
SG21400.00142.0046.4045.000.38271.00
SG31400.00142.0030.4045.000.38237.00
SC21400.00142.0029.60110.000.35317.00
Ospina et al. [7]
(2003)
GFR-11625.00120.0029.5034.000.73217.00
GFR-21625.00120.0028.9034.001.46260.00
NEF-11625.00120.0037.5028.400.87206.00
Hussein et al. [35]
(2004)
G-S11625.00100.0040.0042.001.18249.00
G-S31625.00100.0029.0042.001.67240.00
G-S41625.00100.0026.0042.000.95210.00
El-Gamal et al. [36]
(2005)
G-S131500.00163.0049.6044.601.00740.00
G-S231500.00159.0044.3038.501.99712.00
G-S331500.00159.0049.2046.501.21732.00
C-S131500.00156.0049.60122.500.35674.00
C-S231500.00165.0044.30122.500.69799.00
Zhang et al. [37]
(2005)
GS21625.00100.0035.0042.001.05218.00
GSHS1625.00100.0071.0042.001.18275.00
Zaghloul [38]
(2007)
ZJF51625.0075.0044.80100.001.33234.00
Ramzy et al. [12]
(2008)
F11400.0082.0037.4046.001.10165.00
F21400.00112.0033.0046.000.81170.00
F31400.0082.0038.2046.001.29210.00
F41400.0082.0039.7046.001.54230.00
Lee et al. [8]
(2009)
GFU11506.25110.0036.3048.201.18222.00
GFB21506.25110.0036.3048.202.15246.00
GFB31506.25110.0036.3048.203.00248.00
Xiao [39]
(2010)
A1225.00130.0022.1645.600.42176.40
B-21225.00130.0032.4845.600.42209.40
B-31225.00130.0032.4045.600.55245.30
B-41225.00130.0032.8045.600.29166.60
B-61225.00130.0033.2045.600.42217.20
B-71225.00130.0028.3245.600.42221.50
C1225.00130.0046.0545.600.42252.50
Bouguerra et al. [9]
(2011)
G-200-N31500.00155.0049.1043.001.20732.00
G-175-N31500.00135.0035.2043.001.20484.00
G-150-N31500.00110.0035.2043.001.20362.00
G-175-H31500.00135.0064.8043.001.20704.00
G-175-N-0.731500.00135.0053.1043.000.70549.00
G-175-N-0.3531500.00137.0053.1043.000.35506.00
C-175-N31500.00140.0040.30122.000.40530.00
Hassan et al. [40]
(2013)
G(0.7)30/201900.00134.0034.3048.200.71329.00
G(1.6)30/201900.00131.5038.6048.101.56431.00
G(0.7)45/2012025.00134.0045.4048.200.71400.00
G(1.6)45/2012025.00131.5032.4048.101.56504.00
G(0.3)30/351900.00284.0034.3048.200.34825.00
G(0.7)30/351900.00284.0039.4048.100.731071.00
G(0.3)45/3512025.00284.0048.6048.200.34911.00
G(0.7)45/3512025.00281.5029.6048.100.731248.00
G(1.6)30/20-H1900.00131.0075.8057.401.56547.00
G(1.2)30/201900.00131.0037.5064.901.21438.00
G(1.6)30/351900.00275.0038.2056.701.611492.00
G(1.6)30/35-H1900.00275.0075.8056.701.611600.00
G(0.7)30/20-B1900.00131.0038.6048.200.73386.00
G(1.6)45/20-B12025.00131.0039.4048.101.56511.00
G(0.3)30/35-B1900.00284.0039.4048.200.34781.00
G(1.6)30/20-B1900.00131.0032.4048.101.56451.00
G(0.3)45/35-B12025.00284.0032.4048.200.341020.00
G(0.7)30/35-B-11900.00281.0029.6048.100.731027.00
G(0.7)30/35-B-21900.00281.0046.7048.100.731195.00
G(0.7)37.5/27.5-B-211406.25209.0032.3048.200.72830.00
Nguyen-Minh et al. [10]
(2013)
GSL-0.41400.00129.0039.0048.000.48180.00
GSL-0.61400.00129.0039.0048.000.68212.00
GSL-0.81400.00129.0039.0048.000.92244.00
Elgabbas et al. [41]
(2016)
S2-B31500.00167.0048.8164.800.70548.30
S3-B31500.00169.0042.2069.300.69664.60
S4-B31500.00167.0042.2064.800.70565.90
S5-B31500.00167.0047.9064.800.99716.40
S6-B31500.00167.0047.9064.800.42575.80
S7-B31500.00167.0047.9064.800.42436.40
Gouda et al. [42,43]
(2016)
GN-0.651900.00160.0042.0068.000.65363.00
GN-0.981900.00160.0038.0068.000.98378.00
GN-1.301900.00160.0039.0068.001.13425.00
GH-0.651900.00160.0070.0068.000.65380.00
G-00-XX1900.00160.0038.0068.000.65421.00
G-30-XX1900.00160.0042.0068.000.65296.00
R-15-XX1900.00160.0040.0063.100.65320.00
Hussein et al. [44]
(2018)
H-1.0-XX1900.00160.0080.0065.000.98461.00
H-1.5-XX1900.00160.0084.0065.001.46541.00
H-2.0-XX1900.00160.0087.0065.001.93604.00
Eladawy et al. [45]
(2019)
G1(1.06)1900.00151.0052.0062.601.06140.00
G2(1.51)1900.00151.0046.0062.601.51140.00
G3(1.06)-SL1900.00151.0046.0062.601.06180.00
Gu [46]
(2020)
A30-11900.0088.0027.4051.101.28191.00
A30-21900.00108.0027.3051.101.05289.00
A30-31900.00138.0026.2051.100.82413.00
A30-411225.0086.0026.8051.101.31209.00
A40-111225.0088.0028.2051.101.28232.00
A40-211225.0088.0026.4054.100.89221.00
A40-31900.0088.0028.6051.101.28236.00
A50-11900.0088.0029.2051.101.28253.00
A50-21900.0090.0032.2054.100.87237.00
A50-311225.0088.0026.7051.101.28280.00
Zhou [47]
(2020)
S40-11900.0088.0032.3051.100.98187.00
S50-11900.0086.0043.2054.400.70134.00
Eladawy et al. [48]
(2020)
G4(1.06)-H1900.00151.0092.0062.601.06140.00
Mohammed et al. [49]
(2021)
0F-60S1625.00125.0038.2050.602.81463.00
0F-80F1625.00125.0038.2050.602.11486.00
0F-110S1625.00125.0038.2050.601.53436.00
1.25F-60S1625.00125.0039.8050.602.81455.00
1.25F-80S1625.00125.0039.8050.602.11506.00
1.25F-110S1625.00125.0039.8050.601.53498.00

References

  1. Hawkins, N.M.; Criswell, M.E.; Roll, F. Shear Strength of Slabs without Shear Reinforcement. ACI. Struct. J. 1974, 42, 677–720. [Google Scholar]
  2. Muttoni, A. Punching shear strength of reinforced concrete slabs without transverse reinforcement. ACI. Struct. J. 2008, 105, 440–450. [Google Scholar]
  3. Shu, J.; Belletti, B.; Muttoni, A.; Scolari, M.; PLoS, M. Internal force distribution in RC slabs subjected to punching shear. Eng. Struct. 2017, 153, 766–781. [Google Scholar] [CrossRef] [Green Version]
  4. Truong, G.T.; Choi, K.K.; Kim, C.S. Punching shear strength of interior concrete slab-column connections reinforced with FRP flexural and shear reinforcement. J. Build. Eng. 2022, 46, 103692. [Google Scholar] [CrossRef]
  5. Zhang, D.; Jin, W.; Gao, J. Investigation and detection on corrosion of concrete structure in marine environment. Proc. Spie 2009, 15, 317–324. [Google Scholar]
  6. Matthys, S.; Taerwe, L. Concrete Slabs Reinforced with FRP Grids. II: Punching Resistance. J. Compos. Constr. 2000, 4, 154–161. [Google Scholar] [CrossRef]
  7. Ospina, C.E.; Alexander, S.D.B.; Cheng, J.J.R. Punching of Two-Way Concrete Slabs with Fiber-Reinforced Polymer Reinforcing Bars or Grids. ACI. Struct. J. 2003, 100, 589–598. [Google Scholar]
  8. Lee, J.H.; Yoon, Y.S.; Cook, W.D.; Mitchell, D. Improving Punching Shear Behavior of Glass Fiber-Reinforced Polymer Reinforced Slabs. ACI. Struct. J. 2009, 106, 427–434. [Google Scholar]
  9. Bouguerra, K.; Ahmed, E.A.; El-Gamal, S.; Benmokrane, B. Testing of full-scale concrete bridge deck slabs reinforced with fiber-reinforced polymer (FRP) bars. Constr. Build. Mater. 2011, 25, 3956–3965. [Google Scholar] [CrossRef]
  10. Nguyen-Minh, L.; Rovnak, M. Punching Shear Resistance of Interior GFRP Reinforced Slab-Column Connections. J. Compos. Constr. 2013, 17, 2–13. [Google Scholar] [CrossRef]
  11. El-Gamal, S.E. Finite Element Analysis of Concrete Bridge Decks Reinforced with Fiber Reinforced Polymer Bars. J. Eng. Res-Kuwait. 2014, 11, 50. [Google Scholar]
  12. Mahmoud, Z.; Salma, T. Punching behavior and strength of slab-column connection reinforced with glass fiber rebars. In Proceedings of the 7th International Conference on FRP Composites in Civil Engineering, Vancouver, BC, Canada, 20–22 August 2014. [Google Scholar]
  13. Vu, D.T.; Hoang, N.D. Punching shear capacity estimation of FRP-reinforced concrete slabs using a hybrid machine learning approach. Struct. Infrastruct. Eng. 2015, 12, 1153–1161. [Google Scholar] [CrossRef]
  14. MOHURD (Ministry of Housing and Urban-Rural Development of the People’s Republic of China). GB 50010—2010 Code for Design of Concrete Structures; China Building Industry Press: Beijing, China, 2015; pp. 232–236. [Google Scholar]
  15. ACI (American Concrete Institute). Building Code Requirements for Structure Concrete (ACI 318-14) and Commentary; American Concrete Institute: Farmington Hills, MI, USA, 2014; Volume 69, p. 5. [Google Scholar]
  16. El-Ghandour, A.W.; Pilakoutas, K.; Waldron, P. New approach for punching shear capacity prediction of fiber reinforced polymer reinforced concrete flat slabs. ACI. Struct. J. 1999, 188, 135–144. [Google Scholar]
  17. El-Ghandour, A.W.; Pilakoutas, K.; Waldron, P. Punching shear behavior and design of FRP RC flat slabs. In Proceedings of the International Workshop on Punching Shear Capacity of RC Slabs, Stockholm, Sweden, 1 January 2000. [Google Scholar]
  18. Ju, M.; Ju, J.; Sim, J. A new formula of punching shear strength for fiber reinforced polymer (FRP) or steel reinforced two-way concrete slabs. Compos. Struct. 2021, 259, 113471. [Google Scholar] [CrossRef]
  19. Geetha, N.K.; Bridjesh, P. Overview of machine learning and its adaptability in mechanical engineering. Mater. Today 2020, 4. [Google Scholar] [CrossRef]
  20. Mangalathu, S.; Karthikeyan, K.; Feng, D.C.; Jeon, J.S. Machine-learning interpretability techniques for seismic performance assessment of infrastructure systems. Eng. Struct. 2022, 250, 112883. [Google Scholar] [CrossRef]
  21. Chen, S.Z.; Feng, D.C.; Han, W.S.; Wu, G. Development of data-driven prediction model for CFRP-steel bond strength by implementing ensemble learning algorithms. Constr. Build. Mater. 2021, 303, 124470. [Google Scholar] [CrossRef]
  22. Feng, D.C.; Wang, W.J.; Mangalathu, S.; Hu, G.; Wu, T. Implementing ensemble learning methods to predict the shear strength of RC deep beams with/without web reinforcements. Eng. Struct. 2021, 235, 111979. [Google Scholar] [CrossRef]
  23. Fu, B.; Chen, S.Z.; Liu, X.R.; Feng, D.C. A probabilistic bond strength model for corroded reinforced concrete based on weighted averaging of non-fine-tuned machine learning models. Constr. Build. Mater. 2022, 318, 125767. [Google Scholar] [CrossRef]
  24. Nasiri, S.; Khosravani, M.R. Machine learning in predicting mechanical behavior of additively manufactured parts. J. Mater. Res. Technol. 2021, 14, 1137–1153. [Google Scholar] [CrossRef]
  25. Hoang, N.D. Estimating punching shear capacity of steel fibre reinforced concrete slabs using sequential piecewise multiple linear regression and artificial neural network. Measurement 2019, 137, 58–70. [Google Scholar] [CrossRef]
  26. Nguyen, H.D.; Truong, G.T.; Shin, M. Development of extreme gradient boosting model for prediction of punching shear resistance of r/c interior slabs. Eng. Struct. 2021, 235, 112067. [Google Scholar] [CrossRef]
  27. Mangalathu, S.; Shin, H.; Choi, E.; Jeon, J.S. Explainable machine learning models for punching shear strength estimation of flat slabs without transverse reinforcement. J. Build. Eng. 2021, 39, 102300. [Google Scholar] [CrossRef]
  28. Rahman, A.; Deshpande, P.; Radue, M.S.; Odegard, G.M.; Gowtham, S.; Ghosh, S.; Spear, A.D. A machine learning framework for predicting the shear strength of carbon nanotube-polymer interfaces based on molecular dynamics simulation data. Compos. Sci. Technol. 2021, 207, 108627. [Google Scholar] [CrossRef]
  29. Ilawe, N.V.; Zimmerman, J.A.; Wong, B.M. Breaking Badly: DFT-D2 Gives Sizeable Errors for Tensile Strengths in Palladium-Hydride Solids. J. Chem. Theory Comput. 2015, 11, 5426–5435. [Google Scholar] [CrossRef]
  30. Lundberg, S.M.; Lee, S.I. A Unified Approach to Interpreting Model Predictions. In Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 25 November 2017. [Google Scholar]
  31. Ahmad, S.H.; Zia, P.; Yu, T.J.; Xie, Y. Punching shear tests of slabs reinforced with 3 dimensional carbon fiber fabric. Concr. Int. 1994, 16, 36–41. [Google Scholar]
  32. Banthia, N.; Al-Asaly, M.; Ma, S. Behavior of Concrete Slabs Reinforced with Fiber-Reinforced Plastic Grid. J. Mater. Civil. Eng. 1995, 7, 252–257. [Google Scholar] [CrossRef]
  33. Khanna, O.S.; Mufti, A.A.; Bakht, B. Experimental investigation of the role of reinforcement in the strength of concrete deck slabs. Can. J. Civil. Eng. 2000, 27, 475–480. [Google Scholar] [CrossRef]
  34. El-Ghandour, A.W.; Pilakoutas, K.; Waldron, P. Punching Shear Behavior of Fiber Reinforced Polymers Reinforced Concrete Flat Slabs: Experimental Study. J. Compos. Constr. 2003, 7, 258–265. [Google Scholar] [CrossRef] [Green Version]
  35. Hussein, A.; Rashid, I. Two-way concrete slabs reinforced with GFRP bars. In Proceeding of the 4th International Conference on Advanced Composite Materials in Bridges and Structures, Calgary, AB, Canada, 10–12 July 2004. [Google Scholar]
  36. El-Gamal, S.; El-Salakawy, E.; Benmokrane, B. Behavior of Concrete Bridge Deck Slabs Reinforced with Fiber-Reinforced Polymer Bars Under Concentrated Loads. ACI Struct. J. 2005, 102, 727–735. [Google Scholar]
  37. Zhang, Q.; Marzouk, H.; Hussein, A. A preliminary study of high-strength concrete two-way slabs reinforced with GFRP bars. In Proceedings of the 33rd CSCE Annual Conference: General Conference and International History Symposium, Toronto, ON, Canada, 1 June 2005. [Google Scholar]
  38. Zaghloul, A. Punching Shear Strength of Interior and Edge Column-Slab Connections in CFRP Reinforced Flat Plate Structures Transferring Shear and Moment. Ph.D. Thesis, Carleton University, Ottawa, ON, Canada, 2007. [Google Scholar]
  39. Xiao, Z. Expremental Study on Two-Way Concrete Slab Subjected to Punching Shear. Master’s Thesis, Zhengzhou University, Zhengzhou, China, 2010. [Google Scholar]
  40. Hassan, M.A.W. Punching Shear Behaviour of Concrete Two-Way Slabs Reinforced with Glass Fiber-Reinforced Polymer (GFRP) Bars. Ph.D. Thesis, Université de Sherbrooke, Sherbrooke, QC, Canada, 2013. [Google Scholar]
  41. Elgabbas, F.; Ahmed, E.A.; Benmokrane, B. Experimental Testing of Concrete Bridge-Deck Slabs Reinforced with Basalt-FRP Reinforcing Bars under Concentrated Loads. J. Bridge. Eng. 2016, 21, 04016029. [Google Scholar] [CrossRef]
  42. Gouda, A.; El-Salakawy, E. Behavior of GFRP-RC Interior Slab-Column Connections with Shear Studs and High-Moment Transfer. J. Compos. Constr. 2016, 20, 04016005. [Google Scholar] [CrossRef]
  43. Gouda, A.; El-Salakawy, E. Punching Shear Strength of GFRP-RC Interior Slab–Column Connections Subjected to Moment Transfer. J. Compos. Constr. 2016, 20, 04015037. [Google Scholar] [CrossRef]
  44. Hussein, A.H.; El-Salakawy, E.F. Punching Shear Behavior of Glass Fiber-Reinforced Polymer–Reinforced Concrete Slab-Column Interior Connections. ACI Struct. J. 2018, 115, 1075–1088. [Google Scholar] [CrossRef]
  45. Eladawy, B.M.; Hassan, M.; Benmokrane, B. Experimental Study of Interior Glass Fiber-Reinforced Polymer-Reinforced Concrete Slab-Column Connections under Lateral Cyclic Load. ACI Struct. J. 2019, 116, 165–180. [Google Scholar] [CrossRef]
  46. Gu, S. Study on The Punching Shear Behavior of FRP Reinforced Concrete Slabs Subjected to Concentric Loading. Master’s Thesis, Zhejiang University of Technology, Zhejiang, China, 2020. [Google Scholar]
  47. Zhou, X. Expremental Study on the Punching Shear Behavior of Square GFRP Reinforced Concrete Slabs. Master’s Thesis, Zhejiang University of Technology, Zhejiang, China, 2020. [Google Scholar]
  48. Eladawy, M.; Hassan, M.; Benmokrane, B.; Ferrier, E. Lateral cyclic behavior of interior two-way concrete slab–column connections reinforced with GFRP bars. Eng. Struct. 2020, 209, 109978. [Google Scholar] [CrossRef]
  49. AlHamaydeh, M.; Anwar Orabi, M. Punching Shear Behavior of Synthetic Fiber–Reinforced Self-Consolidating Concrete Flat Slabs with GFRP Bars. J. Compos. Constr. 2021, 25, 04021029. [Google Scholar] [CrossRef]
  50. Kim, J.; Kim, D.K.; Feng, M.Q.; Yazdani, F. Application of Neural Networks for Estimation of Concrete Strength. J. Mater. Civil. Eng. 2004, 16, 257–264. [Google Scholar] [CrossRef]
  51. Yonaba, H.; Anctil, F.; Fortin, V. Comparing Sigmoid Transfer Functions for Neural Network Multistep Ahead Streamflow Forecasting. J. Hydrol. Eng. 2010, 15, 275–283. [Google Scholar] [CrossRef]
  52. Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
  53. Shen, H.; Wang, Z.; Gao, C.; Qin, J.; Yao, F.; Xu, W. Determining the number of BP neural network hidden layer units. J. Tianjin Univ. Technol. 2008, 24, 13–15. [Google Scholar]
  54. Cortes, C.; Vapnik, V. Support-Vector Networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  55. Yu, Y.; Li, W.; Li, J.; Nguyen, T.N. A novel optimised self-learning method for compressive strength prediction of high performance concrete. Constr. Build. Mater. 2018, 184, 229–247. [Google Scholar] [CrossRef]
  56. Smola, A.J.; Schölkopf, B. A tutorial on support vector regression. Stat. Comput. 2004, 14, 199–222. [Google Scholar] [CrossRef] [Green Version]
  57. Lin, J.; Wang, G.; Xu, R. Particle Swarm Optimization–Based Finite-Element Analyses and Designs of Shear Connector Distributions for Partial-Interaction Composite Beams. J. Bridge. Eng. 2019, 24, 04019017. [Google Scholar] [CrossRef]
  58. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995. [Google Scholar]
  59. Huang, W.; Liu, H.; Zhang, Y.; Mi, R.; Tong, C.; Xiao, W.; Shuai, B. Railway dangerous goods transportation system risk identification: Comparisons among SVM, PSO-SVM, GA-SVM and GS-SVM. Appl. Soft. Comput. 2021, 109, 107541. [Google Scholar] [CrossRef]
  60. Jiang, W.; Xie, Y.; Li, W.; Wu, J.; Long, G. Prediction of the splitting tensile strength of the bonding interface by combining the support vector machine with the particle swarm optimization algorithm. Eng. Struct. 2021, 230, 111696. [Google Scholar] [CrossRef]
  61. Quinlan, J.R. Induction on decision tree. Mach. Learn. 1986, 1, 81–106. [Google Scholar] [CrossRef] [Green Version]
  62. Rutkowski, L.; Jaworski, M.; Pietruczuk, L.; Duda, P. The CART decision tree for mining data streams. Inform. Sci. 2014, 266, 1–15. [Google Scholar] [CrossRef]
  63. Almuallim, H. An efficient algorithm for optimal pruning of decision trees. Artif. Intell. 1996, 83, 347–362. [Google Scholar] [CrossRef] [Green Version]
  64. Qin, Z.; Lawry, J. ROC Analysis of a Linguistic Decision Tree Merging Algorithm. Comput. Intell. 2004, 33–42. [Google Scholar]
  65. Feng, D.; Cetiner, B.; Azadi Kakavand, M.; Taciroglu, E. Data-Driven Approach to Predict the Plastic Hinge Length of Reinforced Concrete Columns and Its Application. J. Struct. Eng. 2021, 147, 04020332. [Google Scholar] [CrossRef]
  66. Feng, D.; Liu, Z.; Wang, X.; Chen, Y.; Chang, J.; Wei, D.; Jiang, Z. Machine learning-based compressive strength prediction for concrete: An adaptive boosting approach. Constr. Build. Mater. 2020, 230, 117000. [Google Scholar] [CrossRef]
  67. Freund, Y.; Schapire, R.E. A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting. J. Comput. Syst. Sci. 1997, 55, 119–139. [Google Scholar] [CrossRef] [Green Version]
  68. Tiryaki, B. Predicting intact rock strength for mechanical excavation using multivariate statistics, artificial neural networks, and regression trees. Eng. Geol. 2008, 99, 51–60. [Google Scholar] [CrossRef]
  69. ACI (American Concrete Institute). Building Code Requirements for Structure Concrete (ACI 318-19) and Commentary; American Concrete Institute: Farmington Hills, MI, USA, 2019. [Google Scholar]
  70. BSI (British Standards Institution). Structure Use of Concrete BS 8110: Part 1: Code of Practice for Design and Construction: BS 8110: 1997; BSI: London, UK, 1997. [Google Scholar]
  71. Feng, D.; Wang, W.; Mangalathu, S.; Taciroglu, E. Interpretable XGBoost-SHAP Machine-Learning Model for Shear Strength Prediction of Squat RC Walls. J. Struct. Eng. 2021, 147, 04021173. [Google Scholar] [CrossRef]
  72. Deifalla, A. Punching shear strength and deformation for FRP-reinforced concrete slabs without shear reinforcements. Case Stud. Constr. Mat. 2022, 16, e00925. [Google Scholar] [CrossRef]
Figure 1. Punching shear failure mode of FRP reinforced concrete slabs. (a) Stereogram; (b) Profile.
Figure 1. Punching shear failure mode of FRP reinforced concrete slabs. (a) Stereogram; (b) Profile.
Crystals 12 00259 g001
Figure 2. Histograms of input and output variables. (a) x1; (b) x2; (c) x3; (d) x4; (e) x5; (f) x6; (g) y.
Figure 2. Histograms of input and output variables. (a) x1; (b) x2; (c) x3; (d) x4; (e) x5; (f) x6; (g) y.
Crystals 12 00259 g002
Figure 3. Flowchart for implementation of machine learning algorithm.
Figure 3. Flowchart for implementation of machine learning algorithm.
Crystals 12 00259 g003
Figure 4. The predictions of ANN for data set. (a) Training set; (b) Test set.
Figure 4. The predictions of ANN for data set. (a) Training set; (b) Test set.
Crystals 12 00259 g004
Figure 5. The predictions of PSO-SVR for data set. (a) Training set; (b) Test set.
Figure 5. The predictions of PSO-SVR for data set. (a) Training set; (b) Test set.
Crystals 12 00259 g005
Figure 6. The predictions of DT for data set. (a) Training set; (b) Test set.
Figure 6. The predictions of DT for data set. (a) Training set; (b) Test set.
Crystals 12 00259 g006
Figure 7. The predictions of AdaBoost for data set. (a) Training set; (b) Test set.
Figure 7. The predictions of AdaBoost for data set. (a) Training set; (b) Test set.
Crystals 12 00259 g007
Figure 8. The predictions of models for the whole data set. (a) Formula (1); (b) Formula (2); (c) Formula (3); (d) Formula (4); (e) Formula (5); (f) Formula (6); (g) ANN; (h) PSO-SVR; (i) DT; (j) AdaBoost.
Figure 8. The predictions of models for the whole data set. (a) Formula (1); (b) Formula (2); (c) Formula (3); (d) Formula (4); (e) Formula (5); (f) Formula (6); (g) ANN; (h) PSO-SVR; (i) DT; (j) AdaBoost.
Crystals 12 00259 g008aCrystals 12 00259 g008bCrystals 12 00259 g008c
Figure 9. Global interpretations of PSO-SVR model by SHAP values. (a) SHAP feature importance; (b) SHAP summary plot.
Figure 9. Global interpretations of PSO-SVR model by SHAP values. (a) SHAP feature importance; (b) SHAP summary plot.
Crystals 12 00259 g009
Figure 10. Individual interpretations for selected samples. (a) Sample 1: Specimen G(1.6)30/20; (b) Sample 2: Specimen G(1.6)45/20-B.
Figure 10. Individual interpretations for selected samples. (a) Sample 1: Specimen G(1.6)30/20; (b) Sample 2: Specimen G(1.6)45/20-B.
Crystals 12 00259 g010
Figure 11. Feature dependence plots. (a) x1; (b) x2; (c) x3; (d) x4; (e) x5; (f) x6.
Figure 11. Feature dependence plots. (a) x1; (b) x2; (c) x3; (d) x4; (e) x5; (f) x6.
Crystals 12 00259 g011aCrystals 12 00259 g011b
Table 1. Parameters in the punching shear strength samples.
Table 1. Parameters in the punching shear strength samples.
ParameterUnitMinimumMaximumStd. DevMeanType
x1: Types of column section-1.003.000.751.44Input
x2: cross-section area of columncm250.272025.00515.94817.72Input
x3: slab’s effective depthmm55.00284.0052.71136.38Input
x4: compressive strength of concreteMpa22.16118.0014.5341.17Input
x5: Young’s modulus of FRP reinforcementGpa28.40147.0024.3460.36Input
x6: reinforcement ratio%0.153.760.671.03Input
y: punching shear strengthkN61.001600.00288.87402.34Output
Table 2. Performance and ranking of ANN which have different neuron numbers in hidden layer.
Table 2. Performance and ranking of ANN which have different neuron numbers in hidden layer.
Neuron Number of Hidden LayerValidation SetRanking
RMSEMAER2RMSEMAER2
7100.2480.920.89555
887.2871.720.91212
986.9173.000.91131
1087.9272.600.91323
1194.0979.040.90444
12101.6683.380.88666
13109.4389.780.86777
Table 3. Result comparison between machine learning algorithms.
Table 3. Result comparison between machine learning algorithms.
Machine Learning ModelTraining SetTest Set
RMSEMAER2RMSEMAER2
ANN102.6860.090.8844.8038.590.97
PSO-SVR54.8527.920.9633.4626.270.99
DT59.5237.890.9666.5752.930.94
AdaBoost27.2920.700.9938.4032.280.98
Table 4. Calculation formulas of traditional empirical model.
Table 4. Calculation formulas of traditional empirical model.
Number of FormulaOriginExpression
Formula (1)GB 50010-2010 (2015) [14] V 1 = 0.7 β h f t η b 0 , 0.5 d d ;   η = min η 1 = 0.4 + 1.2 β s η 2 = 0.5 + α s d 4 b 0 , 0.5 d
Formula (2)ACI 318-19 [62] V 2 = min 1 3 , 1 6 1 + 2 β s , 1 12 2 + α s d b 0 , 0.5 d λ s f c b 0 , 0.5 d d ;   λ s = 2 1 + 0.004 d 1
Formula (3)BS 8110-97 [63] V 3 = 0.79 100 ρ f 1 / 3 400 d 1 / 4 f cu 25 1 / 3 b 0 , 1.5 d d
Formula (4)El-Ghandour et al. (1999) [16] V 4 = 0.33 f c E f E s 1 / 3 b 0 , 0.5 d d
Formula (5)El-Ghandour et al. (2000) [17] V 5 = 0.79 100 ρ f 1.8 E f E s 1 / 3 400 d 1 / 4 f cu 25 1 / 3 b 0 , 1.5 d d
Formula (6)Ospina et al. [7] V 6 = 2.77 ρ f f c 1 / 3 E f E s b 0 , 1.5 d d
βh is the sectional depth influence coefficient; ft is the design value of tensile strength; b0,0.5d is the the perimeter of the critical section for slabs and footings at a distance of d/2 away from the column face; βs is the ratio of the long side to the short side of the sectional shape under local load or concentrated reaction force; αs is the influence coefficient of column type; fcu is the compressive strength of concrete cube; b0,1.5d is the perimeter of the critical section for slabs and footings at a distance of 1.5d/2 away from the loaded area; Es is the Young’s modulus of steel bar.
Table 5. Result comparison between machine learning and traditional empirical models.
Table 5. Result comparison between machine learning and traditional empirical models.
IndicatorFormula (1)Formula (2)Formula (3)Formula (4)Formula (5)Formula (6)ANNPSO-SVRDTAdaBoost
RMSE150.41151.71176.15155.01178.09174.4794.0851.3260.9929.83
MAE98.4897.10127.06113.41121.75117.9755.8227.5940.8723.00
R20.730.720.630.710.620.630.890.970.960.99
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Shen, Y.; Sun, J.; Liang, S. Interpretable Machine Learning Models for Punching Shear Strength Estimation of FRP Reinforced Concrete Slabs. Crystals 2022, 12, 259. https://doi.org/10.3390/cryst12020259

AMA Style

Shen Y, Sun J, Liang S. Interpretable Machine Learning Models for Punching Shear Strength Estimation of FRP Reinforced Concrete Slabs. Crystals. 2022; 12(2):259. https://doi.org/10.3390/cryst12020259

Chicago/Turabian Style

Shen, Yuanxie, Junhao Sun, and Shixue Liang. 2022. "Interpretable Machine Learning Models for Punching Shear Strength Estimation of FRP Reinforced Concrete Slabs" Crystals 12, no. 2: 259. https://doi.org/10.3390/cryst12020259

APA Style

Shen, Y., Sun, J., & Liang, S. (2022). Interpretable Machine Learning Models for Punching Shear Strength Estimation of FRP Reinforced Concrete Slabs. Crystals, 12(2), 259. https://doi.org/10.3390/cryst12020259

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop