Next Article in Journal
Morphological Analysis of Laser Surface Texturing Effect on AISI 430 Stainless Steel
Previous Article in Journal
Properties of Ceramic Coating on Heating Surface of Waste Incineration Boiler Prepared by Slurry Method
Previous Article in Special Issue
NSF-Based Analysis of the Structural Stressing State of Trussed Steel and a Concrete Box Girder
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Intelligent Design of Construction Materials: A Comparative Study of AI Approaches for Predicting the Strength of Concrete with Blast Furnace Slag

1
Department of Jewelry Design, KAYA University, Gimhae 50830, Korea
2
School of Materials Engineering, Xuzhou College of Industrial Technology, Xuzhou 221116, China
3
Xuzhou Finance and Economics Branch, Jiangsu Union Technical Institute, Xuzhou 221116, China
4
School of Mines, China University of Mining and Technology, Xuzhou 221116, China
5
Peter the Great St.Petersburg Polytechnic University, 195251 St. Petersburg, Russia
*
Authors to whom correspondence should be addressed.
Materials 2022, 15(13), 4582; https://doi.org/10.3390/ma15134582
Submission received: 15 May 2022 / Revised: 19 June 2022 / Accepted: 24 June 2022 / Published: 29 June 2022

Abstract

:
Concrete production by replacing cement with green materials has been conducted in recent years considering the strategy of sustainable development. This study researched the topic of compressive strength regarding one type of green concrete containing blast furnace slag. Although some researchers have proposed using machine learning models to predict the compressive strength of concrete, few researchers have compared the prediction accuracy of different machine learning models on the compressive strength of concrete. Firstly, the hyperparameters of BP neural network (BPNN), support vector machine (SVM), decision tree (DT), random forest (RF), K-nearest neighbor algorithm (KNN), logistic regression (LR), and multiple linear regression (MLR) are tuned by the beetle antennae search algorithm (BAS). Then, the prediction effects of the above seven machine learning models on the compressive strength of concrete are evaluated and compared. The comparison results show that KNN has higher R values and lower RSME values both in the training set and test set; that is, KNN is the best model for predicting the compressive strength of concrete among the seven machine learning models mentioned above.

1. Introduction

Concrete is a common building material; it has been widely used in industrial and civil buildings and has become one of the world’s most widely used building materials because of its low price, excellent performance, simple production process, and other characteristics [1,2,3,4,5,6]. Over time, more and more infrastructure industries have given priority to concrete as a building material [7,8]. As a result of pouring, the concrete interior often produces phenomena such as cavities and in-compactness. This will lead to the strength, compactness, frost resistance, anti-permeability, and other properties of concrete being reduced and will also affect the service life of concrete structures to a certain extent, and may even affect the safe operation of buildings [9,10,11,12]. Cement is an important part of concrete, but it will emit a large amount of carbon in the process of production, which will bring a certain burden to the environment [13,14,15]. With the wide application of concrete, its impact on the environment has been paid more and more attention [16,17,18,19]. Considering the strategy of sustainable development, it is urgently needed to solve the problem of the environmental pollution caused by cement production by replacing cement with green materials [20,21,22,23,24,25,26].
Blast furnace slag is a type of industrial waste slag discharged from the blast furnace when smelting pig iron, and it contains a large amount of active substances [27,28,29,30]. Researchers found that blast furnace slag has a certain value, so they began to study the application of blast furnace slag, and took it as auxiliary cementing material to replace part of the cement in concrete, alleviate the environmental pollution brought by cement production, and improve the performance of concrete [29,30,31,32]. Shi et al. studied the effect of blast furnace slag fine aggregate produced by three different steel mills on the mechanical properties of high-performance concrete, and the results showed that the concrete with blast furnace slag fine aggregate could improve the compressive strength of concrete under the condition of a lower water–cement ratio [33]. Cvetkovic et al. proposed an adaptive network-based fuzzy inference system (ANFIS) to study the influence of blast furnace slag and fly ash on the strength of concrete. The research results showed that the addition of blast furnace slag and fly ash is beneficial to improving concrete strength, and the curing time has the greatest influence on concrete strength [34]. Zhao et al. studied the mechanical properties and fresh properties of self-compacting concrete by ground blast furnace slag and hook-end steel fiber. The results showed that replacing 10% cement in self-compacting concrete with slag had a positive effect on reducing the workability of freshly mixed concrete. However, using slag to replace 30% cement in self-compacting concrete will reduce the passing capacity and filling capacity of fresh concrete. Blast furnace slag can improve the bonding properties of the fiber–matrix interface, and then improve the mechanical properties of self-compacting concrete [35]. Liu et al. studied the influence of the composite mixing of steel slag and blast furnace slag on mortar and concrete, and the results showed that the composite mixing of steel slag and blast furnace slag may reduce the early compressive strength of concrete, but will promote the development of concrete strength over time, and is conducive to the self-shrinkage of concrete and the reduction of adiabatic temperature. These phenomena are more obvious when the water–solid ratio is low [36].
Engineers usually use the laboratory test method to study the performance of concrete. However, the laboratory test method has many disadvantages, such as low efficiency and high cost [37,38,39,40,41,42,43]. To find a more efficient and low-cost method to predict the performance of concrete, many researchers choose to use machine learning models to predict the properties of concrete [44,45,46,47,48,49,50,51]. Salimbahrami et al. studied the compressive strength prediction methods of recycled concrete based on the artificial neural network (ANN) and support vector machine (SVM), and the research results show that machine learning models have good prediction effects on the compressive strength of recycled concrete [52]. Al-Shamir et al. established a prediction model for the compressive strength of HPC by using the regularized extreme learning machine (RELM) technology and compared the RELM model with other machine learning models. The results show that the prediction accuracy of the RELM model for the compressive strength of HPC is higher [53]. Wang et al. proposed a model based on the combination of random forest and support vector machine (RF-SVM) to predict the impermeability of concrete and compared the prediction results of the RF-SVM model with the BP neural network model and single SVM model. The research results show that the RF-SVM model has a better prediction effect and fitting effect on the prediction of the impermeability of concrete [54]. Nilsen et al. proposed to use the linear regression and stochastic forest machine learning methods to predict the thermal expansion coefficient of concrete to solve the time-consuming and expensive problems of CTE measurement, and achieved a good prediction effect [55]. The above machine learning models have achieved good prediction results in concrete performance prediction [56,57,58,59,60,61,62,63,64,65,66,67,68]. However, most researchers only consider the prediction effect of the proposed model when studying the prediction of the properties of concrete by machine learning models. Few researchers compare the prediction effect of various machine learning models and select the machine learning model with the best prediction effect to predict the properties of concrete.
Strength is an important index to measure the quality of concrete with blast furnace slag. To ensure the quality of concrete, concrete must reach a certain strength, and the material composition of concrete determines its most critical mechanical index, which is compressive strength. To predict the compressive strength of concrete with blast furnace slag more efficiently and economically, firstly, this study uses the beetle antennae search algorithm (BAS) to adjust the hyperparameters of the BPNN, SVM, DT, RF, KNN, LR, and MLR, considering the simple implementation, fast convergence speed, and low possibility of falling into local optimization by changing the step size strategy [69]. Then, comparing the prediction effects of the above seven models on the compressive strength of concrete with blast furnace slag, the machine learning model with the best prediction effect on the compressive strength of concrete with blast furnace slag is selected.

2. Methodology

2.1. Data Collection

In the past, many researchers often only focused on developing new prediction models of the performance of concrete, while ignoring the importance of a reliable database to verify the accuracy of the developed models. In this study, the data set on the compressive strength of concrete is collected from the published articles, and a reliable database is formed [70]. Cement, water, blast furnace slag, coarse aggregate, fine aggregate, and superplasticizer are the input variables, and the compressive strength of concrete is the output variable. The frequency distribution histogram of the data of each variable in the database is shown in Figure 1. It can be seen from Figure 1 that the data frequency distribution histograms of water, coarse aggregate, and superplasticizer are single peaks, and the data frequency distribution histograms of fine aggregate and concrete compressive strength are double peaks. In short, from the frequency distribution histogram of these seven variables, it can be seen that the data distribution of each variable in the database is reasonable and covers a wide range; that is, using the database to predict the compressive strength of concrete can achieve good results.

2.2. Correlation Analysis

To prevent the multicollinearity of machine learning models for predicting the compressive strength of concrete, it is necessary to analyze the correlation between input variables before the training of models. The correlation analysis results of cement, water, blast furnace slag, coarse aggregate, superplasticizer, and fine aggregate are shown in Figure 2. It can be seen from Figure 2 that the correlation coefficients on the diagonal line are 1, while the correlation coefficients on the other position are all less than 0.6; that is, the correlation coefficients between the same variables are 1, and the correlation coefficients between different variables are less than 0.6. The above results show that the correlation between cement, water, blast furnace slag, coarse aggregate, superplasticizer, and fine aggregate is low. Therefore, using them as input variables to predict the compressive strength of concrete will not affect the prediction effect due to multiple collinearities.

2.3. Algorithm

2.3.1. Beetle Antennae Search (BAS)

BAS is an efficient intelligent optimization algorithm. Compared with other optimization algorithms, this algorithm can optimize without knowing the specific function and gradient information. Therefore, this algorithm has the advantages of small computation and fast optimization speed. The idea of the BAS algorithm is to realize optimization by simulating the process of beetles looking for food. The BAS algorithm regards the fitness function value as the concentration of food odor, so the function has different function values in different positions. Beetles find the optimal value by comparing the odor concentration received by left and right antennae until they find the location of food; that is, the algorithm finds the optimal function value through multiple iterations and comparisons. The optimization steps of the BAS algorithm are as follows:
(1)
Determine the direction of the initial value of each beetle. The direction of the initial value of the beetles is determined by the following formula:
d = r a n d s ( K , 1 ) r a n d s ( K , 1 )
where r a n d s ( ) is the random function, and K is the spatial dimension.
(2)
Set the step factor. The step size factor determines the searchability of beetles, so choosing a larger initial step size is helpful to improve the search range of beetles. The calculation formula of the step factor is as follows:
ξ t + 1 = ξ t · e t a ( t = 1 , , n )
where ξ is the step size, eta is the decreasing factor, e t a 0 , 1 , t is the current number of iterations, and n is the total number of iterations.
The position coordinates of the two whiskers of beetles are updated by the following formula:
x l = x t d 0 d 2 x r = x t + d 0 d 2 ( t = 1 , 2 , , n )
where x l is the position of the left whisker, x r is the position of the right whisker, x t represents the position of the individual centroid when the number of iterations is t, and d 0 represents the length between the two whiskers.
The fitness function (mean square error, MSE) is expressed by the odor concentration of the left and right whiskers, and the solution formula is as follows [71,72]:
f i t n e s s = 1 n i = 1 n ( d f i d v i ) 2
where d f i represents the output value of the i t h sample model and d v i represents the actual value of the i t h sample. The MSE between predicted outputs and observed outputs can be minimized during this process to evaluate the predictive performance.
Comparing the fitness values of the two tentacles, the beetle moves in the direction of a large fitness value, to update the position. The location update formula is as follows:
x t + 1 = x t ξ t · d · s i g n ( f ( x l x r ) )
where ξ t is the step size factor of the tth iteration and s i g n ( · ) is the symbolic function.
The code of the BAS algorithm is shown in Algorithm 1 [5,69].
Algorithm 1 The framework of BAS algorithm
Input: f ( x ) : Fitness function
K : Dimensions of variables
e t a : Decrease factor
n : Number of iterations
ξ : Step factor
Output:Optimal solution X b e s t , f b e s t
1: Initial the initial position of the beetle X 0
2: Initial a random orientation of the beetle d 0
3: Initialization iteration number t = 1
4: While ( t n ) or (stop criterion) do
5: X t l , X t r Use Equation (3) to calculate the position of the beetle’s tentacles
6: f ( X t l ) , f ( X t r ) Use Equation (4) to calculate fitness value
7: X t Update coordinate using Equation (5)
8: f ( X t ) Calculate its fitness value
9:If   f ( X t ) < f ( X b e s t )
10: f ( X b e s t ) f ( X t ) Update the current optimal value
11: X b e s t X t Update the current position
12:End if
13: t t + 1
14: End while
15: Return X b e s t , f b e s t

2.3.2. Backpropagation Neural Network (BPNN)

BPNN is a multilayer feedforward network prediction model trained based on the error backpropagation algorithm. Without knowing the clear mathematical equation relationship between the input data and the output data, it can learn the relationship between the input layer and the output layer, to input the corresponding value and obtain the prediction result. The structure of BPNN is composed of an input layer, hidden layer, and output layer. BPNN first needs to input influence variables, and then output the final results from the output layer through the calculation and adjustment of the hidden layer. Next, it is necessary to calculate the error between the input value and the actual value and judge whether the error is within the specified range. If the error is not within the specified range, backpropagation is required to redistribute the weight, and we then repeat the cycle until the error is within the specified range. After the test error reaches the required accuracy, the learning ends and a black box model is obtained. After this, the test and prediction of the model are carried out around the black-box model. The flow chart of BPNN is shown in Figure 3.

2.3.3. Support Vector Machine (SVM)

SVM is typical supervised learning technology. The basic idea of SVM is to find the maximum interval hyperplane, maximize the interval from different types of samples to the classification hyperplane, and optimize the classification hyperplane. In the hypothetical linear separable sample set ( x i , y i ) , i = 1 , 2 , , n , x R d , y = 1 , 1 , the expression of the linear discriminant function in d-dimensional space is as follows:
g ( x ) = w · x + b
The equation for classifying hyperplanes is:
w · x + b = 0
Next, the discriminant function needs to be normalized so that the distance between the classification hyperplane and the sample closest to the classification hyperplane in the two types of samples is 1. At this time, the classification interval is 2 / w , and the maximum classification interval can be reached only when the minimum w is met. To ensure that the classification hyperplane distinguishes all samples, the following formula needs to be satisfied:
y i ( w · x ) + b 1 0 , i = 1 , 2 , n
The classification hyperplane that minimizes w and satisfies the above formula is called the optimal hyperplane.
The transformation of samples from low-dimensional space to high-dimensional space is a solution to the linear inseparable problem. By this method, the transformation from the nonlinear problem to the linear problem can be realized, and then the optimal classification hyperplane can be solved. Suppose that ϕ : R d H is a nonlinear mapping, which can realize the transformation of input samples from low-dimensional space to high-dimensional feature space; that is, the construction of the optimal hyperplane can be realized by the inner product operation of high-dimensional space. Its expression is ϕ ( x i ) · ϕ ( y i ) , where ϕ ( x i ) does not need to be calculated. The inner product operation of high-dimensional space can determine a kernel function K, and the kernel function K satisfies the following formula:
y i ( w · x ) + b 1 0 , i = 1 , 2 , , n
The key to transforming the nonlinear problem into the linear problem is to select the appropriate kernel function K. At this time, the calculation formula of the objective function is as follows:
Q ( α ) = i = 1 n α i 1 2 i , j = 1 n α i α j y i y i K ( x i · x j )
The calculation formula of the classification function is as follows:
f ( x ) = sgn i = 1 n α i y i K ( x i · y i ) + b
When the above expression is used to calculate the classification function, other conditions of the algorithm do not change. In short, the main idea of SVM to construct the optimal hyperplane in high-dimensional space is to transform the input vector into high-dimensional feature space mapping through the kernel function.

2.3.4. Decision Tree (DT)

The DT is a common prediction method in machine learning. It is a typical classification method that generates partition rules through the continuous logical induction of training data sets, and its main forms are binary tree and multiway tree. The construction of a decision tree algorithm mainly includes the generation of a decision tree and pruning of a decision tree. Decision tree generation refers to the process of generating decision rules by learning and training sample set data. Pruning of the decision tree mainly refers to using test data set to test, correct, and trim the rules generated by the above decision tree to prevent the over-fitting phenomenon of the decision tree. The selection of characteristic values is the most critical part of the process of DT division, and the effect of inductive classification after the completion of decision tree construction largely depends on the selected feature evaluation method. The DT algorithm is one of the most widely used inference algorithms because of its advantages of high classification accuracy, simple generation mode, and high tolerance of noisy data.

2.3.5. Random Forests (RF)

RF is an algorithm that integrates multiple trees through the bagging idea of ensemble learning. RF summarizes the classification results of all decision trees on the training sample set by constructing a large number of decision trees. If the problem studied is a classification problem, the result is the prediction category of the classification tree; if the problem studied is a regression problem, the result is the average of all regression trees. Integrated learning is one of the most common machine learning ideas at present. The advantage of integrated learning is to integrate more models and effectively avoid the inherent defects of a single model or a group of models. RF is one of the typical ensemble learning algorithms. RF can solve the problems of low accuracy and over-fitting of a single decision tree by gathering multiple decision trees. The construction process of RF is as follows:
(1)
Assuming that the size of the training set is N, m training sample sets with retrieval are taken from the training sample set, and m regression trees are constructed using the extracted training sample sets.
(2)
In the process of constructing a regression tree, no more than the total number of variables are randomly extracted from all independent variables at each node to branch, and each branch is scored to determine the optimal branch.
(3)
Each regression tree is branched from top to bottom, and parameters such as the number of RF subtrees, the depth of RF subtrees, the maximum number of subtrees, and the minimum number of subtrees are constantly adjusted during the branching process, to optimize the accuracy of the model.
(4)
Summarizing all the generated regression trees to form the RF model, the prediction effect of the model is determined by evaluating the determination coefficient and root mean square error of the test set. If the prediction effect of the model is not satisfactory, the parameters need to be adjusted continuously in the process of random forest modeling until the expected effect is achieved.
The schematic diagram of random forest construction is shown in Figure 4.

2.3.6. K-Nearest Neighbor (KNN)

KNN is one of the most simple machine learning methods. The core idea of KNN is that if k nearest samples of a sample in space belong to the same category, the sample also belongs to this category. That is, the category of a given unknown sample can be determined according to the category of k samples closest to it. In simple terms, the KNN algorithm calculates the distance between the input unknown sample and all sample points and then takes the first K samples with the smallest distance for unified annotation. The graphic description of the KNN algorithm is shown in Figure 5.

2.3.7. Logistic Regression (LR)

LR is a generalized linear regression model with the advantages of strong plasticity, fast calculation speed, and strong generalization ability. The LR algorithm can not only predict the possibility of an event occurring under the action of a variety of different input variables but also analyze two opposing events. LR has many advantages over SVM, ANN, and other self-optimized training learning algorithms in the learning of model training set and the time required for model prediction. The binary classification problem is the most important application field of LR. LR only distinguishes class 0 and class 1 in the classification process of binary classification problems, and its probability distribution formula is as follows:
P ( Y = 1 x , w ) = e w x + b 1 + e w x + b = 1 1 + e ( w x + b )
P ( Y = 0 x , w ) = 1 1 + e w x + b
where w is the model weight coefficient, w R n , x is the input variable, x R n , Y is the output variable, Y 0 , 1 , b is bias, and b 0 , 1 .
In the case of multiple inputs, the weight vectors and input variables of the model need to be expanded. In this case, the mathematical expression of the LR algorithm is as follows:
P ( Y = 1 x , w ) = 1 1 + e w T x
P ( Y = 0 x , w ) = 1 1 + e w T x
where w = ( w 1 , w 2 , w n , m ) T , x = ( x 1 , x 2 , x n , m ) T , and w i , x i represent the ith dimension of w and x vectors, respectively.
The regression function is obtained by unifying the above two expressions as follows:
f ( x ) = 1 1 + e w T x

2.3.8. Multiple Linear Regression (MLR)

Regression analysis is a common statistical analysis method to deal with variable correlation. For the independent variables and dependent variables without a strict deterministic function relation quantity, regression analysis can also better determine the functional relationship between them. The conceptual diagram of the regression model is shown in Figure 6.
Regression analysis problems can usually be divided into unary regression and multiple regression. In real life, a phenomenon is often associated with multiple factors, so it is necessary to generate an optimal combination of multiple independent variables to jointly predict the dependent variable. Therefore, the application scope of multiple linear regression is often wider than that of unary regression. The general form of multiple regression linear problems is as follows:
y = β 0 + β 1 x 1 + β 2 x 2 + + β n x n + ε
where β 0 , β 1 , , β n is the regression coefficient, β 0 is the regression constant, x 1 , x 2 , x n are the independent variable, y is the dependent variable, and ε is the random error.
In the actual problem, if there are m groups of data, the multiple linear regression model can be expressed as follows:
y 1 = β 0 + β 1 x 11 + β 2 x 12 + + β n x 1 n + ε 1 y 2 = β 0 + β 1 x 21 + β 2 x 22 + + β n x 2 n + ε 2 y m = β 0 + β 1 x m 1 + β 2 x m 2 + + β n x m n + ε m
where the error term must satisfy E ( ε j ) = 0 , V a r ( ε j ) = σ 2 , C o v ( ε j , ε k ) = 0 , j k .
The above formula can be written as a matrix:
Y = X β + ε
where X = 1 x 11 x 1 n 1 x 21 x 2 n 1 x m 1 x m n , Y = ( y 1 , y 2 , , y m ) T , β = ( β 0 , β 1 , , β n ) T , ε = ( ε 1 , ε 2 , ε m ) T . The hyperparameters of the machine learning models tuned by the BAS algorithm are summarized in Table 1.

3. Results and Discussion

3.1. Hyperparameter Tuning

To optimize the hyperparameters of the BPNN, SVM, DT, RF, KNN, LR, and MLR models, the BAS algorithm is used in this study to optimize the hyperparameters of the above seven machine learning models, and the optimization results are shown in Figure 7. It can be seen from Figure 7 that the RSME values of the BPNN, SVM, RF, and KNN models decline rapidly at first with the increase in the number of iterations, and tend to be stable as a whole when they fall to lower values, and the RSME values of SVM are the lowest. However, although the RSME values of the DT, LR, and MLR models are low before iteration, their RSME values do not change with the increase in the number of iterations. That is, BAS has a better hyperparameter tuning effect on the BPNN, SVM, RF, and KNN models, and the best hyperparameter tuning effect on SVM, but no hyperparameter tuning effect on the DT, LR, and MLR models.
To further determine the optimal RSME values of the above seven machine learning models, this study further tuned the hyperparameters of the above models through 10-fold cross-validation. The results of the 10-fold cross-validation on the hyperparameter optimization of the above seven models are shown in Figure 8. The 10-fold cross-validation is a common test method to test the accuracy of the algorithm. This method needs to divide the data set into ten parts, selecting one of them as the testing data in turn and the remaining nine as the training data for training. It can be seen from the figures that after BAS hyperparameter tuning, the RSME values of the BPNN, SVM, RF, and KNN models are lower; that is, the prediction effect for the compressive strength of concrete with blast furnace slag is better.

3.2. Evaluation of the Model

The box diagram of the prediction error of concrete compressive strength by the BPNN, SVM, DT, RF, KNN, LR, and MLR models optimized by BAS is shown in Figure 9. It can be seen that the KNN performed the best among all the models, as indicated by the minimum residual value. The DT, BP, and MLR models showed large residuals, demonstrating that they may not be suitable to predict the compressive strength of the concrete with blast furnace slag. The remaining machine learning models (SVM and RF) showed moderate performance in predicting the mechanical properties of the concrete with blast furnace slag, with certain accuracy.
To select the model with the best prediction effect for the compressive strength of concrete, this study further evaluated the prediction effect of the BPNN, SVM, DT, RF, KNN, LR, and MLR models tuned by BAS on the compressive strength of concrete. The comparison between the predicted values and actual values of the training set and test set of the seven models mentioned above is shown in Figure 10. It can be seen from Figure 10 that the BPNN, SVM, RF, and KNN models show high consistency between the predicted values and the actual values, while the DT, LR, and MLR models show a large difference between the predicted values and the actual values. The RSME values of the BPNN, SVM, DT, RF, KNN, LR, and MLR models of the training set are 5.8238, 0.2376, 11.1465, 3.4754, 1.0299, 9.642, and 7.5262, respectively; R values are 0.9306, 0.9999, 0.7007, 0.9809, 0.9978, 0.8658, and 0.8765, respectively; the RSME values of the test set are 19.8532, 10.968, 9.6954, 6.4661, 6.2801, 9.3101, and 7.4981, respectively; R values are 0.3485, 0.8358, 0.8197, 0.9173, 0.9165, 0.871, and 0.8836, respectively. In other words, BPNN, SVM, RF, and KNN all have higher R values and lower RSME values in the training set and test set. This shows again that the BPNN, SVM, RF, and KNN models have a better prediction effect on the compressive strength of concrete, while the LR and MLR models have a poor prediction effect on the compressive strength of concrete. Although SVM has the highest R value and the lowest RSME value in the training set, its R value decreases greatly and the RSME value increases greatly in the test set; that is, SVM appears to display the overlearning phenomenon in this study.
Figure 11 is the radar diagram of the R values and RSME values of the BPNN, SVM, DT, RF, KNN, LR, and MLR models. It can be seen more intuitively from Figure 11 that KNN has both a lower RSME value and a higher R value; that is, among the seven machine learning models, KNN is the model with the highest prediction accuracy for the compressive strength of concrete. This can be due to the fact that the KNN model itself does not need assumptions for the data set and is not sensitive to outliers, which can easily appear in concrete design. Moreover, the KNN model mainly relies on the surrounding limited adjacent samples, rather than the method of discriminating the class domain to determine the category. Therefore, the KNN model is more suitable for sample sets with more overlapping class domains (there may be more overlapping class domains in concrete design).
To further analyze the prediction effect of the training set and test set of seven models on the compressive strength of concrete, a Monte Carlo simulation was conducted on the RSME values of the seven models in this study, and the simulation results are shown in Figure 12. Although the same data for the training group and the test group were employed in this study, it can be seen that in the process of Monte Carlo simulation, the error of prediction showed obvious randomness. The possible reason for this is that these machine learning algorithms differ greatly in the underlying principles of implementation, so data differences are almost statistically irrelevant. Moreover, it can be seen from Figure 12 that BPNN has the highest R value in the training set; that is, BPNN tuned by BAS has the worst prediction effect in the training set. Although the SVM has the lowest RSME value of the training set, the RSME value of the test set is relatively high; that is, the prediction effect of the SVM optimized by BAS on the compressive strength of concrete is not stable. In contrast, KNN adjusted by BAS has lower RSME values in both the training set and the test set, which again verifies that KNN has the best prediction effect on the compressive strength of concrete among the seven machine learning models mentioned above.

3.3. Importance of Variables

Due to the complexity of the compressive strength of concrete, different admixtures have different effects on the compressive strength of concrete. The importance scores of different input variables to concrete compressive strength are shown in Figure 13. It can be seen from Figure 13 that cement has the highest importance score (2.7171) for the compressive strength of concrete; that is, cement is the most important factor affecting the compressive strength of concrete among the input variables of this study. The importance of fine aggregate to the compressive strength of concrete has the lowest score (0.3128); that is, fine aggregate is the variable with the least influence on the compressive strength of concrete among the input variables of this study. The importance scores of cement, water, blast furnace slag, coarse aggregate, superplasticizer, and fine aggregate to the compressive strength of concrete are all positive; that is, the compressive strength of concrete is proportional to the above six input variables.

4. Conclusions

Concrete is one of the most widely used building materials, and strength is an important index of its comprehensive performance; to ensure that the quality of concrete meets the requirements, concrete must reach a certain compressive strength. To enhance the compressive strength of concrete and ensure the sustainable development of the concrete industry, the use of mineral admixtures to replace part of the cement in concrete has attracted more and more researchers’ attention. In this study, the prediction effect of BPNN, SVM, DT, RF, KNN, LR, and MLR models tuned by BAS on the compressive strength of concrete containing blast furnace slag was studied, and the following conclusions were obtained:
(1)
The BAS algorithm showed a small amount of computation, very fast convergence, and global optimization ability in the machine learning model used to adjust and predict the mechanical properties of concrete. By comparison with varying machine learning models, the results showed that BAS has good hyperparameter tuning effects on BPNN, SVM, RF, and KNN models, but poor hyperparameter tuning effects on DT, LR, and MLR models.
(2)
Among the seven machine learning models, SVM, RF, and KNN have higher prediction accuracy for the compressive strength of concrete, while SVM has an over-fitting phenomenon for the prediction of the compressive strength of concrete. After further comparison, the KNN model is finally confirmed to be the model with the highest prediction accuracy (R value of the training set is 0.9978; R value of the testing set is 0.9165) for the compressive strength of concrete.
(3)
Among all the design parameters of the concrete with blast furnace slag, the importance score of cement to the compressive strength of concrete is the highest, while the importance score of fine aggregate to the compressive strength of concrete is the lowest, and the importance values of the above five variables to the compressive strength of concrete are all positive. In other words, cement and fine aggregate have the greatest and least influence on the compressive strength of concrete among the five input variables mentioned above, and the compressive strength of concrete is proportional to any one of the five input variables in this study.
In future studies, more data can be collected to improve the reliability of the machine learning model, and the prediction model obtained should be applied to actual production practice, providing some guidance for industrial production. Moreover, the performance of the models optimized with the BAS algorithm can be compared with traditional methods to train each model to verify the effectiveness of hyperparameter tuning. The model combining the linear and nonlinear models using error series or residuals can be proposed in the future to improve the efficiency and reliability of computing.

Author Contributions

Conceptualization, X.W., F.Z. and J.H.; Supervision, J.H.; Writing—original draft, M.Z. and M.M.S.S.; Writing—review and editing, X.W., F.Z., M.Z. and M.M.S.S. All authors have read and agreed to the published version of the manuscript.

Funding

The research is partially funded by the Ministry of Science and Higher Education of the Russian Federation as part of the World-class Research Center program: Advanced Digital Technologies (contract No. 075-15-2022-311 dated 20.04.2022).

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Amario, M.; Pepe, M.; Toledo Filho, R.D. Influence of Recycled Concrete Aggregates on the Rheology of Concrete. In Proceedings of the 5th Iberoamerican Congress of Self-Compacting Concrete and Special Concrete, Valencia, Spain, 5–6 March 2018; pp. 85–94. [Google Scholar]
  2. Jeevanandan, K.; Sreevidya, V. Experimental Investigation on Concrete and Geopolymer Concrete. In Proceedings of the International Conference on Recent Trends in Nanomaterials for Energy, Environmental and Engineering Applications (ICONEEEA), Tiruchirappalli, India, 28–29 March 2020; pp. 307–312. [Google Scholar]
  3. Bressi, S.; Fiorentini, N.; Huang, J.; Losa, M. Crumb rubber modifier in road asphalt pavements: State of the art and statistics. Coatings 2019, 9, 384. [Google Scholar] [CrossRef] [Green Version]
  4. Huang, J.; Leandri, P.; Cuciniello, G.; Losa, M. Mix design and laboratory characterisation of rubberised mixture used as damping layer in pavements. Int. J. Pavement Eng. 2021, 23, 2746–2760. [Google Scholar] [CrossRef]
  5. Ma, H.; Liu, J.; Zhang, J.; Huang, J. Estimating the compressive strength of cement-based materials with mining waste using support vector machine, decision tree, and random forest models. Adv. Civ. Eng. 2021, 2021, 6629466. [Google Scholar] [CrossRef]
  6. Xu, W.; Huang, X.; Huang, J.; Yang, Z. Structural analysis of backfill highway subgrade on the lower bearing capacity foundation using the finite element method. Adv. Civ. Eng. 2021, 2021, 1690168. [Google Scholar] [CrossRef]
  7. Jian, S.-M.; Wu, B. Compressive behavior of compound concrete containing demolished concrete lumps and recycled aggregate concrete. Constr. Build. Mater. 2021, 272, 121624. [Google Scholar] [CrossRef]
  8. Liang, X.; Yu, X.; Chen, C.; Ding, G.; Huang, J. Towards the low-energy usage of high viscosity asphalt in porous asphalt pavements: A case study of warm-mix asphalt additives. Case Stud. Constr. Mater. 2022, 16, e00914. [Google Scholar] [CrossRef]
  9. Kim, C.-G.; Park, H.-G.; Hong, G.-H.; Lee, H.; Suh, J.-I. Shear strength of reinforced concrete-composite beams with prestressed concrete and non-prestressed concrete. ACI Struct. J. 2018, 115, 917–930. [Google Scholar]
  10. Huang, J.; Duan, T.; Lei, Y.; Hasanipanah, M. Finite element modeling for the antivibration pavement used to improve the slope stability of the open-pit mine. Shock Vib. 2020, 2020, 6650780. [Google Scholar] [CrossRef]
  11. Ahmad, M.; Tang, X.-W.; Ahmad, F.; Pirhadi, N.; Wan, X.; Cheng, K. Probabilistic evaluation of cpt-based seismic soil liquefaction potential: Towards the integration of interpretive structural modeling and bayesian belief network. Math. Biosci. Eng. MBE 2021, 18, 9233–9925. [Google Scholar] [CrossRef]
  12. Wang, Q.-A.; Zhang, C.; Ma, Z.-G.; Huang, J.; Ni, Y.-Q.; Zhang, C. Shm deformation monitoring for high-speed rail track slabs and bayesian change point detection for the measurements. Constr. Build. Mater. 2021, 300, 124337. [Google Scholar] [CrossRef]
  13. Pyataev, E.; Zhukov, A.; Vako, K.; Burtseva, M.; Mednikova, E.; Prusakova, M.; Izumova, E. Effective Polymer Concrete on Waste Concrete Production. In Proceedings of the 22nd International Scientific Conference on Construction—The Formation of Living Environment (FORM), Tashkent Inst Irrigat & Agr Mechanizat Engineers, Tashkent, Uzbekistan, 18–21 April 2019; Tashkent Inst Irrigat & Agr Mechanizat Engineers: Tashkent, Uzbekistan, 2019. [Google Scholar]
  14. Huang, J.; Alyousef, R.; Suhatril, M.; Baharom, S.; Alabduljabbar, H.; Alaskar, A.; Assilzadeh, H. Influence of porosity and cement grade on concrete mechanical properties. Adv. Concr. Constr. 2020, 10, 393–402. [Google Scholar]
  15. Huang, J.; Zhou, M.; Yuan, H.; Sabri, M.M.; Li, X. Towards sustainable construction materials: A comparative study of prediction models for green concrete with metakaolin. Buildings 2022, 12, 772. [Google Scholar]
  16. Rahman, S.S.; Khattak, M.J. Roller compacted geopolymer concrete using recycled concrete aggregate. Constr. Build. Mater. 2021, 283, 122624. [Google Scholar] [CrossRef]
  17. Wimalasiri, M.; Robert, D.; Li, C.-Q. Permeability degradation of stressed concrete considering concrete plasticity. J. Mater. Civ. Eng. 2020, 32, 04020265. [Google Scholar] [CrossRef]
  18. Zhang, S.; Fan, Y.; Huang, J.; Shah, S.P. Effect of nano-metakaolinite clay on hydration behavior of cement-based materials at early curing age. Constr. Build. Mater. 2021, 291, 123107. [Google Scholar] [CrossRef]
  19. Huang, J.; Sabri, M.M.; Ulrikh, D.V.; Ahmad, M.; Alsaffar, K.A. Predicting the compressive strength of the cement-fly ash–slag ternary concrete using the firefly algorithm (fa) and random forest (rf) hybrid machine-learning method. Materials 2022, 15, 4193. [Google Scholar] [CrossRef]
  20. Vavrus, M.; Kotes, P. Numerical Comparison of Concrete Columns Strengthened with Layer of Fiber Concrete and Reinforced Concrete. In Proceedings of the 13th International Scientific Conference on Sustainable, Modern and Safe Transport (TRANSCOM), Novy Smokovec, Slovakia, 29–31 May 2019; pp. 920–926. [Google Scholar]
  21. Esparham, A.; Moradikhou, A.B.; Andalib, F.K.; Avanaki, M.J. Strength characteristics of granulated ground blast furnace slag-based geopolymer concrete. Adv. Concr. Constr. 2021, 11, 219–229. [Google Scholar]
  22. Fikri, H.; Krisologus, Y.P.; Permana, R.; Raafidiani, R. Cyclic Behavior of Ground Granulated Blast Furnace Slag (ggbfs) Concrete Beams. In Proceedings of the 3rd International Conference on Innovation in Engineering and Vocational Education (ICIEVE), Bandung, Indonesia, 26 November 2020. [Google Scholar]
  23. Qi, A.; Liu, X.; Wang, Z.; Chen, Z. Mechanical properties of the concrete containing ferronickel slag and blast furnace slag powder. Constr. Build. Mater. 2020, 231, 117120. [Google Scholar] [CrossRef]
  24. Gao, Y.; Huang, J.; Li, M.; Dai, Z.; Jiang, R.; Zhang, J. Chemical modification of combusted coal gangue for u(vi) adsorption: Towards a waste control by waste strategy. Sustainability 2021, 13, 117120. [Google Scholar] [CrossRef]
  25. Huang, J.; Li, X.; Kumar, G.S.; Deng, Y.; Gong, M.; Dong, N. Rheological properties of bituminous binder modified with recycled waste toner. J. Clean. Prod. 2021, 317, 128415. [Google Scholar] [CrossRef]
  26. Huang, J.; Zhou, M.; Yuan, H.; Sabri, M.M.S.; Li, X. Prediction of the compressive strength for cement-based materials with metakaolin based on the hybrid machine learning method. Materials 2022, 15, 3500. [Google Scholar] [CrossRef] [PubMed]
  27. Ayano, T.; Fujii, T. Improvement of concrete properties using granulated blast furnace slag sand. J. Adv. Concr. Technol. 2021, 19, 118–132. [Google Scholar] [CrossRef]
  28. Breitenbuecher, R.; Baecker, J.; Kunz, S.; Ehrenberg, A.; Gerten, C. Optimizing the Acid Resistance of Concrete With Granulated Blast-Furnace Slag. In Proceedings of the 5th International Conference on Concrete Repair, Rehabilitation and Retrofitting (ICCRRR), Cape Town, South Africa, 19–21 November 2018. [Google Scholar]
  29. Salvador, R.P.; Rambo, D.A.S.; Bueno, R.M.; Silva, K.T.; de Figueiredo, A.D. On the use of blast-furnace slag in sprayed concrete applications. Constr. Build. Mater. 2019, 218, 543–555. [Google Scholar] [CrossRef]
  30. Saranya, P.; Nagarajan, P.; Shashikala, A.P. Development of ground-granulated blast-furnace slag-dolomite geopolymer concrete. ACI Mater. J. 2019, 116, 235–243. [Google Scholar]
  31. Topcu, I.B.; Unverdi, A. Properties of High Content Ground Granulated Blast Furnace Slag Concrete. In Proceedings of the 3rd International Sustainable Buildings Symposium (ISBS), Dubai, United Arab Emirates, 15–17 March 2018; pp. 114–126. [Google Scholar]
  32. Mendes, S.E.S.; Oliveira, R.L.N.; Cremonez, C.; Pereira, E.; Pereira, E.; Trentin, P.O.; Medeiros-Junior, R.A. Estimation of electrical resistivity of concrete with blast-furnace slag. ACI Mater. J. 2021, 118, 27–37. [Google Scholar]
  33. Shi, D.; Masuda, Y.; Lee, Y. Experimental Study on Mechanical Properties of High-Strength Concrete Using Blast Furnace Slag Fine Aggregate. In Proceedings of the 1st International Conference on High Performance Structures and Materials Engineering, Beijing, China, 5–6 May 2011; pp. 113–118. [Google Scholar]
  34. Cvetkovic, S. Estimation of factors affecting the concrete strength in presence of blast furnace slag and fly ash using adaptive neuro-fuzzy technique. Struct. Concr. 2021, 22, 1243. [Google Scholar]
  35. Zhao, Y.; Bi, J.H.; Sun, Y.T.; Wang, Z.Y.; Huo, L.Y.; Duan, Y.C. Synergetic effect of ground granulated blast-furnace slag and hooked-end steel fibers on various properties of steel fiber reinforced self-compacting concrete. Struct. Concr. 2022, 23, 268–284. [Google Scholar] [CrossRef]
  36. Liu, Z.G.; Huang, Z.X. Influence of steel slag-superfine blast furnace slag composite mineral admixture on the properties of mortar and concrete. Adv. Civ. Eng. 2021, 2021, 9983019. [Google Scholar] [CrossRef]
  37. Cui, X.; Wang, Q.; Zhang, R.; Dai, J.; Li, S. Machine learning prediction of concrete compressive strength with data enhancement. J. Intell. Fuzzy Syst. 2021, 41, 7219–7228. [Google Scholar] [CrossRef]
  38. Kumar, A.; Arora, H.C.; Kapoor, N.R.; Mohammed, M.A.; Kumar, K.; Majumdar, A.; Thinnukool, O. Compressive strength prediction of lightweight concrete: Machine learning models. Sustainability 2022, 14, 2404. [Google Scholar] [CrossRef]
  39. Pham, A.-D.; Ngo, N.-T.; Nguyen, Q.-T.; Truong, N.-S. Hybrid machine learning for predicting strength of sustainable concrete. Soft Comput. 2020, 24, 14965–14980. [Google Scholar] [CrossRef]
  40. Prayogo, D.; Santoso, D.I.; Wijaya, D.; Gunawan, T.; Widjaja, J.A. Prediction of Concrete Properties Using Ensemble Machine Learning Methods. In Proceedings of the 2nd International Conference on Sustainable Infrastructure (ICSI), Yogyakarta, Indonesia, 28–29 October 2020. [Google Scholar]
  41. Huang, J.; Duan, T.; Zhang, Y.; Liu, J.; Zhang, J.; Lei, Y. Predicting the permeability of pervious concrete based on the beetle antennae search algorithm and random forest model. Adv. Civ. Eng. 2020, 2020, 8863181. [Google Scholar] [CrossRef]
  42. Xu, W.; Huang, X.; Yang, Z.; Zhou, M.; Huang, J. Developing hybrid machine learning models to determine the dynamic modulus (e*) of asphalt mixtures using parameters in witczak 1-40d model: A comparative study. Materials 2022, 15, 1791. [Google Scholar] [CrossRef] [PubMed]
  43. Zhu, F.; Wu, X.; Zhou, M.; Sabri, M.M.; Huang, J. Intelligent design of building materials: Development of an ai-based method for cement-slag concrete design. Materials 2022, 15, 3833. [Google Scholar] [CrossRef] [PubMed]
  44. Van Quan, T.; Viet Quoc, D.; Ho, L.S. Evaluating compressive strength of concrete made with recycled concrete aggregates using machine learning approach. Constr. Build. Mater. 2022, 323, 126578. [Google Scholar] [CrossRef]
  45. Zhang, J.; Xu, J.; Liu, C.; Zheng, J. Prediction of rubber fiber concrete strength using extreme learning machine. Front. Mater. 2021, 7, 582635. [Google Scholar] [CrossRef]
  46. Ziolkowski, P.; Niedostatkiewicz, M. Machine learning techniques in concrete mix design. Materials 2019, 12, 1256. [Google Scholar] [CrossRef] [Green Version]
  47. Ziolkowski, P.; Niedostatkiewicz, M.; Kang, S.-B. Model-based adaptive machine learning approach in concrete mix design. Materials 2021, 14, 1661. [Google Scholar] [CrossRef]
  48. Huang, J.; Asteris, P.G.; Pasha, S.M.K.; Mohammed, A.S.; Hasanipanah, M. A new auto-tuning model for predicting the rock fragmentation: A cat swarm optimization algorithm. Eng. Comput. 2020, 38, 2209–2220. [Google Scholar] [CrossRef]
  49. Huang, J.; Zhang, J.; Gao, Y. Intelligently predict the rock joint shear strength using the support vector regression and firefly algorithm. Lithosphere 2021, 2021, 2467126. [Google Scholar] [CrossRef]
  50. Wang, Q.-A.; Zhang, J.; Huang, J. Simulation of the compressive strength of cemented tailing backfill through the use of firefly algorithm and random forest model. Shock Vib. 2021, 2021, 5536998. [Google Scholar] [CrossRef]
  51. Huang, J.; Zhou, M.; Sabri, M.M.S.; Yuan, H. A novel neural computing model applied to estimate the dynamic modulus (dm) of asphalt mixtures by the improved beetle antennae search. Sustainability 2022, 14, 5938. [Google Scholar] [CrossRef]
  52. Salimbahrami, S.R.; Shakeri, R. Experimental investigation and comparative machine-learning prediction of compressive strength of recycled aggregate concrete. Soft Comput. 2021, 25, 919–932. [Google Scholar] [CrossRef]
  53. Al-Shamiri, A.K.; Yuan, T.F.; Kim, J.H. Non-tuned machine learning approach for predicting the compressive strength of high-performance concrete. Materials 2020, 13, 1023. [Google Scholar] [CrossRef] [Green Version]
  54. Wang, L.; Wu, X.G.; Chen, H.Y.; Zeng, T.M.; IOP. Prediction of Impermeability of the Concrete Structure Based on Random Forest and Support Vector Machine. In Proceedings of the International Conference on Sustainable Development and Environmental Science (ICSDES), Zhengzhou, China, 19–21 June 2020. [Google Scholar]
  55. Nilsen, V.; Pham, L.T.; Hibbard, M.; Klager, A.; Cramer, S.M.; Morgan, D. Prediction of concrete coefficient of thermal expansion and other properties using machine learning. Constr. Build. Mater. 2019, 220, 587–595. [Google Scholar] [CrossRef]
  56. Mahdiyar, A.; Jahed Armaghani, D.; Koopialipoor, M.; Hedayat, A.; Abdullah, A.; Yahya, K. Practical risk assessment of ground vibrations resulting from blasting, using gene expression programming and monte carlo simulation techniques. Appl. Sci. 2020, 10, 472. [Google Scholar] [CrossRef] [Green Version]
  57. Lu, S.; Koopialipoor, M.; Asteris, P.G.; Bahri, M.; Armaghani, D.J. A novel feature selection approach based on tree models for evaluating the punching shear capacity of steel fiber-reinforced concrete flat slabs. Materials 2020, 13, 3902. [Google Scholar] [CrossRef]
  58. Koopialipoor, M.; Tootoonchi, H.; Armaghani, D.J.; Mohamad, E.T.; Hedayat, A. Application of deep neural networks in predicting the penetration rate of tunnel boring machines. Bull. Eng. Geol. Environ. 2019, 78, 6347–6360. [Google Scholar] [CrossRef]
  59. Koopialipoor, M.; Noorbakhsh, A.; Noroozi Ghaleini, E.; Jahed Armaghani, D.; Yagiz, S. A new approach for estimation of rock brittleness based on non-destructive tests. Nondestruct. Test. Eval. 2019, 34, 354–375. [Google Scholar] [CrossRef]
  60. Koopialipoor, M.; Nikouei, S.S.; Marto, A.; Fahimifar, A.; Armaghani, D.J.; Mohamad, E.T. Predicting tunnel boring machine performance through a new model based on the group method of data handling. Bull. Eng. Geol. Environ. 2019, 78, 3799–3813. [Google Scholar] [CrossRef]
  61. Hasanipanah, M.; Noorian-Bidgoli, M.; Armaghani, D.J.; Khamesi, H. Feasibility of pso-ann model for predicting surface settlement caused by tunneling. Eng. Comput. 2016, 32, 705–715. [Google Scholar] [CrossRef]
  62. Hasanipanah, M.; Monjezi, M.; Shahnazar, A.; Armaghani, D.J.; Farazmand, A. Feasibility of indirect determination of blast induced ground vibration based on support vector machine. Measurement 2015, 75, 289–297. [Google Scholar] [CrossRef]
  63. Hajihassani, M.; Armaghani, D.J.; Marto, A.; Mohamad, E.T. Ground vibration prediction in quarry blasting through an artificial neural network optimized by imperialist competitive algorithm. Bull. Eng. Geol. Environ. 2015, 74, 873–886. [Google Scholar] [CrossRef]
  64. Chen, W.; Hasanipanah, M.; Rad, H.N.; Armaghani, D.J.; Tahir, M. A new design of evolutionary hybrid optimization of svr model in predicting the blast-induced ground vibration. Eng. Comput. 2019, 37, 1455–1471. [Google Scholar] [CrossRef]
  65. Cai, M.; Koopialipoor, M.; Armaghani, D.J.; Thai Pham, B. Evaluating slope deformation of earth dams due to earthquake shaking using mars and gmdh techniques. Appl. Sci. 2020, 10, 1486. [Google Scholar] [CrossRef] [Green Version]
  66. Armaghani, D.J.; Raja, R.S.N.S.B.; Faizi, K.; Rashid, A.S.A. Developing a hybrid pso–ann model for estimating the ultimate bearing capacity of rock-socketed piles. Neural Comput. Appl. 2017, 28, 391–405. [Google Scholar] [CrossRef]
  67. Armaghani, D.J.; Mohamad, E.T.; Narayanasamy, M.S.; Narita, N.; Yagiz, S. Development of hybrid intelligent models for predicting tbm penetration rate in hard rock condition. Tunn. Undergr. Space Technol. 2017, 63, 29–43. [Google Scholar] [CrossRef]
  68. Armaghani, D.J.; Mirzaei, F.; Shariati, M.; Trung, N.T.; Shariati, M.; Trnavac, D. Hybrid ann-based techniques in predicting cohesion of sandy-soil combined with fiber. Geomech. Eng. 2020, 20, 191–205. [Google Scholar]
  69. Jiang, X.; Li, S. Bas: Beetle antennae search algorithm for optimization problems. arxiv 2017, arXiv:1710.10724. [Google Scholar] [CrossRef]
  70. Yeh, I.C. Modeling of strength of high-performance concrete using artificial neural networks. Cem. Concr. Res. 1998, 28, 1797–1808. [Google Scholar] [CrossRef]
  71. Silva, D.A.; Alves, G.I.; de Mattos Neto, P.S.; Ferreira, T.A. Measurement of fitness function efficiency using data envelopment analysis. Expert Syst. Appl. 2014, 41, 7147–7160. [Google Scholar] [CrossRef]
  72. Lima Junior, A.R.; Silva, D.A.; Mattos Neto, P.S.; Ferreira, T.A. An Experimental Study of Fitness Function and Time Series Forecasting Using Artificial Neural Networks. In Proceedings of the 12th Annual Conference Companion on Genetic and Evolutionary Computation, Portland, OR, USA, 7–11 July 2010; pp. 2015–2018. [Google Scholar]
Figure 1. Frequency distribution histogram of variables. (a) Cement; (b) Water; (c) Blast furnace slag; (d) Coarse aggregate; (e) Fine aggregate; (f) Superplasticizer; (g) Uniaxial compressive strength.
Figure 1. Frequency distribution histogram of variables. (a) Cement; (b) Water; (c) Blast furnace slag; (d) Coarse aggregate; (e) Fine aggregate; (f) Superplasticizer; (g) Uniaxial compressive strength.
Materials 15 04582 g001aMaterials 15 04582 g001b
Figure 2. Correlation analysis results of input variables.
Figure 2. Correlation analysis results of input variables.
Materials 15 04582 g002
Figure 3. Flow chart of BPNN.
Figure 3. Flow chart of BPNN.
Materials 15 04582 g003
Figure 4. Construction schematic of RF.
Figure 4. Construction schematic of RF.
Materials 15 04582 g004
Figure 5. Graphic description of KNN (samples are represented by different color and shape).
Figure 5. Graphic description of KNN (samples are represented by different color and shape).
Materials 15 04582 g005
Figure 6. Graphic description of regression.
Figure 6. Graphic description of regression.
Materials 15 04582 g006
Figure 7. The relationship between RSME values and the number of iterations of different models.
Figure 7. The relationship between RSME values and the number of iterations of different models.
Materials 15 04582 g007
Figure 8. RMSE values for different fold numbers of different models. (a) BPNN; (b) SVM; (c) DT; (d) RF; (e) KNN; (f) LR; (g) MLR (Red lines represent the minimum values of RMSE).
Figure 8. RMSE values for different fold numbers of different models. (a) BPNN; (b) SVM; (c) DT; (d) RF; (e) KNN; (f) LR; (g) MLR (Red lines represent the minimum values of RMSE).
Materials 15 04582 g008aMaterials 15 04582 g008b
Figure 9. Box plot of the prediction error of different models (+ represents the maximum value).
Figure 9. Box plot of the prediction error of different models (+ represents the maximum value).
Materials 15 04582 g009
Figure 10. Comparison of predicted value with the actual value of different models. (a) BPNN; (b) SVM; (c) DT; (d) RF; (e) KNN; (f) LR; (g) MLR.
Figure 10. Comparison of predicted value with the actual value of different models. (a) BPNN; (b) SVM; (c) DT; (d) RF; (e) KNN; (f) LR; (g) MLR.
Materials 15 04582 g010aMaterials 15 04582 g010bMaterials 15 04582 g010c
Figure 11. Taylor diagrams of test sets for different models.
Figure 11. Taylor diagrams of test sets for different models.
Materials 15 04582 g011
Figure 12. Monte Carlo simulation (number of Monto Carlo runs vs. value of RSME).
Figure 12. Monte Carlo simulation (number of Monto Carlo runs vs. value of RSME).
Materials 15 04582 g012
Figure 13. Importance scores of different input variables to the compressive strength of concrete.
Figure 13. Importance scores of different input variables to the compressive strength of concrete.
Materials 15 04582 g013
Table 1. Hyperparameters of the machine learning models tuned by the BAS algorithm.
Table 1. Hyperparameters of the machine learning models tuned by the BAS algorithm.
Machine Learning ModelsHyperparameters Tuned by the BAS AlgorithmRange Values (or Requirement) of the Hyperparameters
BPNNhidden_layer_num1–3
hidden_layer_size1–20
SVMC_penalty0.1–10
kernelLinear
tol1 × 10−4–1× 10−2
DTcriterionGini, Entropy
max_depth1–100
min_samples_split2–10
min_samples_leaf1–10
RFcriterionGini, Entropy
n_estimators1–1000
KNNneighbors num1–10
LRtol1 × 10−5–1 × 10−3
C_inverse0.1–10
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wu, X.; Zhu, F.; Zhou, M.; Sabri, M.M.S.; Huang, J. Intelligent Design of Construction Materials: A Comparative Study of AI Approaches for Predicting the Strength of Concrete with Blast Furnace Slag. Materials 2022, 15, 4582. https://doi.org/10.3390/ma15134582

AMA Style

Wu X, Zhu F, Zhou M, Sabri MMS, Huang J. Intelligent Design of Construction Materials: A Comparative Study of AI Approaches for Predicting the Strength of Concrete with Blast Furnace Slag. Materials. 2022; 15(13):4582. https://doi.org/10.3390/ma15134582

Chicago/Turabian Style

Wu, Xiangping, Fei Zhu, Mengmeng Zhou, Mohanad Muayad Sabri Sabri, and Jiandong Huang. 2022. "Intelligent Design of Construction Materials: A Comparative Study of AI Approaches for Predicting the Strength of Concrete with Blast Furnace Slag" Materials 15, no. 13: 4582. https://doi.org/10.3390/ma15134582

APA Style

Wu, X., Zhu, F., Zhou, M., Sabri, M. M. S., & Huang, J. (2022). Intelligent Design of Construction Materials: A Comparative Study of AI Approaches for Predicting the Strength of Concrete with Blast Furnace Slag. Materials, 15(13), 4582. https://doi.org/10.3390/ma15134582

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop