Next Article in Journal
Flash Flood Assessment and Management for Sustainable Development Using Geospatial Technology and WMS Models in Abha City, Aseer Region, Saudi Arabia
Next Article in Special Issue
Detection Method of Crushing Mouth Loose Material Blockage Based on SSD Algorithm
Previous Article in Journal
Factors Affecting Visiting Behavior to Bali during the COVID-19 Pandemic: An Extended Theory of Planned Behavior Approach
Previous Article in Special Issue
Research on Comprehensive Evaluation Model of a Truck Dispatching System in Open-Pit Mine
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Mining Maximum Subsidence Prediction Based on Genetic Algorithm Combined with XGBoost Model

School of Investigation and Surveying Engineering, Changchun Institute of Technology, Changchun 130021, China
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(16), 10421; https://doi.org/10.3390/su141610421
Submission received: 27 July 2022 / Revised: 16 August 2022 / Accepted: 19 August 2022 / Published: 22 August 2022
(This article belongs to the Special Issue Advances in Intelligent and Sustainable Mining)

Abstract

:
The extreme gradient boosting (XGBoost) ensemble learning algorithm excels in solving complex nonlinear relational problems. In order to accurately predict the surface subsidence caused by mining, this work introduces the genetic algorithm (GA) and XGBoost integrated algorithm model for mining subsidence prediction and uses the Python language to develop the GA-XGBoost combined model. The hyperparameter vector of XGBoost is optimized by a genetic algorithm to improve the prediction accuracy and reliability of the XGBoost model. Using some domestic mining subsidence data sets to conduct a model prediction evaluation, the results show that the R2 (coefficient of determination) of the prediction results of the GA-XGBoost model is 0.941, the RMSE (root mean square error) is 0.369, and the MAE (mean absolute error) is 0.308. Then, compared with classic ensemble learning models such as XGBoost, random deep forest, and gradient boost, the GA-XGBoost model has higher prediction accuracy and performance than a single machine learning model.

1. Introduction

Coal mining subsidence can not only easily cause surface damage, building collapse, road, and railway damage, but also a series of safety and environmental problems such as landslides and surface water loss. At the same time, it can seriously damage the ecological environment of the mining area [1,2,3]. By predicting the damage degree of surface subsidence, the maximum subsidence value, and other basic data information in advance, can not only guide mine design in terms of the combined production plan and designated disaster prevention measures, but can also reduce mining damage and alleviate resources by pre-optimizing the mining plan and treatment measures. The contradiction occurs between development and utilization, environmental protection, and social public security. The research on subsidence damage caused by mining has always been a research hotspot of mining scientists. In recent years, the main research methods of mining subsidence include logistics similarity simulation, computer numerical simulation, and other methods. Similar simulation can simulate the stress distribution of surrounding rock caused by excavation. It is characteristically intuitive and simple in the study of complex problems such as mining and provides a valuable reference for engineering applications and scientific research. However, it has disadvantages such as high cost, long periods, and a lack of qualitative parameter output for simulated excavation. Particularly when the model size is large and the mining conditions are complex, it is difficult to simulate the real mining subsidence process jointly [4,5,6,7]. Computer numerical simulation can simulate the complex dynamic change process and mining conditions in the mining process, with low cost and quick results, and has also been widely used in the mining field. However, the reliability of its simulation results is closely related to the modeling analyst’s mastery of numerical software and understanding of the actual mining process [8,9,10]. At present, the main theoretical method for predicting mining subsidence in China is the probability integral method, which is widely used because of its rich theory, rigorous mathematical theoretical foundation, and convenient calculation. However, due to complex mining conditions and geological conditions, there is a certain level of error between the theoretical prediction of the probability integration method and a real engineering case. Detailed parameters to improve the accuracy and reliability of the overall prediction are provided [11,12,13].
In recent years, with the development of computer science and technology, artificial intelligence methods have been successfully applied in the field of rock mass engineering [14,15]. More and more mining technicians have begun to introduce artificial intelligence technology to solve the highly nonlinear and complex prediction problem of mining subsidence prediction and have achieved fruitful research results [16,17,18,19]. With the wide application of artificial intelligence methods such as machine learning, more and more scholars use machine learning methods to evaluate land subsidence problems, use machine learning methods to analyze a large number of land subsidence data, mine the subsidence laws in the data, and study its influencing factors and corresponding countermeasures [20,21]. The successful application of machine learning methods provides new ideas for the study of land subsidence, and its powerful processing of complex nonlinear problems shows great potential [22]. For the subsidence problem caused by coal mining, the main direction of much of the research is to use intelligent algorithms and machine learning methods to optimize the predicted parameters of traditional theoretical models and improve the accuracy and reliability of the prediction [23,24]. There are many research methods of mining subsidence, involving a wide range of theoretical aspects, and each method shows certain applicability and defects. With the wide application of machine learning methods, it has become an effective method to study complex problems by analyzing data, mining the essential principles of physical phenomena, and analyzing relevant influencing factors. In view of the above analysis, this paper selects the classic XGBoost ensemble algorithm to deal with the highly nonlinear problem of mining settlement prediction. Through a comprehensive analysis of mining factors, geological factors, etc., based on on-site observation data, bypassing the theoretical assumptions of mining models and complex mining, geological and mining factors are used to study mining subsidence problems. At the same time, given the problems of many XGBoost hyperparameters and complex models, a genetic algorithm (GA) is introduced to optimize the hyperparameters of XGBoost. A GA-XGBoost combined model with adaptive adjustment of prediction accuracy is established, which improves the generalization ability and prediction accuracy of a single XGBoost model. The combined model method proposed in this paper can help mining scientists use artificial intelligence to predict mining subsidence problems and provide a reference for the optimization of mining plans and the formulation of disaster prevention measures.

2. Methodology

2.1. Model of Mining Subsidence

With the mining of the coal seam, the roof is broken and collapsed, and the overlying stratum appears to be deformed and bent and destabilized after a series of complex operations. The mining impact is transmitted from the downhole to the surface, which eventually leads to significant subsidence [25]. The mining subsidence model is shown in Figure 1.
As shown in Figure 1, the movement and damage of the overlying strata caused by coal mining destroys the stress state and stable structure of the original strata, and eventually evolves into severe land subsidence [26]. The main factors affecting mining subsidence are mining parameters (mining height: m, mining width: l), and geological conditions (coal depth: H, topsoil thickness: h, etc.)

2.2. Extreme Gradient Boosting (XGBoost)

XGBoost is a typical ensemble learning algorithm. Its basic idea is to combine several weak estimators, and finally form a strong estimator for predictive analysis by continuously fitting the prediction residuals of a single weak estimator [27]. The common weak learner integrated by XGBoost is a tree model. Since the traditional loss function and model complexity are considered when establishing the objective function, and a regular term is added to prevent overfitting, XGBoost has high computing performance and prediction accuracy [28]. XGBoost shows excellent performance and generalization ability in dealing with nonlinear problems. It has been widely used in the field of machine learning for tasks such as classification and regression. Its objective function is as follows [29].
O b j ( r ) = i = 1 m L ( y i , y ^ i ( r ) ) + k = 1 r Ω ( g r )
where i represents the ith sample of the data set; m represents the total amount of imported data imported into the kth tree; r represents all trees established; y i represents the true value; y ^ i ( r ) represents the predicted value of the rth prediction; g r represents the structural item of the tree model; L ( y i , y ^ i ( r ) ) represents the model loss function, which describes the difference between the true value and the predicted value; Ω ( g r ) indicates the complexity of the model. The complexity formula of the model is as follows:
Ω ( g r ) = γ T + 1 2 λ j T w 2
where T represents the number of leaf nodes of the tree, w indicates leaf weight, γ parameter to control the number of leaves, λ is the normalized coefficient. On the basis of considering the traditional error function, XGBoost takes into account the influence of model complexity, and at the same time optimizes the generalization error of the model in mathematical principles and seeks a balance point in the model’s variance-bias dilemma in order to generalize the model.
The error is the smallest and the operation speed is the fastest. XGBoost has been recognized as an extremely high-performance estimator for classification and regression problems. However, due to the complexity of the XGBoost algorithm and many hyperparameters, the prediction performance of the model is directly affected by the model hyperparameters, which is greatly limited in engineering applications. XGBoost has many hyperparameters, and the most significant ones in practical applications are learning rate (learning rate), number of weak evaluations (num_round), maximum depth of tree (max_depth), regularization parameter(lambda), etc. However, the optimization of hyperparameters is closely related to the analyst’s understanding of the physical meaning of the data and analysis experience. In practical applications, the parameters are generally adjusted by the initial default parameters, so the performance of the model is closely related to the modeler’s theoretical principles and engineering background. Usually, manual parameter adjustment is inefficient and often difficult to optimize the model quickly, and it is difficult to meet the growing requirements of engineering applications.

2.3. Genetic Algorithm (GA)

GA is a heuristic group search algorithm. Its basic idea is to simulate the principle of “survival of the fittest and survival of the fittest” in nature, and simulate the selection, replication, crossover, mutation, and other operations of biological population chromosomes with the help of computer programs. The elimination search is carried out in a group manner, and the optimal solution is finally searched in the solution space [30]. In the process of GA optimization, the potential possible solutions are first regarded as independent individuals, the individuals are encoded to form an initial population, and then the approximate optimal solution is generated through generation-by-generation evolution. The value of the selection operation and draw on the crossover and mutation operations of genetics are used to generate a new generation of populations. The whole process simulates the evolutionary process of nature, the descendant population is more adaptable to the environment and gradually outperforms the previous generation, and finally the optimal individual of the descendants is decoded to obtain the optimal solution to the problem [31]. GA has strong versatility and simple operation and has been widely used in optimization problems in many fields [32]. We first use the default parameter combination of the model as the initial solution for coding optimization and set a fixed number of iteration steps to stop the algorithm. By setting parameters such as different population numbers, it is used to study the genetic algorithm’s ability to optimize the hyperparameters of the XGBoost model. The basic operation process of GA is shown in Figure 2.

2.4. The Combined Model of GA-XGBoost

Aiming at the nonlinear regression prediction problem of mining subsidence, based on the XGBoost algorithm, and citing GA to optimize the hyperparameters of XGBoost, a combined prediction model for mining subsidence prediction is established called the GA-XGBoost prediction model. First, the dataset is divided into training set and test set, where the training set is used to train the model, and the nonlinear relationship between the features and labels in the data set is learned. The test set is used to evaluate the prediction performance of the model. Then, the hyperparameter vector of XGBoost is randomly initialized as an individual and encoded, and the prediction error of XGBoost is used as the objective function to perform evolution iteration on the hyperparameter vector through the genetic algorithm to search for the best hyperparameter combination within the solution space. The research framework of the GA-XGBoost model is shown in Figure 3.
When running the GA-XGBoost combined model, it is necessary to initialize the basic parameters of GA, such as the number of populations, the number of iterations, and the probability of mutation. As shown in Figure 2, the key step to optimize the XGBoost model using the GA algorithm is to select the solution vector of the optimal XGBoost hyperparameters through the GA. During the XGBoost training process, the GA realizes the adaptive adjustment of the XGBoost training, and at the same time through its global optimization, and the advantage of implicit parallelism, improves the optimization speed of XGBoost hyperparameters.

3. Materials

In order to verify the feasibility of GA-XGBoost model in mining subsidence prediction, some cases of domestic scholars’ research results of mining subsidence are collected for testing and analysis. In this paper, the measured data of domestic mines are used as the research data set, the characteristics (mining depth: H, mining height: m, the area of the goaf: s, and the thickness of the loose topsoil layer: h) are used as input variables, and the label (maximum subsidence value: w) is used as the output variable and as the dependent variable to study the relationship between mining subsidence and mining and geological factors [16,33,34]. The GA algorithm is used to optimize the hyperparameter vectors of XGBoost [num_round, max_depth, eta, λ ].

3.1. Data Set

A dataset containing 78 coal mining subsidence data of previous studies on coal mining subsidence was collected to evaluate the feasibility of the GA-XGBoost model in this work [33,35]. The maximum subsidence caused by mining is mainly used as an indicator for the research and analysis in this paper and will be used as a prediction output. The distribution characteristics of the dataset are shown in Figure 4.
As shown in Figure 4, the median value of H is about 150 m, the lower quartile is 100 m, and the upper quartile is 300 m. The median value of m is about 2 m, the lower quartile is 1.9 m, the upper quartile is 4.3 m, the median value of s is about 5000 m2, the lower quartile is 2000 m2, and the upper quartile is 130,000 m2. The median of h is 20, the lower quartile is 5 m, the upper quartile is 90 m, the median of w is about 1.8 m, the lower quartile is 0.8 m, and the upper quartile is 2.2 m.

3.2. Model Verification and Evaluation

In this work, a GA-XGBoost prediction model is built, the training set is used to train the prediction model, and the test set is used to validate the trained model. Evaluation metrics such as R2, RMSE, and MAE are used to evaluate and analyze the reliability and accuracy of the predictive models developed in this article. These evaluation metrics are used to describe the relationship between the predicted and tested values of w. The formula for calculating the evaluation index is as follows [36,37]:
R 2 = 1 i = 1 N ( y i y i ) 2 / i = 1 N ( y i y ¯ i ) 2
R M S E = i = 1 N ( y i y i ) 2 / N
M A E = i = 1 N | y i y i | / N
where y i is the actual value, y i is the predicted value, y ¯ i is the mean of the actual values, and N is the number of test samples.

3.3. Results and Discussion

In this paper, the optimization of the XGBoost model by the GA algorithm is analyzed by setting different population sizes. As the GA algorithm converges, the error of the XGBoost module gradually decreases. Each iteration of the genetic algorithm searches for a hyperparameter combination that makes the XGBoost error smaller. The model fitness value optimization curve is shown in Figure 5.
At different population sizes, the fitness value of the model is gradually lower, and the prediction performance is gradually improved. This is done in order to reasonably evaluate the optimization performance of the GA algorithm of different populations for the XGBoost model. We further use the TOPSIS method to analyze and evaluate the main evaluation indicators of the model [38,39]. When the population size is 40, the prediction performance score of the model is 0.2 and the ranking is the best. The detailed indicators and scores are shown in Table 1.
When the population number is 40, the model performs the best, and the optimal solution vector of the hyperparameters of the XGBoost model is [num_round = 108, max_depth = 7, eta = 0.40148499, λ = 14.84633415]. As shown in Figure 6, in order to further compare the superiority of the GA-XGBoost combined model, in addition to the single XGBoost, the RFR, AdaBoost, Bagging, and GradientBoost models were also selected for comparison. With the optimization of the GA algorithm, the prediction performance of the XGBoost model is significantly improved, with R2 from 0.819 to 0.941, RMSE from 0.648 to 0.369, and MAE from 0.38 to 0.308. It can be seen that it is feasible to use the GA algorithm to optimize the hyperparameters of XGBoost, find a better combination of hyperparameters, and improve the prediction performance of the XGBoost model.
In order to further compare the GA-XGBoost combined model with other single integrated algorithm models, the performance indicators of each model are comprehensively analyzed and evaluated in this paper [20]. The performance indicators of the GA-XGBosst model and other single models and their performance rankings are shown in Table 2.
In order to analyze the reliability of a single model more comprehensively, such as GA-XGBoost Ref. [40], the Taylor graph was used to evaluate and display the performance of all models, as shown in Figure 7. According to the Taylor plot results, GA-XGBoost was closer to the reference (REF), which shows that the prediction performance of GA-XGBoost is better than other single models.
In order to evaluate the generalization ability of the GA-XGBoost combined model, the test set data was used to make predictions and compared with the actual measured values. Figure 8 shows the distribution of the predicted and measured values of the GA-XGBoost model. As shown in the prediction curve in the figure, except for the large deviation of the predicted value of the local scatter points, the predicted values of the other scattered points were in good agreement with the actual value. When the actual subsidence value (w) was relatively large, the predicted value had a large deviation. It can be seen that the prediction performance of the GA-XGBoost model has a significant impact when the subsidence value is large. The main reason is that the data set studied in this paper is not large enough, and there are not many cases with a large subsidence value in the data set. In the process of training, it is difficult for the model to learn the regression prediction law.

4. Conclusions

In this study, the XGBoost model combined with the GA algorithm was applied to the prediction of mining subsidence. By combining GA algorithm and XGBoost, a GA-XGBoost combined prediction model was constructed. In these modeling processes, the dataset used consists of four inputs and one output. They use RMSE, MAE, R2, and other indicators to evaluate the prediction performance of the model. In addition, the GA-XGBoost proposed in this paper was compared with this classic single ensemble algorithm model such as RFR, GradientBoost XGBoost, AdaBoost, and Bagging. The proposed GA-XGBoost combined model outperforms other single models. On the plain XGB model, the prediction performance was significantly improved (R2 = 0.941, RMSE = 0.369, MAE = 0.308). Therefore, it is feasible to apply the GA-XGBoost model introduced in this study to the prediction of mining subsidence.
(1)
The prediction accuracy of the GA-XGBoost model is higher than that of a single integrated algorithm model such as XGBoost, RFR, Gradient Boost, etc., indicating that it is feasible to use the GA algorithm to optimize the hyperparameters of XGBoost to improve the prediction performance of the model. It is feasible to combine traditional machine learning models with intelligent algorithms to predict mining subsidence.
(2)
The essence of GA-XGBoost is to use the search ability of GA to realize the self-adaptation and self-optimization of the XGBoost model, thereby improving the prediction performance. With the continuous enrichment and accumulation of mine data sets, the application scenarios of this model will be more extensive, and more influencing factors can be considered such as: key strata, old empty areas, coal seam dip, dip change rate, thickness change rate, etc. The complex mining area has application value. It can supplement the prediction methods and theories in the field of mine subsidence and provide auxiliary support for the formulation and optimization of relevant mining plans and control measures.

Author Contributions

Conceptualization, Z.G.; N.Y. and H.Q.; Data curation, Z.G.; Funding acquisition, Z.G.; Investigation, C.W. and H.Q.; Methodology, Z.G. and M.C.; Writing—original draft, Z.G.; Writing—review & editing, M.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Jilin Province Educational Department Scientific Research Planning Project (JJKH20210669KJ), Jilin Province Education Science 14th Five-Year Plan Project (GH21352), Changchun Institute of Technology Science and Technology Foundation Project (320210001), and Changchun Institute of Technology Educational Reform Research Project (2021cit038).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lu, J.-X.; Tang, F.-Q.; Zhao, J.-Y.; Yan, Z.-C. Review of study on mining subsidence and ground surface damage in loess mining area. J. Xi’an Univ. Sci. Technol. 2019, 39, 8. [Google Scholar]
  2. Dinar, A.; Esteban, E.; Calvo, E.; Herrera, G.; Teatini, P.; Tomas, R.; Li, Y.; Ezquerro, P.; Albiac, J. We lose ground: Global assessment of land subsidence impact extent. Sci. Total Environ. 2021, 786, 147415. [Google Scholar] [CrossRef] [PubMed]
  3. Li, Y.; An, S.; Zhou, D.; Zhan, S.; Gao, Y. Whole basin modeling and parameter inversion of mining subsidence based on UAV photogrammetry technology. Saf. Coal Mines 2022, 53, 179–186. [Google Scholar]
  4. Zheng, L.; Zhu, L.; Wang, W.; Guo, L.; Chen, B. Land Subsidence Related to Coal Mining in China Revealed by L-band InSAR Analysis. Int. J. Environ. Res. Public Health 2020, 17, 1170. [Google Scholar] [CrossRef] [Green Version]
  5. Zhang, H.-J. Caving and subsidence characteristics and reclamation technology of shallow coal seam mining in the Yellow River Valley. Coal Eng. 2021, 53, 140–144. [Google Scholar]
  6. Yan, Y.; Yan, W.; Liu, J.; Guo, J. The Prediction Model of Super Large Subsidence in High Water Table Coal Mining Areas Covered with Thick Unconsolidated Layer. Geofluids 2021, 2021, 5520548. [Google Scholar] [CrossRef]
  7. Zhu, X.-J. Study on Strata Movement Mechanism in Backfill-Strip Mining. Diploma Thesis, China University of Mining and Technology, Beijing, China, 2016. [Google Scholar]
  8. Deng, W.-N. Data Processing Method of Mining Subsidence Simulation Based on FISH Language. Coal Min. Technol. 2016, 21, 15–17. [Google Scholar]
  9. Sikora, P.; Wesolowski, M. Numerical assessment of the influence of former mining activities and plasticity of rock mass on deformations of terrain surface. Int. J. Min. Sci. Technol. 2021, 31, 209–214. [Google Scholar] [CrossRef]
  10. Li, X.; Guo, W.; Zhao, G. Study on influence of compression characteristics of super-thick alluvium on mining subsidence. China Saf. Sci. J. 2018, 28, 135–141. [Google Scholar]
  11. Niu, Y.; Xu, L.; Zhang, K.; Ye, W.; Zhang, J.; Jiang, B. Study on the Predicted Parameters of Probability Integral Method Based on GA-BP Neural Network. Met. Mine 2019, 10, 93–100. [Google Scholar]
  12. Xing, X.; Zhu, Y.; Yuan, Z.; Xiao, L.; Liu, B. Predicting mining-induced dynamic deformations for drilling solution rock salt mine based on probability integral method and weibull temporal function. Int. J. Remote Sens. 2021, 42, 639–671. [Google Scholar] [CrossRef]
  13. Xu, J.-Y. Research on Calculation Model of Mining Subsidence Prediction Parameters Based on Convolutional Neural Network. Diploma Thesis, Liaoning Technical University, Fuxin, China, 2021. [Google Scholar]
  14. Feng, T.; Wang, C.; Zhang, J. Prediction of Surface Settlement of Foundation Pit Based on ABC-BP Model. J. Hebei Univ. Eng. Nat. Sci. Ed. 2020, 37, 7–12. [Google Scholar]
  15. Xu, K. Deformation Prediction of Diaphragm Wall of Deep Foundation Pit based on BP Neural Network Improved by Genetic Algorithm. Fly Ash Compr. Util. 2021, 35, 6–11. [Google Scholar]
  16. Pan, H.; Zhao, Y.; Zhang, W.; Bai, Y.; Han, Y. Prediction of surface subsidence with improved BP neural network based on Adaboost. Coal Sci. Technol. 2019, 47, 161–167. [Google Scholar]
  17. Sui, L.C.; Ma, F.; Chen, N. Mining Subsidence Prediction by Combining Support Vector Machine Regression and Interferometric Synthetic Aperture Radar Data. ISPRS Int. J. Geo-Inf. 2020, 9, 390. [Google Scholar] [CrossRef]
  18. Abidin, H.Z.; Djaja, R.; Darmawan, D.; Hadi, S.; Akbar, A.; Rajiyowiryono, H.; Sudibyo, Y.; Meilano, I.; Kasuma, M.A.; Kahar, J.; et al. Land subsidence of Jakarta (Indonesia) and its geodetic monitoring system. Nat. Hazards 2001, 23, 365–387. [Google Scholar] [CrossRef]
  19. Wang, X.-Y. The Prediction of Mining Subsidence in Mountain Area Based on BP Neural. Diploma Thesis, Taiyuan University of Technology, Taiyuan, China, 2010. [Google Scholar]
  20. Zhou, J.; Shi, X.; Du, K.; Qiu, X.; Li, X.; Mitri, H.S. Feasibility of Random-Forest Approach for Prediction of Ground Settlements Induced by the Construction of a Shield-Driven Tunnel. Int. J. Geomech. 2017, 17, 04016129. [Google Scholar] [CrossRef]
  21. Rahmati, O.; Golkarian, A.; Biggs, T.; Keesstra, S.; Mohammadi, F.; Daliakopoulos, I.N. Land subsidence hazard modeling: Machine learning to identify predictors and the role of human activities. J. Environ. Manag. 2019, 236, 466–480. [Google Scholar] [CrossRef]
  22. Shi, L.; Gong, H.; Chen, B.; Zhou, C. Land Subsidence Prediction Induced by Multiple Factors Using Machine Learning Method. Remote Sens. 2020, 12, 4044. [Google Scholar] [CrossRef]
  23. Mao, W. Mining Subsidence Prediction Method Based on Genetic BP Neural Network Model. Met. Mine 2016, 45, 164. [Google Scholar]
  24. Wei, T.; Wang, L.; Li, N.; Chi, S.S.; Jiang, C. Inverse Method of the Parameters of Probability Integral Method Based on Quantum Genetic Algorithm. Met. Mine 2018, 8, 118–122. [Google Scholar]
  25. Qian, M.; Shi, P.; Xu, J. Mine Pressure and Strata Control; China University of Mining and Technology Press: Beijing, China, 2010. [Google Scholar]
  26. Deng, K.; Tan, Z.; Jiang, Y.; Dai, H.Y.; Shi, Y. Deformation Monitoring and Subsidence Engineering; China University of Mining and Technology Press: Beijing, China, 2014. [Google Scholar]
  27. Sun, J. Design of Enterprise Credit Rating Prediction Scheme Based on Improved XGBoost Algorithm. Diploma Thesis, Shanghai Normal University, Shanghai, China, 2021. [Google Scholar]
  28. Lei, D. Prediction and Analysis of Passenger Flow Sent by a Railway Bureau Based on XGBOOST Algorithm. Diploma Thesis, Southwest Jiaotong University, Chengdu, China, 2020. [Google Scholar]
  29. Wu, M.-T.; Chen, Q.-S.; Qi, C.-C. Slope safety, stability evaluation, and protective measures based on machine learning. Chin. J. Eng. 2022, 44, 180–188. [Google Scholar]
  30. Liu, W.-X. Research on Optimization of A Company’s Workshop Layout Based on Genetic Algorithm. Diploma Thesis, China University of Mining and Technology, Beijing, China, 2021. [Google Scholar]
  31. Gu, X.-L. Application Research of Flexible Job-shop Scheduling Problem Based on Improved Genetic Algorithm. Diploma Thesis, Dalian Jiaotong University, Dalian, China, 2020. [Google Scholar]
  32. Luo, C.-H. Research on Voltage Optimization Method of Distribution Network Based on Parallel Improved Genetic Algorithm. Diploma Thesis, Harbin Institute of Technology, Harbin, China, 2021. [Google Scholar]
  33. Sheng, Y.-H. Study on Surface Movement of Coal Seam Group in Loess Mountain Area. Diploma Thesis, Xi’an University of Science and Technology, Xi’an, China, 2020. [Google Scholar]
  34. Zhao, Y. Research on Prediction of Surface Subsidence Based on the Adaboost Improved BP Neural Network. Diploma Thesis, Xi’an University of Science and Technology, Xi’an, China, 2019. [Google Scholar]
  35. Xing, L.; Yuan, X.-T.; Zhang, P. Mining subsidence prediction based on Adaboost-PSO-BP model. Coal Eng. 2020, 52, 141–144. [Google Scholar]
  36. Yin, Y. Study on Hyper-Spectral Models for Predicting Black Soil Organic Matter Content. Diploma Thesis, Jilin University, Changchun, China, 2015. [Google Scholar]
  37. Zhou, J.; Qiu, Y.-G.; Khandelwal, M.; Zhu, S.; Zhang, X. Developing a hybrid model of Jaya algorithm-based extreme gradient boosting machine to estimate blast-induced ground vibrations. Int. J. Rock Mech. Min. Sci. 2021, 145, 104856. [Google Scholar] [CrossRef]
  38. Wang, X.; Qin, J.; Zhang, Q.; Chen, W.J.; Chen, X.L. Mining method optimization of Gu Mountain stay ore based on AHP-TOPSIS evaluation model. J. Cent. South Univ. Sci. Technol. 2013, 44, 1131–1137. [Google Scholar]
  39. Ganesh, S.; Ramakrishnan, S.K.; Palani, V.; Sundaram, M.; Sankaranarayanan, N.; Ganesan, S.P. Investigation on the mechanical properties of ramie/kenaf fibers under various parameters using GRA and TOPSIS methods. Polym. Compos. 2022, 43, 130–143. [Google Scholar] [CrossRef]
  40. Jiang, W.; Chen, H. Assessment and projection of changes in temperature extremes over the mid-high latitudes of Asia based on CMIP6 models. Trans. Atmos. Sci. 2021, 44, 12. [Google Scholar]
Figure 1. Basic model of mining subsidence.
Figure 1. Basic model of mining subsidence.
Sustainability 14 10421 g001
Figure 2. Basic flow of genetic algorithm.
Figure 2. Basic flow of genetic algorithm.
Sustainability 14 10421 g002
Figure 3. The research framework of GA-XGBoost.
Figure 3. The research framework of GA-XGBoost.
Sustainability 14 10421 g003
Figure 4. The dataset used in the GA-XGBoost.
Figure 4. The dataset used in the GA-XGBoost.
Sustainability 14 10421 g004
Figure 5. Optimization process of GA-XGBoost models with different population size.
Figure 5. Optimization process of GA-XGBoost models with different population size.
Sustainability 14 10421 g005
Figure 6. Prediction performance comparison between GA-XGBoost model and other single models.
Figure 6. Prediction performance comparison between GA-XGBoost model and other single models.
Sustainability 14 10421 g006
Figure 7. Taylor plot of models’ performance.
Figure 7. Taylor plot of models’ performance.
Sustainability 14 10421 g007
Figure 8. Comparison between the predicted results of GA-XGBoost and the measured values.
Figure 8. Comparison between the predicted results of GA-XGBoost and the measured values.
Sustainability 14 10421 g008
Table 1. Scoring of GA-XGBoost models with different population sizes.
Table 1. Scoring of GA-XGBoost models with different population sizes.
Swarm SizeRMSER2MAEScoreRank
100.4620.9080.3940.0177
200.4070.9280.3120.1504
300.3860.9350.2870.1932
400.3690.9420.3080.2001
500.4130.9260.3190.1385
600.4050.9290.3470.1296
700.4710.9040.3920.0028
800.3980.9310.2990.1703
Table 2. Comprehensive evaluation of GA-XGBoost model and other single models.
Table 2. Comprehensive evaluation of GA-XGBoost model and other single models.
ModelRMSER2MAEScoreRank
XGBoost0.6480.8190.380.1715
GA-XGBoost0.3690.9410.3080.3711
RFR0.5930.8490.4120.1893
GradientBoost0.650.8180.4240.1472
AdaBoost0.7650.7490.5530.0006
Bagging0.6660.8090.4510.1224
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gu, Z.; Cao, M.; Wang, C.; Yu, N.; Qing, H. Research on Mining Maximum Subsidence Prediction Based on Genetic Algorithm Combined with XGBoost Model. Sustainability 2022, 14, 10421. https://doi.org/10.3390/su141610421

AMA Style

Gu Z, Cao M, Wang C, Yu N, Qing H. Research on Mining Maximum Subsidence Prediction Based on Genetic Algorithm Combined with XGBoost Model. Sustainability. 2022; 14(16):10421. https://doi.org/10.3390/su141610421

Chicago/Turabian Style

Gu, Zhongyuan, Miaocong Cao, Chunguang Wang, Na Yu, and Hongyu Qing. 2022. "Research on Mining Maximum Subsidence Prediction Based on Genetic Algorithm Combined with XGBoost Model" Sustainability 14, no. 16: 10421. https://doi.org/10.3390/su141610421

APA Style

Gu, Z., Cao, M., Wang, C., Yu, N., & Qing, H. (2022). Research on Mining Maximum Subsidence Prediction Based on Genetic Algorithm Combined with XGBoost Model. Sustainability, 14(16), 10421. https://doi.org/10.3390/su141610421

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop