Artificial Neural Networks and Ensemble Learning for Enhanced Liquefaction Prediction in Smart Cities
Abstract
:Highlights
- The bagging prediction model demonstrated approximately 20% higher accuracy compared to the single ANN model.
- Accurate prediction of bearing layer depth is critical for improving urban resilience and infrastructure planning in smart cities.
- The improved accuracy of the bagging model supports more reliable geotechnical investigations, which can lead to safer urban development in earthquake-prone areas.
- Improved prediction models for bearing layer depth can reduce the need for extensive in situ testing, lowering costs and increasing the efficiency of construction projects.
Abstract
1. Introduction
2. Data and Methods
3. Building Artificial Neural Networks (ANNs)
3.1. Preparing the Dataset
3.2. Creating the Model
4. Building Bagging
5. Results and Discussion
5.1. Results on Predicting Bearing Layer Depth
5.2. Comparison of Prediction Results between ANNs and Bagging
- (1)
- Limited Generalization Ability: A single model is easily affected by data noise, outliers, and overfitting, resulting in poor performance on new data.
- (2)
- Low Stability: A single model is highly sensitive to data distribution and feature selection, meaning slight changes in the data or features can significantly alter the prediction results.
- (3)
- Limited Expressive Power: A single model can often only capture certain aspects of the data, making it difficult to represent the complexity and diversity inherent in the data.
- (1)
- Geological data analysis and forecasting: Smart cities rely on rich data for decision making. Bagging can improve the analysis of geological and other related data by reducing model variance and improving prediction performance.
- (2)
- Hazard detection: Timely detection of anomalies is critical for smart cities. The predictive model of bearing layer depth created by bagging can be used as the basis for creating disaster maps, which in turn helps to detect and respond to abnormal situations more effectively.
- (3)
- Resource optimization: Based on the model developed in this study, Bagging can help optimize resource allocation, such as establishing a trusted bearing layer depth, predicting unknown points before construction, and omitting geological survey steps when a trusted value is exceeded, thereby reducing costs.
6. Conclusions
- (1)
- By using “latitude”, “longitude”, “altitude”, and “bearing layer depth” as input features, high-precision prediction of bearing layer depth was achieved. This accuracy is critical for smart cities, as understanding the geotechnical properties of the ground can significantly impact infrastructure development, from building construction to transportation network design.
- (2)
- Compared to single models such as ANNs, ensemble learning using bagging demonstrated superior prediction performance, with an increase in accuracy of approximately 20%. Bagging enables better data analysis, promoting more effective urban planning.
- (3)
- When employing the ensemble learning method bagging to predict geotechnical engineering survey results, it was found that even small changes in the depth of the training data could significantly affect model performance. This finding underscores the importance of ensuring data accuracy.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Cong, Y.; Inazumi, S. Integration of Smart City Technologies with Advanced Predictive Analytics for Geotechnical Investigations. Smart Cities 2024, 7, 1089–1108. [Google Scholar] [CrossRef]
- Lashkari1, A.; Karimi, A.; Fakharian, K.; Kaviani-Hamedani, F. Prediction of undrained behavior of isotropically and anisotropically consolidated firoozkuh sand: In stability and flow liquefaction. Int. J. Geomech. 2017, 17, 1–17. [Google Scholar] [CrossRef]
- Dobry, R.; Abdoun, T. Recent findings on liquefaction triggering in clean and silty sands during earthquakes. J. Geotech. Geoenviron. Eng. 2017, 143, 1778. [Google Scholar] [CrossRef]
- Bao, X.H.; Jin, Z.Y.; Cui, H.Z.; Chen, X.S.; Xie, X.Y. Soil liquefaction mitigation in geotechnical engineering: An overview of recently developed methods. Soil Dyn. Earthq. Eng. 2019, 120, 273–291. [Google Scholar] [CrossRef]
- Seed, H.B.; Idriss, I.M. Ground Motions and Soil Liquefaction during Earthquakes; Monograph Series; Earthquake Engineering Research Institute, University of California: Berkeley, CA, USA, 1982. [Google Scholar]
- Tatsuoka, F.; Iwasaki, T.; Tokida, K.; Yasuda, S.; Hirose, M.; Imai, T.; Kon-no, M. Standard penetration tests and soil liquefaction potential evaluation. Soils Found 1980, 20, 95–111. [Google Scholar] [CrossRef] [PubMed]
- Youd, T.L.; Perkins, D.M. Mapping liquefaction-induced ground failure potential. J. Geotech. Eng. 1978, 104, 433–446. [Google Scholar] [CrossRef]
- Wakamatsu, K. Liquefaction history, 416–1997, in Japan. In Proceedings of the 12th WCEE, Auckland, New Zealand, 30 January–4 February 2000; p. 2270. [Google Scholar]
- Towhata, I.; Taguchi, Y.; Hayashida, T.; Goto, S.; Shintakus, Y.; Hamada, Y.; Aoyama, S. Liquefaction perspective of soil ageing. Geotechnique 2016, 67, 467–478. [Google Scholar] [CrossRef]
- Lo, R.C.; Wang, Y. Lessons learned from recent earthquakes-geoscience and geotechnical perspectives. In Advances in Geotechnical Earthquake Engineering–Soil Liquefaction and Seismic Safety of Dams and Monuments; IntechOpen: Rijeka, Croatia, 2012; pp. 1–42. [Google Scholar] [CrossRef]
- Hazout, L.; Zitouni, Z.E.A.; Belkhatir, M.; Schanz, T. Evaluation of static liquefaction characteristics of saturated loose sand through the mean grain size and extreme grain sizes. Geotech. Geol. Eng. 2017, 35, 2079–2105. [Google Scholar] [CrossRef]
- Kopetz, H. Real-Time Systems; Springer: New York, NY, USA, 2011; pp. 307–323. [Google Scholar]
- Yuan, Y.M.; Qin, X.; Wu, C.L.; Tang, T.Z. Architecture and data vitalization of smart city. Adv. Mater. Res. 2012, 403–408, 2564–2568. [Google Scholar] [CrossRef]
- Yin, C.T.; Xiong, Z.; Chen, H.; Wang, J.Y.; Cooper, D.; David, B. A literature survey on smart cities. Sci. China 2015, 58, 1–18. [Google Scholar] [CrossRef]
- Katsuumi, A.; Cong, Y.; Inazumi, S. AI-Driven Prediction and Mapping of Soil Liquefaction Risks for Enhancing Earthquake Resilience in Smart Cities. Smart Cities 2024, 7, 1836–1856. [Google Scholar] [CrossRef]
- Ren, X.; Hou, J.; Song, S.; Liu, Y.; Chen, D.; Wang, X.; Dou, L. Lithology identification using well logs: A method by integrating artificial neural networks and sedimentary patterns. J. Pet. Sci. Eng. 2019, 182, 106336. [Google Scholar] [CrossRef]
- Mienye, I.D.; Sun, Y. A survey of ensemble learning: Concepts, algorithms, applications, and prospects. IEEE Access 2022, 10, 99129–99149. [Google Scholar] [CrossRef]
- Yang, P.; Yang, Y.H.; Zhou, B.B.; Zomaya, A.Y. A review of ensemble methods in bioinformatics. Curr. Bioinform. 2010, 5, 296–308. [Google Scholar] [CrossRef]
- Sun, J.; Li, Q.; Chen, M.; Ren, L.; Huang, G.; Li, C.; Zhang, Z. Optimization of models for a rapid identification of lithology while drilling-A win-win strategy based on machine learning. J. Pet. Sci. Eng. 2019, 176, 321–341. [Google Scholar] [CrossRef]
- Xie, Y.; Zhu, C.; Zhou, W.; Li, Z.; Liu, X.; Tu, M. Evaluation of machine learning methods for formation lithology identification: A comparison of tuning processes and model performances. J. Pet. Sci. Eng. 2017, 160, 182–193. [Google Scholar] [CrossRef]
- Binh, T.P.; DieuTien, B.; Indra, P.; Dholakia, M.B. Evaluation of predictive ability of support vector machines and naive Bayes trees methods for spatial prediction of landslides in Uttarakhand state (India) using GIS. J. Geomat. 2016, 10, 71–79. [Google Scholar]
- Pakawan, C.; Saowanee, W. redicting Urban Expansion and Urban Land Use Changes in Nakhon Ratchasima City Using a CA-Markov Model under Two Different Scenarios. Land 2019, 8, 140. [Google Scholar] [CrossRef]
- Li, H.; Wan, B.; Chu, D.P.; Wang, R.; Ma, G.M.; Fu, J.M.; Xiao, Z.C. Progressive Geological Modeling and Uncertainty Analysis Using Machine Learning. Int. J. Geo-Inf. 2023, 12, 97. [Google Scholar] [CrossRef]
- Zhang, Z.Q.; Wang, G.W.; Carranza, E.; Liu, C.; Li, J.J.; Liu, X.X.; Chen, C.; Fan, J.J.; Dong, Y.L. An integrated machine learning framework with uncertainty quantification for three-dimensional lithological modeling from multi-source geophysical data and drilling data. Eng. Geol. 2023, 324, 107255. [Google Scholar] [CrossRef]
- Shan, S.; Pei, X.; Zhan, W. Estimating Deformation Modulus and Bearing Capacity of Deep Soils from Dynamic Penetration Test. Adv. Civ. Eng. 2021, 2021, 1082050. [Google Scholar] [CrossRef]
- Cong, Y.; Inazumi, S. Ensemble learning for predicting subsurface bearing layer depths in Tokyo. Results Eng. 2024, 23, 102654. [Google Scholar] [CrossRef]
- Salman, R.; Kecman, V. Regression as classification. In Proceedings of the IEEE Southeastcon 2012, Orlando, FL, USA, 15–18 March 2012; pp. 1–6. [Google Scholar] [CrossRef]
- Stewart, L.; Bach, F.; Berthet, Q.; Vert, J. Regression as classification: Influence of task formulation on neural network features. In Proceedings of the 26th International Conference on Artificial Intelligence and Statistics (AISTATS), Valencia, Spain, 25–27 April 2023; p. 206. [Google Scholar]
- Kohavi, R.; Provost, F. Glossary of terms. Mach. Learn. 1998, 30, 271–274. [Google Scholar]
- Rogan, J.; Franklin, J.; Stow, D.; Miller, J.; Woodcock, C.; Roberts, D. Mapping land-cover modifications over large areas: A comparison of machine learning algorithms. Remote Sens. Environ. 2008, 112, 2272–2283. [Google Scholar] [CrossRef]
- Breiman, L.; Friedman, J.; Olshen, R.; Stone, C. Classification and Regression Trees; Belmont: Wadsworth, IL, USA, 1984. [Google Scholar]
- Li, W.; Michael, H. Coastal wetland mapping using ensemble learning algorithms: A comparative study of bagging, boosting and stacking techniques. Remote Sens. 2020, 12, 1683. [Google Scholar] [CrossRef]
- Jalloh, A.B.; Kyuro, S.; Jalloh, Y.; Barrie, A.K. Integrating artificial neural networks and geostatistics for optimum 3D geological block modeling in mineral reserve estimation: A case study. Int. J. Min. Sci. Technol. 2016, 26, 581–585. [Google Scholar] [CrossRef]
- Krawczyk, B.; Minku, L.L.; Gama, J.; Stefanowski, J.; Woźniak, M. Ensemble learning for data stream analysis: A survey. Inf. Fusion 2017, 37, 132–156. [Google Scholar] [CrossRef]
- Dave, V.S.; Dutta, K. Neural network-based models for software effort estimation: A review. Artif. Intell. Rev. 2014, 42, 295–307. [Google Scholar] [CrossRef]
- Izeboudjen, N.; Larbes, C.; Farah, A. A new classification approach for neural networks hardware: From standards chips to embedded systems on chip. Artif. Intell. Rev. 2014, 41, 491–534. [Google Scholar] [CrossRef]
- Oludare, I.A.; Aman, J.; Abiodun, E.O.; Kemi, V.D.; Nachaat, A.M.; Humaira, A. State-of-the-art in artificial neural network applications: A survey. Heliyon 2018, 4, 11. [Google Scholar] [CrossRef]
- He, K.M.; Zhang, X.Y.; Ren, S.Q.; Sun, J. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. In Proceedings of the International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 1026–1034. [Google Scholar]
- Opitz, D.; Maclin, R. Popular ensemble methods: An empirical study. J. Artif. Intell. Res. 1999, 11, 169–198. [Google Scholar] [CrossRef]
- Polikar, R. Ensemble based systems in decision making. IEEE Circuits Syst. Mag. 2006, 6, 21–45. [Google Scholar] [CrossRef]
- Ghimire, B.; Rogan, J.; Galiano, V.R.; Panday, P.; Neeti, N. An evaluation of bagging, boosting, and random forests for land-cover classification in Cape Cod, Massachusetts, USA. GIScience Remote Sens. 2012, 49, 623–643. [Google Scholar] [CrossRef]
- Zhou, Z.H. Ensemble Methods: Foundations and Algorithms; CRC Press: Boca Raton, FL, USA; London, UK; New York, NY, USA, 2012. [Google Scholar]
- Breiman, L. Bagging predictors. Mach. Learn. 1996, 24, 123–140. [Google Scholar] [CrossRef]
- Kohavi, R.; Wolpert, D.H. Bias plus variance decomposition for zero-one loss functions. In Proceedings of the Thirteenth International Conference on Machine Learning (ICML’96), Bari, Italy, 3–6 July 1996. [Google Scholar]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Giang, N.; Rodney, B.; Rohitash, C. Evolutionary bagging for ensemble learning. Neurocomputing 2022, 510, 1–14. [Google Scholar] [CrossRef]
- Lun, D.; Xiaozhou, S.; Yanlin, W.; Ensheng, S.; Shi, H.; Dongmei, Z. Is a single model enough? mucos: A multi-model ensemble learning for semantic code search. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, Gold Coast, QLD, Australia, 1–5 November 2021; pp. 2994–2998. [Google Scholar] [CrossRef]
- Xin, Y.; Quansheng, L.; Yucong, P.; Xing, H.; Jian, W.; Xinyu, W. Strength of stacking technique of ensemble learning in rockburst prediction with imbalanced data: Comparison of eight single and ensemble models. Nat. Resour. Res. 2021, 30, 1795–1815. [Google Scholar] [CrossRef]
Causes of Liquefaction | Details |
---|---|
Loose ground | Sandy soils with an N value of 20 or less, indicating soil hardness, and particle sizes between 0.03 mm and 0.5 mm. |
High groundwater | Groundwater level within 10 m of the ground surface. |
Earthquake | Commonly observed along coastlines, near river mouths, on reclaimed land, and in river alluvial fans. Earthquake intensity of 5 or higher. The longer the shaking lasts, the greater the damage. |
Latitude | Longitude | Bearing Layer Depth (m) | Elevation (m) |
---|---|---|---|
35.6290 | 139.674 | 13.38 | 38.8 |
35.6114 | 139.632 | 11.00 | 11.0 |
35.6582 | 139.649 | 12.80 | 37.3 |
35.6679 | 139.669 | 13.53 | 36.5 |
Area (km2) | Data Density (pcs/km2) | Standard Deviation of the Data |
---|---|---|
58.1 | 7.46 | 9.53 |
Hyperparameters | Value |
---|---|
N_esimators | 91 |
Max features | Auto |
Max_depth | None |
Predicted Location | Error of Case 1 (m) | Error of Case 2 (m) |
---|---|---|
1 | 1.40 | 0.75 |
2 | 0.80 | 0.53 |
3 | 5.30 | 3.41 |
4 | 0.70 | 1.95 |
5 | 0.89 | 0.09 |
6 | 1.40 | 0.22 |
7 | 0.56 | 0.02 |
8 | 0.70 | 0.10 |
9 | 0.26 | 0.26 |
10 | 0.78 | 1.26 |
Average error (m) | 1.27 | 0.86 |
CI | 10.16 0.77 | 10.56 1.05 |
MAE | MSE | RMSE | |
---|---|---|---|
ANNs | 1.07 | 2.89 | 1.70 |
Bagging | 0.86 | 1.79 | 1.34 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cong, Y.; Inazumi, S. Artificial Neural Networks and Ensemble Learning for Enhanced Liquefaction Prediction in Smart Cities. Smart Cities 2024, 7, 2910-2924. https://doi.org/10.3390/smartcities7050113
Cong Y, Inazumi S. Artificial Neural Networks and Ensemble Learning for Enhanced Liquefaction Prediction in Smart Cities. Smart Cities. 2024; 7(5):2910-2924. https://doi.org/10.3390/smartcities7050113
Chicago/Turabian StyleCong, Yuxin, and Shinya Inazumi. 2024. "Artificial Neural Networks and Ensemble Learning for Enhanced Liquefaction Prediction in Smart Cities" Smart Cities 7, no. 5: 2910-2924. https://doi.org/10.3390/smartcities7050113
APA StyleCong, Y., & Inazumi, S. (2024). Artificial Neural Networks and Ensemble Learning for Enhanced Liquefaction Prediction in Smart Cities. Smart Cities, 7(5), 2910-2924. https://doi.org/10.3390/smartcities7050113