Unlocking the Potential of Wastewater Treatment: Machine Learning Based Energy Consumption Prediction
Abstract
:1. Introduction
- First, we considered all input variables, including hydraulic, wastewater characteristics, weather, and time data, to predict energy consumption in a WWTP. This study compared twenty-three machine learning models, including support vector regression with different kernels, GPR with different kernels, boosted trees, bagged trees, decision trees, neural networks (NNs), RF, k-nearest neighbors (KNN), Gradient Boosting (XGBoost), and LightgBMs. Bayesian optimization has been applied to calibrate and fine-tune the investigated machine-learning models to develop efficient energy consumption predictions. In addition, A 5-fold cross-validation technique has been used to construct these models based on training data. Five performance evaluation metrics are employed to assess the goodness of predictions. Results revealed that using all input variables to predict EC, the machine learning models did not provide satisfactory predictions.
- Second, the aim is to construct reduced models by keeping only pertinent input variables to predict EC in WWTP. To this end, Random Forest and XGBoost algorithms were applied to identify important variables that considerably influence the prediction capability of the considered models. Results showed that the reduced models obtained a slightly improved prediction of EC compared to the full models.
- It is worthwhile to mention that the studied methods do not take into account the time-dependent nature of energy consumption in the prediction process. Our final contribution to addressing this limitation is constructing dynamic models by incorporating lagged measurements as inputs to enhance the ML models’ ability to perform effectively. Results demonstrated that using lagged data contributes to improving the prediction quality of the ML models and highlights the superior performance of the dynamic GPR and KNN.
2. Related Works
3. Materials and Methods
3.1. Data Description and Analysis
3.2. Methodology
3.2.1. SVR Model
3.2.2. GPR Model
3.2.3. K-Nearest Neighbor
3.2.4. ANN Models
3.2.5. Decision Tree Regression
3.2.6. Ensemble Methods
3.2.7. Models Calibration via Bayesian Optimization
4. Machine Learning-Based EC prediction framework
- RMSE measures the differences between predicted and true values.
- MAE measures the average absolute difference between predicted and true values.
- MAPE measures the average percentage difference between predicted and true values.
- is the ratio of the square RMSE in testing to the square RMSE in training [28].
5. Results and Discussion
5.1. EC prediction Using Full Models
5.2. EC Prediction Using Reduced Models
5.3. EC Prediction Using Dynamic Models
6. Conclusions
- Despite optimizing the prediction model through variable selection methods, it is important to acknowledge that our model’s predictive capability could be influenced by other variables that were not included due to data limitations. Future research should focus on exploring and incorporating a broader range of variables to enhance the accuracy and comprehensiveness of energy consumption prediction models in WWTPs. This could involve considering additional variables related to process conditions, influent characteristics, operational parameters, and external factors such as climate and regulatory changes. By incorporating these variables, we can improve the predictive power of the models and gain a more comprehensive understanding of the factors impacting energy consumption in WWTPs.
- In future work, we will emphasize the need for additional studies that focus on validating the feasibility and utility of these models in real-world scenarios. This will involve considering factors such as computational requirements and operational constraints commonly encountered in real WWTP settings.
- Deep learning models, known for their ability to handle time-series data, present an intriguing avenue for further exploration in forecasting energy consumption in WWTPs. These models, such as recurrent neural networks (RNNs) and long short-term memory (LSTM) networks, have demonstrated promising capabilities in capturing temporal dependencies and patterns within time-series data [61,62]. By leveraging their strengths, deep learning models could improve the accuracy and precision of energy consumption forecasts in WWTPs.
- Another possibility for improvement is integrating wavelet-based multiscale data representation with machine learning models. This approach would take into account the temporal and frequency characteristics of the data and could potentially improve the accuracy of the prediction models. Wavelet-based multiscale representation can also be used to extract relevant features and patterns from the data, which could be used to improve the performance of the machine learning models. This approach could potentially provide more accurate predictions and lead to further optimization of energy consumption in WWTPs.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Gu, Y.; Li, Y.; Li, X.; Luo, P.; Wang, H.; Wang, X.; Wu, J.; Li, F. Energy self-sufficient wastewater treatment plants: Feasibilities and challenges. Energy Procedia 2017, 105, 3741–3751. [Google Scholar] [CrossRef]
- Molinos-Senante, M.; Maziotis, A. Evaluation of energy efficiency of wastewater treatment plants: The influence of the technology and aging factors. Appl. Energy 2022, 310, 118535. [Google Scholar] [CrossRef]
- Daw, J.; Hallett, K.; DeWolfe, J.; Venner, I. Energy Efficiency Strategies for Municipal Wastewater Treatment Facilities; Technical Report; National Renewable Energy Lab. (NREL): Golden, CO, USA, 2012.
- Goldstein, R.; Smith, W. Water & Sustainability: US Electricity Consumption for Water Supply & Treatment-the Next Half Century; Electric Power Research Institute: Palo Alto, CA, USA, 2002; Volume 4. [Google Scholar]
- Zhang, Z.; Kusiak, A.; Zeng, Y.; Wei, X. Modeling and optimization of a wastewater pumping system with data-mining methods. Appl. Energy 2016, 164, 303–311. [Google Scholar] [CrossRef]
- Plappally, A.; Lienhard V, J.H. Energy requirements for water production, treatment, end use, reclamation, and disposal. Renew. Sustain. Energy Rev. 2012, 16, 4818–4848. [Google Scholar] [CrossRef]
- Robescu, L.D.; Boncescu, C.; Bondrea, D.A.; Presura-Chirilescu, E. Impact of wastewater treatment plant technology on power consumption and carbon footprint. In Proceedings of the 2019 International Conference on ENERGY and ENVIRONMENT (CIEM), Timisoara, Romania, 17–18 October 2019; pp. 524–528. [Google Scholar]
- Chen, Y.; Song, L.; Liu, Y.; Yang, L.; Li, D. A review of the artificial neural network models for water quality prediction. Appl. Sci. 2020, 10, 5776. [Google Scholar] [CrossRef]
- Harrou, F.; Cheng, T.; Sun, Y.; Leiknes, T.; Ghaffour, N. A data-driven soft sensor to forecast energy consumption in wastewater treatment plants: A case study. IEEE Sens. J. 2020, 21, 4908–4917. [Google Scholar] [CrossRef]
- Cheng, T.; Harrou, F.; Kadri, F.; Sun, Y.; Leiknes, T. Forecasting of wastewater treatment plant key features using deep learning-based models: A case study. IEEE Access 2020, 8, 184475–184485. [Google Scholar] [CrossRef]
- El-Rawy, M.; Abd-Ellah, M.K.; Fathi, H.; Ahmed, A.K.A. Forecasting effluent and performance of wastewater treatment plant using different machine learning techniques. J. Water Process Eng. 2021, 44, 102380. [Google Scholar] [CrossRef]
- Hilal, A.M.; Althobaiti, M.M.; Eisa, T.A.E.; Alabdan, R.; Hamza, M.A.; Motwakel, A.; Al Duhayyim, M.; Negm, N. An Intelligent Carbon-Based Prediction of Wastewater Treatment Plants Using Machine Learning Algorithms. Adsorpt. Sci. Technol. 2022, 2022, 8448489. [Google Scholar] [CrossRef]
- Safeer, S.; Pandey, R.P.; Rehman, B.; Safdar, T.; Ahmad, I.; Hasan, S.W.; Ullah, A. A review of artificial intelligence in water purification and wastewater treatment: Recent advancements. J. Water Process Eng. 2022, 49, 102974. [Google Scholar] [CrossRef]
- Ly, Q.V.; Truong, V.H.; Ji, B.; Nguyen, X.C.; Cho, K.H.; Ngo, H.H.; Zhang, Z. Exploring potential machine learning application based on big data for prediction of wastewater quality from different full-scale wastewater treatment plants. Sci. Total Environ. 2022, 832, 154930. [Google Scholar] [CrossRef]
- Cheng, T.; Harrou, F.; Sun, Y.; Leiknes, T. Monitoring influent measurements at water resource recovery facility using data-driven soft sensor approach. IEEE Sens. J. 2018, 19, 342–352. [Google Scholar] [CrossRef] [Green Version]
- Haimi, H.; Mulas, M.; Corona, F.; Vahala, R. Data-derived soft-sensors for biological wastewater treatment plants: An overview. Environ. Model. Softw. 2013, 47, 88–107. [Google Scholar] [CrossRef]
- Andreides, M.; Dolejš, P.; Bartáček, J. The prediction of WWTP influent characteristics: Good practices and challenges. J. Water Process Eng. 2022, 49, 103009. [Google Scholar] [CrossRef]
- Guo, H.; Jeong, K.; Lim, J.; Jo, J.; Kim, Y.M.; Park, J.p.; Kim, J.H.; Cho, K.H. Prediction of effluent concentration in a wastewater treatment plant using machine learning models. J. Environ. Sci. 2015, 32, 90–101. [Google Scholar] [CrossRef] [PubMed]
- Nadiri, A.A.; Shokri, S.; Tsai, F.T.C.; Moghaddam, A.A. Prediction of effluent quality parameters of a wastewater treatment plant using a supervised committee fuzzy logic model. J. Clean. Prod. 2018, 180, 539–549. [Google Scholar] [CrossRef]
- Hernández-del Olmo, F.; Gaudioso, E.; Duro, N.; Dormido, R. Machine learning weather soft-sensor for advanced control of wastewater treatment plants. Sensors 2019, 19, 3139. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Alali, Y.; Harrou, F.; Sun, Y. Predicting Energy Consumption in Wastewater Treatment Plants through Light Gradient Boosting Machine: A Comparative Study. In Proceedings of the 2022 10th International Conference on Systems and Control (ICSC), Marseille, France, 23–25 November 2022; pp. 137–142. [Google Scholar] [CrossRef]
- Zhang, S.; Wang, H.; Keller, A.A. Novel Machine Learning-Based Energy Consumption Model of Wastewater Treatment Plants. ACS ES&T Water 2021, 1, 2531–2540. [Google Scholar]
- Boncescu, C.; Robescu, L.; Bondrea, D.; Măcinic, M. Study of energy consumption in a wastewater treatment plant using logistic regression. In IOP Conference Series: Earth and Environmental Science, Proceedings of the 4th International Conference on Biosciences (ICoBio 2021), Bogor, Indonesia, 11–12 August 2021; IOP Publishing: Bristol, UK, 2021; Volume 664, p. 012054. [Google Scholar]
- Ramli, N.A.; Abdul Hamid, M.F. Data Based Modeling of a Wastewater Treatment Plant by using Machine Learning Methods. J. Eng. Technol. 2019, 6, 14–21. [Google Scholar]
- Torregrossa, D.; Schutz, G.; Cornelissen, A.; Hernández-Sancho, F.; Hansen, J. Energy saving in WWTP: Daily benchmarking under uncertainty and data availability limitations. Environ. Res. 2016, 148, 330–337. [Google Scholar] [CrossRef]
- Torregrossa, D.; Leopold, U.; Hernández-Sancho, F.; Hansen, J. Machine learning for energy cost modelling in wastewater treatment plants. J. Environ. Manag. 2018, 223, 1061–1067. [Google Scholar] [CrossRef] [PubMed]
- Qiao, J.; Zhou, H. Modeling of energy consumption and effluent quality using density peaks-based adaptive fuzzy neural network. IEEE/CAA J. Autom. Sin. 2018, 5, 968–976. [Google Scholar] [CrossRef]
- Bagherzadeh, F.; Nouri, A.S.; Mehrani, M.J.; Thennadil, S. Prediction of energy consumption and evaluation of affecting factors in a full-scale WWTP using a machine learning approach. Process Saf. Environ. Prot. 2021, 154, 458–466. [Google Scholar] [CrossRef]
- Zhang, Z.; Zeng, Y.; Kusiak, A. Minimizing pump energy in a wastewater processing plant. Energy 2012, 47, 505–514. [Google Scholar] [CrossRef]
- Oulebsir, R.; Lefkir, A.; Safri, A.; Bermad, A. Optimization of the energy consumption in activated sludge process using deep learning selective modeling. Biomass Bioenergy 2020, 132, 105420. [Google Scholar] [CrossRef]
- Das, A.; Kumawat, P.K.; Chaturvedi, N.D. A Study to Target Energy Consumption in Wastewater Treatment Plant using Machine Learning Algorithms. In Computer Aided Chemical Engineering; Elsevier: Amsterdam, The Netherlands, 2021; Volume 50, pp. 1511–1516. [Google Scholar]
- Oliveira, P.; Fernandes, B.; Analide, C.; Novais, P. Forecasting energy consumption of wastewater treatment plants with a transfer learning approach for sustainable cities. Electronics 2021, 10, 1149. [Google Scholar] [CrossRef]
- Yusuf, J.; Faruque, R.B.; Hasan, A.J.; Ula, S. Statistical and Deep Learning Methods for Electric Load Forecasting in Multiple Water Utility Sites. In Proceedings of the 2019 IEEE Green Energy and Smart Systems Conference (IGESSC), Long Beach, CA, USA, 4–5 November 2019; pp. 1–5. [Google Scholar]
- Filipe, J.; Bessa, R.J.; Reis, M.; Alves, R.; Póvoa, P. Data-driven predictive energy optimization in a wastewater pumping station. Appl. Energy 2019, 252, 113423. [Google Scholar] [CrossRef] [Green Version]
- Parzen, E. On estimation of a probability density function and mode. Ann. Math. Stat. 1962, 33, 1065–1076. [Google Scholar] [CrossRef]
- Yu, P.S.; Chen, S.T.; Chang, I.F. Support vector regression for real-time flood stage forecasting. J. Hydrol. 2006, 328, 704–716. [Google Scholar] [CrossRef]
- Hong, W.C.; Dong, Y.; Chen, L.Y.; Wei, S.Y. SVR with hybrid chaotic genetic algorithms for tourism demand forecasting. Appl. Soft Comput. 2011, 11, 1881–1890. [Google Scholar] [CrossRef]
- Smola, A.J.; Schölkopf, B. A tutorial on support vector regression. Stat. Comput. 2004, 14, 199–222. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.; Wang, W.; Harrou, F.; Sun, Y. Reliable solar irradiance prediction using ensemble learning-based models: A comparative study. Energy Convers. Manag. 2020, 208, 112582. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.; Wang, W.; Harrou, F.; Sun, Y. Wind power prediction using ensemble learning-based models. IEEE Access 2020, 8, 61517–61527. [Google Scholar] [CrossRef]
- Harrou, F.; Saidi, A.; Sun, Y.; Khadraoui, S. Monitoring of photovoltaic systems using improved kernel-based learning schemes. IEEE J. Photovoltaics 2021, 11, 806–818. [Google Scholar] [CrossRef]
- Williams, C.K.; Rasmussen, C.E. Gaussian Processes for Machine Learning; MIT Press: Cambridge, MA, USA, 2006; Volume 2. [Google Scholar]
- Williams, C.; Rasmussen, C. Gaussian processes for regression. Adv. Neural Inf. Process. Syst. 1995, 8, 514–520. [Google Scholar]
- Tang, L.; Yu, L.; Wang, S.; Li, J.; Wang, S. A novel hybrid ensemble learning paradigm for nuclear energy consumption forecasting. Appl. Energy 2012, 93, 432–443. [Google Scholar] [CrossRef]
- James, G.; Witten, D.; Hastie, T.; Tibshirani, R. An Introduction to Statistical Learning; Springer: Berlin/Heidelberg, Germany, 2013; Volume 112. [Google Scholar]
- Loh, W.Y. Classification and regression trees. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2011, 1, 14–23. [Google Scholar] [CrossRef]
- Zhou, Z. Ensemble Methods: Foundations and Algorithms; CRC Press: Boca Raton, FL, USA, 2012; pp. 15–55. [Google Scholar]
- Breiman, L. Bagging predictors. Mach. Learn. 1996, 24, 123–140. [Google Scholar] [CrossRef] [Green Version]
- Bühlmann, P.; Hothorn, T. Boosting algorithms: Regularization, prediction and model fitting. Stat. Sci. 2007, 22, 477–505. [Google Scholar]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Chen, T.; He, T.; Benesty, M.; Khotilovich, V.; Tang, Y.; Cho, H.; Chen, K.; Zhou, T. Xgboost: Extreme gradient boosting. R Package Version 0.4-2 2015, 1, 1–4. Available online: https://cran.r-project.org/web/packages/xgboost/vignettes/xgboost.pdf (accessed on 17 June 2023).
- Deng, H.; Yan, F.; Wang, H.; Fang, L.; Zhou, Z.; Zhang, F.; Xu, C.; Jiang, H. Electricity Price Prediction Based on LSTM and LightGBM. In Proceedings of the 2021 IEEE 4th International Conference on Electronics and Communication Engineering (ICECE), Xi’an, China, 17–19 December 2021; pp. 286–290. [Google Scholar]
- Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T.Y. Lightgbm: A highly efficient gradient boosting decision tree. Adv. Neural Inf. Process. Syst. 2017, 30, 52. [Google Scholar]
- Bull, A.D. Convergence rates of efficient global optimization algorithms. J. Mach. Learn. Res. 2011, 12, 2879–2904. [Google Scholar]
- Bergstra, J.; Bengio, Y. Random search for hyper-parameter optimization. J. Mach. Learn. Res. 2012, 13, 281–305. [Google Scholar]
- Yang, L.; Shami, A. On hyperparameter optimization of machine learning algorithms: Theory and practice. Neurocomputing 2020, 415, 295–316. [Google Scholar] [CrossRef]
- Snoek, J.; Larochelle, H.; Adams, R.P. Practical bayesian optimization of machine learning algorithms. Adv. Neural Inf. Process. Syst. 2012, 25, 1–12. [Google Scholar]
- Shahriari, B.; Swersky, K.; Wang, Z.; Adams, R.P.; De Freitas, N. Taking the human out of the loop: A review of Bayesian optimization. Proc. IEEE 2015, 104, 148–175. [Google Scholar] [CrossRef] [Green Version]
- Hastie, T.; Tibshirani, R.; Friedman, J.H.; Friedman, J.H. The Elements of Statistical Learning: Data Mining, Inference, and Prediction; Springer: Amsterdam, The Netherlands, 2009; Volume 2. [Google Scholar]
- Wang, S.; Zou, L.; Li, H.; Zheng, K.; Wang, Y.; Zheng, G.; Li, J. Full-scale membrane bioreactor process WWTPs in East Taihu basin: Wastewater characteristics, energy consumption and sustainability. Sci. Total Environ. 2020, 723, 137983. [Google Scholar] [CrossRef]
- Rathnayake, N.; Rathnayake, U.; Dang, T.L.; Hoshino, Y. A Cascaded Adaptive Network-Based Fuzzy Inference System for Hydropower Forecasting. Sensors 2022, 22, 2905. [Google Scholar] [CrossRef]
- Harrou, F.; Dairi, A.; Kadri, F.; Sun, Y. Forecasting emergency department overcrowding: A deep learning framework. Chaos Solitons Fractals 2020, 139, 110247. [Google Scholar] [CrossRef]
Reference | Dataset | Predicted Algorithm | Result |
---|---|---|---|
Zhang et al. [22] | 2387 records from the China Urban Drainage Yearbook | RF model | Achieved an R2 of 0.702 |
Bagherzadeh et al. [28] | Melbourne water company between 2014 and 2019 | ANN, GBM, and RF | GBM reached the lowest RMSE and MAE values in the test phase, 33.9 and 26.9, respectively |
Boncescu et al. [23] | Data collected between 2015 and 2017 with a total of 403 records | Logistic Regression | Accuracy around 80% |
Ramli et al. [24] | EC dataset collected from TNB electrical bills from March 2011 to February 2015 | NN, KNN, SVM, and Linear Regression | ANN reached an RMSE of 52,084 |
Torregrossa et al. [26] | WWTPs in Northwest Europe | ANN and RF | RF and ANN obtained an R2 of 82 and 0.81, respectively |
Torregrossa et al. [25] | Solingen-Burg dataset | SVR, ANN, and RF algorithms | RF obtained an R of 0.71 |
Oulebsir et al. [30] | 317 WWTPs located in the north-west of Europe | ANN model | R2 got a range between 74.2 to 82.4 |
Zhang et al. [29] | Data set covers the period from July 2010 to January 2012 | ANN model | reaches 0.78 and 0.02 in MAE and MAPE |
Das et al. [31] | 360 instances collected over one year | ANN, RNN, LSTM, and GRU | GRU achieves the lowest MAE of 0.43 |
Oliveira et al. [32] | Three datasets covering the period between January 2016 and May 2020 | LSTM, CNN, and GRU | CNN models produced higher performance results in both tests, achieving 690 kWh in the RMSE and 630 kWh in the MAE |
Yusuf et al. [33] | A water district in southern California | ARIMA and LSTM models | RMSE with a score of 20.96 and MAPE with a score of 5.51. |
Filipe et al. [34] | 70,962 data points were collected from September 2013 to June 2017 | Linear Quantile Regression, and Gradient Boosting Trees (GBT) | GBT achieved MAE and RMSE scores of 2.43% and 3.31% respectively |
Methods | RMSE (MWh) | MAE (MWh) | MAPE (%) | RMSE Train (MWh) | Train Time (s) | J2 |
---|---|---|---|---|---|---|
LSVR | 46.40 | 36.11 | 12.09 | 39.12 | 60.297 | 1.40 |
CSVR | 41.96 | 33.56 | 12.30 | 45.58 | 197.89 | 0.84 |
GSVR | 41.85 | 32.28 | 11.24 | 38.86 | 28.94 | 1.15 |
QSVR | 43.44 | 33.68 | 11.81 | 40.66 | 97.052 | 1.14 |
GPRE | 43.32 | 33.48 | 11.45 | 37.92 | 469.67 | 1.30 |
GPRM3/2 | 46.02 | 35.51 | 11.88 | 38.16 | 308.8 | 1.45 |
GPRRQ | 42.44 | 32.62 | 11.24 | 37.97 | 805.05 | 1.24 |
GPRSE | 46.23 | 35.65 | 11.92 | 38.15 | 482.8 | 1.46 |
GPRM5/2 | 46.00 | 35.48 | 11.88 | 38.18 | 527.09 | 1.45 |
BT | 41.46 | 32.23 | 11.36 | 35.67 | 442.61 | 1.35 |
BST | 41.37 | 31.97 | 11.25 | 35.77 | 28.376 | 1.33 |
ODT | 42.57 | 33.57 | 11.87 | 39.06 | 19.345 | 1.18 |
NNN | 60.05 | 47.08 | 14.58 | 44.06 | 13.64 | 1.85 |
MNN | 93.77 | 67.85 | 18.51 | 55.90 | 11.86 | 2.81 |
WNN | 194.31 | 131.10 | 27.91 | 145.03 | 16.16 | 1.79 |
BNN | 64.54 | 51.00 | 15.65 | 45.90 | 10.11 | 1.97 |
TNN | 67.59 | 49.76 | 15.69 | 51.95 | 12.14 | 1.69 |
ONN | 47.09 | 36.55 | 12.21 | 39.40 | 157.72 | 1.42 |
XGBoost | 41.38 | 32.23 | 12.07 | 42.36 | 375.00 | 0.95 |
RF | 42.43 | 33.16 | 12.07 | 44.90 | 75.00 | 0.89 |
KNN | 41.51 | 32.82 | 12.30 | 41.41 | 21.00 | 1.00 |
LightgBM | 41.76 | 32.34 | 12.40 | 41.76 | 107.00 | 1.00 |
Methods | RMSE (MWh) | MAE (MWh) | MAPE (%) | RMSE Train (MWh) | Train Time (s) | J2 |
---|---|---|---|---|---|---|
LSVR | 47.76 | 37.19 | 12.30 | 39.52 | 102.18 | 1.46 |
CSVR | 41.96 | 33.56 | 12.30 | 45.68 | 178.57 | 0.84 |
GSVR | 41.96 | 32.16 | 11.17 | 38.39 | 61.98 | 1.19 |
QSVR | 45.31 | 35.14 | 11.90 | 40.03 | 264.51 | 1.28 |
GPRE | 43.30 | 33.50 | 11.44 | 38.06 | 379.95 | 1.29 |
GPRM3/2 | 42.35 | 32.67 | 11.25 | 38.30 | 462.06 | 1.22 |
GPRRQ | 42.43 | 32.62 | 11.23 | 37.91 | 977.71 | 1.25 |
GPRSE | 46.21 | 35.67 | 11.92 | 38.39 | 467.66 | 1.45 |
GPRM5/2 | 42.22 | 32.58 | 11.24 | 38.42 | 420.51 | 1.21 |
BT | 41.70 | 32.35 | 11.43 | 35.51 | 450.59 | 1.38 |
BST | 43.67 | 34.26 | 12.56 | 40.01 | 30.56 | 1.04 |
ODT | 42.71 | 33.26 | 11.60 | 38.73 | 35.81 | 1.22 |
NNN | 60.88 | 48.96 | 15.29 | 42.79 | 10.27 | 2.02 |
MNN | 101.34 | 70.46 | 18.85 | 78.04 | 9.85 | 1.69 |
WNN | 140.76 | 92.65 | 26.12 | 227.77 | 13.33 | 0.38 |
BNN | 60.04 | 46.36 | 14.50 | 59.87 | 10.30 | 1.01 |
TNN | 62.43 | 49.07 | 15.06 | 46.86 | 11.64 | 1.78 |
ONN | 43.44 | 33.74 | 11.53 | 41.22 | 298.74 | 1.11 |
XGBoost | 41.77 | 32.80 | 12.20 | 42.91 | 376.00 | 0.95 |
RF | 41.61 | 32.31 | 12.25 | 41.48 | 64.00 | 1.01 |
KNN | 37.33 | 28.23 | 10.65 | 41.41 | 10.00 | 0.08 |
LightgBM | 41.27 | 32.00 | 12.27 | 41.27 | 103.00 | 1.00 |
Methods | RMSE (MWh) | MAE (MWh) | MAPE (%) | RMSE Train (MWh) | Train Time (s) | J2 |
---|---|---|---|---|---|---|
KNN | 37.33 | 28.23 | 10.65 | 37.48 | 13.00 | 0.99 |
XGBoost | 37.14 | 28.50 | 10.81 | 37.62 | 398.00 | 0.97 |
LightGBM | 37.38 | 28.63 | 10.96 | 37.11 | 150.00 | 1.01 |
GPRRQ | 37.45 | 28.65 | 10.04 | 34.17 | 936.09 | 1.20 |
RF | 37.86 | 28.73 | 10.91 | 41.73 | 66.00 | 0.82 |
GPRE | 37.36 | 28.74 | 10.05 | 33.88 | 549.21 | 1.22 |
BT | 37.56 | 28.75 | 10.27 | 33.99 | 332.30 | 1.22 |
GPRM5/2 | 37.42 | 28.81 | 10.07 | 33.93 | 502.68 | 1.22 |
BST | 37.83 | 28.86 | 10.25 | 34.92 | 58.89 | 1.17 |
ODT | 38.48 | 28.87 | 10.45 | 35.70 | 22.87 | 1.16 |
GSVR | 37.70 | 28.88 | 10.12 | 34.47 | 34.78 | 1.20 |
GPRSE | 37.59 | 28.92 | 10.10 | 33.96 | 784.89 | 1.23 |
QSVR | 38.62 | 29.30 | 10.24 | 34.38 | 210.03 | 1.26 |
ONN | 38.85 | 30.02 | 10.46 | 35.45 | 423.01 | 1.20 |
CSVR | 40.07 | 30.13 | 10.35 | 44.40 | 122.44 | 0.81 |
LSVR | 39.22 | 30.29 | 10.43 | 34.15 | 49.95 | 1.32 |
BNN | 46.96 | 36.31 | 12.43 | 39.83 | 11.41 | 1.39 |
TNN | 46.96 | 36.31 | 12.43 | 38.72 | 12.80 | 1.47 |
NNN | 53.83 | 39.83 | 12.55 | 57.08 | 10.52 | 0.89 |
MNN | 90.18 | 65.37 | 17.77 | 60.08 | 11.16 | 2.25 |
WNN | 152.43 | 111.08 | 26.80 | 210.20 | 14.63 | 0.53 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Alali, Y.; Harrou, F.; Sun, Y. Unlocking the Potential of Wastewater Treatment: Machine Learning Based Energy Consumption Prediction. Water 2023, 15, 2349. https://doi.org/10.3390/w15132349
Alali Y, Harrou F, Sun Y. Unlocking the Potential of Wastewater Treatment: Machine Learning Based Energy Consumption Prediction. Water. 2023; 15(13):2349. https://doi.org/10.3390/w15132349
Chicago/Turabian StyleAlali, Yasminah, Fouzi Harrou, and Ying Sun. 2023. "Unlocking the Potential of Wastewater Treatment: Machine Learning Based Energy Consumption Prediction" Water 15, no. 13: 2349. https://doi.org/10.3390/w15132349
APA StyleAlali, Y., Harrou, F., & Sun, Y. (2023). Unlocking the Potential of Wastewater Treatment: Machine Learning Based Energy Consumption Prediction. Water, 15(13), 2349. https://doi.org/10.3390/w15132349