PSO-Based Ensemble Meta-Learning Approach for Cloud Virtual Machine Resource Usage Prediction
Abstract
:1. Introduction
- First, we propose an ensemble learning-based workload prediction method based on recurrent neural network (RNN) variants and particle swarm optimization (PSO).
- Second, we provide a comparison of the proposed model with PSO-based LSTM, Bi-LSTM, and GRU prediction models for time series prediction algorithms.
- We conducted a series of experiments to evaluate the performance of the proposed PSO-based ensemble prediction model on the dataset of the GWA-T-12 and PlanetLab traces.
2. Problem Definition
3. Recent Key Contribution
3.1. Deep Learning-Based Approaches
3.2. Hybrid Approaches
4. Cloud Virtual Machine Resource Usage Prediction Based on PSO Ensemble Model
4.1. Particle Swarm Optimization
4.2. Ensemble Learning Techniques
4.3. Long Short-Term Memory Neural Networks
4.4. Bidirectional Long Short-Term Memory Neural Networks
4.5. Gated Recurrent Unit Neural Networks
4.6. Architecture Overview
Ensemble Expert Learning with PSO Optimization
5. Evaluation Metrics
6. Experiment and Dataset
6.1. Experimental Results
VM Selection Results
6.2. Prediction Performance Results
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Wen, Y.; Wang, Y.; Liu, J.; Cao, B.; Fu, Q. CPU usage prediction for cloud resource provisioning based on deep belief network and particle swarm optimization. Concurr. Comput. 2020, 32, e5730. [Google Scholar] [CrossRef]
- Duggan, M.; Mason, K.; Duggan, J.; Howley, E.; Barrett, E. Predicting host CPU utilization in cloud computing using recurrent neural networks. In Proceedings of the 2017 12th International Conference for Internet Technology and Secured Transactions (ICITST), Cambridge, UK, 11–14 December 2017; pp. 67–72. [Google Scholar] [CrossRef]
- Devi, K.L.; Valli, S. Time series-based workload prediction using the statistical hybrid model for the cloud environment. Computing 2022, 105, 353–374. [Google Scholar] [CrossRef]
- Chen, Z.; Hu, J.; Min, G.; Zomaya, A.Y.; El-Ghazawi, T. Towards Accurate Prediction for High-Dimensional and Highly-Variable Cloud Workloads with Deep Learning. IEEE Trans. Parallel Distrib. Syst. 2020, 31, 923–934. [Google Scholar] [CrossRef] [Green Version]
- Wen, Y.; Wang, Z.; Zhang, Y.; Liu, J.; Cao, B.; Chen, J. Energy and cost aware scheduling with batch processing for instance-intensive IoT workflows in clouds. Future Gener. Comput. Syst. 2019, 101, 39–50. [Google Scholar] [CrossRef]
- Subirats, J.; Guitart, J. Assessing and forecasting energy efficiency on Cloud computing platforms. Future Gener. Comput. Syst. 2015, 45, 70–94. [Google Scholar] [CrossRef]
- Akram, U.; Fülöp, M.T.; Tiron-tudor, A.; Topor, D.I. Impact of Digitalization on Customers’ Well-Being in the Pandemic Period: Challenges and Opportunities for the Retail Industry. Int. J. Environ. Res. Public Health 2021, 18, 7533. [Google Scholar] [CrossRef]
- Yang, D.; Cao, J.; Fu, J.; Wang, J.; Guo, J. A pattern fusion model for multi-step-ahead CPU load prediction. J. Syst. Softw. 2013, 86, 1257–1266. [Google Scholar] [CrossRef]
- Hsu, C.H.; Slagter, K.D.; Chen, S.C.; Chung, Y.C. Optimizing energy consumption with task consolidation in clouds. Inf. Sci. 2014, 258, 452–462. [Google Scholar] [CrossRef]
- Rahmanian, A.A.; Ghobaei-Arani, M.; Tofighy, S. A learning automata-based ensemble resource usage prediction algorithm for cloud computing environment. Future Gener. Comput. Syst. 2018, 79, 54–71. [Google Scholar] [CrossRef]
- Qiu, F.; Zhang, B.; Guo, J. A Deep Learning Approach for VM Workload Prediction in the Cloud; IEEE: New York, NY, USA, 2016. [Google Scholar]
- Qiu, F.; Zhang, B.; Guo, J. A deep learning approach for VM workload prediction in the cloud. In Proceedings of the 2016 17th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD), Shanghai, China, 30 May–1 June 2016; pp. 319–324. [Google Scholar] [CrossRef]
- Zhang, Q.; Yang, L.T.; Yan, Z.; Chen, Z.; Li, P. An Efficient Deep Learning Model to Predict Cloud Workload for Industry Informatics. IEEE Trans. Ind. Inform. 2018, 14, 3170–3178. [Google Scholar] [CrossRef]
- Ruan, L.; Bai, Y.; Li, S.; He, S.; Xiao, L. Workload time series prediction in storage systems: A deep learning based approach. Clust. Comput. 2021, 26, 25–35. [Google Scholar] [CrossRef]
- Fülöp, M.T.; Breaz, T.O.; He, X.; Ionescu, C.A.; Cordo, G.S. The role of universities’ sustainability, teachers’ wellbeing, and attitudes toward e-learning during COVID-19. Front. Public Health 2022, 10, 981593. [Google Scholar] [CrossRef]
- Kumar, J.; Singh, A.K.; Buyya, R. Ensemble learning based predictive framework for virtual machine resource request prediction. Neurocomputing 2020, 397, 20–30. [Google Scholar] [CrossRef]
- Yazdanian, P.; Sharifian, S. E2LG: A Multiscale Ensemble of LSTM/GAN Deep Learning Architecture for Multistep-Ahead Cloud Workload Prediction; Springer: Berlin/Heidelberg, Germany, 2021; Volume 77. [Google Scholar] [CrossRef]
- Kalekar, P. Time series forecasting using Holt-Winters exponential smoothing. Kanwal Rekhi Sch. Inf. Technol. 2004, 4329008, 1–13. Available online: http://www.it.iitb.ac.in/~praj/acads/seminar/04329008_ExponentialSmoothing.pdf (accessed on 10 December 2022).
- Weingärtner, R.; Bräscher, G.B.; Westphall, C.B. Cloud resource management: A survey on forecasting and profiling models. J. Netw. Comput. Appl. 2015, 47, 99–106. [Google Scholar] [CrossRef]
- Manvi, S.S.; Krishna Shyam, G. Resource management for Infrastructure as a Service (IaaS) in cloud computing: A survey. J. Netw. Comput. Appl. 2014, 41, 424–440. [Google Scholar] [CrossRef]
- Li, Z.; Yu, X.; Yu, L.; Guo, S.; Chang, V. Energy-efficient and quality-aware VM consolidation method. Future Gener. Comput. Syst. 2020, 102, 789–809. [Google Scholar] [CrossRef]
- Sun, X.; Ansari, N.; Wang, R. Optimizing Resource Utilization of a Data Center. IEEE Commun. Surv. Tutor. 2016, 18, 2822–2846. [Google Scholar] [CrossRef]
- Kumar, J.; Singh, A.K. Workload prediction in cloud using artificial neural network and adaptive differential evolution. Future Gener. Comput. Syst. 2018, 81, 41–52. [Google Scholar] [CrossRef]
- Kumar, K.D.; Umamaheswari, E. EWPTNN: An Efficient Workload Prediction Model in Cloud Computing Using Two-Stage Neural Networks. Procedia Comput. Sci. 2019, 165, 151–157. [Google Scholar] [CrossRef]
- Chouliaras, S.; Sotiriadis, S. Auto-scaling containerized cloud applications: A workload-driven approach. Simul. Model. Pract. Theory 2022, 121, 102654. [Google Scholar] [CrossRef]
- Kumaraswamy, S.; Nair, M.K. Intelligent VMs prediction in cloud computing environment. In Proceedings of the 2017 International Conference on Smart Technologies for Smart Nation (SmartTechCon), Bengaluru, India, 17–19 August 2017; pp. 288–294. [Google Scholar] [CrossRef]
- Khan, T.; Tian, W.; Ilager, S.; Buyya, R. Workload forecasting and energy state estimation in cloud data centres: ML-centric approach. Future Gener. Comput. Syst. 2022, 128, 320–332. [Google Scholar] [CrossRef]
- Deepika, T.; Prakash, P. Power consumption prediction in cloud data center using machine learning. Int. J. Electr. Comput. Eng. 2020, 10, 1524–1532. [Google Scholar] [CrossRef]
- Imdoukh, M.; Ahmad, I.; Alfailakawi, M.G. Machine learning-based auto-scaling for containerized applications. Neural Comput. Appl. 2020, 32, 9745–9760. [Google Scholar] [CrossRef]
- Tseng, F.H.; Wang, X.; Chou, L.D.; Chao, H.C.; Leung, V.C.M. Dynamic Resource Prediction and Allocation for Cloud Data Center Using the Multiobjective Genetic Algorithm. IEEE Syst. J. 2018, 12, 1688–1699. [Google Scholar] [CrossRef]
- Kumar, J.; Goomer, R.; Singh, A.K. Long Short Term Memory Recurrent Neural Network (LSTM-RNN) Based Workload Forecasting Model for Cloud Datacenters. Procedia Comput. Sci. 2018, 125, 676–682. [Google Scholar] [CrossRef]
- Varma, P.R.K.; Kumari, V.V.; Kumar, S.S. Progress in Computing, Analytics and Networking; Springer: Singapore, 2018; Volume 710, ISBN 978-981-10-7870-5. [Google Scholar]
- Dang-Quang, N.M.; Yoo, M. An Efficient Multivariate Autoscaling Framework Using Bi-LSTM for Cloud Computing. Appl. Sci. 2022, 12, 3523. [Google Scholar] [CrossRef]
- Gan, Z.; Chen, P.; Yu, C.; Chen, J.; Feng, K. Workload Prediction based on GRU-CNN in Cloud Environment. In Proceedings of the 2022 International Conference on Computer Engineering and Artificial Intelligence (ICCEAI), Shijiazhuang, China, 22–24 July 2022; pp. 472–476. [Google Scholar] [CrossRef]
- Chen, Z.; Zhu, Y.; Di, Y.; Feng, S. Self-adaptive prediction of cloud resource demands using ensemble model and subtractive-fuzzy clustering based fuzzy neural network. Comput. Intell. Neurosci. 2015, 2015, 17. [Google Scholar] [CrossRef] [Green Version]
- Singh, N.; Rao, S. Ensemble learning for large-scale workload prediction. IEEE Trans. Emerg. Top. Comput. 2014, 2, 149–165. [Google Scholar] [CrossRef]
- Kim, I.K.; Wang, W.; Qi, Y.; Humphrey, M. Forecasting Cloud Application Workloads with CloudInsight for Predictive Resource Management. IEEE Trans. Cloud Comput. 2020, 10, 1848–1863. [Google Scholar] [CrossRef]
- Kim, I.K.; Wang, W.; Qi, Y.; Humphrey, M. CloudInsight: Utilizing a Council of Experts to Predict Future Cloud Application Workloads. In Proceedings of the 2018 IEEE 11th International Conference on Cloud Computing (CLOUD), San Francisco, CA, USA, 2–7 July 2018; pp. 41–48. [Google Scholar] [CrossRef]
- Zhong, W.; Zhuang, Y.; Sun, J.; Gu, J. A load prediction model for cloud computing using PSO-based weighted wavelet support vector machine. Appl. Intell. 2018, 48, 4072–4083. [Google Scholar] [CrossRef]
- Ouhame, S.; Hadi, Y.; Ullah, A. An efficient forecasting approach for resource utilization in cloud data center using CNN-LSTM model. Neural Comput. Appl. 2021, 33, 10043–10055. [Google Scholar] [CrossRef]
- Shen, H.; Hong, X. Host Load Prediction with Bi-directional Long Short-Term Memory in Cloud Computing. arXiv 2020, arXiv:2007.15582. [Google Scholar]
- Okwu, M.O.; Tartibu, L.K. Particle Swarm Optimisation. Stud. Comput. Intell. 2021, 927, 5–13. [Google Scholar] [CrossRef]
- Band, S.S.; Janizadeh, S.; Pal, S.C.; Saha, A.; Chakrabortty, R.; Shokri, M.; Mosavi, A. Novel ensemble approach of deep learning neural network (Dlnn) model and particle swarm optimization (pso) algorithm for prediction of gully erosion susceptibility. Sensors 2020, 20, 5609. [Google Scholar] [CrossRef]
- Yuan, C.; Tang, Y.; Mei, R.; Mo, F.; Wang, H. A PSO-LSTM Model of Offshore Wind Power Forecast considering the Variation of Wind Speed in Second-Level Time Scale. Math. Probl. Eng. 2021, 2021, 2009062. [Google Scholar] [CrossRef]
- Yao, Y.; Han, L.; Wang, J. LSTM-PSO: Long Short-Term Memory Ship Motion Prediction Based on Particle Swarm Optimization. In Proceedings of the 2018 IEEE CSAA Guidance, Navigation and Control Conference (CGNCC), Xiamen, China, 10–12 August 2018. [Google Scholar] [CrossRef]
- Gudise, V.G.; Venayagamoorthy, G.K. Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks. In Proceedings of the Comparison of Particle Swarm Optimization and Backpropagation as Training Algorithms for Neural Networks; IEEE: New York, NY, USA, 2003; Volume 2, Available online: https://ieeexplore.ieee.org/document/1202255 (accessed on 3 January 2023).
- Hartmanis, J.; Leeuwen, J. Van Multiple Classifier Systems—First International Workshop, MCS 2000, Proceedings; LNCS; Springer: Berlin/Heidelberg, Germany, 2000; Volume 1857. [Google Scholar]
- Ensemble Learning: 5 Main Approaches. 2019. Available online: https://www.kdnuggets.com/2019/01/ensemble-learning-5-main-approaches.html (accessed on 8 December 2022).
- Aksoy, A.; Ertürk, Y.E.; Erdoğan, S.; Eyduran, E.; Tariq, M.M. Estimation of honey production in beekeeping enterprises from eastern part of Turkey through some data mining algorithms. Pak. J. Zool. 2018, 50, 2199–2207. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Cho, K.; van Merriënboer, B.; Bahdanau, D.; Bengio, Y. On the properties of neural machine translation: Encoder–decoder approaches. In Proceedings of SSST-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation; Association for Computational Linguistics, Stroudsburg, PA, USA, 2014; pp. 103–111. [Google Scholar] [CrossRef]
- Brownlee, J. Better Deep Learning: Train Faster, Reduce Overfitting, and Make Better Predictions. Mach. Learn. Mastery Python 2018, 1, 539. [Google Scholar]
- Merkle, F.H.; Discher, C.A. Controlled-Potential Coulometric Analysis of N-Substituted Phenothiazine Derivatives. Anal. Chem. 1964, 36, 1639–1643. [Google Scholar] [CrossRef]
- Shi, R.; Jiang, C. Three-Way Ensemble Prediction for Workload in the Data Center. IEEE Access 2022, 10, 10021–10030. [Google Scholar] [CrossRef]
Notation | Description |
---|---|
P | Physical computers |
V | Virtual machines |
|A| | Application hosted in data center |
References | Technique | Resource for Prediction | Dataset | Metrics | Strengths |
---|---|---|---|---|---|
[33] | BiLSTM | Virtual machine | GWA-T-12 and GWA-T-13 dataset | Accuracy, memory usage, and network-received throughput | Experiments on different actual workload datasets |
[10] | Learning automata (LA) | Virtual machine | CoMon project | Accuracy, CPU load prediction | Virtual machine using LA to tune the weights of multiple prediction models |
[40] | CNN-LSTM | Virtual machine | GWA-T-12 dataset | Accuracy, (CPU, memory, and network usage) | Used filtering mechanism to reduce noise in the workload data before inputting into the CNN layer |
[41] | BiLSTM | Physical machine | Google load trace data | Accuracy, CPU | Multistep-ahead prediction |
[1] | DP-CUPA learning (AR, GM, DBN, and PSO) | Physical machine | Google cluster usage trace | Accuracy, CPU usage | PSO to improve the accuracy of prediction results |
[34] | GRU-CNN | Physical machine | All the request data of the Shanghai Telecom base station | Accuracy, request data | Hybrid method CNN and LSTM |
PSO-based ensemble model | PSO-based ensemble meta-learning approach (GRU, BiLSTM, LSTM) | Virtual machine | Google cluster and PlanetLab traces | Accuracy, CPU usage | PSO to improve the accuracy of prediction results and consider multiple VMs’ CPU usage unlike other studies |
VM1_CPU_USAGE | VM2_CPU_ USAGE | VM3_CPU_ USAGE | VM4_CPU_ USAGE | |
---|---|---|---|---|
VM1_CPU_usage | 1 | 0.529261 | 0.399846 | 0.702094 |
VM2_CPU_usage | 0.529261 | 1 | 0.40572 | 0.764131 |
VM3_CPU_usage | 0.399846 | 0.40572 | 1 | 0.458631 |
VM4_CPU_usage | 0.702094 | 0.764131 | 0.458631 | 1 |
Model | MAE | MAPE | RMSE |
---|---|---|---|
GRU | 89.19% | 93.82% | 91.95% |
LSTM | 79.42% | 94.94% | 79.72% |
BiLSTM | 85.05% | 94.59% | 86.19% |
Model | MAE | MAPE | RMSE |
---|---|---|---|
GRU | 86.13% | 88.06% | 86.91% |
LSTM | 82.98% | 84.55% | 82.64% |
BiLSTM | 86.2% | 88.01% | 86.88% |
Model | MAE | MAPE | RMSE |
---|---|---|---|
GRU | 20.423 | 8.826 | 39.111 |
LSTM | 10.725 | 10.771 | 15.514 |
BiLSTM | 14.766 | 10.079 | 22.79 |
PSO Ensemble | 2.207 | 0.545 | 3.146 |
Model | MAE | MAPE | RMSE |
---|---|---|---|
GRU | 3.15 | 9.709 | 4.143 |
LSTM | 2.568 | 7.502 | 3.123 |
BiLSTM | 3.167 | 9.67 | 4.132 |
PSO Ensemble | 0.437 | 1.159 | 0.542 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Leka, H.L.; Fengli, Z.; Kenea, A.T.; Hundera, N.W.; Tohye, T.G.; Tegene, A.T. PSO-Based Ensemble Meta-Learning Approach for Cloud Virtual Machine Resource Usage Prediction. Symmetry 2023, 15, 613. https://doi.org/10.3390/sym15030613
Leka HL, Fengli Z, Kenea AT, Hundera NW, Tohye TG, Tegene AT. PSO-Based Ensemble Meta-Learning Approach for Cloud Virtual Machine Resource Usage Prediction. Symmetry. 2023; 15(3):613. https://doi.org/10.3390/sym15030613
Chicago/Turabian StyleLeka, Habte Lejebo, Zhang Fengli, Ayantu Tesfaye Kenea, Negalign Wake Hundera, Tewodros Gizaw Tohye, and Abebe Tamrat Tegene. 2023. "PSO-Based Ensemble Meta-Learning Approach for Cloud Virtual Machine Resource Usage Prediction" Symmetry 15, no. 3: 613. https://doi.org/10.3390/sym15030613
APA StyleLeka, H. L., Fengli, Z., Kenea, A. T., Hundera, N. W., Tohye, T. G., & Tegene, A. T. (2023). PSO-Based Ensemble Meta-Learning Approach for Cloud Virtual Machine Resource Usage Prediction. Symmetry, 15(3), 613. https://doi.org/10.3390/sym15030613