Differentially Private Clustered Federated Load Prediction Based on the Louvain Algorithm
Abstract
:1. Introduction
- (1)
- A federated learning load-forecasting framework tailored for residential energy users with heterogeneous data is proposed.
- (2)
- A method for calculating the similarity of load patterns while ensuring privacy is introduced. This involves constructing a complex network and developing a novel federated clustering method based on the Louvain algorithm, which offers lower computational costs and does not require pre-specifying the number of clusters (K).
- (3)
- An adaptive gradient-clipping differentially private clustered federated averaging algorithm, called pADP-FedAvg, is proposed, along with a differential privacy analysis.
- (4)
- Weather and time factors are identified as key load-related variables, a user dataset is constructed, and a federated learning load-forecasting LSTM model is established. The effectiveness and advantages of the proposed methods are validated through case studies.
2. Related Technology
2.1. Federated Learning Model
2.2. Differential Privacy
3. Problem Description
- (1)
- , indicating that all clients have consistent data distributions and the optimal solution is achieved simultaneously.
- (2)
- The local model updates are not all zero, which is caused by inconsistent data distributions during the training process. This scenario more accurately reflects practical applications of federated learning. In this case, the local model updates may not guide the global model toward optimization, resulting in generally lower performance and accuracy for the global model.
4. Research Motivation
5. System Architecture and Design Scheme
5.1. System Architecture
5.2. Similarity Calculation
- Step 1
- Step 2
- Equation (11) is used to calculate the cosine similarity between any two clients’ local updates. The cosine similarity evaluates the similarity of two vectors by calculating the cosine of the angle between them. The closer the value is to 1, the more similar the two vectors are:
5.3. Complex Network Construction
5.4. Louvain Algorithm
5.5. Adaptive Differential Privacy Personalized Clustering Federated Learning Algorithm
Algorithm 1 pADP-FedAvg algorithm. |
Input: Number of communication rounds T; Local iteration period L; Initial value of the personalized model ; Local update step size ; Noise parameter ; Initial value of the adaptive clipping threshold ; Number of clients selected in each round r Output: Personalized model Server executes: 1: initialize 2: for each round do 3: Sample a set of r clients at random without replacement, denoted by 4: Broadcast to all clients in 5: for each client in parallel do 6: ClientUpdate 7: end for 8: 9: end for 10: Return ClientUpdate: 11: 12: for do 13: Sample a mini-batch of size and compute gradient 14: Adaptive gradient clipping: 15: 16: Add DP noise to gradient: 17: 18: 19: end for 20: Computes clipping threshold by Equation (15) 21: upload to server |
5.6. Privacy Analysis
6. Electricity Load-Forecasting Model
6.1. Load Influencing Factors
6.2. Construction of User Dataset
6.3. Federated Learning Model Based on LSTM
Feature Type | Feature Name | Data Type | Unit/Range | Transformation Method | Transformed Feature |
---|---|---|---|---|---|
Weather | Temperature | Continuous | °F | Normalization | temperature |
Humidity | Continuous | % | Normalization | humidity | |
Pressure | Continuous | kPa | Normalization | pressure | |
Load | Load | Continuous | kW | Normalization | load |
Time | Year | Discrete | 2016–2018 | One-hot encoding | oh-2016, oh-2017, oh-2018 |
Month | Discrete | 1–12 | sin/cos Cyclic encoding | month-sin, month-cos | |
Day | Discrete | 1–30 | sin/cos Cyclic encoding | day-sin, day-cos | |
Hour | Discrete | 0–23 | sin/cos Cyclic encoding | hour-sin, hour-cos |
6.4. Clustered Federated Learning Load-Forecasting Process Based on Louvain Algorithm
- Step 1
- All clients participate in training with the classical differential privacy federated averaging algorithm until convergence.
- Step 2
- The central server calculates the cosine similarity based on the local update vectors at the convergence of the model from local clients.
- Step 3
- Each client serves as a network node, constructing a network edge set and edge weights between nodes based on the similarity of learned client parameters, forming an undirected weighted complex network.
- Step 4
- Community clustering is performed using the Louvain algorithm.
- Step 5
- The pADP-FedAvg algorithm is utilized for personalized adaptive differential privacy clustering federated learning for electricity load forecasting.
7. Analysis of Arithmetic Examples
7.1. Data Sources
7.2. Data Pre-Processing
7.3. Evaluation Metrics
7.4. Analysis of Simulation Results
8. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Voropai, N. Electric Power System Transformations: A Review of Main Prospects and Challenges. Energies 2020, 13, 5639. [Google Scholar] [CrossRef]
- Zhang, L.; Ren, J.; Mu, Y.; Wang, B. Privacy-Preserving Multi-Authority Attribute-Based Data Sharing Framework for Smart Grid. IEEE Access 2020, 8, 23294–23307. [Google Scholar] [CrossRef]
- Hou, H.; Liu, C.; Wang, Q.; Wu, X.; Tang, J.; Shi, Y.; Xie, C. Review of load forecasting based on artificial intelligence methodologies, models, and challenges. Electr. Power Syst. Res. 2022, 210, 108067. [Google Scholar] [CrossRef]
- Dudek, G. Pattern-based local linear regression models for short-term load forecasting. Electr. Power Syst. Res. 2016, 130, 139–147. [Google Scholar] [CrossRef]
- Walser, T.; Sauer, A. Typical load profile-supported convolutional neural network for short-term load forecasting in the industrial sector. Energy AI 2021, 5, 100104. [Google Scholar] [CrossRef]
- Karaman, O.A. Prediction of Wind Power with Machine Learning Models. Appl. Sci. 2023, 13, 11455. [Google Scholar] [CrossRef]
- Cicceri, G.; Tricomi, G.; D’Agati, L.; Longo, F.; Merlino, G.; Puliafito, A. A Deep Learning-Driven Self-Conscious Distributed Cyber-Physical System for Renewable Energy Communities. Sensors 2023, 23, 4549. [Google Scholar] [CrossRef]
- McMahan, B.; Moore, E.; Ramage, D.; Hampson, S.; Arcas, B.A.Y. Communication-Efficient Learning of Deep Networks from Decentralized Data. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, Lauderdale, FL, USA, 20–22 April 2017; Singh, A., Zhu, J., Eds.; Volume 54, pp. 1273–1282. [Google Scholar]
- Niknam, S.; Dhillon, H.S.; Reed, J.H. Federated Learning for Wireless Communications: Motivation, Opportunities, and Challenges. IEEE Commun. Mag. 2020, 58, 46–51. [Google Scholar] [CrossRef]
- Husnoo, M.A.; Anwar, A.; Hosseinzadeh, N.; Islam, S.N.; Mahmood, A.N.; Doss, R. A Secure Federated Learning Framework for Residential Short-Term Load Forecasting. IEEE Trans. Smart Grid 2024, 15, 2044–2055. [Google Scholar] [CrossRef]
- Fernández, J.D.; Menci, S.P.; Lee, C.M.; Rieger, A.; Fridgen, G. Privacy-preserving federated learning for residential short-term load forecasting. Appl. Energy 2022, 326, 119915. [Google Scholar] [CrossRef]
- Fekri, M.N.; Grolinger, K.; Mir, S. Distributed load forecasting using smart meter data: Federated learning with Recurrent Neural Networks. Int. J. Electr. Power Energy Syst. 2022, 137, 107669. [Google Scholar] [CrossRef]
- Savi, M.; Olivadese, F. Short-Term Energy Consumption Forecasting at the Edge: A Federated Learning Approach. IEEE Access 2021, 9, 95949–95969. [Google Scholar] [CrossRef]
- Zhu, L.; Liu, Z.; Han, S. Deep leakage from gradients. In Proceedings of the 33rd International Conference on Neural Information Processing Systems, Vancouver, BC, Canada, 8–14 December 2019; Curran Associates Inc.: Red Hook, NY, USA, 2019. [Google Scholar]
- Dwork, C.; Roth, A. The Algorithmic Foundations of Differential Privacy. Found. Trends® Theor. Comput. Sci. 2014, 9, 211–407. [Google Scholar] [CrossRef]
- Abadi, M.; Chu, A.; Goodfellow, I.; McMahan, H.B.; Mironov, I.; Talwar, K.; Zhang, L. Deep Learning with Differential Privacy. In Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security, CCS ’16, Vienna, Austria, 24–28 October 2016; pp. 308–318. [Google Scholar] [CrossRef]
- Hu, R.; Guo, Y.; Gong, Y. Concentrated Differentially Private Federated Learning with Performance Analysis. IEEE Open J. Comput. Soc. 2021, 2, 276–289. [Google Scholar] [CrossRef]
- Sun, X.; Yuan, Z.; Kong, X.; Xue, L.; He, L.; Lin, Y. Communication-Efficient and Privacy-Preserving Aggregation in Federated Learning with Adaptability. IEEE Internet Things J. 2024, 11, 26430–26443. [Google Scholar] [CrossRef]
- Xue, R.; Xue, K.; Zhu, B.; Luo, X.; Zhang, T.; Sun, Q.; Lu, J. Differentially Private Federated Learning with an Adaptive Noise Mechanism. IEEE Trans. Inf. Forensics Secur. 2024, 19, 74–87. [Google Scholar] [CrossRef]
- Liu, M.; Teng, F.; Zhang, Z.; Ge, P.; Sun, M.; Deng, R.; Cheng, P.; Chen, J. Enhancing Cyber-Resiliency of DER-Based Smart Grid: A Survey. IEEE Trans. Smart Grid 2024, 15, 4998–5030. [Google Scholar] [CrossRef]
- Zhang, Z.; Liu, M.; Sun, M.; Deng, R.; Cheng, P.; Niyato, D.; Chow, M.Y.; Chen, J. Vulnerability of Machine Learning Approaches Applied in IoT-Based Smart Grid: A Review. IEEE Internet Things J. 2024, 11, 18951–18975. [Google Scholar] [CrossRef]
- Ma, X.; Zhu, J.; Lin, Z.; Chen, S.; Qin, Y. A state-of-the-art survey on solving non-IID data in Federated Learning. Future Gener. Comput. Syst. 2022, 135, 244–258. [Google Scholar] [CrossRef]
- Sattler, F.; Müller, K.R.; Samek, W. Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 3710–3722. [Google Scholar] [CrossRef]
- Gao, Z.; Xiong, Z.; Zhao, C.; Feng, F. Clustered Federated Learning Framework with Acceleration Based on Data Similarity. In Algorithms and Architectures for Parallel Processing; Tari, Z., Li, K., Wu, H., Eds.; Springer: Singapore, 2024; pp. 80–92. [Google Scholar]
- Ghosh, A.; Hong, J.; Yin, D.; Ramchandran, K. Robust Federated Learning in a Heterogeneous Environment. arXiv 2019, arXiv:1906.06629. [Google Scholar]
- Ghosh, A.; Chung, J.; Yin, D.; Ramchandran, K. An Efficient Framework for Clustered Federated Learning. IEEE Trans. Inf. Theory 2022, 68, 8076–8091. [Google Scholar] [CrossRef]
- Bun, M.; Steinke, T. Concentrated Differential Privacy: Simplifications, Extensions, and Lower Bounds. In Theory of Cryptography; Hirt, M., Smith, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2016; pp. 635–658. [Google Scholar]
- Yu, L.; Liu, L.; Pu, C.; Gursoy, M.E.; Truex, S. Differentially Private Model Publishing for Deep Learning. In Proceedings of the 2019 IEEE Symposium on Security and Privacy (SP), San Francisco, CA, USA, 19–23 May 2019; pp. 332–349. [Google Scholar] [CrossRef]
- Geyer, R.C.; Klein, T.; Nabi, M. Differentially Private Federated Learning: A Client Level Perspective. arXiv 2017, arXiv:1712.07557. [Google Scholar]
- McMahan, H.B.; Ramage, D.; Talwar, K.; Zhang, L. Learning Differentially Private Recurrent Language Models. In Proceedings of the International Conference on Learning Representations, Vancouver, BC, Canada, 30 April–3 May 2018. [Google Scholar]
- Blondel, V.D.; Guillaume, J.L.; Lambiotte, R.; Lefebvre, E. Fast unfolding of communities in large networks. J. Stat. Mech. Theory Exp. 2008, 2008, P10008. [Google Scholar] [CrossRef]
- Shi, H.; Xu, M.; Li, R. Deep Learning for Household Load Forecasting—A Novel Pooling Deep RNN. IEEE Trans. Smart Grid 2018, 9, 5271–5280. [Google Scholar] [CrossRef]
- Makonin, S. HUE: The hourly usage of energy dataset for buildings in British Columbia. Data Brief 2019, 23, 103744. [Google Scholar] [CrossRef]
Parameter | Value |
---|---|
Number of epochs of training on clients | 8 |
Total communication rounds | 60 |
LSTM hidden layers | 2 |
Model structure | LSTM layers with 256 hidden states LSTM layers with 128 hidden states Fully connected layers with 64 neurons Fully connected layers with 32 neurons |
Input feature dimensions | 27 |
Batch size | 128 |
Learning rate | 0.0015 |
Dropout probability | 0.2 |
Output feature dimensions | 15 |
Privacy budget | 0.4, 0.6, 0.8 |
Loss | MSE |
Algorithm | MSE | RMSE | MAPE |
---|---|---|---|
DP-FedAvg | 0.278 | 0.527 | 33.5 |
pADP-FedAvg | 0.205 | 0.452 | 29.3 |
Model | Evaluation Metrics | |||
---|---|---|---|---|
Global | MSE | 0.251 | 0.205 | 0.192 |
RMSE | 0.501 | 0.452 | 0.438 | |
MAPE | 35.5 | 29.3 | 19.7 | |
Cluster 1 | MSE | 0.042 | 0.017 | 0.009 |
RMSE | 0.205 | 0.130 | 0.095 | |
MAPE | 12.2 | 6.2 | 2.9 | |
Cluster 2 | MSE | 0.031 | 0.021 | 0.015 |
RMSE | 0.176 | 0.145 | 0.122 | |
MAPE | 15.1 | 8.5 | 3.3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pan, T.; Hou, J.; Jin, X.; Li, C.; Cai, X.; Zhou, X. Differentially Private Clustered Federated Load Prediction Based on the Louvain Algorithm. Algorithms 2025, 18, 32. https://doi.org/10.3390/a18010032
Pan T, Hou J, Jin X, Li C, Cai X, Zhou X. Differentially Private Clustered Federated Load Prediction Based on the Louvain Algorithm. Algorithms. 2025; 18(1):32. https://doi.org/10.3390/a18010032
Chicago/Turabian StylePan, Tingzhe, Jue Hou, Xin Jin, Chao Li, Xinlei Cai, and Xiaodong Zhou. 2025. "Differentially Private Clustered Federated Load Prediction Based on the Louvain Algorithm" Algorithms 18, no. 1: 32. https://doi.org/10.3390/a18010032
APA StylePan, T., Hou, J., Jin, X., Li, C., Cai, X., & Zhou, X. (2025). Differentially Private Clustered Federated Load Prediction Based on the Louvain Algorithm. Algorithms, 18(1), 32. https://doi.org/10.3390/a18010032