Improving Electrical Fault Detection Using Multiple Classifier Systems
Abstract
:1. Introduction
- The evaluation of well-established MCS approaches from the literature for the fault classification task in electrical transmission systems;
- The assessment of the impact on the performance of static ensemble and dynamic selection approaches when different levels of noise are introduced;
- A comparison of the MCS with various single models from the literature across 14 different scenarios in total;
- The superior performance of dynamic selection approaches in dealing with noise and enhancing classification accuracy compared to single models.
2. Multiple Classifier Systems
2.1. Static Ensemble
2.2. Dynamic Ensemble
2.2.1. DCS-LA
2.2.2. DESP
2.2.3. KNORA-E
2.2.4. KNORA-U
2.2.5. MCB
2.2.6. META-DES
3. Experimental Protocol
3.1. Dataset Description
- Class 0: No faults;
- Class 1: Phase-to-ground faults;
- Class 2: Phase-to-phase faults;
- Class 3: Phase-to-phase-to-ground faults;
- Class 4: Three-phase faults.
- Class 0: Fault AB;
- Class 1: Fault ABC;
- Class 2: Fault ABG;
- Class 3: Fault AC;
- Class 4: Fault ACG.
3.2. Experimental Setup
4. Results
4.1. Dataset 1
4.2. Dataset 2
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- O’Rourke, T.D. Critical infrastructure, interdependencies, and resilience. BRIDGE-Wash. Acad. Eng. 2007, 37, 22. [Google Scholar]
- Dobson, I.; Carreras, B.A.; Lynch, V.E.; Newman, D.E. Complex systems analysis of series of blackouts: Cascading failure, critical points, and self-organization. Chaos Interdiscip. J. Nonlinear Sci. 2007, 17, 026103. [Google Scholar] [CrossRef] [PubMed]
- Yagan, O.; Qian, D.; Zhang, J.; Cochran, D. Optimal allocation of interconnecting links in cyber-physical systems: Interdependence, cascading failures, and robustness. IEEE Trans. Parallel Distrib. Syst. 2012, 23, 1708–1720. [Google Scholar] [CrossRef]
- Goni, M.F.; Nahiduzzaman, M.; Anower, M.; Rahman, M.; Islam, M.; Ahsan, M.; Haider, J.; Shahjalal, M. Fast and Accurate Fault Detection and Classification in Transmission Lines using Extreme Learning Machine. E-Prime Adv. Electr. Eng. Electron. Energy 2023, 3, 100107. [Google Scholar] [CrossRef]
- Janarthanam, K.; Kamalesh, P.; Basil, T.V.; Kovilpillai, A.K.J. Electrical Faults-Detection and Classification using Machine Learning. In Proceedings of the 2022 International Conference on Electronics and Renewable Systems (ICEARS), Tuticorin, India, 16–18 March 2022; pp. 1289–1295. [Google Scholar]
- Jamil, M.; Sharma, S.K.; Singh, R. Fault detection and classification in electrical power transmission system using artificial neural network. SpringerPlus 2015, 4, 334. [Google Scholar] [CrossRef] [PubMed]
- Lusková, M.; Leitner, B. Societal vulnerability to electricity supply failure. Interdiscip. Descr. Complex Syst. INDECS 2021, 19, 391–401. [Google Scholar] [CrossRef]
- Grainger, J.J. Power System Analysis; McGraw-Hill: New York, NY, USA, 1999. [Google Scholar]
- Glover, J.D.; Overbye, T.J.; Sarma, M.S. Power System Analysis & Design; Cengage Learning: Boston, MA, USA, 2017. [Google Scholar]
- Ge, L.; Yan, J.; Sun, Y.; Wang, Z. Situation Awareness for Smart Distribution Systems; MDPI-Multidisciplinary Digital Publishing Institute: Basel, Switzerland, 2022. [Google Scholar]
- Panteli, M.; Kirschen, D.S. Situation awareness in power systems: Theory, challenges and applications. Electr. Power Syst. Res. 2015, 122, 140–151. [Google Scholar] [CrossRef]
- Endsley, M.R. Toward a theory of situation awareness in dynamic systems. Hum. Factors 1995, 37, 32–64. [Google Scholar] [CrossRef]
- He, X.; Qiu, R.C.; Ai, Q.; Chu, L.; Xu, X.; Ling, Z. Designing for situation awareness of future power grids: An indicator system based on linear eigenvalue statistics of large random matrices. IEEE Access 2016, 4, 3557–3568. [Google Scholar] [CrossRef]
- Pinto, R.; Gonçalves, G. Application of artificial immune systems in advanced manufacturing. Array 2022, 15, 100238. [Google Scholar] [CrossRef]
- Ogar, V.N.; Hussain, S.; Gamage, K.A. Transmission line fault classification of multi-dataset using Catboost classifier. Signals 2022, 3, 468–482. [Google Scholar] [CrossRef]
- Shakiba, F.M.; Azizi, S.M.; Zhou, M.; Abusorrah, A. Application of machine learning methods in fault detection and classification of power transmission lines: A survey. Artif. Intell. Rev. 2023, 56, 5799–5836. [Google Scholar] [CrossRef]
- Guo, J.; Yang, Y.; Li, H.; Wang, J.; Tang, A.; Shan, D.; Huang, B. A hybrid deep learning model towards fault diagnosis of drilling pump. Appl. Energy 2024, 372, 123773. [Google Scholar] [CrossRef]
- Ruan, Y.; Zheng, M.; Qian, F.; Meng, H.; Yao, J.; Xu, T.; Pei, D. Fault detection and diagnosis of energy system based on deep learning image recognition model under the condition of imbalanced samples. Appl. Therm. Eng. 2024, 238, 122051. [Google Scholar] [CrossRef]
- Asman, S.H.; Ab Aziz, N.F.; Ungku Amirulddin, U.A.; Ab Kadir, M.Z.A. Decision tree method for fault causes classification based on RMS-DWT analysis in 275 Kv transmission lines network. Appl. Sci. 2021, 11, 4031. [Google Scholar] [CrossRef]
- Viswavandya, M.; Patel, S.; Sahoo, K. Analysis and comparison of machine learning approaches for transmission line fault prediction in power systems. J. Res. Eng. Appl. Sci. 2021, 6, 24–31. [Google Scholar] [CrossRef]
- Wang, B.; Yang, K.; Wang, D.; Chen, S.z.; Shen, H.j. The applications of XGBoost in fault diagnosis of power networks. In Proceedings of the 2019 IEEE Innovative Smart Grid Technologies-Asia (ISGT Asia), Chengdu, China, 21–24 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 3496–3500. [Google Scholar]
- Atrigna, M.; Buonanno, A.; Carli, R.; Cavone, G.; Scarabaggio, P.; Valenti, M.; Graditi, G.; Dotoli, M. A machine learning approach to fault prediction of power distribution grids under heatwaves. IEEE Trans. Ind. Appl. 2023, 59, 4835–4845. [Google Scholar] [CrossRef]
- Abed, N.K.; Abed, F.T.; Al-Yasriy, H.F.; ALRikabi, H.T.S. Detection of power transmission lines faults based on voltages and currents values using K-Nearest neighbors. Int. J. Power Electron. Drive Syst. (IJPEDS) 2023, 14, 1033–1043. [Google Scholar] [CrossRef]
- Hallmann, M.; Pietracho, R.; Komarnicki, P. Comparison of Artificial Intelligence and Machine Learning Methods Used in Electric Power System Operation. Energies 2024, 17, 2790. [Google Scholar] [CrossRef]
- Jawad, R.S.; Abid, H. HVDC fault detection and classification with artificial neural network based on ACO-DWT method. Energies 2023, 16, 1064. [Google Scholar] [CrossRef]
- Wang, Y.; Liu, M.; Bao, Z.; Zhang, S. Stacked sparse autoencoder with PCA and SVM for data-based line trip fault diagnosis in power systems. Neural Comput. Appl. 2019, 31, 6719–6731. [Google Scholar] [CrossRef]
- Wadi, M.; Elmasry, W. An anomaly-based technique for fault detection in power system networks. In Proceedings of the 2021 International Conference on Electric Power Engineering–Palestine (ICEPE-P), Gaza, Palestine, 23–24 March 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–5. [Google Scholar]
- Veerasamy, V.; Wahab, N.I.A.; Othman, M.L.; Padmanaban, S.; Sekar, K.; Ramachandran, R.; Hizam, H.; Vinayagam, A.; Islam, M.Z. LSTM recurrent neural network classifier for high impedance fault detection in solar PV integrated power system. IEEE Access 2021, 9, 32672–32687. [Google Scholar] [CrossRef]
- Ajagekar, A.; You, F. Quantum computing based hybrid deep learning for fault diagnosis in electrical power systems. Appl. Energy 2021, 303, 117628. [Google Scholar] [CrossRef]
- Shadi, M.R.; Ameli, M.T.; Azad, S. A real-time hierarchical framework for fault detection, classification, and location in power systems using PMUs data and deep learning. Int. J. Electr. Power Energy Syst. 2022, 134, 107399. [Google Scholar] [CrossRef]
- Alhanaf, A.S.; Balik, H.H.; Farsadi, M. Intelligent fault detection and classification schemes for smart grids based on deep neural networks. Energies 2023, 16, 7680. [Google Scholar] [CrossRef]
- Salehimehr, S.; Miraftabzadeh, S.M.; Brenna, M. A Novel Machine Learning-Based Approach for Fault Detection and Location in Low-Voltage DC Microgrids. Sustainability 2024, 16, 2821. [Google Scholar] [CrossRef]
- Harish, A.; Jayan, M. Classification of power transmission line faults using an ensemble feature extraction and classifier method. In Proceedings of the Inventive Communication and Computational Technologies: Proceedings of ICICCT 2020, Tamil Nadu, India, 25–26 June 2021; Springer: Singapore, 2021; pp. 417–427. [Google Scholar]
- Wolpert, D.H. The lack of a priori distinctions between learning algorithms. Neural Comput. 1996, 8, 1341–1390. [Google Scholar] [CrossRef]
- Nishat Toma, R.; Kim, C.H.; Kim, J.M. Bearing fault classification using ensemble empirical mode decomposition and convolutional neural network. Electronics 2021, 10, 1248. [Google Scholar] [CrossRef]
- Nishat Toma, R.; Kim, J.M. Bearing fault classification of induction motors using discrete wavelet transform and ensemble machine learning algorithms. Appl. Sci. 2020, 10, 5251. [Google Scholar] [CrossRef]
- Ghaemi, A.; Safari, A.; Afsharirad, H.; Shayeghi, H. Accuracy enhance of fault classification and location in a smart distribution network based on stacked ensemble learning. Electr. Power Syst. Res. 2022, 205, 107766. [Google Scholar] [CrossRef]
- Vaish, R.; Dwivedi, U.; Tewari, S.; Tripathi, S.M. Machine learning applications in power system fault diagnosis: Research advancements and perspectives. Eng. Appl. Artif. Intell. 2021, 106, 104504. [Google Scholar] [CrossRef]
- Fragoso, R.C.; Cavalcanti, G.D.; Pinheiro, R.H.; Oliveira, L.S. Dynamic selection and combination of one-class classifiers for multi-class classification. Knowl.-Based Syst. 2021, 228, 107290. [Google Scholar] [CrossRef]
- Hajihosseinlou, M.; Maghsoudi, A.; Ghezelbash, R. Stacking: A novel data-driven ensemble machine learning strategy for prediction and mapping of Pb-Zn prospectivity in Varcheh district, west Iran. Expert Syst. Appl. 2024, 237, 121668. [Google Scholar] [CrossRef]
- Bhattacharya, D.; Nigam, M.K. Energy efficient fault detection and classification using hyperparameter-tuned machine learning classifiers with sensors. Meas. Sensors 2023, 30, 100908. [Google Scholar] [CrossRef]
- Lim, S.H.; Kim, T.; Lee, K.Y.; Song, K.M.; Yoon, S.G. Two-Stage Fault Classification Algorithm for Real Fault Data in Transmission Lines. IEEE Access 2024, 12, 121156–121168. [Google Scholar] [CrossRef]
- Dong, X.; Yu, Z.; Cao, W.; Shi, Y.; Ma, Q. A survey on ensemble learning. Front. Comput. Sci. 2020, 14, 241–258. [Google Scholar] [CrossRef]
- Moral-García, S.; Abellán, J. Improving the Results in Credit Scoring by Increasing Diversity in Ensembles of Classifiers. IEEE Access 2023, 11, 58451–58461. [Google Scholar] [CrossRef]
- Asif, D.; Bibi, M.; Arif, M.S.; Mukheimer, A. Enhancing heart disease prediction through ensemble learning techniques with hyperparameter optimization. Algorithms 2023, 16, 308. [Google Scholar] [CrossRef]
- Bitrus, S.; Fitzek, H.; Rigger, E.; Rattenberger, J.; Entner, D. Enhancing classification in correlative microscopy using multiple classifier systems with dynamic selection. Ultramicroscopy 2022, 240, 113567. [Google Scholar] [CrossRef]
- Zheng, J.; Liu, Y.; Ge, Z. Dynamic ensemble selection based improved random forests for fault classification in industrial processes. IFAC J. Syst. Control 2022, 20, 100189. [Google Scholar] [CrossRef]
- Walhazi, H.; Maalej, A.; Amara, N.E.B. A multi-classifier system for automatic fingerprint classification using transfer learning and majority voting. Multimed. Tools Appl. 2024, 83, 6113–6136. [Google Scholar] [CrossRef]
- Mienye, I.D.; Sun, Y. A survey of ensemble learning: Concepts, algorithms, applications, and prospects. IEEE Access 2022, 10, 99129–99149. [Google Scholar] [CrossRef]
- Cruz, R.M.; Sabourin, R.; Cavalcanti, G.D. Dynamic classifier selection: Recent advances and perspectives. Inf. Fusion 2018, 41, 195–216. [Google Scholar] [CrossRef]
- Aurangzeb, S.; Aleem, M. Evaluation and classification of obfuscated Android malware through deep learning using ensemble voting mechanism. Sci. Rep. 2023, 13, 3093. [Google Scholar] [CrossRef] [PubMed]
- Ganaie, M.A.; Hu, M.; Malik, A.K.; Tanveer, M.; Suganthan, P.N. Ensemble deep learning: A review. Eng. Appl. Artif. Intell. 2022, 115, 105151. [Google Scholar] [CrossRef]
- Breiman, L. Bagging predictors. Mach. Learn. 1996, 24, 123–140. [Google Scholar] [CrossRef]
- Aeeneh, S.; Zlatanov, N.; Yu, J. New Bounds on the Accuracy of Majority Voting for Multiclass Classification. IEEE Trans. Neural Netw. Learn. Syst. 2024, 1–5. [Google Scholar] [CrossRef]
- Wolpert, D.H. Stacked generalization. Neural Netw. 1992, 5, 241–259. [Google Scholar] [CrossRef]
- Cui, S.; Yin, Y.; Wang, D.; Li, Z.; Wang, Y. A stacking-based ensemble learning method for earthquake casualty prediction. Appl. Soft Comput. 2021, 101, 107038. [Google Scholar] [CrossRef]
- Woods, K.; Kegelmeyer, W.P.; Bowyer, K. Combination of multiple classifiers using local accuracy estimates. IEEE Trans. Pattern Anal. Mach. Intell. 1997, 19, 405–410. [Google Scholar] [CrossRef]
- Woloszynski, T.; Kurzynski, M.; Podsiadlo, P.; Stachowiak, G.W. A measure of competence based on random classification for dynamic ensemble selection. Inf. Fusion 2012, 13, 207–213. [Google Scholar] [CrossRef]
- Ko, A.H.; Sabourin, R.; Britto, A.S. From dynamic classifier selection to dynamic ensemble selection. Pattern Recognit. 2008, 41, 1718–1731. [Google Scholar] [CrossRef]
- Giacinto, G.; Roli, F. Dynamic classifier selection based on multiple classifier behavior. Pattern Recognit. 2001, 34, 1879–1881. [Google Scholar] [CrossRef]
- Cruz, R.M.; Sabourin, R.; Cavalcanti, G.D. META-DES: A dynamic ensemble selection framework using meta-learning. Pattern Recognit. 2014, 48, 1925–1935. [Google Scholar] [CrossRef]
- Ensina, L.A.; Oliveira, L.E.d.; Cruz, R.M.; Cavalcanti, G.D. Fault distance estimation for transmission lines with dynamic regressor selection. Neural Comput. Appl. 2024, 36, 1741–1759. [Google Scholar] [CrossRef]
- Kurukuru, V.S.B.; Blaabjerg, F.; Khan, M.A.; Haque, A. A novel fault classification approach for photovoltaic systems. Energies 2020, 13, 308. [Google Scholar] [CrossRef]
- Rahman Fahim, S.; Sarker, S.K.; Muyeen, S.; Sheikh, M.R.I.; Das, S.K. Microgrid fault detection and classification: Machine learning based approach, comparison, and reviews. Energies 2020, 13, 3460. [Google Scholar] [CrossRef]
- Bhuiyan, E.A.; Akhand, M.A.; Fahim, S.R.; Sarker, S.K.; Das, S.K. A deep learning through DBN enabled transmission line fault transient classification framework for multimachine microgrid systems. Int. Trans. Electr. Energy Syst. 2022, 2022, 6820319. [Google Scholar] [CrossRef]
- Lazzarini, R.; Tianfield, H.; Charissis, V. A stacking ensemble of deep learning models for IoT intrusion detection. Knowl.-Based Syst. 2023, 279, 110941. [Google Scholar] [CrossRef]
- Zhu, X.; Li, J.; Ren, J.; Wang, J.; Wang, G. Dynamic ensemble learning for multi-label classification. Inf. Sci. 2023, 623, 94–111. [Google Scholar] [CrossRef]
- Cordeiro, P.R.; Cavalcanti, G.D.; Cruz, R.M. Dynamic ensemble algorithm post-selection using Hardness-aware Oracle. IEEE Access 2023, 11, 86056–86070. [Google Scholar] [CrossRef]
- Suthaharan, S.; Suthaharan, S. Decision tree learning. In Machine Learning Models and Algorithms for Big Data Classification: Thinking with Examples for Effective Learning; Springer: New York, NY, USA, 2016; pp. 237–269. [Google Scholar]
- Jamehbozorg, A.; Shahrtash, S.M. A decision-tree-based method for fault classification in single-circuit transmission lines. IEEE Trans. Power Deliv. 2010, 25, 2190–2196. [Google Scholar] [CrossRef]
- Navada, A.; Ansari, A.N.; Patil, S.; Sonkamble, B.A. Overview of use of decision tree algorithms in machine learning. In Proceedings of the 2011 IEEE Control and System Graduate Research Colloquium, Shah Alam, Malaysia, 27–28 June 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 37–42. [Google Scholar]
- El Mrabet, Z.; Sugunaraj, N.; Ranganathan, P.; Abhyankar, S. Random forest regressor-based approach for detecting fault location and duration in power systems. Sensors 2022, 22, 458. [Google Scholar] [CrossRef] [PubMed]
- Fang, J.; Yang, F.; Chen, C.; Yang, Y.; Pang, B.; He, J.; Lin, H. Power Distribution Transformer Fault Diagnosis with Unbalanced Samples Based on Neighborhood Component Analysis and K-Nearest Neighbors. In Proceedings of the 2021 Power System and Green Energy Conference (PSGEC), Shanghai, China, 13–16 May 2021; pp. 670–675. [Google Scholar]
Model | Parameter | Values |
---|---|---|
KNN | n_neighbors | {1, 2, 3, 4, 5, 6, 7, 8, 9, 10} |
weights | {‘uniform’, ‘distance’} | |
metric | {‘euclidean’, ‘manhattan’, ‘minkowski’} | |
Decision Tree | random_state | {0, 1, 2, 42} |
criterion | {‘gini’, ‘entropy’} | |
max_depth | {2, 3, 4, 5, 6, 7, 8, 9, 10, 11} | |
min_samples_leaf | {1, 2, 4, 6} | |
Random Forest | n_estimators | {1, 10, 30, 100, 200} |
random_state | {0, 42} | |
max_depth | {2, 10, 30, None} | |
max_features | {‘auto’, ‘sqrt’, ‘log2’} | |
min_samples_leaf | {1, 2} | |
XGBoosting | learning_rate | {0.1, 0.547, 0.6427} |
max_depth | {2, 4, 6, 8, 10} | |
n_estimators | {2, 4, 8, 10, 200} | |
min_child_weight | {1, 3, 5} | |
subsample | {0.7, 0.8, 0.9} | |
Lgbm | learning_rate | {0.001, 0.01, 0.1} |
max_depth | {2, 4, 6, 8, 10} | |
min_child_samples | {20} | |
n_estimators | {2, 4, 8, 10, 200} | |
num_leaves | {7, 31} | |
boosting_type | {‘gbdt’, ‘goss’} | |
CatBoosting | learning_rate | {0.1, 0.01, 0.001} |
max_depth | {2, 4, 6, 8, 10} | |
n_estimators | {2, 4, 8, 10, 200} |
Static Ensemble | Parameter | Value |
---|---|---|
Majority Vote | voting | {‘hard’} |
Stacked DT | meta_classifier | {DecisionTree} |
meta_classifier_criterion | {‘gini’} | |
meta_classifier_min_samples_leaf | {1} | |
meta_classifier_min_samples_split | {2} | |
meta_classifier_splitter | {‘best’} | |
Stacked LR | meta_classifier | {LogisticRegression} |
Dynamic Selection | Parameter | Value |
---|---|---|
DESP | DFP | {False} |
DESL_perc | {0.5} | |
IH_rate | {0.3} | |
k | {7} | |
knn_classifier | {‘knn’} | |
knn_metric | {‘minkowski’} | |
mode | {‘selection’} | |
KNORA-E KNORA-U | DFP | {False} |
DESL_perc | {0.5} | |
IH_rate | {0.3} | |
k | {7} | |
knn_classifier | {‘knn’} | |
knn_metric | {‘minkowski’} | |
MCB | DFP | {False} |
DESL_perc | {0.5} | |
IH_rate | {0.3} | |
diff_thresh | {0.1} | |
k | {7} | |
knn_classifier | {‘knn’} | |
knn_metric | {‘minkowski’} | |
knne | {False} | |
M-DES | DFP | {False} |
DESL_perc | {0.5} | |
Hc | {1.0} | |
IH_rate | {0.3} | |
Kp | {5} | |
k | {7} | |
knn_classifier | {‘knn’} | |
knn_metric | {‘minkowski’} | |
meta_classifier | {‘Multinomial naive Bayes’} | |
mode | {‘selection’} | |
OLA | DFP | {False} |
DESL_perc | {0.5} | |
k | {7} | |
knn_classifier | {‘knn’} | |
knn_metric | {‘minkowski’} | |
knne | {False} |
Metric | Acronym | Equation | Limits |
---|---|---|---|
Accuracy | A | [0, 1] | |
Precision | P | [0, 1] | |
Recall | R | [0, 1] |
Noise | Metric | Single Models | |||||
---|---|---|---|---|---|---|---|
Level | CatBoost | DT | KNN | LightGBM | RF | XGBoost | |
Without noise | A | 99.80 | 74.58 | 99.90 | 99.85 | 99.90 | 99.83 |
P | 99.80 | 77.24 | 99.90 | 99.85 | 99.90 | 99.83 | |
R | 99.80 | 74.58 | 99.90 | 99.85 | 99.90 | 99.83 | |
10,000 | A | 99.67 | 71.58 | 94.40 | 99.75 | 99.80 | 99.62 |
P | 99.67 | 74.52 | 94.36 | 99.75 | 99.80 | 99.63 | |
R | 99.67 | 71.58 | 94.40 | 99.75 | 99.80 | 99.62 | |
20,000 | A | 99.55 | 69.10 | 88.72 | 99.65 | 99.58 | 99.62 |
P | 99.55 | 72.02 | 88.45 | 99.65 | 99.58 | 99.62 | |
R | 99.55 | 69.10 | 88.72 | 99.65 | 99.58 | 99.62 | |
30,000 | A | 99.50 | 66.07 | 84.52 | 99.55 | 99.08 | 99.62 |
P | 99.50 | 68.64 | 83.94 | 99.55 | 99.08 | 99.63 | |
R | 99.50 | 66.07 | 84.52 | 99.55 | 99.08 | 99.62 | |
40,000 | A | 99.58 | 69.15 | 78.75 | 99.48 | 98.50 | 99.35 |
P | 99.58 | 98.50 | 77.58 | 99.48 | 98.50 | 99.35 | |
R | 99.58 | 98.50 | 78.75 | 99.48 | 98.50 | 99.35 | |
50,000 | A | 99.42 | 67.42 | 75.25 | 99.50 | 98.02 | 99.48 |
P | 99.42 | 67.60 | 73.40 | 99.50 | 98.05 | 99.48 | |
R | 99.42 | 67.42 | 75.25 | 99.50 | 98.02 | 99.48 | |
60,000 | A | 99.22 | 67.22 | 70.17 | 99.20 | 97.70 | 99.17 |
P | 99.22 | 60.77 | 67.34 | 99.20 | 97.74 | 99.18 | |
R | 99.22 | 67.22 | 70.17 | 99.20 | 97.70 | 99.17 |
Noise | Metric | Static Ensemble | ||
---|---|---|---|---|
Level | Majority Vote | Stacked DT | Stacked LR | |
Without noise | A | 99.90 | 99.90 | 99.90 |
P | 99.90 | 99.90 | 99.90 | |
R | 99.90 | 99.90 | 99.90 | |
10,000 | A | 99.65 | 99.72 | 99.85 |
P | 99.65 | 99.73 | 99.85 | |
R | 99.65 | 99.72 | 99.85 | |
20,000 | A | 99.50 | 99.78 | 99.85 |
P | 99.50 | 99.78 | 99.85 | |
R | 99.50 | 99.78 | 99.85 | |
30,000 | A | 99.17 | 99.55 | 99.72 |
P | 99.18 | 99.55 | 99.73 | |
R | 99.17 | 99.55 | 99.72 | |
40,000 | A | 98.80 | 99.78 | 99.78 |
P | 98.82 | 99.78 | 99.78 | |
R | 98.80 | 99.78 | 99.78 | |
50,000 | A | 98.40 | 99.55 | 99.70 |
P | 98.44 | 99.55 | 99.70 | |
R | 98.40 | 99.55 | 99.70 | |
60,000 | A | 97.92 | 99.45 | 99.60 |
P | 98.00 | 99.45 | 99.60 | |
R | 97.92 | 99.45 | 99.60 |
Noise | Metric | Dynamic Selection Approach | |||||
---|---|---|---|---|---|---|---|
Level | DESP | KNORA-E | KNORA-U | MCB | M-DES | OLA | |
Without noise | A | 99.85 | 99.85 | 99.93 | 99.88 | 99.85 | 99.90 |
P | 99.85 | 99.85 | 99.93 | 99.88 | 99.85 | 99.90 | |
R | 99.85 | 99.85 | 99.93 | 99.88 | 99.85 | 99.90 | |
10,000 | A | 99.65 | 99.70 | 99.80 | 99.98 | 99.75 | 98.72 |
P | 99.65 | 99.70 | 99.80 | 99.98 | 99.75 | 98.72 | |
R | 99.65 | 99.70 | 99.80 | 99.98 | 99.75 | 98.72 | |
20,000 | A | 99.52 | 99.67 | 99.70 | 98.52 | 99.75 | 97.20 |
P | 99.53 | 99.68 | 99.70 | 98.53 | 99.75 | 97.26 | |
R | 99.52 | 99.67 | 99.70 | 98.52 | 99.75 | 97.20 | |
30,000 | A | 99.12 | 99.48 | 99.50 | 98.32 | 99.70 | 96.83 |
P | 99.14 | 99.48 | 99.50 | 98.32 | 99.70 | 96.90 | |
R | 99.12 | 99.48 | 99.50 | 98.32 | 99.70 | 96.83 | |
40,000 | A | 98.67 | 99.48 | 99.58 | 97.15 | 99.70 | 95.55 |
P | 98.70 | 99.48 | 99.58 | 97.15 | 99.70 | 95.61 | |
R | 98.67 | 99.48 | 99.58 | 97.15 | 99.70 | 95.55 | |
50,000 | A | 98.78 | 99.40 | 99.48 | 97.82 | 99.65 | 95.60 |
P | 99.79 | 99.40 | 99.48 | 97.82 | 99.65 | 95.64 | |
R | 99.78 | 99.40 | 99.48 | 97.82 | 99.65 | 95.60 | |
60,000 | A | 98.60 | 99.33 | 99.42 | 96.95 | 99.62 | 94.97 |
P | 98.63 | 99.33 | 99.43 | 96.96 | 99.63 | 95.11 | |
R | 98.60 | 99.33 | 99.42 | 96.95 | 99.62 | 94.97 |
Noise | Metric | Single Models | |||||
---|---|---|---|---|---|---|---|
Level | CatBoost | DT | KNN | LightGBM | RF | XGBoost | |
Without noise | A | 99.40 | 53.87 | 72.00 | 99.83 | 99.77 | 99.79 |
P | 99.40 | 43.66 | 71.95 | 99.83 | 99.77 | 99.79 | |
R | 99.40 | 53.87 | 72.00 | 99.83 | 99.77 | 99.79 | |
0.1 | A | 97.79 | 53.79 | 71.44 | 99.21 | 98.45 | 99.10 |
P | 97.79 | 43.58 | 71.44 | 99.21 | 98.45 | 99.10 | |
R | 97.79 | 53.79 | 71.44 | 99.21 | 98.45 | 99.10 | |
0.3 | A | 95.45 | 53.10 | 69.91 | 97.14 | 95.89 | 97.14 |
P | 95.45 | 42.84 | 69.84 | 97.14 | 95.89 | 97.14 | |
R | 95.45 | 53.10 | 69.91 | 97.14 | 95.89 | 97.14 | |
0.5 | A | 92.16 | 52.56 | 68.32 | 94.16 | 92.42 | 92.43 |
P | 92.16 | 48.32 | 68.22 | 94.16 | 92.43 | 94.38 | |
R | 92.16 | 52.56 | 68.32 | 94.16 | 92.42 | 94.38 | |
0.7 | A | 89.52 | 51.67 | 66.35 | 90.53 | 89.19 | 90.83 |
P | 89.52 | 48.01 | 66.40 | 90.54 | 89.22 | 90.84 | |
R | 89.52 | 51.67 | 66.35 | 90.53 | 89.19 | 90.83 | |
1 | A | 84.91 | 48.75 | 63.96 | 85.76 | 84.02 | 86.35 |
P | 84.91 | 49.19 | 63.83 | 85.81 | 84.12 | 86.39 | |
R | 84.91 | 48.75 | 63.96 | 85.76 | 84.02 | 86.35 | |
1.3 | A | 80.53 | 48.74 | 61.69 | 81.63 | 79.99 | 81.70 |
P | 80.53 | 48.98 | 61.53 | 81.68 | 80.06 | 81.73 | |
R | 80.53 | 48.74 | 61.69 | 81.63 | 79.99 | 81.70 | |
1.5 | A | 77.65 | 47.46 | 59.92 | 78.71 | 76.85 | 78.87 |
P | 77.65 | 46.72 | 59.75 | 78.82 | 76.92 | 78.91 | |
R | 77.65 | 47.46 | 59.72 | 78.71 | 76.95 | 78.87 |
Noise | Metric | Static Ensemble | ||
---|---|---|---|---|
Level | Majority Vote | Stacked DT | Stacked LR | |
Without noise | A | 99.66 | 99.77 | 99.80 |
P | 99.66 | 99.77 | 99.80 | |
R | 99.66 | 99.77 | 99.80 | |
0.1 | A | 98.92 | 98.75 | 99.23 |
P | 98.92 | 98.74 | 99.23 | |
R | 98.92 | 98.75 | 99.23 | |
0.3 | A | 96.65 | 95.62 | 97.05 |
P | 96.66 | 95.62 | 97.05 | |
R | 96.65 | 95.62 | 97.05 | |
0.5 | A | 93.64 | 91.21 | 93.84 |
P | 93.67 | 91.21 | 93.84 | |
R | 93.64 | 91.21 | 93.84 | |
0.7 | A | 90.45 | 86.52 | 90.76 |
P | 90.50 | 86.52 | 90.77 | |
R | 90.45 | 86.52 | 90.76 | |
1 | A | 85.80 | 80.25 | 85.61 |
P | 85.95 | 80.10 | 85.45 | |
R | 85.80 | 80.25 | 85.61 | |
1.3 | A | 81.27 | 74.15 | 81.27 |
P | 81.41 | 74.35 | 81.47 | |
R | 81.27 | 74.15 | 81.27 | |
1.5 | A | 78.58 | 70.55 | 78.34 |
P | 78.74 | 71.05 | 78.56 | |
R | 78.58 | 70.55 | 78.34 |
Noise | Metric | Dynamic Selection Approach | |||||
---|---|---|---|---|---|---|---|
Level | DESP | KNORA-E | KNORA-U | MCB | M-DES | OLA | |
Without noise | A | 98.36 | 99.64 | 99.64 | 97.19 | 99.74 | 96.79 |
P | 98.36 | 99.64 | 99.64 | 97.19 | 99.74 | 96.79 | |
R | 98.36 | 99.64 | 99.64 | 97.19 | 99.74 | 96.79 | |
0.1 | A | 98.86 | 98.79 | 98.74 | 96.24 | 99.02 | 95.43 |
P | 98.96 | 98.79 | 98.74 | 96.25 | 99.02 | 95.43 | |
R | 98.86 | 98.79 | 98.74 | 96.24 | 99.02 | 95.43 | |
0.3 | A | 94.27 | 95.69 | 96.28 | 93.60 | 96.76 | 93.02 |
P | 94.41 | 95.71 | 96.30 | 93.62 | 96.77 | 93.02 | |
R | 94.27 | 95.69 | 96.28 | 93.60 | 96.76 | 93.02 | |
0.5 | A | 90.84 | 91.34 | 92.83 | 90.14 | 92.97 | 89.06 |
P | 91.87 | 91.40 | 92.89 | 90.19 | 93.02 | 89.09 | |
R | 90.84 | 91.34 | 92.83 | 90.14 | 92.97 | 89.06 | |
0.7 | A | 87.86 | 87.72 | 89.53 | 86.10 | 89.54 | 85.57 |
P | 88.10 | 87.80 | 89.61 | 86.15 | 89.62 | 85.61 | |
R | 87.86 | 87.72 | 89.53 | 86.10 | 89.54 | 85.57 | |
1 | A | 84.77 | 82.41 | 84.95 | 80.90 | 83.99 | 80.08 |
P | 84.30 | 82.25 | 84.19 | 79.91 | 83.61 | 80.00 | |
R | 84.77 | 82.41 | 84.19 | 80.90 | 83.99 | 80.08 | |
1.3 | A | 80.47 | 77.70 | 80.62 | 76.01 | 79.16 | 75.51 |
P | 80.67 | 77.90 | 80.92 | 76.20 | 79.35 | 75.71 | |
R | 80.47 | 77.70 | 80.62 | 76.01 | 79.16 | 75.51 | |
1.5 | A | 77.67 | 74.86 | 77.97 | 74.15 | 75.90 | 72.92 |
P | 77.87 | 74.98 | 78.05 | 74.35 | 75.90 | 72.98 | |
R | 77.67 | 74.86 | 77.97 | 74.15 | 75.90 | 72.92 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Oliveira, J.; Passos, D.; Carvalho, D.; Melo, J.F.V.; Silva, E.G.; de Mattos Neto, P.S.G. Improving Electrical Fault Detection Using Multiple Classifier Systems. Energies 2024, 17, 5787. https://doi.org/10.3390/en17225787
Oliveira J, Passos D, Carvalho D, Melo JFV, Silva EG, de Mattos Neto PSG. Improving Electrical Fault Detection Using Multiple Classifier Systems. Energies. 2024; 17(22):5787. https://doi.org/10.3390/en17225787
Chicago/Turabian StyleOliveira, José, Dioéliton Passos, Davi Carvalho, José F. V. Melo, Eraylson G. Silva, and Paulo S. G. de Mattos Neto. 2024. "Improving Electrical Fault Detection Using Multiple Classifier Systems" Energies 17, no. 22: 5787. https://doi.org/10.3390/en17225787
APA StyleOliveira, J., Passos, D., Carvalho, D., Melo, J. F. V., Silva, E. G., & de Mattos Neto, P. S. G. (2024). Improving Electrical Fault Detection Using Multiple Classifier Systems. Energies, 17(22), 5787. https://doi.org/10.3390/en17225787