Comparative Evaluation of Non-Intrusive Load Monitoring Methods Using Relevant Features and Transfer Learning
Abstract
:1. Introduction
2. Problem Statement
2.1. Supervised HEA Identification
2.2. Features Selection for HEA Identification
2.3. Transfer Learning
3. Materials
3.1. HEAs Datasets
3.1.1. PLAID Dataset
3.1.2. New Proposed Dataset
3.2. Electrical Features Computed from Current and Voltage Measurements
- The Root Mean Squared (RMS) value of the harmonic component of the voltage and currents , and their sum and :
- The RMS voltage V and current I:
- The harmonic component of the active, reactive, and apparent powers , , , and their sums , , :
- The active, reactive apparent, and distorsion powers P, Q, S, and D:
- The voltage and current total harmonic distortion and :
- The voltage and current distortion powers and :
- The non-fundamental apparent power :
- The voltage and current crest factors for :
- Finally, the global and harmonic power factors and :
4. Feature Selection
4.1. Investigated Feature Selection Methods
4.1.1. Existing Methods
- Linear Discriminant Analysis (LDA) can be used as a supervised filter-based method maximizing the separation between classes (see Section 5.1).
- Mutual Information (MI) is a filter-based method measuring the amount of information each feature conveys from the class labels [35].
4.1.2. New Proposed Sequential Forward Method
Algorithm 1: Sequential forward FS algorithm |
|
4.1.3. New Proposed Deep Neural Network (DNN) Feature Selection Method
4.2. Feature Selection Results
- For both datasets, some features such as , or are present regardless of the used FS method;
- For both datasets, the features selected by the DNN method for FS and by the PCA method are diversified in terms of harmonic orders;
- For both datasets, the features selected by the MI and LDA methods are related to odd-order harmonics, which describe the power supply structures included in most of the HEAs;
- For the sequential forward FS method, our experiments compare the results provided by the Euclidean-based KNN classifier (where the neighborhood parameter is set to ) and to the LDA classifier. The number of nearest neighbors is set to 7 because it is the closest odd number to the number of instances in a class in the proposed dataset, so that for each neighborhood, there is a majority vote. The selected features are those that reach the maximum accuracy [47] reported in Table 2 and Table 3. For our dataset, only 12 features allow maximizing the KNN classifier accuracy and 18 features maximize the accuracy of the LDA classifier (see Figure 5). For the PLAID dataset, 25 features allow maximizing the KNN classifier accuracy and 33 features maximize the LDA classifier accuracy (see Figure 6). The low accuracy reached by the LDA classifier in the PLAID dataset can be explained by the unbalanced training sets (where one or several classes outnumber the other classes) [48]. Indeed, LDA is known to not provide good performances in this setting since classification is generally biased towards the majority classes. It is observed that for the LDA classifier, the accuracies less rapidly reach a plateau than for the KNN. Indeed, the KNN is known to be affected by the overfitting phenomenon [49,50], which increases the distance between individuals of the same class and decreases the accuracy. The classifier cannot deal with the feature relevance and is more sensitive to FS than other classifiers such as LDA, which can handle irrelevant features [51].
5. Home Electrical Appliances Classification Results
5.1. Investigated Classification Methods
- The KNN method is widely used by the NILM community for HEAs’ identification [12,52]. We use the Euclidean distance and , which corresponds to the closest odd number to the number of instances in a class in the proposed dataset. Hence, the predicted class corresponds to the most represented one in the neighborhood through majority voting.
- The LDA method estimates the optimal linear combination between features using the eigenvectors of the projection matrix of dimension , where and , where are the covariance matrices built from the corresponding individuals (number of individuals of class ); and correspond to the mean over all the individuals of the whole dataset and the mean over all the individuals in the class k, respectively. Then, the tested individuals are projected into the discriminative linear space before being assigned to the class whose centroid is the closest in terms of the Euclidean distance.
- The proposed DNN classification method uses the same fully connected DNN architecture as presented in Section 4.1.3 for FS. Our implementation is based on tensorflow/keras (https://keras.io/). The training is completed with a batch size equal to 64 and a maximal number of 350 epochs (one epoch is reached each time the whole training dataset is processed once). The optimization is completed using the RMSprop algorithm [43] with a learning rate set to .
- The Random Forest (RF) classification method creates a set of decision trees and aggregates the votes from the decision trees to predict the class of the tested individuals [53]. The number of trees was set to 5 after experimental tuning to get the best results.
5.2. Classification Test Procedure
5.3. Self-Database Results
5.3.1. Proposed Dataset
5.3.2. PLAID Dataset
5.4. Transfer Learning Results
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
BN | Batch Normalization |
DA | Data Augmentation |
DNN | Deep Neural Network |
FS | Feature Selection |
HEA(s) | Home Electrical Appliance(s) |
KNN | K-nearest-neighbor |
LDA | Linear Discriminant Analysis |
MI | Mutual Information |
NILM | Non-Intrusive Load Monitoring |
PCA | Principal Component analysis |
PCC | Point of Common Coupling |
PLAID | Public Dataset of High Resolution for Load Identification Research |
RELU[RELU] | REctified Linear Unit |
RF | Random Forest |
SNR | Signal-to-Noise Ratio |
Appendix A. Confusion Matrices
References
- Darby, S. The Effectiveness of Feedback on Energy Consumption. In A Review for DEFRA of the Literature on Metering, Billing and direct Displays; University of Oxford: Oxford, UK, 2006; Volume 486. [Google Scholar]
- Wang, Y.; Chen, Q.; Hong, T.; Kang, C. Review of smart meter data analytics: Applications, methodologies, and challenges. IEEE Trans. Smart Grid 2018, 10, 3125–3148. [Google Scholar] [CrossRef] [Green Version]
- Hart, G.W. Non-intrusive appliance load monitoring. Proc. IEEE 1992, 80, 1870–1891. [Google Scholar] [CrossRef]
- Zoha, A.; Gluhak, A.; Imran, M.A.; Rajasegarar, S. Non-intrusive load monitoring approaches for disaggregated energy sensing: A survey. Sensors 2012, 12, 16838–16866. [Google Scholar] [CrossRef] [Green Version]
- Faustine, A.; Mvungi, N.H.; Kaijage, S.; Michael, K. A survey on non-intrusive load monitoring methodies and techniques for energy disaggregation problem. arXiv 2017, arXiv:1703.00785. [Google Scholar]
- Houidi, S.; Auger, F.; Sethom, H.B.A.; Fourer, D.; Miègeville, L. Multivariate event detection methods for non-intrusive load monitoring in smart homes and residential buildings. Energy Build. 2020, 208, 109624. [Google Scholar] [CrossRef]
- Houidi, S.; Fourer, D.; Auger, F. On the Use of Concentrated Time-Frequency Representations as Input to a Deep Convolutional Neural Network: Application to Non Intrusive Load Monitoring. Entropy 2020, 22, 911. [Google Scholar] [CrossRef]
- Liang, J.; Ng, S.K.K.; Kendall, G.; Cheng, J.W.M. Load Signature Study Part I: Basic Concept, Structure, and Methodology. IEEE Trans. Power Del. 2010, 25, 551–560. [Google Scholar] [CrossRef]
- Zeifman, M.; Roth, K. Non intrusive appliance load monitoring: Review and outlook. IEEE Trans. Consum. Electron. 2011, 57, 76–84. [Google Scholar] [CrossRef]
- Kalogridis, G.; Efthymiou, C.; Denic, S.; Cepeda, R. Privacy for smart meters: Towards undetectable appliance load signatures. In Proceedings of the First IEEE International Conference on Smart Grid Communications, Gaithersburg, MD, USA, 4–6 October 2010; pp. 232–237. [Google Scholar]
- Ruano, A.; Hernandez, A.; Ureña, J.; Ruano, M.; Garcia, J. NILM Techniques for Intelligent Home Energy Management and Ambient Assisted Living: A Review. Energies 2019, 12. [Google Scholar] [CrossRef] [Green Version]
- Figueiredo, M.; De Almeida, A.; Ribeiro, B. An Experimental Study on Electrical Signature Identification of Non-Intrusive Load Monitoring (NILM) Systems. In Proceedings of the 10th international Conference on Adaptive and Natural Computing Algorithms (ICANNGA), Ljubljana, Slovenia, 14–16 April 2011; pp. 31–40. [Google Scholar]
- Do Nascimento, P.P.M. Applications of Deep Learning Techniques on NILM. Ph.D. Thesis, Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brazil, 2016. [Google Scholar]
- De Paiva Penha, D.; Castro, A.R.G. Home appliance identification for NILM systems based on deep neural networks. Int. J. Artif. Intell. Appl. 2018, 9, 69–80. [Google Scholar] [CrossRef]
- Piccialli, V.; Sudoso, A.M. Improving Non-Intrusive Load Disaggregation through an Attention-Based Deep Neural Network. Energies 2021, 14, 847. [Google Scholar] [CrossRef]
- Cannas, B.; Carcangiu, S.; Carta, D.; Fanni, A.; Muscas, C. Selection of Features Based on Electric Power Quantities for Non-Intrusive Load Monitoring. Appl. Sci. 2021, 11. [Google Scholar] [CrossRef]
- Guyon, I.; Elisseeff, A. An introduction to variable and feature selection. J. Mach. Learn.Res. 2003, 3, 1157–1182. [Google Scholar]
- Sadeghianpourhamami, N.; Ruyssinck, J.; Deschrijver, D.; Dhaene, T.; Develder, C. Comprehensive feature selection for appliance classification in NILM. Energy Build. 2017, 151, 98–106. [Google Scholar] [CrossRef] [Green Version]
- Houidi, S.; Auger, F.; Ben Attia Sethom, H.; Fourer, D.; Miègeville, L. Relevant feature selection for home appliances recognition. In Proceedings of the Electrimacs 2017 Conference, Nancy, France, 4–6 July 2017. [Google Scholar]
- Kato, T.; Cho, H.S.; Lee, D.; Toyomura, T.; Yamazaki, T. Appliance Recognition from Electric Current Signals for Information-Energy Integrated Network in Home Environments. In Proceedings of the Smart Homes and Health Telematics (ICOST), Tours, France, 1–3 July 2009; pp. 150–157. [Google Scholar]
- D’Incecco, M.; Squartini, S.; Zhong, M. Transfer learning for non-intrusive load monitoring. IEEE Trans. Smart Grid 2019, 11, 1419–1429. [Google Scholar] [CrossRef] [Green Version]
- Houidi, S. Classification des Charges Électriques Résidentielles en vue de Leur Gestion Intelligente et de Leur comptabilisation. Ph.D. Thesis, University of Nantes, Saint-Nazaire, France, 2020. [Google Scholar]
- Houidi, S.; Auger, F.; Sethom, H.B.A.; Miègeville, L.; Fourer, D.; Jiang, X. Statistical Assessment of Abrupt Change Detectors for Non Intrusive Load Monitoring. In Proceedings of the 2018 IEEE International Conference on Industrial Technology (ICIT), Lyon, France, 20–22 February 2018. [Google Scholar]
- Molina, L.C.; Belanche, L.; Nebot, À. Feature selection algorithms: A survey and experimental evaluation. In Proceedings of the IEEE International Conference on Data Mining (ICDM), IEEE, Lyon, France, 20–22 February 2002; pp. 306–313. [Google Scholar]
- Jain, A.; Zongker, D. Feature selection: Evaluation, application, and small sample performance. IEEE Trans. Pattern Anal. 1997, 19, 153–158. [Google Scholar] [CrossRef] [Green Version]
- Gao, J.; Giri, S.; Kara, E.C.; Bergés, M. PLAID: A Public Dataset of High-resolution Electrical Appliance Measurements for Load Identification Research: Demo Abstract. In Proceedings of the 2nd ACM International Conference on Embedded Systems for Energy-Efficient Built Environments, Seoul, Korea, 4–5 November 2014; pp. 198–199. [Google Scholar] [CrossRef]
- Renaux, D.P.B.; Pottker, F.; Ancelmo, H.C.; Lazzaretti, A.E.; Lima, C.R.E.; Linhares, R.R.; Oroski, E.; Nolasco, L.d.S.; Lima, L.T.; Mulinari, B.M.; et al. A Dataset for Non-Intrusive Load Monitoring: Design and Implementation. Energies 2020, 13. [Google Scholar] [CrossRef]
- Langella, R.; Testa, A. IEEE standard definitions for the measurement of electric power quantities under sinusoidal, nonsinusoidal, balanced, or unbalanced conditions. Rev. IEEE Std. 1459–2000 2010, 1–40. [Google Scholar] [CrossRef] [Green Version]
- Eigeles, E.A. On the Assessment of Harmonic Pollution. IEEE Trans. Power Del. 1995, 10, 693–698. [Google Scholar]
- Aha, D.W.; Bankert, R.L. A comparative evaluation of sequential feature selection algorithms. In Learning From Data; Springer: New York, NY, USA, 1996; pp. 199–206. [Google Scholar]
- Yu, L.; Liu, H. Efficient feature selection via analysis of relevance and redundancy. J. Mach. Learn. Res. 2004, 5, 1205–1224. [Google Scholar]
- Cadenas, J.M.; Garrido, M.C.; Martínez, R. Feature subset selection filter—wrapper based on low quality data. Expert Syst. Appl. 2013, 40, 6241–6252. [Google Scholar] [CrossRef]
- Anderson, T.W. An Introduction to Multivariate Statistical Analysis; John Wiley and Sons Inc.: New York, NY, USA, 1958; Volume 2. [Google Scholar]
- Boutsidis, C.; Mahoney, M.W.; Drineas, P. Unsupervised feature selection for principal components analysis. In Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (SIGKDD), Las Vegas, NV, USA, 24–27 August 2008; pp. 61–69. [Google Scholar]
- Pohjalainen, J.; Rasanen, O.; Kadioglu, S. Feature selection methods and their combinations in high-dimensional classification of speaker likability, intelligibility and personality traits. Comput. Speech Lang. 2015, 29, 145–171. [Google Scholar] [CrossRef]
- Blanchet, F.G.; Legendre, P.; Borcard, D. Forward selection of explanatory variables. Ecology 2008, 89, 2623–2632. [Google Scholar] [CrossRef]
- Marcano-Cedeño, A.; Quintanilla-Domínguez, J.; Cortina-Januchs, M.; Andina, D. Feature selection using sequential forward selection and classification applying artificial metaplasticity neural network. In Proceedings of the IECON 2010—36th Annual Conference on IEEE Industrial Electronics Society, Glendale, AZ, USA, 7–10 November 2010; pp. 2845–2850. [Google Scholar]
- Kozbur, D. Testing-Based Forward Model Selection. Am. Econ. Rev. 2017, 107, 266–269. [Google Scholar] [CrossRef] [Green Version]
- Le, T.T.H.; Kim, Y.; Kim, H. Network intrusion detection based on novel feature selection model and various recurrent neural networks. Appl. Sci. 2019, 9, 1392. [Google Scholar] [CrossRef] [Green Version]
- Pereira, L.; Nunes, N. Performance evaluation in non-intrusive load monitoring: Datasets, metrics, and tools-A review. Wiley Interdiscip. Rev. Data Min. Know. Discov. 2018, 1–17. [Google Scholar] [CrossRef] [Green Version]
- Gevrey, M.; Dimopoulos, I.; Leka, S. Review and comparison of methods to study the contribution of variables in artificial neural network models. Ecol. Modell. 2003, 160, 249–264. [Google Scholar] [CrossRef]
- Peng, C.; Lin, G.; Zhai, S.; Ding, Y.; He, G. Non-Intrusive Load Monitoring via Deep Learning Based User Model and Appliance Group Model. Energies 2020, 13, 5629. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Gao, B.; Pavel, L. On the properties of the softmax function with application in game theory and reinforcement learning. arXiv 2017, arXiv:1704.00805. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv 2015, arXiv:1502.03167. [Google Scholar]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
- Pereira, L.; Nunes, N. A comparison of performance metrics for event classification in Non-Intrusive Load Monitoring. In Proceedings of the 2017 IEEE International Conference on Smart Grid Communications (SmartGridComm), Dresden, Germany, 23–27 October 2017; pp. 159–164. [Google Scholar]
- Xue, J.; Titterington, M. Do unbalanced data have a negative effect on Linear Discriminant Ananlysis. Pattern Recognit. 2008, 41, 1558–1571. [Google Scholar] [CrossRef] [Green Version]
- Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification; John Wiley and Sons Inc.: Hoboken, NJ, USA, 2001. [Google Scholar]
- Theodoridis, S.; Koutroumbas, K. Pattern Recognition, 2nd ed.; Academic Press: Cambridge, MA, USA, 2003. [Google Scholar]
- Lee, Y.; Lin, Y.; Wahba, G. Multicategory support vector machines: Theory and application to the classification of microarray data and satellite radiance data. J. Am. Stat. Associat. 2004, 99, 67–81. [Google Scholar] [CrossRef] [Green Version]
- Basu, K.; Debusschere, V.; Bacha, S.; Maulik, U.; Bondyopadhyay, S. Non intrusive load monitoring: A temporal multi-label classification approach. IEEE Trans. Indust. Inform. 2015, 11, 262–270. [Google Scholar] [CrossRef]
- Wu, X.; Gao, Y.; Jiao, D. Multi-Label Classification Based on Random Forest Algorithm for Non-Intrusive Load Monitoring System. Processes 2019, 7, 337. [Google Scholar] [CrossRef] [Green Version]
- Murata, H.; Onoda, T. Applying Kernel Based Subspace Classification to a Non-intrusive Monitoring for Household Electric Appliances. In Proccedings of the International Conference on Artificial Neural Networks (ICANN), Vienna, Austria, 21–25 August 2001. [Google Scholar]
- Makonin, S.; Popowich, F. Non-intrusive load monitoring performance evaluation. Energy Effic. 2015, 8, 809–814. [Google Scholar] [CrossRef]
- Saitoh, T.; Osaki, T.; Konishi, R.; Sugahara, K. Current sensor based home appliance and state of appliance recognition. SICE J. Control Meas. Sys. Integrat. 2010, 3, 86–93. [Google Scholar] [CrossRef]
- Mignot, R.; Peeters, G. An Analysis of the Effect of Data Augmentation Methods: Experiments for a Musical Genre Classification Task. Trans. Int. Soc. Mus. Inform. Ret. 2019. [Google Scholar] [CrossRef] [Green Version]
- De Baets, L.; Ruyssinck, J.; Develder, C.; Dhaene, T.; Deschrijver, D. Appliance classification using VI trajectories and convolutional neural networks. Energy Build. 2018, 158, 32–36. [Google Scholar] [CrossRef] [Green Version]
Electrical Features Name | Number of Computed Features |
---|---|
Effective current I and its harmonics for and (A) | 17 |
Active power P and its harmonics for and (W) | 17 |
Reactive power Q and its harmonics for and (VAR) | 17 |
Apparent power S and its harmonics for and , (VA) | 18 |
Current harmonic distortion | 1 |
Distortion power D, , (VAD) | 3 |
Power factor and for | 16 |
Current crest factor | 1 |
Total | 90 |
Method | Features Meeting the Additive Criterion | |
---|---|---|
# Features | Selected Features | |
KNN based sequential forward FS method | 12 | |
LDA based sequential forward FS method | 18 | |
MI | 20 | |
PCA | 17 | |
LDA | 14 | |
DNN | 27 | |
Method | Features Meeting the Additive Criterion | |
---|---|---|
# Features | Selected Features | |
KNN based sequential forward FS method | 25 | |
LDA based sequential forward FS method | 33 | All features except |
MI | 20 | |
PCA | 31 | All features except |
LDA | 12 | |
DNN | 17 | |
Feature Selection Method | Average (%) |
---|---|
MI | 92.96 |
KNN based Seq. forw. FS method | 92.58 |
LDA | 92.42 |
All features | 90.92 |
LDA-based Seq. forw. FS method | 90.14 |
PCA | 87.64 |
P,Q features | 86.62 |
DNN | 84.87 |
Methods (Selected Features/Classifier) | Results | |||||||
---|---|---|---|---|---|---|---|---|
Feat. | # feat. | Classifier | D.A | Acc. | Rec. | Pre. | ||
MI | 20 | R.F | Yes | 99.18 | 99.17 | 99.18 | 99.30 | 4.95 |
LDA | 14 | R.F | Yes | 98.56 | 98.52 | 98.56 | 98.71 | 7.04 |
P,Q features | 2 | R.F | Yes | 98.15 | 98.15 | 98.15 | 98.24 | 49.07 |
All features | 34 | R.F | Yes | 98.15 | 98.14 | 98.15 | 98.28 | 2.88 |
KNN-based Seq. forw. FS (K = 7) | 12 | R.F | Yes | 97.74 | 97.71 | 97.74 | 97.99 | 8.14 |
LDA | 14 | R.F | No | 98.15 | 97.67 | 97.43 | 98.15 | 7.01 |
KNN-based Seq. forw. FS (K = 7) | 12 | R.F | No | 98.15 | 97.60 | 97.33 | 98.15 | 8.17 |
MI | 20 | R.F | No | 98.15 | 97.54 | 97.23 | 98.15 | 4.90 |
DNN | 27 | R.F | Yes | 97.34 | 97.32 | 97.34 | 97.60 | 3.60 |
P,Q features | 2 | R.F | No | 97.74 | 97.06 | 96.72 | 97.74 | 48.87 |
MI | 20 | LDA | Yes | 96.92 | 96.83 | 96.92 | 97.55 | 4.84 |
LDA-based Seq. forw. FS | 18 | LDA | No | 97.54 | 96.72 | 96.31 | 97.54 | 5.41 |
DNN | 27 | LDA | Yes | 96.72 | 96.62 | 96.72 | 97.49 | 3.58 |
LDA-based Seq. forw. FS | 18 | LDA | Yes | 96.72 | 96.57 | 96.72 | 97.43 | 5.37 |
LDA-based Seq. forw. FS | 18 | R.F | Yes | 96.31 | 96.29 | 96.31 | 96.77 | 5.35 |
PCA | 17 | R.F | Yes | 96.31 | 96.25 | 96.31 | 96.70 | 5.66 |
All features | 34 | LDA | Yes | 96.31 | 96.18 | 96.31 | 96.84 | 2.83 |
DNN | 27 | LDA | No | 96.93 | 96.14 | 95.77 | 96.91 | 3.59 |
LDA | 14 | LDA | Yes | 95.69 | 95.59 | 95.69 | 96.32 | 6.83 |
KNN-based Seq. forw. FS (K = 7) | 12 | KNN | No | 96.72 | 95.76 | 95.29 | 96.72 | 8.06 |
All features | 34 | KNN | Yes | 95.69 | 95.53 | 95.69 | 96.82 | 2.81 |
MI | 20 | KNN | Yes | 95.49 | 95.27 | 95.49 | 96.73 | 4.77 |
DNN | 27 | R.F | No | 96.31 | 95.22 | 94.67 | 96.31 | 3.57 |
MI | 20 | LDA | No | 96.31 | 95.21 | 94.67 | 96.31 | 4.81 |
LDA-based Seq. forw. FS | 18 | R.F | No | 96.31 | 95.21 | 94.67 | 96.31 | 5.35 |
MI | 20 | DNN | No | 96.31 | 95.08 | 94.47 | 96.31 | 4.81 |
KNN-based Seq. forw. FS (K = 7) | 12 | KNN | Yes | 95.28 | 95.02 | 95.32 | 96.68 | 7.94 |
LDA | 14 | KNN | Yes | 95.28 | 95.01 | 95.28 | 96.65 | 6.80 |
LDA-based Seq. forw. FS | 18 | KNN | Yes | 95.27 | 95.01 | 95.28 | 96.49 | 5.29 |
P,Q features | 2 | KNN | Yes | 95.28 | 94.97 | 95.28 | 96.66 | 47.64 |
All features | 34 | LDA | No | 96.10 | 94.94 | 94.36 | 96.10 | 2.82 |
All features | 34 | R.F | No | 96.10 | 94.91 | 94.33 | 96.10 | 2.82 |
PCA | 17 | R.F | No | 95.90 | 94.87 | 94.39 | 95.90 | 5.64 |
KNN-based Seq. forw. FS (K = 7) | 12 | LDA | Yes | 95.28 | 94.86 | 95.28 | 96.34 | 7.94 |
KNN-based Seq. forw. FS (K = 7) | 12 | LDA | No | 95.90 | 94.60 | 93.95 | 95.90 | 7.99 |
PCA | 17 | LDA | Yes | 94.46 | 94.37 | 94.46 | 95.48 | 5.55 |
LDA | 14 | LDA | No | 95.69 | 94.33 | 93.65 | 95.69 | 6.83 |
P,Q features | 2 | KNN | No | 95.08 | 93.44 | 92.62 | 95.08 | 47.54 |
PCA | 17 | LDA | No | 94.67 | 93.06 | 92.28 | 94.67 | 5.57 |
Feat. | # feat. | Classifier | D.A | Acc. | Rec. | Pre. | ||
LDA | 14 | KNN | No | 94.46 | 92.96 | 92.21 | 94.46 | 6.74 |
PCA | 17 | DNN | No | 93.64 | 92.15 | 91.48 | 93.65 | 5.51 |
P,Q features | 2 | LDA | Yes | 93.23 | 91.85 | 93.23 | 93.69 | 46.61 |
All features | 34 | DNN | No | 93.24 | 91.34 | 90.44 | 93.24 | 2.74 |
P,Q features | 2 | LDA | No | 93.23 | 90.98 | 89.85 | 93.23 | 46.61 |
KNN-based Seq. forw. FS (K = 7) | 12 | DNN | No | 92.62 | 90.40 | 89.34 | 92.62 | 7.72 |
MI | 20 | KNN | No | 91.60 | 89.21 | 88.01 | 91.59 | 4.58 |
LDA-based Seq. forw. FS | 18 | DNN | No | 90.98 | 88.48 | 87.39 | 90.98 | 5.05 |
LDA | 14 | DNN | No | 92.01 | 89.89 | 88.99 | 92.01 | 6.57 |
DNN | 27 | KNN | Yes | 87.09 | 86.18 | 87.09 | 89.46 | 3.23 |
LDA-based Seq. forw. FS | 18 | KNN | No | 88.11 | 84.35 | 82.51 | 88.11 | 4.89 |
DNN | 27 | DNN | No | 86.27 | 82.70 | 88.99 | 92.01 | 3.20 |
DNN | 27 | KNN | No | 85.86 | 81.39 | 79.17 | 85.86 | 3.18 |
All features | 34 | KNN | No | 84.43 | 80.31 | 78.36 | 84.43 | 2.48 |
PCA | 17 | KNN | Yes | 81.14 | 79.87 | 81.14 | 82.58 | 4.77 |
PCA | 17 | KNN | No | 82.37 | 78.18 | 76.25 | 82.37 | 4.84 |
All features | 34 | DNN | Yes | 77.05 | 76.07 | 77.05 | 79.51 | 2.27 |
LDA | 14 | DNN | Yes | 76.84 | 75.40 | 76.84 | 77.41 | 5.49 |
MI | 20 | DNN | Yes | 76.84 | 75.33 | 76.84 | 77.18 | 3.84 |
KNN-based Seq. forw. FS (K = 7) | 12 | DNN | Yes | 76.02 | 74.70 | 76.02 | 77.51 | 6.33 |
PCA | 17 | DNN | Yes | 74.59 | 72.35 | 74.59 | 75.47 | 4.39 |
LDA-based Seq. forw. FS | 18 | DNN | Yes | 70.70 | 68.45 | 70.70 | 70.50 | 3.93 |
P,Q features | 2 | DNN | No | 71.52 | 66.43 | 64.28 | 71.52 | 35.76 |
P,Q features | 2 | DNN | Yes | 63.32 | 60.05 | 63.32 | 59.19 | 31.66 |
DNN | 27 | DNN | Yes | 45.08 | 43.36 | 45.08 | 45.34 | 1.67 |
Feature Selection Method | Average (%) |
---|---|
MI | 85.80 |
LDA based Seq. forw. FS method | 85.07 |
KNN based Seq. forw. FS method | 85.05 |
All features | 84.56 |
PCA | 83.53 |
DNN | 82.52 |
LDA | 81.61 |
P,Q features | 58.31 |
Methods (Selected Features/Classifier) | Results | |||||||
---|---|---|---|---|---|---|---|---|
Feat. | # Feat. | Classifier | D.A | Acc. | Rec. | Pre. | ||
KNN-based Seq. forw. FS (K = 7) | 25 | KNN | No | 99.19 | 98.63 | 98.81 | 98.56 | 3.97 |
KNN-based Seq. forw. FS (K = 7) | 25 | KNN | Yes | 99.09 | 98.54 | 98.25 | 98.90 | 3.96 |
LDA-based Seq. forw. FS | 33 | KNN | No | 99.07 | 98.50 | 98.71 | 98.39 | 3.00 |
LDA-based Seq. forw. FS | 33 | KNN | Yes | 99.02 | 98.50 | 98.21 | 98.83 | 3.00 |
All features | 34 | KNN | No | 99.07 | 98.49 | 98.69 | 98.40 | 2.91 |
PCA | 30 | KNN | No | 99.00 | 98.46 | 98.60 | 98.41 | 3.30 |
DNN | 17 | KNN | No | 98.99 | 98.45 | 98.62 | 98.41 | 5.82 |
MI | 20 | KNN | No | 99.13 | 98.43 | 98.43 | 98.53 | 4.96 |
DNN | 17 | KNN | Yes | 98.94 | 98.41 | 98.14 | 98.74 | 5.82 |
All features | 34 | KNN | Yes | 99.07 | 98.38 | 98.09 | 98.81 | 2.91 |
Feat. | # Feat. | Classifier | D.A | Acc. | Rec. | Pre. | ||
MI | 20 | KNN | Yes | 99.03 | 98.33 | 98.23 | 98.48 | 4.95 |
PCA | 30 | KNN | Yes | 98.95 | 98.28 | 98.00 | 98.69 | 3.30 |
MI | 20 | R.F | No | 98.91 | 98.27 | 98.50 | 98.18 | 4.95 |
LDA | 12 | KNN | No | 98.92 | 98.09 | 98.06 | 98.18 | 8.24 |
LDA-based Seq. forw. FS | 33 | R.F | No | 98.75 | 97.97 | 98.20 | 97.91 | 2.99 |
All features | 34 | R.F | No | 98.79 | 97.88 | 98.09 | 97.89 | 2.91 |
KNN-based Seq. forw. FS (K = 7) | 25 | R.F | No | 98.80 | 97.85 | 98.00 | 97.85 | 3.95 |
LDA | 12 | KNN | Yes | 98.76 | 97.84 | 97.64 | 98.06 | 8.23 |
DNN | 17 | R.F | No | 98.58 | 97.74 | 98.14 | 97.54 | 5.80 |
PCA | 30 | R.F | No | 98.67 | 97.66 | 97.84 | 97.60 | 3.29 |
LDA | 12 | R.F | No | 98.60 | 97.62 | 97.99 | 97.51 | 8.22 |
LDA-based Seq. forw. FS | 33 | R.F | Yes | 98.23 | 97.22 | 96.88 | 97.70 | 2.98 |
KNN-based Seq. forw. FS (K = 7) | 25 | R.F | Yes | 98.19 | 97.14 | 96.69 | 97.78 | 3.93 |
All features | 34 | R.F | Yes | 98.25 | 97.04 | 96.67 | 97.53 | 2.89 |
PCA | 30 | R.F | Yes | 98.16 | 96.90 | 95.53 | 97.43 | 3.27 |
MI | 20 | R.F | Yes | 98.19 | 96.89 | 96.48 | 97.34 | 4.91 |
LDA | 12 | R.F | Yes | 97.88 | 96.86 | 96.43 | 97.40 | 8.16 |
MI | 20 | DNN | Yes | 98.32 | 96.78 | 96.38 | 97.63 | 4.92 |
DNN | 17 | R.F | Yes | 97.96 | 96.61 | 96.07 | 97.62 | 5.76 |
All features | 34 | DNN | No | 98.04 | 96.35 | 96.64 | 96.67 | 2.88 |
KNN-based Seq. forw. FS (K = 7) | 25 | DNN | No | 97.56 | 96.30 | 96.59 | 96.58 | 3.90 |
MI | 20 | DNN | No | 96.26 | 96.05 | 96.99 | 96.28 | 4.81 |
LDA-based Seq. forw. FS | 33 | DNN | No | 96.29 | 94.40 | 95.42 | 94.85 | 2.89 |
LDA | 12 | DNN | No | 95.52 | 94.10 | 95.10 | 94.09 | 7.96 |
LDA | 12 | DNN | Yes | 95.51 | 92.88 | 91.57 | 95.88 | 7.96 |
DNN | 17 | DNN | No | 94.55 | 92.41 | 94.03 | 92.52 | 5.56 |
PCA | 30 | DNN | No | 94.45 | 92.15 | 92.85 | 92.63 | 3.15 |
LDA-based Seq. forw. FS | 33 | DNN | Yes | 94.81 | 91.66 | 90.72 | 94.43 | 2.87 |
KNN-based Seq. forw. FS (K = 7) | 25 | DNN | Yes | 94.87 | 91.22 | 90.29 | 93.27 | 3.79 |
P,Q features | 2 | KNN | Yes | 94.43 | 90.91 | 91.75 | 91.68 | 47.22 |
P,Q features | 2 | KNN | No | 94.58 | 90.82 | 91.39 | 90.86 | 47.29 |
PCA | 30 | DNN | Yes | 91.19 | 90.82 | 88.99 | 94.33 | 3.04 |
DNN | 17 | DNN | Yes | 93.28 | 90.52 | 88.14 | 94.84 | 5.49 |
P,Q features | 2 | R.F | No | 93.79 | 90.31 | 90.50 | 90.60 | 46.90 |
All features | 34 | DNN | Yes | 90.33 | 86.33 | 85.00 | 89.60 | 2.66 |
P,Q features | 2 | R.F | Yes | 90.63 | 85.84 | 86.14 | 85.70 | 45.32 |
LDA-based Seq. forw. FS | 33 | LDA | No | 45.52 | 53.46 | 59.45 | 54.25 | 1.38 |
All features | 34 | LDA | No | 45.31 | 53.25 | 59.42 | 54.05 | 1.33 |
MI | 20 | LDA | No | 44.55 | 53.11 | 58.49 | 53.84 | 2.23 |
KNN-based Seq. forw. FS (K = 7) | 25 | LDA | No | 44.37 | 51.50 | 57.62 | 52.32 | 1.77 |
Feat. | # Feat. | Classifier | D.A | Acc. | Rec. | Pre. | ||
P,Q features | 2 | DNN | Yes | 65.56 | 50.48 | 51.27 | 55.29 | 32.79 |
P,Q features | 2 | DNN | No | 65.52 | 49.75 | 52.58 | 52.45 | 32.76 |
PCA | 30 | LDA | No | 42.54 | 48.58 | 54.28 | 49.23 | 1.42 |
LDA-based Seq. forw. FS | 33 | LDA | Yes | 42.89 | 48.87 | 49.68 | 57.03 | 1.30 |
All features | 34 | LDA | Yes | 42.91 | 48.78 | 49.75 | 56.29 | 1.26 |
KNN-based Seq. forw. FS (K = 7) | 25 | LDA | Yes | 42.72 | 48.67 | 49.61 | 55.40 | 1.71 |
MI | 20 | LDA | Yes | 42.45 | 48.50 | 49.49 | 53.77 | 2.12 |
DNN | 17 | LDA | No | 40.65 | 46.79 | 54.21 | 46.85 | 2.39 |
PCA | 30 | LDA | Yes | 40.59 | 45.42 | 45.64 | 52.09 | 1.35 |
LDA | 12 | LDA | No | 38.66 | 40.40 | 47.43 | 40.75 | 3.22 |
DNN | 17 | LDA | Yes | 37.09 | 39.20 | 39.66 | 45.50 | 2.18 |
LDA | 12 | LDA | Yes | 36.08 | 35.88 | 35.49 | 41.93 | 3.00 |
P,Q features | 2 | LDA | Yes | 12.12 | 4.29 | 7.11 | 4.19 | 6.06 |
P,Q features | 2 | LDA | No | 11.59 | 4.05 | 3.65 | 7.55 | 5.80 |
Methods (Selected Features/Classifier) | Results | ||||||
---|---|---|---|---|---|---|---|
Feat. | # Feat. | Classifier | Acc. | Rec. | Pre. | ||
MI | 20 | KNN | 99.12 | 98.42 | 98.40 | 98.53 | 4.96 |
LDA | 14 | KNN | 99.10 | 98.36 | 98.53 | 98.31 | 7.08 |
KNN-based Seq. forw. FS (K = 7) | 12 | R.F | 98.99 | 98.36 | 98.48 | 98.35 | 8.25 |
KNN-based Seq. forw. FS (K = 7) | 12 | KNN | 99.04 | 98.29 | 98.39 | 98.30 | 8.25 |
PCA | 17 | KNN | 98.95 | 98.28 | 98.54 | 98.17 | 5.82 |
DNN | 27 | KNN | 98.95 | 98.20 | 98.42 | 98.09 | 3.66 |
LDA-based Seq. forw. FS | 18 | KNN | 98.92 | 98.14 | 98.331 | 98.13 | 5.49 |
LDA | 14 | R.F | 98.96 | 97.95 | 98.09 | 97.94 | 7.07 |
MI | 20 | R.F | 98.89 | 97.86 | 97.90 | 9791 | 4.94 |
PCA | 17 | R.F | 98.77 | 97.84 | 98.04 | 97.79 | 5.81 |
LDA-based Seq. forw. FS | 18 | R.F | 98.76 | 97.64 | 97.79 | 97.62 | 5.49 |
DNN | 27 | R.F | 98.68 | 97.61 | 97.84 | 97.53 | 3.65 |
KNN-based Seq. forw. FS (K = 7) | 12 | DNN | 98.08 | 96.83 | 97.40 | 96.87 | 8.17 |
LDA | 14 | DNN | 97.85 | 96.35 | 97.10 | 96.32 | 6.99 |
MI | 20 | DNN | 94.85 | 94.82 | 95.86 | 95.10 | 4.74 |
PCA | 17 | DNN | 96.87 | 94.68 | 95.55 | 94.89 | 5.70 |
DNN | 27 | DNN | 93.76 | 93.66 | 94.86 | 94.17 | 3.47 |
LDA-based Seq. forw. FS | 18 | DNN | 95.76 | 93.18 | 94.22 | 93.43 | 6.32 |
MI | 20 | LDA | 44.14 | 52.43 | 58.21 | 53.18 | 2.21 |
DNN | 27 | LDA | 42.36 | 48.29 | 54.22 | 48.48 | 1.57 |
PCA | 17 | LDA | 41.84 | 43.63 | 50.10 | 43.87 | 2.46 |
LDA | 14 | LDA | 40.75 | 43.54 | 48.95 | 44.3 | 2.46 |
KNN-based Seq. forw. FS (K = 7) | 12 | LDA | 37.73 | 38.23 | 42.81 | 40.30 | 3.14 |
LDA-based Seq. forw. FS | 18 | LDA | 38.65 | 37.70 | 43.20 | 38.74 | 2.15 |
Methods (Selected features/Classifier) | Results | ||||||
---|---|---|---|---|---|---|---|
Feat. | # Feat. | Classifier | Acc. | Rec. | Pre. | ||
LDA-based Seq. forw. FS | 33 | R.F | 98.77 | 98.36 | 98.15 | 98.77 | 2.99 |
MI | 20 | R.F | 98.76 | 98.36 | 98.15 | 98.77 | 4.93 |
KNN-based Seq. forw. FS (K = 7) | 25 | R.F | 97.74 | 96.99 | 96.61 | 97.74 | 3.90 |
PCA | 30 | LDA | 96.51 | 95.46 | 94.95 | 96.51 | 3.21 |
LDA | 12 | R.F | 96.51 | 95.45 | 94.94 | 96.51 | 8.04 |
LDA-based Seq. forw. FS | 33 | LDA | 96.31 | 95.18 | 94.63 | 96.31 | 2.92 |
KNN-based Seq. forw. FS (K = 7) | 25 | LDA | 96.31 | 95.15 | 94.56 | 96.31 | 3.85 |
Feat. | # Feat. | Classifier | Acc. | Rec. | Pre. | ||
MI | 20 | LDA | 96.31 | 95.15 | 94.56 | 96.31 | 4.81 |
DNN | 17 | R.F | 95.70 | 94.33 | 93.65 | 95.70 | 5.63 |
PCA | 30 | R.F | 95.49 | 93.98 | 93.23 | 95.49 | 3.18 |
MI | 20 | DNN | 93.85 | 92.14 | 91.32 | 93.85 | 4.69 |
LDA | 12 | LDA | 93.64 | 92.07 | 91.29 | 93.64 | 7.80 |
DNN | 17 | LDA | 93.44 | 91.67 | 90.78 | 93.44 | 5.50 |
LDA | 12 | LDA | 92.62 | 90.50 | 89.48 | 92.62 | 7.72 |
KNN-based Seq. forw. FS (K = 7) | 25 | DNN | 93.24 | 91.02 | 89.92 | 93.24 | 3.73 |
LDA-based Seq. forw. FS | 33 | DNN | 92.62 | 90.67 | 89.75 | 92.62 | 2.80 |
MI | 20 | KNN | 91.80 | 89.27 | 88.01 | 91.80 | 4.59 |
DNN | 17 | DNN | 90.98 | 88.38 | 87.16 | 90.98 | 5.35 |
PCA | 30 | DNN | 90.37 | 87.56 | 86.27 | 90.37 | 3.01 |
PCA | 30 | KNN | 85.04 | 80.66 | 78.58 | 85.04 | 2.83 |
LDA-based Seq. forw. FS | 33 | KNN | 84.42 | 79.84 | 77.66 | 84.42 | 2.55 |
LDA | 12 | KNN | 83.81 | 79.57 | 77.52 | 83.81 | 6.98 |
KNN-based Seq. forw. FS (K = 7) | 25 | KNN | 84.01 | 79.37 | 77.11 | 84.01 | 3.36 |
DNN | 17 | KNN | 77.87 | 72.96 | 70.65 | 77.87 | 4.58 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Houidi, S.; Fourer, D.; Auger, F.; Sethom, H.B.A.; Miègeville, L. Comparative Evaluation of Non-Intrusive Load Monitoring Methods Using Relevant Features and Transfer Learning. Energies 2021, 14, 2726. https://doi.org/10.3390/en14092726
Houidi S, Fourer D, Auger F, Sethom HBA, Miègeville L. Comparative Evaluation of Non-Intrusive Load Monitoring Methods Using Relevant Features and Transfer Learning. Energies. 2021; 14(9):2726. https://doi.org/10.3390/en14092726
Chicago/Turabian StyleHouidi, Sarra, Dominique Fourer, François Auger, Houda Ben Attia Sethom, and Laurence Miègeville. 2021. "Comparative Evaluation of Non-Intrusive Load Monitoring Methods Using Relevant Features and Transfer Learning" Energies 14, no. 9: 2726. https://doi.org/10.3390/en14092726
APA StyleHouidi, S., Fourer, D., Auger, F., Sethom, H. B. A., & Miègeville, L. (2021). Comparative Evaluation of Non-Intrusive Load Monitoring Methods Using Relevant Features and Transfer Learning. Energies, 14(9), 2726. https://doi.org/10.3390/en14092726