A Neural Network-Based Weighted Voting Algorithm for Multi-Target Classification in WSN
Abstract
:1. Introduction
2. Related Work
2.1. Performance of Different Classification Algorithms
2.2. NN-Based Classifier
3. Multiple Classifiers Weighted Voting Strategy
3.1. Strategy of Voting
3.2. Calculation of Classifier Weights
4. Engineering Implementation of the Algorithm
4.1. Training of Neural Networks
4.2. Transplanting of NN-Based Classifier
4.3. Training Data Preprocessing
4.4. The Flow of the NN-Based Weighted Voting Strategy
Algorithm 1: NN-based Weighted Voting Algorithm |
Input: Extracted feature data (including sample feature data F, and the corresponding labels L; and feature data to be categorized CF, category labels is denote as Li (i = 0, …, N), where N is the number of categories). Output: Corresponding target category label (denote as L) (1) Preprocessing the training dataset using the segmented averaging means introduced in Section 4.3. The feature data and labels for the processed samples (FT1, FT2, FT3) and (CF1, CF2, CF3) have been acquired. (2) Three NN-based classifiers are trained utilizing the acquired sample data ((FT1, FT2, FT3), (CF1, CF2, CF3)): utilizing (FT1, CF1) for training classifier 1; utilizing (FT2, CF2) for training classifier 2; utilizing (FT3, CF3) for training classifier 3; (3) Based on the parameters of the trained NN-based classifier, extract the parameter matrix (iw,ib,hw,hb,ow,ob) for subsequent classification. (4) The Lance-Williams method, as described in Section 3.2, is utilized to calculate the weights of the three classifiers, identified as WC1, WC2, and WC3. These weights are computed using the sample feature data (FT1, FT2, and FT3) and the target feature data CF that is to be classified. (5) According to the computational formula presented in Section 4.2, the matrices detailing the parameters of the three classifiers are employed in order to compute the feature data to be classified. Subsequently, the classification results of the three classifiers, denoted as r(c1), r(c2), and r(c3), are obtained; (6) Statistically weighted classification result r is derived from the classifier weights computed in (4): Find the item with the highest weight of the label L, and its corresponding category is the classification result of the input feature data. |
5. Performance of Multi-DBN Weighted Voting Algorithm
5.1. Performance Analysis of Multi-DBN Weighted Voting Algorithm
5.2. Effect of Network Size on Algorithm Classification Accuracy
- (1)
- The proposed voting classification strategy yields higher accuracy than the highest accuracy achieved by the three single classifiers. Additionally, there is no discernable correlation between network size and classification accuracy of the DBN classifier, larger networks do not necessarily result in higher accuracy.
- (2)
- The classification accuracy of DW vehicles decrease as the network size increases. Additionally, the DBN classifier’s classification accuracy for AAV vehicles is negatively correlated with the network size. When the classifier’s accuracy for AAV vehicles improves, the corresponding accuracy for DW vehicles decrease.
5.3. Comparison with Other Multi-Objective Classification Methods
- (1)
- The NB method yields a lower average classification accuracy of approximately 63.85% when used to classify multiple vehicle signals detected in the WSN system. The main reason for this is that when the sample training dataset is small, the a priori probability model, obtained by analyzing the training data using the Naive Bayes (NB) method, is not accurate.
- (2)
- Additionally, when classifying multiple target signals using the Adaboosting method, the accuracy is lower compared to other methods.
- (3)
- When classifying multiple vehicle signals detected in the WSN monitoring system, the neural network-related methods (such as the multi-DBN voting method and FFNN method) exhibit higher average classification accuracy. Moreover, the neural network method’s advantage in classification accuracy becomes more apparent after the network optimization.
- (4)
- The NN-based weighted voting method achieves the highest classification accuracy, although its classification time is the longest compared to other methods.
5.4. Comparison of Classification Performance Using Different Neural Networks
- (1)
- The neural network-based weighted voting algorithm yields varying classification results when employing different neural networks.
- (2)
- Classification accuracy significantly improves when using DNN and DBN classifiers compared to FFNN, with an achievement of about 85%.
6. Conclusions
- (1)
- A more thorough investigation of classification accuracy among algorithms using various neural networks to determine the optimal neural network structure.
- (2)
- The simulations show a significant impact of different training sets on the classification performance of neural networks, and more studies are needed to reveal the underlying patterns.
- (3)
- The size of the neural network, particularly the number of hidden layers, can impact the classifier’s classification accuracy. However, augmenting the number of hidden layers increases computing power demands, and finding a balance requires further research.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
WSNs | Wireless sensor networks |
NN | Neural network |
DBN | Deep belief network |
FFNN | Feedforward neural network |
DNN | Deep neural network |
WCER | Wavelet coefficient energy ratio |
AAV | Assault amphibian vehicle |
DW | Dragon wagon |
KNN | K-nearest neighbor |
ELM | Extreme Learning Machine |
NB | Naive Bayes |
References
- Sohraby, K.; Minoli, D.; Znati, T. Wireless Sensor Networks: Technology, Protocols, and Applications, 1st ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2007. [Google Scholar]
- Mishra, D.P.; Dorale, S.S. An Application of Wireless Sensor Network in Intelligent Transportation System. In Proceedings of the 2013 6th International Conference on Emerging Trends in Engineering and Technology, Nagpur, India, 16–18 December 2013; pp. 90–91. [Google Scholar]
- Prasad, P. Recent trend in wireless sensor network and its applications: A survey. Sens. Rev. 2015, 35, 229–236. [Google Scholar] [CrossRef]
- Malhotra, B.; Nikolaidis, I.; Harms, J. Distributed classification of acoustic targets in wireless audio-sensor networks. Comput. Netw. 2008, 52, 2582–2593. [Google Scholar] [CrossRef]
- Liu, L.; Anlong Ming Ma, H.; Zhang, X. A binary-classification-tree based framework for distributed target classification in multimedia sensor networks. In Proceedings of the 2012 Proceedings IEEE Infocom, Orlando, FL, USA, 25–30 March 2012. [Google Scholar] [CrossRef]
- Yang, B.; Lei, Y.; Yan, B. Distributed Multi-Human Location Algorithm Using Naive Bayes Classifier for a Binary Pyroelectric Infrared Sensor Tracking System. IEEE Sens. J. 2016, 16, 216–223. [Google Scholar] [CrossRef]
- Zhang, L.; Wang, J.; An, Z. Vehicle recognition algorithm based on Haar-like features and improved Adaboost classifier. J. Ambient. Intell. Humaniz. Comput. 2021, 14, 807–815. [Google Scholar] [CrossRef]
- Javaid, A.; Javaid, N.; Wadud, Z.; Saba, T.; Sheta, O.; Saleem, M.; Alzahrani, M. Machine Learning Algorithms and Fault Detection for Improved Belief Function Based Decision Fusion in Wireless Sensor Networks. Sensors 2019, 19, 1334. [Google Scholar] [CrossRef]
- Manikandan, J.; Venkataramani, B. Design of a Modified One-Against-All SVM Classifier. In Proceedings of the 2009 IEEE International Conference on Systems Man and Cybernetics, San Antonio, TX, USA, 11–14 October 2009; pp. 1869–1874. [Google Scholar] [CrossRef]
- Hsu, C.W.; Lin, C.J. A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Netw. 2002, 13, 415–425. [Google Scholar] [CrossRef]
- Cheng, L.L.; Zhang, J.P.; Yang, J.; Ma, J. An Improved Hierarchical Multi-Class Support Vector Machine with Binary Tree Architecture. In Proceedings of the 2008 International Conference on Internet Computing in Science and Engineering, Harbin, China, 28–29 January 2008; pp. 106–109. [Google Scholar]
- Fei, B.; Liu, J.B. Binary tree of SVM: A new fast multiclass training and classification algorithm. IEEE Trans. Neural Netw. 2006, 17, 696–704. [Google Scholar] [CrossRef]
- Wang, Y.; Cheng, X.L.; Li, X.; Li, B.Q.; Yuan, X.B. Powerset Fusion Network for Target Classification in Unattended Ground Sensors. IEEE Sens. J. 2021, 21, 12. [Google Scholar] [CrossRef]
- Xu, T.H.; Wang, N.; Xu, X. Seismic Target Recognition Based on Parallel Recurrent Neural Network for Unattended Ground Sensor Systems. IEEE Access 2019, 7, 137823–137834. [Google Scholar] [CrossRef]
- Belsare, K.; Singh, M.; Gandam, A.; Malik, P.K.; Agarwal, R.; Gehlot, A. An integrated approach of IoT and WSN using wavelet transform and machine learning for the solid waste image classification in smart cities. Trans. Emerg. Telecommun. Technol. 2023, e4857. [Google Scholar] [CrossRef]
- Kim, Y.; Jeong, S.; Kim, D. A GMM-Based Target Classification Scheme for a Node in Wireless Sensor Networks. IEICE Trans. Commun. 2008, 11, 3544–3551. [Google Scholar] [CrossRef]
- Zhang, H.; Pan, Z. Cross-Voting SVM Method for Multiple Vehicle Classification in Wireless Sensor Networks. Sensors 2018, 18, 3108. [Google Scholar] [CrossRef] [PubMed]
- Basheer, I.A.; Hajmeer, M. Artificial neural networks: Fundamentals, computing, design, and application. J. Microbiol. Methods 2000, 43, 3–31. [Google Scholar] [CrossRef] [PubMed]
- Huang, C.H.; Wang, J.F. Multi-weighted Majority Voting Algorithm on Support Vector Machine and Its Application. In Proceedings of the Tencon IEEE Region 10 Conference, Singapore, 23–26 November 2009. [Google Scholar]
- Tang, Q.; Xiao, J.Y.; Wu, K.F. A novel evidence combination method based on stochastic approach for link- structure analysis algorithm and Lance- Williams distance. PEERJ Comput. Sci. 2003, 9, e1307. [Google Scholar] [CrossRef] [PubMed]
- Duarte, M.F.; Hu, Y.H. Vehicle classification in distributed sensor networks. J. Parallel Distrib. Comput. 2004, 64, 826–838. [Google Scholar] [CrossRef]
- Zhang, H.; Pan, Z.M.; Zhang, W.N. Acoustic-Seismic Mixed Feature Extraction Based on Wavelet Transform for Vehicle Classification in Wireless Sensor Networks. Sensors 2018, 18, 1862. [Google Scholar] [CrossRef]
- Sarikaya, R.; Hinton, G.E.; Deoras, A. Application of Deep Belief Networks for Natural Language Understanding. IEEE-Acm Trans. Audio Speech Lang. Process. 2014, 22, 778–784. [Google Scholar] [CrossRef]
- Zhang, M.-L.; Zhou, Z.-H. ML-KNN: A lazy learning approach to multi-label learning. Pattern Recognit. 2007, 40, 2038–2048. [Google Scholar] [CrossRef]
- Zhang, Z.H. Naive Bayes classification in R. Ann. Transl. Med. 2016, 4, 241. [Google Scholar] [CrossRef]
- Cao, J.W.; Lin, Z.P.; Huang, G.B.; Liu, N. Voting based extreme learning machine. Inf. Sci. 2012, 185, 66–77. [Google Scholar] [CrossRef]
- Asadi, R.; Kareem, S.A. Review of Feed Forward Neural Network classification preprocessing techniques. In Proceedings of the 3rd International Conference On Mathematical Sciences, Baghdad, Iraq, 23–24 March 2022; Volume 1602, pp. 567–573. [Google Scholar] [CrossRef]
- McLoughlin, I.; Zhang, H.; Xie, Z.; Song, Y.; Xiao, W. Robust Sound Event Classification Using Deep Neural Networks. IEEE/ACM Trans. Audio Speech Lang. Process. 2015, 23, 540–552. [Google Scholar] [CrossRef]
Title 2 | Classifier 1 Voting | Classifier 2 Voting | Classifier 2 Voting | Voting Results |
---|---|---|---|---|
A | 1 | 1 | 0 | 2 |
B | 0 | 0 | 0 | 0 |
C | 0 | 0 | 0 | 0 |
D | 0 | 0 | 1 | 1 |
Title 2 | Classifier 1 Voting | Classifier 2 Voting | Classifier 2 Voting | Voting Results |
---|---|---|---|---|
A | 0 | 1 | 0 | 1 |
B | 1 | 0 | 0 | 1 |
C | 0 | 0 | 0 | 0 |
D | 0 | 0 | 1 | 1 |
Classifier | Classifier 1 | Classifier 2 | Classifier 3 | Voting Classification Results |
Voting Weighted Value | 0.8252 | 0.7839 | 0.7613 | |
Classification accuracy of AAV | 68.12% | 66.06% | 69.65% | 73.37% |
Classification accuracy of DW | 80.21% | 83.11% | 72.35% | 85.42% |
Classification accuracy of personnel | 87.24% | 84.63% | 81.18% | 90.76% |
Classification accuracy of small vehicle | 68.12% | 66.06% | 69.65% | 88.96% |
Algorithm | KNN | NB | ELM | NN-Based Weight Voting |
---|---|---|---|---|
accurancy | 61.71% | 63.85% | 71.88% | 84.63% |
time consumption (s) | 0.0043 | 0.0077 | 0.1783 | 0.4892 |
Neural Network Types | DBN | FFNN | DNN |
---|---|---|---|
Average classification accuracy | 84.63% | 75.76% | 85.49% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, H.; Zhou, Y. A Neural Network-Based Weighted Voting Algorithm for Multi-Target Classification in WSN. Sensors 2024, 24, 123. https://doi.org/10.3390/s24010123
Zhang H, Zhou Y. A Neural Network-Based Weighted Voting Algorithm for Multi-Target Classification in WSN. Sensors. 2024; 24(1):123. https://doi.org/10.3390/s24010123
Chicago/Turabian StyleZhang, Heng, and Yang Zhou. 2024. "A Neural Network-Based Weighted Voting Algorithm for Multi-Target Classification in WSN" Sensors 24, no. 1: 123. https://doi.org/10.3390/s24010123
APA StyleZhang, H., & Zhou, Y. (2024). A Neural Network-Based Weighted Voting Algorithm for Multi-Target Classification in WSN. Sensors, 24(1), 123. https://doi.org/10.3390/s24010123