Enhancing Intrusion Detection in Wireless Sensor Networks Using a GSWO-CatBoost Approach
Abstract
:1. Introduction
- -
- We propose a new approach called Genetic Sacrificial Whale Optimization (GSWO) that ingeniously combines a genetic algorithm (GA) and WOA modified by applying a new three-population division strategy with the proposed conditional inherited choice for FS. The proposed algorithm can eliminate the premature convergence of the standard WOA and strike a balance between exploration and exploitation abilities.
- -
- Moreover, we harnessed the capabilities of the CatBoost model as a classifier, distinguishing between the benign data and diverse attack patterns within the dataset.
- -
- In addition, we introduce a new method for fine-tuning CatBoost’s hyperparameters, incorporating quantization and an optimization approach akin to FS (GSWO).
- -
- Finally, we rigorously evaluate the proposed methodology using a comprehensive range of datasets, encompassing established benchmarks such as CICIDS 2017 and NSL-KDD as well as specialized WSN datasets including the WSN dataset and the WSNBFSF dataset, which was published in 2023. These comprehensive evaluations underscore the accuracy and real-time applicability of the proposed method on various data sources.
2. Related Work
2.1. WSN Intrusion Detection
2.2. Metaheuristic Optimization Inspired Feature Selection for Intrusion Detection
2.3. Fine-Tuning Hyperparameters for Machine Learning Model
3. Proposed System
4. Proposed Work
4.1. Preliminaries
4.1.1. Whale Optimization Algorithm
- Encircling prey;
- Searching for prey;
- Bubble-net attacking.
4.1.2. Genetic Algorithm
4.2. Proposed Genetic Sacrificial Whale Optimization
Algorithm 1 Pseudo-code for the GSWO |
4.3. CatBoost Classification Model
4.4. Applying GSWO for Feature Selection
4.5. Applying GSWO for Hyperparameter Optimization
- is illustrated by 14 bits.
- is illustrated by 3 bits.
- is illustrated by 6 bits.
- is illustrated by 3 bits.
- is illustrated by 8 bits.
- is illustrated by 8 bits
5. Experiments and Evaluations
5.1. Dataset Description
5.1.1. NSL-KDD Dataset
5.1.2. CICIDS2017 Dataset
5.1.3. WSN-DS Dataset
5.1.4. WSN-BFSF Dataset
5.2. Evaluation Metrics
5.3. Results and Analysis
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Chithaluru, P.K.; Khan, M.S.; Kumar, M.; Stephan, T. ETH-LEACH: An energy enhanced threshold routing protocol for WSNs. Int. J. Commun. Syst. 2021, 34, e4881. [Google Scholar] [CrossRef]
- Zhao, R.; Yin, J.; Xue, Z.; Gui, G.; Adebisi, B.; Ohtsuki, T.; Gacanin, H.; Sari, H. An efficient intrusion detection method based on dynamic autoencoder. IEEE Wirel. Commun. Lett. 2021, 10, 1707–1711. [Google Scholar] [CrossRef]
- Liu, X.; Xu, B.; Zheng, K.; Zheng, H. Throughput maximization of wireless-powered communication network with mobile access points. IEEE Trans. Wirel. Commun. 2022, 22, 4401–4415. [Google Scholar] [CrossRef]
- Medeiros, D.d.F.; Souza, C.P.d.; Carvalho, F.B.S.d.; Lopes, W.T.A. Energy-saving routing protocols for smart cities. Energies 2022, 15, 7382. [Google Scholar] [CrossRef]
- Vidyapeeth, K.V.; Kalbhor, L. Secure and scalable data aggregation techniques for healthcare monitoring in WSN. J. Discret. Math. Sci. Cryptogr. 2024, 27, 441–452. [Google Scholar]
- Le, T.-T.-H.; Park, T.; Cho, D.; Kim, H. An effective classification for DoS attacks in wireless sensor networks. In Proceedings of the 2018 Tenth International Conference on Ubiquitous and Future Networks (ICUFN), Prague, Czech Republic, 3–6 July 2018; pp. 689–692. [Google Scholar]
- Butun, I.; Morgera, S.D.; Sankar, R. A survey of intrusion detection systems in wireless sensor networks. IEEE Commun. Surv. Tutor. 2013, 16, 266–282. [Google Scholar] [CrossRef]
- Sun, B.; Osborne, L.; Xiao, Y.; Guizani, S. Intrusion detection techniques in mobile ad hoc and wireless sensor networks. IEEE Wirel. Commun. 2007, 14, 56–63. [Google Scholar] [CrossRef]
- Magán-Carrión, R.; Urda, D.; Díaz-Cano, I.; Dorronsoro, B. Towards a reliable comparison and evaluation of network intrusion detection systems based on machine learning approaches. Appl. Sci. 2020, 10, 1775. [Google Scholar] [CrossRef]
- Sultana, N.; Chilamkurti, N.; Peng, W.; Alhadad, R. Survey on SDN based network intrusion detection system using machine learning approaches. Peer-Netw. Appl. 2019, 12, 493–501. [Google Scholar] [CrossRef]
- Tian, Z.; Li, J.; Liu, L.; Wu, H.; Hu, X.; Xie, M.; Zhu, Y.; Chen, X.; Ou-Yang, W. Machine learning-assisted self-powered intelligent sensing systems based on triboelectricity. Nano Energy 2023, 113, 108559. [Google Scholar] [CrossRef]
- Abdalzaher, M.S.; Elwekeil, M.; Wang, T.; Zhang, S. A deep autoencoder trust model for mitigating jamming attack in IoT assisted by cognitive radio. IEEE Syst. J. 2021, 16, 3635–3645. [Google Scholar] [CrossRef]
- Abdi, A.H.; Audah, L.; Salh, A.; Alhartomi, M.A.; Rasheed, H.; Ahmed, S.; Tahir, A. Security Control and Data Planes of SDN: A Comprehensive Review of Traditional, AI and MTD Approaches to Security Solutions. IEEE Access 2024, 12, 69941–69980. [Google Scholar] [CrossRef]
- Fährmann, D.; Martín, L.; Sánchez, L.; Damer, N. Anomaly Detection in Smart Environments: A Comprehensive Survey. IEEE Access 2024, 12, 64006–64049. [Google Scholar] [CrossRef]
- Cerdà-Alabern, L.; Iuhasz, G.; Gemmi, G. Anomaly detection for fault detection in wireless community networks using machine learning. Comput. Commun. 2023, 202, 191–203. [Google Scholar] [CrossRef]
- Gite, P.; Chouhan, K.; Krishna, K.M.; Nayak, C.K.; Soni, M.; Shrivastava, A. ML Based Intrusion Detection Scheme for various types of attacks in a WSN using C4. 5 and CART classifiers. Mater. Today Proc. 2023, 80, 3769–3776. [Google Scholar] [CrossRef]
- Inuwa, M.M.; Das, R. A comparative analysis of various machine learning methods for anomaly detection in cyber attacks on IoT networks. Internet Things 2024, 26, 101162. [Google Scholar] [CrossRef]
- Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Zamani, H.; Mirjalili, S. Enhanced whale optimization algorithm for medical feature selection: A COVID-19 case study. Comput. Biol. Med. 2022, 148, 105858. [Google Scholar] [CrossRef]
- Arden, F.; Safitri, C. Hyperparameter Tuning Algorithm Comparison with Machine Learning Algorithms. In Proceedings of the 2022 6th International Conference on Information Technology, Information Systems and Electrical Engineering (ICITISEE), Yogyakarta, Indonesia, 13–14 December 2022; pp. 183–188. [Google Scholar]
- Almomani, I.; Al-Kasasbeh, B.; Al-Akhras, M. WSN-DS: A dataset for intrusion detection systems in wireless sensor networks. J. Sens. 2016, 2016, 4731953. [Google Scholar] [CrossRef]
- Dener, M.; Okur, C.; Al, S.; Orman, A. WSN-BFSF: A New Dataset for Attacks Detection in Wireless Sensor Networks. IEEE Internet Things J. 2023, 11, 2109–2125. [Google Scholar] [CrossRef]
- Vinayakumar, R.; Alazab, M.; Soman, K.; Poornachandran, P.; Al-Nemrat, A.; Venkatraman, S. Deep learning approach for intelligent intrusion detection system. IEEE Access 2019, 7, 41525–41550. [Google Scholar] [CrossRef]
- Wazirali, R.; Ahmad, R. Machine Learning Approaches to Detect DoS and Their Effect on WSNs Lifetime. Comput. Mater. Contin. 2022, 70, 4922–4946. [Google Scholar] [CrossRef]
- Tabbaa, H.; Ifzarne, S.; Hafidi, I. An online ensemble learning model for detecting attacks in wireless sensor networks. arXiv 2022, arXiv:2204.13814. [Google Scholar] [CrossRef]
- Salmi, S.; Oughdir, L. Performance evaluation of deep learning techniques for DoS attacks detection in wireless sensor network. J. Big Data 2023, 10, 1–25. [Google Scholar] [CrossRef]
- Jiang, S.; Zhao, J.; Xu, X. SLGBM: An intrusion detection mechanism for wireless sensor networks in smart environments. IEEE Access 2020, 8, 169548–169558. [Google Scholar] [CrossRef]
- Liu, J.; Yang, D.; Lian, M.; Li, M. Research on intrusion detection based on particle swarm optimization in IoT. IEEE Access 2021, 9, 38254–38268. [Google Scholar] [CrossRef]
- Vijayanand, R.; Devaraj, D. A novel feature selection method using whale optimization algorithm and genetic operators for intrusion detection system in wireless mesh network. IEEE Access 2020, 8, 56847–56854. [Google Scholar] [CrossRef]
- Hussain, K.; Xia, Y.; Onaizah, A.N.; Manzoor, T.; Jalil, K. Hybrid of WOA-ABC and proposed CNN for intrusion detection system in wireless sensor networks. Optik 2022, 271, 170145. [Google Scholar] [CrossRef]
- Mohiuddin, G.; Lin, Z.; Zheng, J.; Wu, J.; Li, W.; Fang, Y.; Wang, S.; Chen, J.; Zeng, X. Intrusion detection using hybridized meta-heuristic techniques with Weighted XGBoost Classifier. Expert Syst. Appl. 2023, 232, 120596. [Google Scholar] [CrossRef]
- Kasongo, S.M. An advanced intrusion detection system for IIoT based on GA and tree based algorithms. IEEE Access 2021, 9, 113199–113212. [Google Scholar] [CrossRef]
- Bergstra, J.; Bengio, Y. Random search for hyper-parameter optimization. J. Mach. Learn. Res. 2012, 13, 281–305. [Google Scholar]
- Quitadadmo, A.; Johnson, J.; Shi, X. Bayesian hyperparameter optimization for machine learning based eQTL analysis. In Proceedings of the 8th ACM International Conference on Bioinformatics, Computational Biology, and Health Informatics, Boston, MA, USA, 20–23 August 2017; pp. 98–106. [Google Scholar]
- Olson, R.S.; Moore, J.H. TPOT: A tree-based pipeline optimization tool for automating machine learning. In Proceedings of the Workshop on Automatic Machine Learning, New York, NY, USA, 24 June 2016; pp. 66–74. [Google Scholar]
- Hertel, L.; Collado, J.; Sadowski, P.; Ott, J.; Baldi, P. Sherpa: Robust hyperparameter optimization for machine learning. SoftwareX 2020, 12, 100591. [Google Scholar] [CrossRef]
- Akiba, T.; Sano, S.; Yanase, T.; Ohta, T.; Koyama, M. Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; pp. 2623–2631. [Google Scholar]
- Gabriel, J.J.; Anbarasi, L.J. Optimizing Coronary Artery Disease Diagnosis: A Heuristic Approach using Robust Data Preprocessing and Automated Hyperparameter Tuning of eXtreme Gradient Boosting. IEEE Access 2023, 11, 112988–113007. [Google Scholar] [CrossRef]
- Dorogush, A.V.; Ershov, V.; Gulin, A. CatBoost: Gradient boosting with categorical features support. arXiv 2018, arXiv:1810.11363. [Google Scholar]
- Prokhorenkova, L.; Gusev, G.; Vorobev, A.; Dorogush, A.V.; Gulin, A. CatBoost: Unbiased boosting with categorical features. Adv. Neural Inf. Process. Syst. 2018, 31, 6639–6649. [Google Scholar]
- Shapiro, J. Genetic algorithms in machine learning. In Advanced Course on Artificial Intelligence; Springer: Berlin/Heidelberg, Germany, 1999; pp. 146–168. [Google Scholar]
- Oliveto, P.S.; Paixão, T.; Pérez Heredia, J.; Sudholt, D.; Trubenová, B. How to escape local optima in black box optimisation: When non-elitism outperforms elitism. Algorithmica 2018, 80, 1604–1633. [Google Scholar] [CrossRef] [PubMed]
- Zhang, C.; Ji, C.; Hua, L.; Ma, H.; Nazir, M.S.; Peng, T. Evolutionary quantile regression gated recurrent unit network based on variational mode decomposition, improved whale optimization algorithm for probabilistic short-term wind speed prediction. Renew. Energy 2022, 197, 668–682. [Google Scholar] [CrossRef]
- Kundu, R.; Chattopadhyay, S.; Cuevas, E.; Sarkar, R. AltWOA: Altruistic Whale Optimization Algorithm for feature selection on microarray datasets. Comput. Biol. Med. 2022, 144, 105349. [Google Scholar] [CrossRef]
- Friedman, J.H. Greedy function approximation: A gradient boosting machine. Ann. Stat. 2001, 29, 1189–1232. [Google Scholar] [CrossRef]
- Tavallaee, M.; Bagheri, E.; Lu, W.; Ghorbani, A.A. A detailed analysis of the KDD CUP 99 data set. In Proceedings of the 2009 IEEE Symposium on Computational Intelligence for Security and Defense Applications, Ottawa, ON, Canada, 8–10 July 2009; pp. 1–6. [Google Scholar]
- Gray, R.M.; Neuhoff, D.L. Quantization. IEEE Trans. Inf. Theory 1998, 44, 2325–2383. [Google Scholar] [CrossRef]
- Thakkar, A.; Lohiya, R. Attack classification using feature selection techniques: A comparative study. J. Ambient Intell. Humaniz. Comput. 2021, 12, 1249–1266. [Google Scholar] [CrossRef]
- Ao, H. Using machine learning models to detect different intrusion on NSL-KDD. In Proceedings of the 2021 IEEE International Conference on Computer Science, Artificial Intelligence and Electronic Engineering (CSAIEE), Virtual, 20–22 August 2021; pp. 166–177. [Google Scholar]
- Karimi, Z.; Kashani, M.M.R.; Harounabadi, A. Feature ranking in intrusion detection dataset using combination of filtering methods. Int. J. Comput. Appl. 2013, 78, 21–27. [Google Scholar] [CrossRef]
- Panigrahi, R.; Borah, S. A detailed analysis of CICIDS2017 dataset for designing Intrusion Detection Systems. Int. J. Eng. Technol. 2018, 7, 479–482. [Google Scholar]
- Pelletier, Z.; Abualkibash, M. Evaluating the CIC IDS-2017 dataset using machine learning methods and creating multiple predictive models in the statistical computing language R. Science 2020, 5, 187–191. [Google Scholar]
- Powers, D. ‘Evaluation: From precision, recall and F-measure to 1062 ROC, informedness, markedness & correlation. J. Mach. Learn. Technol. 2011, 1063, 37–63. [Google Scholar]
- Seliya, N.; Khoshgoftaar, T.M.; Van Hulse, J. A study on the relationships of classifier performance metrics. In Proceedings of the 2009 21st IEEE International Conference on Tools with Artificial Intelligence, Newark, NJ, USA, 2–4 November 2009; pp. 59–66. [Google Scholar]
- He, H.; Ma, Y. Imbalanced Learning: Foundations, Algorithms, and Applications; Wiley: Hoboken, NJ, USA, 2013. [Google Scholar]
- Tang, C.; Luktarhan, N.; Zhao, Y. An efficient intrusion detection method based on LightGBM and autoencoder. Symmetry 2020, 12, 1458. [Google Scholar] [CrossRef]
Year | Authors | Feature Selection | Model | Parameters Fine-Tuning | Type of Classification | Dataset | Accuracy (%) |
---|---|---|---|---|---|---|---|
2023 | Salmi and Oughdir [26] | None | DNN, CNN, RNN, CNN+RNN | None | Multi-class | WSN-DS | 97.04, 98.79, 96.48, 96.86 |
2020 | Jiang et al. [27] | SBS | LightGBM | None | Multi-class | WSN-DS | 99.53 |
2019 | Vinayakumar et al. [23] | None | DNN | None | Multi-class & Binary | KDD Cup’99, NSL-KDD, UNSW-NB15, CICIDS2017, WSN-DS | 95.00–99.00, 95.00–99.00, 65.00–75.00, 93.00–96.00, 96.00–99.00 |
2018 | Le et al. [6] | None | Random Forest | None | Multi-class | WSN-DS | 98.00 |
2021 | Liu et al. [28] | PSO-LightGBM | OCSVM | None | Multi-class | UNSW-NB15 | 86.68 |
2020 | Vijayanand et al. [29] | WOA-GA (sequentially) | SVM | None | Multi-class | CICIDS2017, ADFA-LD | 95.91, 94.44 |
2023 | Mohiuddin et al. [31] | WOA-SCA | XgBoost | None | Multi-class & Binary | UNSW-NB15, CICIDS2017 | 91.00–99.00,
96.00–98.00 |
2023 | Kasongo et al. [32] | GA | RF | None | Multi-class | UNSW-NB15 | 87.61 |
2022 | Hussain et al. [30] | WOA-ABC | DNN | None | Multi-class | NSL-KDD | 98.00 |
2024 | Our | GSWO | CatBoost | GSWO | Multiclass | NSL-KDD, CICIDS2017, WSN-DS, WSNBFSF | 99.79, 99.74, 99.62, 99.99 |
Criteria | Description |
---|---|
Dataset’s name | NSL-KDD |
Quantity of records | 149,470 |
Quantity of network features | 41 |
Quantity of attack categories | 4 (DoS, Probe, R2L, U2R) |
Characteristics of network features | Basic features, host features, traffic features, and content features |
Datasets | The Traffic Type | Training Set | Testing Set |
---|---|---|---|
NSL-KDD dataset | Benign | 54,093 | 22,960 |
Dos | 37,424 | 16,138 | |
Probe | 9780 | 4299 | |
R2L | 2490 | 1079 | |
U2R | 173 | 79 | |
CICIDS2017 dataset | Normal | 260,415 | 65,266 |
Bot | 1561 | 382 | |
Brute Force | 6824 | 1727 | |
DoS/DDoS | 256,401 | 63,868 | |
Infiltration | 27 | 9 | |
Portscan | 45,779 | 11,526 | |
Web Attack | 1716 | 402 | |
WSN-DS dataset | Blackhole | 8104 | 1945 |
Grayhole | 10,598 | 2624 | |
Flooding | 2405 | 597 | |
TDMA | 5306 | 1322 | |
Normal | 259,621 | 65,020 | |
WSN-BFSF dataset | Normal | 210,223 | 52,628 |
Flooding | 23,913 | 5931 | |
Blackhole | 9441 | 2325 | |
Forwarding | 6108 | 1537 |
New Labels | Old Labels | Distribution | Percentage |
---|---|---|---|
Normal | Benign | 325,681 | 45.49 |
Bot | Bot | 1943 | 0.27 |
Brute Force | FTP-Patator, SSH-Patator | 8551 | 1.19 |
DoS/DdoS | DDoS, DoS Hulk, Heartbleed, Slow, DoS slowloris, GoldenEye, DoS, httptest | 320,269 | 44.74 |
Infiltration | Infiltration | 36 | 0.005 |
Portscan | Portscan | 57,305 | 8.01 |
Web Attack | Web Attack-Sql Injection, Web Attack-XSS, Web Attack-Brute Force | 2118 |
Datasets | Feature Selection Methods | Feature Orders |
---|---|---|
WSN-DS | WOA | [0, 5, 6, 7, 8, 9, 13, 14, 15, 17] |
GA | [3, 5, 6, 8, 9, 11, 12, 14, 15, 17] | |
SCA | [2, 3, 5, 8, 9, 11, 14, 17] | |
BA | [0, 4, 5, 6, 8, 9, 13, 14, 15, 17] | |
Our GSWO | [0, 5, 6, 8, 9, 13, 14, 15, 17] | |
WSNBFSF | WOA | [0, 1, 3, 4, 7, 9, 10, 12] |
GA | [0, 1, 2, 3, 4, 5, 6, 7, 9, 10, 12] | |
SCA | [3, 4, 7, 8, 10] | |
BA | [0 2 4 6 8 14] | |
Our GSWO | [0, 3, 4, 10, 12] | |
NSL-KDD | WOA | [0, 2, 4, 5, 7, 13, 14, 15, 17, 18, 21, 22, 23, 25, 26, 27, 29, 32, 34, 36, 37, 38, 41] |
GA | [0, 2, 4, 5, 9, 12, 22, 23, 26, 27, 29, 31, 32, 33, 35, 36, 37, 39, 40, 41] | |
SCA | [2, 3, 4, 5, 19, 20, 22, 28, 29, 31, 35, 36, 38, 41] | |
BA | [0, 2, 4, 5, 9, 12, 16, 22, 23, 25, 27, 29, 31, 33, 35, 36, 39, 40, 41] | |
Our GSWO | [0, 1, 2, 4, 15, 21, 22, 23, 26, 27, 31, 32, 33, 34, 35, 36, 38, 40, 41] | |
CICIDS2017 | WOA | [0, 1, 3, 5, 6, 14, 16, 17, 18, 19, 20, 26, 28, 33, 34, 37, 40, 43, 45, 54, 55, 56, 58, 59, 61, 62, 66] |
GA | [0, 8, 12, 13, 15, 17, 19, 20, 24, 26, 30, 32, 34, 38, 39, 44, 48, 55, 56, 58, 59, 60, 62] | |
SCA | [0, 15, 16, 24, 38, 41, 42, 48, 54, 55, 56, 62] | |
BA | [0, 3, 4, 7, 10, 11, 12, 13, 19, 22, 23, 24, 27, 32, 33, 41, 42, 46, 51, 52, 55, 56, 58, 62, 65] | |
Our GSWO | [0, 1, 8, 10, 13, 24, 25, 31, 34, 41, 42, 44, 57, 65] |
Datasets | Method | Acc | Prec | Rec | F1 |
---|---|---|---|---|---|
WSN-DS | All features | 98.01 | 92.81 | 85.80 | 89.01 |
WOA | 98.23 | 93.53 | 88.87 | 90.90 | |
GA | 98.24 | 93.58 | 88.86 | 90.92 | |
SCA | 98.23 | 93.49 | 88.86 | 90.88 | |
BA | 98.23 | 93.50 | 88.87 | 90.88 | |
Our GSWO | 98.25 | 93.62 | 88.85 | 90.93 | |
WSNBFSF | All features | 93.81 | 46.97 | 50.00 | 48.44 |
WOA | 96.76 | 94.78 | 76.74 | 82.91 | |
GA | 96.76 | 94.66 | 77.74 | 82.89 | |
SCA | 99.13 | 96.43 | 95.95 | 96.18 | |
BA | 99.58 | 98.53 | 98.39 | 98.45 | |
Our GSWO | 99.68 | 98.64 | 98.92 | 98.77 | |
NSL-KDD | All features | 98.40 | 96.29 | 77.10 | 77.78 |
WOA | 98.49 | 91.44 | 80.89 | 82.92 | |
GA | 98.87 | 95.47 | 85.94 | 88.44 | |
SCA | 98.42 | 93.62 | 78.87 | 79.97 | |
BA | 98.86 | 95.34 | 81.63 | 83.31 | |
Our GSWO | 98.89 | 94.33 | 86.92 | 89.19 | |
CICIDS2017 | All features | 95.23 | 40.45 | 41.60 | 41.00 |
WOA | 99.00 | 69.50 | 67.74 | 68.52 | |
GA | 99.13 | 69.17 | 68.58 | 68.84 | |
SCA | 99.35 | 82.54 | 77.83 | 79.81 | |
BA | 99.01 | 68.85 | 68.52 | 68.66 | |
Our GSWO | 99.37 | 98.35 | 81.90 | 86.22 |
Datasets | Techniques | Hyperparameter Values |
---|---|---|
WSN-DS | Grid search | [‘iter’: 211, ‘’: 0.1, ‘d’: 10, ‘l2’: 1, ‘r’: 1, ‘b’: 1] |
Random search | [‘iter’: 204, ‘’: 0.1, ‘d’: 10, ‘l2’: 3, ‘r’: 2, ‘b’: 2] | |
Optuna | [‘iter’: 427, ‘’: 0.46122, ‘d’: 11, ‘l2’: 2.5, ‘r’: 1, ‘b’: 2.748] | |
Our method | [’iter’: 370, ‘’: 0.3033, ‘d’: 6, ‘l2’: 2.0, ‘r’: 8.9063, ‘b’: 0.7031] | |
WSNBFSF | Grid search | [‘iter’: 179, ‘’: 0.1, ‘d’: 10, ‘l2’: 1, ‘r’: 4.5, ‘b’: 2] |
Random search | [‘iter’: 216, ‘’: 0.07, ‘d’: 10, ‘l2’: 3, ‘r’: 1, ‘b’: 1] | |
Optuna | [‘iter’: 912, ‘’: 0.3768, ‘d’: 10, ‘l2’: 6, ‘r’: 1, ‘b’: 7.577] | |
Our method | [‘iter’: 270, ‘’: 0.28817, ‘d’: 5, ‘l2’: 2.0, ‘r’: 0.15625, ‘b’: 1.09375] | |
NSL-KDD | Grid search | [‘iter’: 235, ‘’: 0.1, ‘d’: 10, ‘l2’: 1, ‘r’: 5, ‘b’: 5] |
Random search | [‘iter’: 284, ‘’: 0.1, ‘d’: 10, ‘l2’: 3, ‘r’: 2, ‘b’: 2] | |
Optuna | [‘iter’: 515, ‘’: 0.28, ‘d’: 6, ‘l2’: 2, ‘r’: 8.95, ‘b’: 3.34] | |
Our method | [‘iter’: 515, ‘’: 0.3, ‘d’: 6, ‘l2’: 2, ‘r’: 8.9, ‘b’: 3.3] | |
CICIDS2017 | Grid search | [‘iter’: 220, ‘’: 0.1, ‘d’: 10, ‘l2’: 3, ‘r’: 2.5, ‘b’: 3] |
Random search | [‘iter’: 220, ‘’: 0.1, ‘d’: 10, ‘l2’: 3, ‘r’: 2.5, ‘b’: 3] | |
Optuna | [‘iter’: 376, ‘’: 0.3246, ‘d’: 10, ‘l2’: 5, ‘r’: 1, ‘b’: 8.31] | |
Our method | [‘iter’: 194, ‘’: 0.2783, ‘d’: 9, ‘l2’: 2, ‘r’: 7.707, ‘b’: 2] |
Dataset | Method | Acc | Prec | Rec | F1 |
---|---|---|---|---|---|
WSN-DS | Grid search | 99.62 | 97.31 | 97.46 | 97.42 |
Random search | 99.59 | 97.18 | 97.69 | 97.37 | |
Optuna | 99.60 | 97.39 | 97.51 | 97.40 | |
Our method | 99.62 | 97.27 | 97.78 | 97.47 | |
WSNBFSF | Grid search | 99.98 | 99.98 | 99.97 | 99.98 |
Random search | 99.98 | 99.96 | 99.96 | 99.96 | |
Optuna | 99.97 | 99.94 | 99.82 | 99.88 | |
Our method | 99.99 | 99.99 | 99.99 | 99.99 | |
NSL-KDD | Grid search | 97.66 | 95.33 | 93.76 | 94.89 |
Random search | 99.76 | 96.37 | 93.99 | 95.05 | |
Optuna | 99.75 | 96.77 | 95.71 | 96.21 | |
Our method | 99.79 | 96.46 | 96.48 | 96.47 | |
CICIDS2017 | Grid search | 99.73 | 97.68 | 89.67 | 92.81 |
Random search | 99.73 | 98.05 | 87.89 | 91.32 | |
Optuna | 99.70 | 97.66 | 90.02 | 92.88 | |
Our method | 99.74 | 97.39 | 93.68 | 95.32 |
Dataset | Detection Rate (%) |
---|---|
WSN-DS | 99.62 ± 00.04 |
WSNBFSF | 99.99 ± 00.00 |
NSL-KDD | 99.82 ± 00.03 |
CICIDS2017 | 99.76 ± 00.01 |
Datasets | Method | Acc | Prec | Rec | F1 | Infer Time |
---|---|---|---|---|---|---|
WSN-DS | CNN [26] | 98.75 | 95.03 | 92.45 | 93.60 | 5.62 s |
DNN [23] | 96.40 | 97.00 | 96.40 | 96.60 | 2.90 s | |
Our method | 99.65 | 97.27 | 97.78 | 97.47 | 16 ms | |
WSNBFSF | LSTM-CNN [22] | 95.75 | 95.39 | 95.75 | 95.52 | 6.56 s |
GRU [22] | 99.01 | 99.00 | 99.01 | 98.99 | 5.91 s | |
Our method | 99.99 | 99.99 | 99.99 | 99.99 | 13 ms | |
NSL-KDD | DNN [23] | 94.14 | 88.76 | 88.67 | 88.48 | 1.87 s |
AE [56] | 89.82 | 91.81 | 90.16 | 90.98 | 3.51 s | |
Our method | 99.76 | 96.17 | 95.14 | 95.63 | 37 ms | |
CICIDS2017 | DNN [23] | 95.60 | 96.20 | 92.60 | 94.70 | 5.76 s |
CNN [26] | 98.62 | 93.20 | 78.34 | 81.62 | 25.36 s | |
Our method | 99.74 | 97.39 | 93.68 | 95.32 | 73 ms |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nguyen, T.M.; Vo, H.H.-P.; Yoo, M. Enhancing Intrusion Detection in Wireless Sensor Networks Using a GSWO-CatBoost Approach. Sensors 2024, 24, 3339. https://doi.org/10.3390/s24113339
Nguyen TM, Vo HH-P, Yoo M. Enhancing Intrusion Detection in Wireless Sensor Networks Using a GSWO-CatBoost Approach. Sensors. 2024; 24(11):3339. https://doi.org/10.3390/s24113339
Chicago/Turabian StyleNguyen, Thuan Minh, Hanh Hong-Phuc Vo, and Myungsik Yoo. 2024. "Enhancing Intrusion Detection in Wireless Sensor Networks Using a GSWO-CatBoost Approach" Sensors 24, no. 11: 3339. https://doi.org/10.3390/s24113339
APA StyleNguyen, T. M., Vo, H. H. -P., & Yoo, M. (2024). Enhancing Intrusion Detection in Wireless Sensor Networks Using a GSWO-CatBoost Approach. Sensors, 24(11), 3339. https://doi.org/10.3390/s24113339