Multi-Criteria Feature Selection Based Intrusion Detection for Internet of Things Big Data
Abstract
:1. Introduction
- This paper introduces the HSACEC hybrid sampling algorithm designed to acquire a balanced dataset, effectively tackling the issue of excessive discarding of majority class samples inherent in conventional undersampling methods. Such methods rely solely on the average classification error rate within clusters, particularly when there exists a substantial imbalance between the count of majority class samples and minority class samples.
- An improved LVW algorithm, called the M-LVW feature selection algorithm, is proposed in this study. This algorithm takes the evaluation criterion of the feature subset as an input parameter, which represents a performance evaluation metric of the classifier. The parameter can be set based on specific requirements. Subsequently, this paper extends the M-LVW algorithm to feature selection in the OVO framework and introduces the LVW-MECO algorithm. Firstly, the LVW-MECO algorithm applies the M-LVW algorithm to each individual base classifier in the OVO scheme. It conducts wrapper-based feature selection using the accuracy of the base classifier as the evaluation criterion for the feature subset, aiming to identify distinct feature subsets for each base classifier.
- This paper integrates the LVW-MECO algorithm with the BP neural network to establish an LVW-MECO intrusion detection model (LVW-MECO-IDM) for network intrusion detection. Experimental evaluations were conducted on the publicly available IoT-23 network intrusion detection dataset to validate the superiority of the LVW-MECO algorithm. The results demonstrate that the LVW-MECO-IDM, which utilizes the LVW-MECO algorithm, can effectively improve the accuracy and detection rate, and reduce false alarm rates.
2. Related Work
3. Preliminary
3.1. Intrusion Detection Mechanism
- IoT Information Collection Module: It collects intrusion data and performs statistical analysis based on the feature space of network intrusion detection.
- Network Intrusion Detection Module: This module uses the data collected by the information collection module to train intrusion detection algorithms. It then uses the model to detect whether the network data are normal. If it is abnormal, it further identifies the type of attack to which the network data belongs.
- Response Module: This module responds accordingly to the detection results. If an attack is detected, it takes appropriate interception and handling measures.
3.1.1. Intrusion Detection Based on Misuse
3.1.2. Anomaly-Based Network Intrusion Detection
3.1.3. Edge Computing-Based Intrusion Detection
3.2. Feature Selection Methods
3.2.1. Feature Selection Based Search Strategy
3.2.2. Feature Selection Based on Sequential Search Strategy
- Feature selection methods based on individual best feature search strategy: In this category of algorithms, the criterion values are first calculated for each feature used individually. Based on these criterion values, the features are sorted, and the top l features are selected as the output feature subset.
- Feature selection methods based on sequential forward search strategy: These algorithms use a “bottom-up” search approach. Initially, the target feature subset is initialized as an empty set. In each step, the feature that optimizes the evaluation criterion the most is added to the target feature subset. The search ends when the termination condition is met, and the obtained target feature subset is considered the selection result.
- Feature selection methods based on sequential backward search strategy: These algorithms use a “top-down” search approach. The target feature subset is initialized with all the features, and in each step, an irrelevant feature is removed until the termination condition is satisfied.
- Feature selection methods based on bidirectional search strategy: These algorithms simultaneously add relevant features and remove irrelevant features in each step.
3.2.3. Feature Selection Based on Random Search Strategy
4. Algorithm Design
4.1. LVW Algorithm
Algorithm 1: LVW feature selection algorithm |
Input: Given a dataset D, feature set F, classifier algorithm h, and stop condition control parameter T. Process: 1. Initialize: 2. while do 3. Randomly generate a feature subset 4. 5. 6. 7. 8. else 9. 10. end if 11. end while Output: feature subset |
4.2. OVO Decomposition Strategy
- 1.
- Each binary classifier returns a pair of confidences for an unknown sample, indicating the probability of the unknown sample being classified as class relative to class . Moreover, . If the classifier provides only one confidence ll, the other confidence can be calculated based on . All the confidences returned by the binary classifiers are combined to form a scoring matrix L:
- 2.
- Finally, a certain aggregation strategy is adopted to integrate the outputs of all base classifiers and obtain the predicted class for the unknown sample. Several commonly used aggregation strategies in OVO are as follows:
- Voting strategy (VOTE) [35]: This method uses a voting strategy to obtain the final class label by selecting the class with the most votes from all base classifiers. The predicted class label is the output result:In the equation, if is 1, it indicates that the base classifier predicts the unknown sample as class i. If it is 0, it means it does not predict it as class i, and there is:
- Learning Valued Preference for Classification (LVPC) [36]: This method introduces a conflict level, absolute preference, and unknown degree into the recognition process of the final class. Its decision rule is as follows:
- In the equation, and represent the absolute preferences for class i and class j, respectively. represents the level of conflict, represents the number of samples for class i in the training set, and represents the unknown degree. The corresponding calculation methods are shown as follows:
- Preference Relations Solved by Non-dominance Criterion (ND) [37]: This method incorporates normalized fuzzy preference relations into the scoring table. The final output class is determined by selecting the class that is maximally non-dominated, and the decision rule is as follows:In the equation, is the normalized scoring table, and the calculation methods for and are as follows:
4.3. The Proposed LVW-MECO Algorithm
4.3.1. M-LVW Algorithm
Algorithm 2: M-LVW feature selection algorithm |
Input: Performance evaluation metric e (evaluation criterion for feature subsets) of the classifier; dataset D; feature set F; classification algorithm h; and stop condition control parameter T Output: feature subset Process: 1. 2. If the value of evaluation metric e for the classifier is positively correlated with the performance of the classifier, then 3. 4. else 5. 6. end if 7. while do 8. 9. 10. 11. 12. 13. else 14. 15. end if 16. end while return |
4.3.2. LVW-MECO Algorithm
Algorithm 3: LVW-MECO feature selection algorithm |
Input: Dataset D containing N classes; feature set F; base classifier (binary classifier) algorithm h in OVO; aggregation strategy s in OVO; stopping condition control parameter T in M-LVW algorithm; and number of binary classifiers to be optimized r Initialization: number of binary classifiers Step 1: Apply the M-LVW algorithm to the k binary classifiers in the OVO setting individually and set the evaluation criterion for the parameter feature subsets as the accuracy of the corresponding binary classifier. This process selects k different feature subsets for the k binary classifiers . Step 2: By applying the 10-fold cross-validation method, the F1 values corresponding to the k binary classifiers are calculated. All binary classifiers are then sorted in ascending order based on their F1 values. Step 3: For j = 1 to r /*Repeat the following Steps 4–5 sequentially for the r binary classifiers with lower F1 values.*/ Step 4: M-LVW algorithm is used for the binary classifier, and the evaluation criterion of its parameter feature subset is set as the F1 value of the binary classifier, and the feature subset is selected for the binary classifier, and is used as the feature set of the binary classifier e. Step 5: The k binary classifiers of are used to form a multi-classifier H with aggregation strategy s; and other binary classifiers , , and a multi-classifier is formed by aggregation strategy s; then, the accuracy of H and is obtained by using the 10-fold cross-verification method. If the accuracy of is greater than H, then , . Output: The subset of features corresponding to each binary classifier in OVO |
4.3.3. HSACEC Algorithm
Algorithm 4: HSACEC hybrid sampling algorithm |
Input: Dataset D containing class N; equilibrium sampling number m; classifier algorithm h; and the number of fractional samples T. Output: Equilibrium sample set Q
|
4.4. Network Intrusion Detection Model Based LW-MECO
- Collect network intrusion data.
- Preprocess the raw data. First, convert the categorical features of the data into numerical values. Then, perform Z-score normalization on the data.
- LVW-MECO-based feature selection. Use the LVW-MECO algorithm to select different feature subsets for each base classifier in the OVO setting.
- OVO-based multiclass classification. Train each base classifier in the OVO setting using the training set. Combine the base classifiers into a multiclass classifier using voting. Use this multiclass classifier to identify intrusion data.
- Output detection results and respond. Based on the results of network intrusion detection, when intrusion behavior is detected, execute various necessary response measures such as alarms, network disconnection, and other actions.
5. Experimental Analysis
5.1. Experimental Data
5.2. Evaluation Criteria
5.3. Contrast Model
- OBPNN model: No feature selection is performed, and this model uses all features.
- LVW-OBPNN model: Each base classifier adopts the LVW algorithm for feature selection. The error rate of the base classifier serves as the sole evaluation criterion for the feature subset, which is equivalent to using the accuracy [38] of the base classifier as the evaluation criterion for the feature subset. Each base classifier selects a different feature subset.
- F1-LVW-OBPNN model: Each base classifier adopts an improved version of the LVW algorithm proposed in reference [39] for feature selection. The evaluation criterion for the feature subset in the LVW algorithm is changed to the F1 score of the classifier. Therefore, this model uses the F1 score of the base classifier as the sole evaluation criterion for the feature subset, resulting in different feature subsets selected by each base classifier.
- MFFS-OBPNN model: Each base classifier adopts the Multi-filter Feature Selection Approach (MFFS) algorithm [40] for feature selection. This algorithm ranks features using filter-based feature selection algorithms based on L1-LR (Logistic Regression), SVM (Support Vector Machine), and RF (Random Forest). Features with rankings below a threshold are removed, and similar features are grouped together. The highest-ranked feature is selected from each cluster, and finally, the features selected by the three feature selection algorithms are combined.
5.4. Experimental Parameter Setting
5.5. Analysis of Experimental Results
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Kong, X.; Chen, Q.; Hou, M.; Rahim, A.; Ma, K.; Xia, F. RMGen: A Tri-Layer Vehicular Trajectory Data Generation Model Exploring Urban Region Division and Mobility Pattern. IEEE Trans. Veh. Technol. 2022, 71, 9225–9238. [Google Scholar] [CrossRef]
- Sarhan, M.; Layeghy, S.; Moustafa, N.; Gallagher, M.; Portmann, M. Feature extraction for machine learning-based intrusion detection in IoT Networks. Digit. Commun. Netw. 2022, in press. [Google Scholar] [CrossRef]
- Nie, L.; Sun, W.; Wang, S.; Ning, Z.; Rodrigues, J.J.P.C.; Wu, Y.; Li, S. Intrusion Detection in Green Internet of Things: A Deep Deterministic Policy Gradient-Based Algorithm. IEEE Trans. Green Commun. Netw. 2021, 5, 778–788. [Google Scholar] [CrossRef]
- Ning, Z.; Sun, S.; Wang, X.; Guo, L.; Wang, G.; Gao, X.; Kwok, R. Intelligent Resource Allocation in Mobile Blockchain for Privacy and Security Transactions: A Deep Reinforcement Learning based Approach. Sci. China Inf. Sci. 2020, 64, 162303. [Google Scholar] [CrossRef]
- Qiu, T.; Liu, X.; Zhou, X.; Qu, W.; Ning, Z.; Chen, C.L.P. An Adaptive Social Spammer Detection Model with Semi-Supervised Broad Learning. IEEE Trans. Knowl. Data Eng. 2022, 34, 4622–4635. [Google Scholar] [CrossRef]
- Liu, K.; Fu, Y.; Wu, L.; Li, X.; Aggarwal, C.; Xiong, H. Automated Feature Selection: A Reinforcement Learning Perspective. IEEE Trans. Knowl. Data Eng. 2023, 35, 2272–2284. [Google Scholar] [CrossRef]
- Wang, X.; Ning, Z.; Guo, L.; Guo, S.; Gao, X.; Wang, G. Mean-Field Learning for Edge Computing in Mobile Blockchain Networks. IEEE Trans. Mob. Comput. 2022, 1–17. [Google Scholar] [CrossRef]
- De la Hoz, E.; Hoz, E.d.l.; Ortiz, A.; Ortega, J.; Martínez-Álvarez, A. Feature Selection by Multi-Objective Optimisation: Application to Network Anomaly Detection by Hierarchical Self-Organising Maps. Knowl.-Based Syst. 2014, 71, 322–338. [Google Scholar] [CrossRef]
- Eesa, A.S.; Orman, Z.; Brifcani, A.M.A. A novel feature-selection approach based on the cuttlefish optimization algorithm for intrusion detection systems. Expert Syst. Appl. 2015, 42, 2670–2679. [Google Scholar] [CrossRef]
- Nie, L.; Ning, Z.; Wang, X.; Hu, X.; Cheng, J.; Li, Y. Data-Driven Intrusion Detection for Intelligent Internet of Vehicles: A Deep Convolutional Neural Network-Based Method. IEEE Trans. Netw. Sci. Eng. 2020, 7, 2219–2230. [Google Scholar] [CrossRef]
- Kang, S.H.; Kim, K.J. A Feature Selection Approach to Find Optimal Feature Subsets for the Network Intrusion Detection System. Clust. Comput. 2016, 19, 325–333. [Google Scholar] [CrossRef]
- Kanungo, T.; Mount, D.; Netanyahu, N.; Piatko, C.; Silverman, R.; Wu, A. An efficient k-means clustering algorithm: Analysis and implementation. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 881–892. [Google Scholar] [CrossRef]
- Erfani, S.M.; Rajasegarar, S.; Karunasekera, S.; Leckie, C. High-dimensional and large-scale anomaly detection using a linear one-class SVM with deep learning. Pattern Recognit. 2016, 58, 121–134. [Google Scholar] [CrossRef]
- Manzoor, I.; Kumar, N. A Feature Reduced Intrusion Detection System Using ANN Classifier. Expert Syst. Appl. 2017, 88, 249–257. [Google Scholar] [CrossRef]
- Khammassi, C.; Krichen, S. A GA-LR wrapper approach for feature selection in network intrusion detection. Comput. Secur. 2017, 70, 255–277. [Google Scholar] [CrossRef]
- Naqvi, S.S.; Browne, W.N.; Hollitt, C. Feature Quality-Based Dynamic Feature Selection for Improving Salient Object Detection. IEEE Trans. Image Process. 2016, 25, 4298–4313. [Google Scholar] [CrossRef] [PubMed]
- Wang, R.; Bian, J.; Nie, F.; Li, X. Unsupervised Discriminative Projection for Feature Selection. IEEE Trans. Knowl. Data Eng. 2022, 34, 942–953. [Google Scholar] [CrossRef]
- Ning, Z.; Hu, H.; Wang, X.; Guo, L.; Guo, S.; Wang, G.; Gao, X. Mobile Edge Computing and Machine Learning in the Internet of Unmanned Aerial Vehicles: A Survey. ACM Comput. Surv. 2023. just accepted. [Google Scholar] [CrossRef]
- Lin, H.; Xue, Q.; Bai, D. Internet of things intrusion detection model and algorithm based on cloud computing and multi-feature extraction extreme learning machine. Digit. Commun. Netw. 2022, 9, 111–124. [Google Scholar] [CrossRef]
- Shi, D.; Zhu, L.; Li, J.; Zhang, Z.; Chang, X. Unsupervised Adaptive Feature Selection with Binary Hashing. IEEE Trans. Image Process. 2023, 32, 838–853. [Google Scholar] [CrossRef]
- Capó, M.; Pérez, A.; Lozano, J.A. A Cheap Feature Selection Approach for the K-Means Algorithm. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 2195–2208. [Google Scholar] [CrossRef] [PubMed]
- Ning, Z.; Chen, H.; Wang, X.; Wang, S.; Guo, L. Blockchain-Enabled Electrical Fault Inspection and Secure Transmission in 5G Smart Grids. IEEE J. Sel. Top. Signal Process. 2022, 16, 82–96. [Google Scholar] [CrossRef]
- Butun, I.; Morgera, S.D.; Sankar, R. A Survey of Intrusion Detection Systems in Wireless Sensor Networks. IEEE Commun. Surv. Tutor. 2014, 16, 266–282. [Google Scholar] [CrossRef]
- Tidjon, L.N.; Frappier, M.; Mammar, A. Intrusion Detection Systems: A Cross-Domain Overview. IEEE Commun. Surv. Tutor. 2019, 21, 3639–3681. [Google Scholar] [CrossRef]
- Subhadrabandhu, D.; Sarkar, S.; Anjum, F. A framework for misuse detection in ad hoc networks—Part II. IEEE J. Sel. Areas Commun. 2006, 24, 290–304. [Google Scholar] [CrossRef]
- Liang, W.; Xiao, L.; Zhang, K.; Tang, M.; He, D.; Li, K.C. Data Fusion Approach for Collaborative Anomaly Intrusion Detection in Blockchain-Based Systems. IEEE Internet Things J. 2022, 9, 14741–14751. [Google Scholar] [CrossRef]
- Kong, X.; Wu, Y.; Wang, H.; Xia, F. Edge Computing for Internet of Everything: A Survey. IEEE Internet Things J. 2022, 9, 23472–23485. [Google Scholar] [CrossRef]
- Ning, Z.; Dong, P.; Wang, X.; Hu, X.; Guo, L.; Hu, B.; Guo, Y.; Qiu, T.; Kwok, R.Y.K. Mobile Edge Computing Enabled 5G Health Monitoring for Internet of Medical Things: A Decentralized Game Theoretic Approach. IEEE J. Sel. Areas Commun. 2021, 39, 463–478. [Google Scholar] [CrossRef]
- Wang, X.; Li, J.; Ning, Z.; Song, Q.; Guo, L.; Guo, S.; Obaidat, M.S. Wireless Powered Mobile Edge Computing Networks: A Survey. ACM Comput. Surv. 2023, 55, 263. [Google Scholar] [CrossRef]
- Wang, X.; Ning, Z.; Guo, S.; Wang, L. Imitation Learning Enabled Task Scheduling for Online Vehicular Edge Computing. IEEE Trans. Mob. Comput. 2022, 21, 598–611. [Google Scholar] [CrossRef]
- Ning, Z.; Chen, H.; Ngai, E.C.H.; Wang, X.; Guo, L.; Liu, J. Lightweight Imitation Learning for Real-Time Cooperative Service Migration. IEEE Trans. Mob. Comput. 2023, 1–18. [Google Scholar] [CrossRef]
- Ning, Z.; Dong, P.; Kong, X.; Xia, F. A Cooperative Partial Computation Offloading Scheme for Mobile Edge Computing Enabled Internet of Things. IEEE Internet Things J. 2019, 6, 4804–4814. [Google Scholar] [CrossRef]
- Naghibi, T.; Hoffmann, S.; Pfister, B. A Semidefinite Programming Based Search Strategy for Feature Selection with Mutual Information Measure. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 37, 1529–1541. [Google Scholar] [CrossRef]
- Sun, L.; Wan, L.; Wang, J.; Lin, L.; Gen, M. Joint Resource Scheduling for UAV-Enabled Mobile Edge Computing System in Internet of Vehicles. IEEE Trans. Intell. Transp. Syst. 2022, 1–9. [Google Scholar] [CrossRef]
- Elkano, M.; Galar, M.; Sanz, J.A.; Fernández, A.; Barrenechea, E.; Herrera, F.; Bustince, H. Enhancing Multiclass Classification in FARC-HD Fuzzy Classifier: On the Synergy between n-Dimensional Overlap Functions and Decomposition Strategies. IEEE Trans. Fuzzy Syst. 2015, 23, 1562–1580. [Google Scholar] [CrossRef]
- Hüllermeier, E.; Vanderlooy, S. Combining predictions in pairwise classification: An optimal adaptive voting strategy and its relation to weighted voting. Pattern Recognit. 2010, 43, 128–142. [Google Scholar] [CrossRef]
- Fernández, A.; Calderón, M.; Barrenechea, E.; Bustince, H.; Herrera, F. Solving multi-class problems with linguistic fuzzy rule based classification systems based on pairwise learning and preference relations. Fuzzy Sets Syst. 2010, 161, 3064–3080. [Google Scholar] [CrossRef]
- Wan, L.; Li, X.; Xu, J.; Sun, L.; Wang, X.; Liu, K. Application of Graph Learning With Multivariate Relational Representation Matrix in Vehicular Social Networks. IEEE Trans. Intell. Transp. Syst. 2023, 24, 2789–2799. [Google Scholar] [CrossRef]
- Wang, B.; Liem, C.C.S. TUD-MMC at MediaEval 2016: Context of Experience task. In Proceedings of the MediaEval Benchmarking Initiative for Multimedia Evaluation, Hilversum, The Netherlands, 20–21 October 2016; Volume 1739. [Google Scholar]
- Haq, A.U.; Zhang, D.; Peng, H.; Rahman, S.U. Combining Multiple Feature-Ranking Techniques and Clustering of Variables for Feature Selection. IEEE Access 2019, 7, 151482–151492. [Google Scholar] [CrossRef]
Normal | Dos | Probe | R2L | U2R | Unbalance | |
---|---|---|---|---|---|---|
Training set | uid | 391,458 | 4107 | 1126 | 52 | 7528.04 |
Test set | id.orig_h | 229,853 | 4166 | 16,189 | 228 | 1008.13 |
ID | Feature | Description |
---|---|---|
1 | uid | The unique ID of the stream |
2 | id.orig_h | Source IP address |
3 | id.orig_p | Source port number |
4 | id.resp_h | Destination IP address |
5 | id.resp_p | Destination IP address |
6 | proto | Agreement |
7 | service | Dhcp, dns, http, ssh |
8 | duration | Flow duration |
9 | orig_bytes | Source sends payload bytes |
10 | resp_bytes | Destination sends payload bytes |
11 | conn_state | Connection state |
12 | local_orig | Source status address flag bit |
13 | local_resp | Destination status address flag bit |
14 | missed_bytes | Bytes lost |
15 | orig_pkts | Number of source address packets |
16 | orig_ip_bytes | Bytes of the source IP layer |
17 | resp_pkts | Number of destination packets |
18 | resp_ip_bytes | Bytes of the destination IP layer |
Argument | Value |
---|---|
Data set | IoT-23 |
OVO’s aggregation strategy s | Voting law |
Stop condition control parameter T in M-LVW algorithm | 100 |
The number of binary classifiers to be optimized in the LW-MECO algorithm r | 4 |
Structure of the I-BP neural network in OVO | |Fi|:15:2 |
Base Classifier | Classification Category | Feature Selection Result |
---|---|---|
h1 | Dos, Normal | 2, 4, 5, 6, 8, 9, 10, 11, 13, 14, 15, 16, 17, 18, 19, 20, 21, 23, 25, 27, 29, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41 |
h2 | Dos, Probe | 3, 7, 8, 12, 23, 25, 27, 28, 29, 30, 32, 33, 34, 36, 38, 39, 40, 41 |
h3 | Dos, R2L | 1, 23, 31, 39, 40 |
h4 | Dos, U2R | 3, 8, 10, 12, 14, 17, 23, 25, 26, 28, 31, 32, 34, 35, 40 |
h5 | Normal, Probe | 2, 3, 7, 8, 12, 23, 24, 25, 26, 28, 32, 33, 34, 36, 38, 39, 40 |
h6 | Normal, R2L | 2, 3, 7, 11, 13, 14, 16, 18, 22, 24, 26, 27, 32, 34, 35, 36, 40 |
h7 | Normal, U2R | 2, 3, 10, 11, 12, 14, 17, 18, 23, 24, 25, 26, 31, 32, 33, 35, 36, 38 |
h8 | Probe, R2L | 2, 4, 5, 6, 7, 8, 11, 13, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 27, 28, 29, 30, 31, 33, 34, 36, 37, 38, 39, 40, 41 |
h9 | Probe, U2R | 2, 3, 14, 26, 30, 33, 40 |
h10 | R2L, U2R | 2, 3, 5, 10, 12, 16, 17, 18, 19, 22, 23, 29, 30, 32, 33, 34, 37, 39, 40, 41 |
Model | Recall Rates for All Categories (%) | Precision (%) | Detection (%) | False Alarm (%) | ||||
---|---|---|---|---|---|---|---|---|
Normal | Dos | Probe | R2L | U2R | ||||
OBPNN | 97.25 | 97.56 | 82.37 | 12.36 | 13.34 | 92.80 | 91.72 | 2.75 |
LVW-OBPNN | 99.13 | 99.11 | 95.17 | 22.04 | 18.60 | 94.99 | 93.99 | 0.87 |
F1-LVW-OBPNN | 98.72 | 98.85 | 95.06 | 23.24 | 20.36 | 94.78 | 93.83 | 1.28 |
MFFS-OBPNN | 98.12 | 97.94 | 93.31 | 20.64 | 15.62 | 93.83 | 92.79 | 1.88 |
LVW-MECO-IDM | 99.60 | 99.84 | 95.32 | 28.82 | 20.53 | 95.98 | 95.10 | 0.40 |
Model | Training Time/s |
---|---|
OBPNN | 180.25 |
LVW-OBPNN | 53.37 |
F1-LVW-OBPNN | 59.26 |
MFFS-OBPNN | 57.35 |
LVW-MECO-IDM | 55.34 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, J.; Xiong, X.; Chen, G.; Ouyang, R.; Gao, Y.; Alfarraj, O. Multi-Criteria Feature Selection Based Intrusion Detection for Internet of Things Big Data. Sensors 2023, 23, 7434. https://doi.org/10.3390/s23177434
Wang J, Xiong X, Chen G, Ouyang R, Gao Y, Alfarraj O. Multi-Criteria Feature Selection Based Intrusion Detection for Internet of Things Big Data. Sensors. 2023; 23(17):7434. https://doi.org/10.3390/s23177434
Chicago/Turabian StyleWang, Jie, Xuanrui Xiong, Gaosheng Chen, Ruiqi Ouyang, Yunli Gao, and Osama Alfarraj. 2023. "Multi-Criteria Feature Selection Based Intrusion Detection for Internet of Things Big Data" Sensors 23, no. 17: 7434. https://doi.org/10.3390/s23177434
APA StyleWang, J., Xiong, X., Chen, G., Ouyang, R., Gao, Y., & Alfarraj, O. (2023). Multi-Criteria Feature Selection Based Intrusion Detection for Internet of Things Big Data. Sensors, 23(17), 7434. https://doi.org/10.3390/s23177434