A Detailed Inspection of Machine Learning Based Intrusion Detection Systems for Software Defined Networks
Abstract
:1. Introduction
- An extensive benchmark dataset of DoS and DDoS attacks on SDN-IoT networks.
- A set of baseline machine learning models trained and tested on the benchmark dataset.
2. Related Work
2.1. General IDS Works
2.2. IDS That Are Based on ML/DL
2.3. SDN-IoT IDS
2.4. Dataset for IDS
3. Methodology
- On the same virtual machine (VM), install both the Open vSwitch (OVS) 2.14.2 and the Mininet software 2.3.0.
- In the OVS-VM, create two adapters to represent the three distinct network subnets. Both of the newly formed interfaces in the configuration in the proposed work are given the names esp33 and esp34.
- On the same OpenFlow switch, create two OVS bridges and give them the names br1 and br2.
- Assign each data plane interface to the bridge that is most appropriate for it. In this work, esp34 was assigned to the br1 bridge, while esp33 was assigned to the br2 bridge.
- Remove the IP address from each data plane interface, or set it to zero. At a later stage, the IP address that was previously withdrawn will be allocated to the bridges that have been built. For instance, we delete the IP address that has been set on the esp34 interface and assign it to the bridge that is linked to it (br1). The setup is carried out in the same manner for br2.
- Connect the virtual machine running Kali Linux to the same network adapter as br1, then connect the target server to the same network adapter as br2.
- On the Linux machine running OVS, enable IP forwarding.
- Construct a topology for a Mininet that consists of four virtual hosts (h1 to h4). By default, the primary Bridge of the controller VM is the node to which Mininet’s virtual hosts are associated.
- Pair up the POX controller to each of the newly built bridges (br1, br2).
- As of right now, ping can be used to communicate with hosts located in various subnets.
- Found open Docker daemons.
- Identified open Docker daemons.
- Obtained Docker images with the needed capabilities.
- A Docker client to deploy the images onto the remote daemons (local Registry).
4. Datasets and Features Extraction
- Characteristics based on network detailed information including specific origin and destination flow, IP, Port and protocol used.
- Attributes based on packets: these characteristics store data within the packets, including the total number of packets in both directions.
- Characteristics based on the number of bytes transmitted or received are considered byte-based attributes.
- Attributes for displaying interval time information in both directions.
- Attributes of flow timers: these characteristics save data about the active and idle times of each flow.
- Attributes pertaining to flags, such as SYN Flag, RST Flag, etc.
- Attributes of flows: these characteristics specify traffic flow data such as number of packets and bytes in both directions.
- The properties of sub-flow descriptors provide data about sub-flows, such as the total amount of packets and bytes flowing in both directions.
5. Results
- Feature 5—Total length of packet in forward direction.
- Feature 1—Destination port.
- Feature 70—Minimum segment size observed in forward direction.
- Feature 46—Number of packets with RST flag.
- Feature 49—Number of packets with URG flag.
- Feature 14—Standard deviation of the length of packets in backward direction.
- Feature 12—Minimum length of packets in backward direction.
- Feature 67—Total number of bytes sent in initial window in forward direction.
- Feature 72—Standard deviation of the time in which the flow was active before becoming idle.
- Feature 68—Total number of bytes sent in initial window in the backward direction.
- Feature 1—Destination Port.
- Feature 67—Total number of bytes sent in the initial window in the forward direction.
- Feature 7—Maximum size of the packet in the forward direction.
- Feature 64—The average number of bytes in a subflow in the forward direction.
- Feature 54—Average size observed in the forward direction (i.e., averaged by segment number).
- Feature 5—Total length of the packet in the forward direction.
- Feature 69—Count of packets with at least 1 byte of TCP data payload in the forward direction.
- Feature 9—Average size of packets in the forward direction (i.e., averaged by packet number).
- Feature 68—Total number of bytes sent in the initial window in the backward direction.
- Feature 37—Number of forward packets per second
- Feature 1: Destinations port in the dataset vary and it is considered important because strange ports that are not used in common protocols like (HTTP 80, FTP 21, HTTPS 443) will be most likely malicious connections.
- Features 67, 7, 64, 54, 5, 69 and 9: These features are all statistics on the forward packet length. The forward packet length is a very important indicator because the attacker will intentionally increase it in order to waste the target processing power which will result in a denial of service attack.
- Feature 68: The backward packet length size is also important because the attacker will generate packets that consume the target resources in preparing and sending more data from its system in order to consume as much as possible from its processing power and network throughput.
- Feature 37: The number of forward packets per second is a significant indicator because a normal connection will generally send a small number of packets in order to request the server content. However, an attacker will send a huge number of packets to request as much as possible data from the server side in order to exhaust server resources.
- Feature 67—Total number of bytes sent in the initial window in the forward direction.
- Feature 01—Destination Port.
- Feature 25—Minimum time between two packets sent in the forward direction.
- Feature 49—Number of packets with URG.
- Feature 20—Minimum time between two packets sent in the flow.
- Feature 22—Mean time between two packets sent in the forward direction.
- Feature 41—Mean length of a packet.
- Feature 14—Standard deviation size of the packet in the backward direction.
- Feature 17—Mean time between two packets sent in the flow.
- Feature 44—Number of packets with FIN.
- Features 25, 20, 22 and 17: All of these features define the intervals between packets, which are important for identifying a DoS attack. Feature 22 is the maximum duration between two packets transmitted in the forward direction, and while this value will be extremely large for a regular user, it will be very small for suspicious activity in order to soak up the target’s resources. However, in contrast to DDoS, the chosen features do not depend on time, as DDoS attacks are typically harmless (from the standpoint of a single device) due to the attackers’ efforts to distribute the attack in a natural way so that it is difficult to block or detect their devices. This, in turn, necessitates a normal time between packets to avoid being detected.
- Feature 1: Destination ports in the dataset vary and are considered important because strange ports that are not used in common protocols like (HTTP 80, FTP 21, HTTPS 443) will be most likely a malicious connections.
- Features 67, 41 and 14: Each of these features is a measurement of the packet’s total length. If an attacker is trying to launch a denial of service attack on a target, they will purposefully increase the forward packet length to do so. In addition, feature 14 is the standard deviation of packet sizes in the reverse direction, and an increase in the STD indicates improper behavior that results in abnormally large answers (i.e., responses).
- Feature 49 and 44: One FIN packet and a negligible or nonexistent number of urgent (URG) packets are typical for a healthy connection. Numerous FIN packets indicate a high volume of connections established and subsequently dissolved. And because of the urgency of the URG packets, the TCP/IP stack will forward them to the application even when the window (or buffer) is not filled up yet.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Myint Oo, M.; Kamolphiwong, S.; Kamolphiwong, T.; Vasupongayya, S. Advanced Support Vector Machine- (ASVM-) Based Detection for Distributed Denial of Service (DDoS) Attack on Software Defined Networking (SDN). J. Comput. Networks Commun. 2019, 2019, 8012568. [Google Scholar] [CrossRef]
- ONF Newsletters. Software-Defined Networking (SDN) Definition. Available online: https://www.opennetworking.org/sdn-definition/ (accessed on 18 July 2020).
- Tang, T.A.; Mhamdi, L.; McLernon, D.; Zaidi, S.A.R.; Ghogho, M.; El Moussa, F. DeepIDS: Deep Learning Approach for Intrusion Detection in Software Defined Networking. Electronics 2020, 9, 1533. [Google Scholar] [CrossRef]
- Kreutz, D.; Ramos, F.M.; Verissimo, P. Towards secure and dependable software-defined networks. In Proceedings of the Second ACM SIGCOMM Workshop on Hot Topics in Software Defined Networking, Hong Kong, China, 16 August 2013; HotSDN ’13. Association for Computing Machinery: New York, NY, USA, 2013; pp. 55–60. [Google Scholar] [CrossRef]
- Okey, O.D.; Maidin, S.S.; Adasme, P.; Lopes Rosa, R.; Saadi, M.; Carrillo Melgarejo, D.; Zegarra Rodríguez, D. BoostedEnML: Efficient Technique for Detecting Cyberattacks in IoT Systems Using Boosted Ensemble Machine Learning. Sensors 2022, 22, 7409. [Google Scholar] [CrossRef]
- Tayyaba, S.K.; Shah, M.A.; Khan, O.A.; Ahmed, A.W. Software Defined Network (SDN) Based Internet of Things (IoT): A Road Ahead. In Proceedings of the International Conference on Future Networks and Distributed Systems, Cambridge, UK, 19–20 July 2017; ICFNDS ’17. Association for Computing Machinery: New York, NY, USA, 2017. [Google Scholar] [CrossRef]
- Vilalta, R.; Ciungu, R.; Mayoral, A.; Casellas, R.; Martinez, R.; Pubill, D.; Serra, J.; Munoz, R.; Verikoukis, C. Improving Security in Internet of Things with Software Defined Networking. In Proceedings of the 2016 IEEE Global Communications Conference (GLOBECOM), Washington, DC, USA, 4–8 December 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Sri vidhya, G.; Nagarajan, R. A novel bidirectional LSTM model for network intrusion detection in SDN-IoT network. Computing 2024, 106, 2613–2642. [Google Scholar] [CrossRef]
- Bera, S.; Misra, S.; Vasilakos, A.V. Software-Defined Networking for Internet of Things: A Survey. IEEE Internet Things J. 2017, 4, 1994–2008. [Google Scholar] [CrossRef]
- Tang, T.A.; Mhamdi, L.; McLernon, D.; Zaidi, S.A.R.; Ghogho, M. Deep learning approach for Network Intrusion Detection in Software Defined Networking. In Proceedings of the 2016 International Conference on Wireless Networks and Mobile Communications (WINCOM), Fez, Morocco, 26–29 October 2016; pp. 258–263. [Google Scholar] [CrossRef]
- Divekar, A.; Parekh, M.; Savla, V.; Mishra, R.; Shirole, M. Benchmarking datasets for Anomaly-based Network Intrusion Detection: KDD CUP 99 alternatives. In Proceedings of the 2018 IEEE 3rd International Conference on Computing, Communication and Security (ICCCS), Kathmandu, Nepal, 25–27 October 2018; pp. 1–8. [Google Scholar] [CrossRef]
- M. R., G.R.; Ahmed, C.M.; Mathur, A. Machine learning for intrusion detection in industrial control systems: Challenges and lessons from experimental evaluation. Cybersecurity 2021, 4, 27. [Google Scholar] [CrossRef]
- Dini, P.; Elhanashi, A.; Begni, A.; Saponara, S.; Zheng, Q.; Gasmi, K. Overview on Intrusion Detection Systems Design Exploiting Machine Learning for Networking Cybersecurity. Appl. Sci. 2023, 13, 7507. [Google Scholar] [CrossRef]
- Musa, U.S.; Chhabra, M.; Ali, A.; Kaur, M. Intrusion Detection System using Machine Learning Techniques: A Review. In Proceedings of the 2020 International Conference on Smart Electronics and Communication (ICOSEC), Trichy, India, 10–12 September 2020; pp. 149–155. [Google Scholar] [CrossRef]
- Aljabri, M.; Altamimi, H.S.; Albelali, S.A.; Al-Harbi, M.; Alhuraib, H.T.; Alotaibi, N.K.; Alahmadi, A.A.; Alhaidari, F.; Mohammad, R.M.A.; Salah, K. Detecting Malicious URLs Using Machine Learning Techniques: Review and Research Directions. IEEE Access 2022, 10, 121395–121417. [Google Scholar] [CrossRef]
- Htun, H.H.; Biehl, M.; Petkov, N. Survey of feature selection and extraction techniques for stock market prediction. Financ. Innov. 2023, 9, 26. [Google Scholar] [CrossRef]
- Patcha, A.; Park, J.M. An overview of anomaly detection techniques: Existing solutions and latest technological trends. Comput. Networks 2007, 51, 3448–3470. [Google Scholar] [CrossRef]
- Bace, R. An Introduction to Intrusion Detection and Assessment for System and Network Security Management; ICSA Intrusion Detection Systems Consortium Technical Report; ICSA, Inc.: Ashburn, VA, USA, 1999; pp. 236–241. [Google Scholar]
- Anderson, J.P. Computer Security Threat Monitoring and Surveillance; Technical Report; James P. Anderson Company: Kent, OH, USA, 1980. [Google Scholar]
- Sobh, T.S. Wired and wireless intrusion detection system: Classifications, good characteristics and state-of-the-art. Comput. Stand. Interfaces 2006, 28, 670–694. [Google Scholar] [CrossRef]
- Valeur, F.; Vigna, G.; Kruegel, C.; Kemmerer, R.A. Comprehensive approach to intrusion detection alert correlation. IEEE Trans. Dependable Secur. Comput. 2004, 1, 146–169. [Google Scholar] [CrossRef]
- Wu, S.X.; Banzhaf, W. The use of computational intelligence in intrusion detection systems: A review. Appl. Soft Comput. 2010, 10, 1–35. [Google Scholar] [CrossRef]
- Hoang, X.D.; Hu, J.; Bertok, P. A program-based anomaly intrusion detection scheme using multiple detection engines and fuzzy inference. J. Netw. Comput. Appl. 2009, 32, 1219–1228. [Google Scholar] [CrossRef]
- Elshoush, H.T.; Osman, I.M. Alert correlation in collaborative intelligent intrusion detection systems—A survey. Appl. Soft Comput. 2011, 11, 4349–4365. [Google Scholar] [CrossRef]
- Shanbhag, S.; Wolf, T. Accurate anomaly detection through parallelism. IEEE Netw. 2009, 23, 22–28. [Google Scholar] [CrossRef]
- Cannady, J.; Harrell, J. A comparative analysis of current intrusion detection technologies. In Proceedings of the Fourth Technology for Information Security Conference, Atlanta, GA, USA, 1 September 1996. [Google Scholar]
- Bejtlich, R. The Tao of Network Security Monitoring: Beyond Intrusion Detection; Pearson Education: London, UK, 2004. [Google Scholar]
- Han, B.; Yang, X.; Sun, Z.; Huang, J.; Su, J. OverWatch: A cross-plane DDoS attack defense framework with collaborative intelligence in SDN. Secur. Commun. Netw. 2018, 2018, 9649643. [Google Scholar] [CrossRef]
- Phan, T.V.; Gias, T.R.; Islam, S.T.; Huong, T.T.; Thanh, N.H.; Bauschert, T. Q-MIND: Defeating stealthy DoS attacks in SDN with a machine-learning based defense framework. In Proceedings of the 2019 IEEE Global Communications Conference (GLOBECOM), Waikoloa, HI, USA, 9–13 December 2019; pp. 1–6. [Google Scholar]
- Chen, Z.; Jiang, F.; Cheng, Y.; Gu, X.; Liu, W.; Peng, J. XGBoost classifier for DDoS attack detection and analysis in SDN-based cloud. In Proceedings of the 2018 IEEE International Conference on Big Data and Smart Computing (Bigcomp), Shanghai, China, 15–17 January 2018; pp. 251–256. [Google Scholar]
- Nikoloudakis, Y.; Kefaloukos, I.; Klados, S.; Panagiotakis, S.; Pallis, E.; Skianis, C.; Markakis, E.K. Towards a machine learning based situational awareness framework for cybersecurity: An SDN implementation. Sensors 2021, 21, 4939. [Google Scholar] [CrossRef]
- Gadze, J.D.; Bamfo-Asante, A.A.; Agyemang, J.O.; Nunoo-Mensah, H.; Opare, K.A.B. An investigation into the application of deep learning in the detection and mitigation of DDOS attack on SDN controllers. Technologies 2021, 9, 14. [Google Scholar] [CrossRef]
- Wani, A.; S, R.; Khaliq, R. SDN-based intrusion detection system for IoT using deep learning classifier (IDSIoT-SDL). CAAI Trans. Intell. Technol. 2021, 6, 281–290. [Google Scholar] [CrossRef]
- Muthanna, M.S.A.; Alkanhel, R.; Muthanna, A.; Rafiq, A.; Abdullah, W.A.M. Towards SDN-Enabled, Intelligent Intrusion Detection System for Internet of Things (IoT). IEEE Access 2022, 10, 22756–22768. [Google Scholar] [CrossRef]
- Ram, A.; Zeadally, S. An intelligent SDN-IoT enabled intrusion detection system for healthcare systems using a hybrid deep learning and machine learning approach. China Commun. 2024, 21, 1–21. [Google Scholar] [CrossRef]
- Bontemps, L.; Cao, V.L.; McDermott, J.; Le-Khac, N.A. Collective anomaly detection based on long short-term memory recurrent neural networks. In Proceedings of the International Conference on Future Data and Security Engineering, Can Tho City, Vietnam, 23–25 November 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 141–152. [Google Scholar]
- Tavallaee, M.; Bagheri, E.; Lu, W.; Ghorbani, A.A. A detailed analysis of the KDD CUP 99 data set. In Proceedings of the 2009 IEEE Symposium on Computational Intelligence for Security and Defense Applications, Ottawa, ON, Canada, 8–10 July 2009; pp. 1–6. [Google Scholar]
- McHugh, J. Testing intrusion detection systems: A critique of the 1998 and 1999 darpa intrusion detection system evaluations as performed by lincoln laboratory. ACM Trans. Inf. Syst. Secur. (TISSEC) 2000, 3, 262–294. [Google Scholar] [CrossRef]
- Tang, T.A.; Mhamdi, L.; McLernon, D.; Zaidi, S.A.R.; Ghogho, M. Deep recurrent neural network for intrusion detection in sdn-based networks. In Proceedings of the 2018 4th IEEE Conference on Network Softwarization and Workshops (NetSoft), Montreal, QC, Canada, 25–29 June 2018; pp. 202–206. [Google Scholar]
- Song, J.; Takakura, H.; Okabe, Y. Description of Kyoto University Benchmark Data. 2006. Available online: http://www.takakura.com/Kyoto_data/BenchmarkData-Description-v5.pdf (accessed on 15 March 2016).
- Haider, W.; Hu, J.; Slay, J.; Turnbull, B.P.; Xie, Y. Generating realistic intrusion detection system dataset based on fuzzy qualitative modeling. J. Netw. Comput. Appl. 2017, 87, 185–192. [Google Scholar] [CrossRef]
- Shiravi, A.; Shiravi, H.; Tavallaee, M.; Ghorbani, A.A. Toward developing a systematic approach to generate benchmark datasets for intrusion detection. Comput. Secur. 2012, 31, 357–374. [Google Scholar] [CrossRef]
- Sharafaldin, I.; Lashkari, A.H.; Ghorbani, A.A. Toward generating a new intrusion detection dataset and intrusion traffic characterization. ICISSp 2018, 1, 108–116. [Google Scholar]
- Koroniotis, N.; Moustafa, N.; Sitnikova, E.; Turnbull, B. Towards the development of realistic botnet dataset in the internet of things for network forensic analytics: Bot-iot dataset. Future Gener. Comput. Syst. 2019, 100, 779–796. [Google Scholar] [CrossRef]
- Panigrahi, R.; Borah, S. A detailed analysis of CICIDS2017 dataset for designing Intrusion Detection Systems. Int. J. Eng. Technol. 2018, 7, 479–482. [Google Scholar]
- A Realistic Cyber Defense Dataset (CSE-CIC-IDS2018). 2018. Available online: https://registry.opendata.aws/cse-cic-ids2018 (accessed on 8 August 2024).
- Firesmith, D. Virtualization via Containers. Línea. 2017. Available online: https://insights.sei.cmu.edu/sei_blog/2017/09/virtualization-via-containers.html (accessed on 8 August 2024).
- Why Docker. Available online: https://www.docker.com/why-docker/ (accessed on 8 August 2024).
- da Silva, V.G.; Kirikova, M.; Alksnis, G. Containers for virtualization: An overview. Appl. Comput. Syst. 2018, 23, 21–27. [Google Scholar] [CrossRef]
- Meadusani, S.R. Virtualization Using Docker Containers: For Reproducible Environments and Containerized Applications. Culminating Proj. Inf. Assur. 2018, 50, 1–79. [Google Scholar]
- Murray, A. Docker Container Security: Challenges and Best Practices. 2023. Available online: https://www.mend.io/resources/blog/docker-container-security/ (accessed on 8 February 2023).
- Lantz, B.; Heller, B.; McKeown, N. A network in a laptop: Rapid prototyping for software-defined networks. In Proceedings of the 9th ACM SIGCOMM Workshop on Hot Topics in Networks, Monterey, CA, USA, 20–21 October 2010; pp. 1–6. [Google Scholar]
- Mininet: An Instant Virtual Network on Your Laptop (or Other PC)—Mininet. 2020. Available online: http://mininet.org/ (accessed on 3 September 2024).
- Habibi Lashkari, A. CICFlowmeter-V4.0 (Formerly Known as ISCXFlowMeter) Is a Network Traffic Bi-Flow Generator and Analyser for Anomaly Detection. 2018. Available online: https://github.com/ISCX/CICFlowMeter (accessed on 3 September 2024).
- Peng, H.; Long, F.; Ding, C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1226–1238. [Google Scholar] [CrossRef] [PubMed]
- Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
# | Feature Name | Description |
---|---|---|
1 | Destination Port | Port number at the victim side |
2 | Flow Duration | Duration of the flow in Microsecond |
3 | Total Fwd Packets | Total packets in the forward direction |
4 | Total Backward Packets | Total packets in the backward direction |
5 | Total Length of Fwd Packets | Total size of packet in forward direction |
6 | Total Length of Bwd Packets | Total size of packet in backward direction |
7 | Fwd Packet Length Max | Maximum size of packet in forward direction |
8 | Fwd Packet Length Min | Minimum size of packet in forward direction |
9 | Fwd Packet Length Mean | Mean size of packet in forward direction |
10 | Fwd Packet Length Std | Standard deviation size of packet in forward direction |
11 | Bwd Packet Length Max | Maximum size of packet in backward direction |
12 | Bwd Packet Length Min | Minimum size of packet in backward direction |
13 | Bwd Packet Length Mean | Mean size of packet in backward direction |
14 | Bwd Packet Length Std | Standard deviation size of packet in backward direction |
15 | Flow Bytes/s | Number of flow packets per second |
16 | Flow Packets/s | Number of flow bytes per second |
17 | Flow IAT Mean | Mean time between two packets sent in the flow |
18 | Flow IAT Std | Standard deviation time between two packets sent in the flow |
19 | Flow IAT Max | Maximum time between two packets sent in the flow |
20 | Flow IAT Min | Minimum time between two packets sent in the flow |
21 | Fwd IAT Total | Total time between two packets sent in the forward direction |
22 | Fwd IAT Mean | Mean time between two packets sent in the forward direction |
23 | Fwd IAT Std | Standard deviation time between two packets sent in the forward direction |
24 | Fwd IAT Max | Maximum time between two packets sent in the forward direction |
25 | Fwd IAT Min | Minimum time between two packets sent in the forward direction |
26 | Bwd IAT Total | Total time between two packets sent in the backward direction |
27 | Bwd IAT Mean | Mean time between two packets sent in the backward direction |
28 | Bwd IAT Std | Standard deviation time between two packets sent in the backward direction |
29 | Bwd IAT Max | Maximum time between two packets sent in the backward direction |
30 | Bwd IAT Min | Minimum time between two packets sent in the backward direction |
31 | Fwd PSH Flags | Number of times the PSH flag was set in packets travelling in the forward direction (0 for UDP) |
32 | Bwd PSH Flags | Number of times the PSH flag was set in packets travelling in the backward direction (0 for UDP) |
33 | Fwd URG Flags | Number of times the URG flag was set in packets travelling in the forward direction (0 for UDP) |
34 | Bwd URG Flags | Number of times the URG flag was set in packets travelling in the backward direction (0 for UDP) |
35 | Fwd Header Length | Total bytes used for headers in the forward direction |
36 | Bwd Header Length | Total bytes used for headers in the backward direction |
37 | Fwd Packets/s | Number of forward packets per second |
38 | Bwd Packets/s | Number of backward packets per second |
39 | Min Packet Length | Minimum length of a packet |
40 | Max Packet Length | Maximum length of a packet |
41 | Packet Length Mean | Mean length of a packet |
42 | Packet Length Std | Standard deviation length of a packet |
43 | Packet Length Variance | Variance length of a packet |
44 | FIN Flag Count | Number of packets with FIN |
45 | SYN Flag Count | Number of packets with SYN |
46 | RST Flag Count | Number of packets with RST |
47 | PSH Flag Count | Number of packets with PSH |
48 | ACK Flag Count | Number of packets with ACK |
49 | URG Flag Count | Number of packets with URG |
50 | CWR Flag Count | Number of packets with CWR |
51 | ECE Flag Count | Number of packets with ECE |
52 | Down/Up Ratio | Download and upload ratio |
53 | Average Packet Size | Average size of packet |
54 | Avg Fwd Segment Size | Average size observed in the forward direction |
55 | Avg Bwd Segment Size | Average number of bytes bulk rate in the backward direction |
56 | Fwd Header Length | Length of the forward packet header |
57 | Fwd Avg Bytes/Bulk | Average number of bytes bulk rate in the forward direction |
58 | Fwd Avg Packets/Bulk | Average number of packets bulk rate in the forward direction |
59 | Fwd Avg Bulk Rate | Average number of bulk rate in the forward direction |
60 | Bwd Avg Bytes/Bulk | Average number of bytes bulk rate in the backward direction |
61 | Bwd Avg Packets/Bulk | Average number of packets bulk rate in the backward direction |
62 | Bwd Avg Bulk Rate | Average number of bulk rate in the backward direction |
63 | Subflow Fwd Packets | The average number of packets in a sub flow in the forward direction |
64 | Subflow Fwd Bytes | The average number of bytes in a sub flow in the forward direction |
65 | Subflow Bwd Packets | The average number of packets in a sub flow in the backward direction |
66 | Subflow Bwd Bytes | The average number of bytes in a sub flow in the backward direction |
67 | Init_Win_bytes_forward | The total number of bytes sent in initial window in the forward direction |
68 | Init_Win_bytes_backward | The total number of bytes sent in initial window in the backward direction |
69 | act_data_pkt_fwd | Count of packets with at least 1 byte of TCP data payload in the forward direction |
70 | min_seg_size_forward | Minimum segment size observed in the forward direction |
71 | Active Mean | Mean time a flow was active before becoming idle |
72 | Active Std | Standard deviation time a flow was active before becoming idle |
73 | Active Max | Maximum time a flow was active before becoming idle |
74 | Active Min | Minimum time a flow was active before becoming idle |
75 | Idle Mean | Mean time a flow was idle before becoming active |
76 | Idle Std | Standard deviation time a flow was idle before becoming active |
77 | Idle Max | Maximum time a flow was idle before becoming active |
78 | Idle Min | Minimum time a flow was idle before becoming active |
Correlation Type | F1 | F2 | F3 | F4 | F5 |
---|---|---|---|---|---|
Pearson (Rho) | −0.5098 | −0.0509 | −0.3215 | −0.3197 | −0.356 |
Pearson (Pval) | 0 | 0 | 0 | 0 | 0 |
Spearman (Rho) | −0.2372 | 0.4457 | −0.4415 | −0.3237 | −0.4306 |
Spearman (Pval) | 0 | 0 | 0 | 0 | 0 |
Correlation Type | F6 | F7 | F8 | F9 | F10 |
---|---|---|---|---|---|
Pearson (Rho) | −0.3197 | −0.0020 | −0.3569 | −0.1207 | −0.1294 |
Pearson (Pval) | 0 | 0.3337 | 0 | 0 | 0 |
Spearman (Rho) | −0.3237 | 0.4241 | −0.4306 | 0.04220 | −0.2189 |
Spearman (Pval) | 0 | 0 | 0 | 0 | 0 |
Naive Bayes | KNN—Euclidean | KNN—Correlation |
---|---|---|
Distribution: Gaussian Prior: empirical Kernel: normal Cross-validation: false | Distance: ‘euclidean’ NumNeighbors: 5 | Distance: ‘Correlation’ NumNeighbors: 5 |
ML Type | Accuracy | Classes | TP Rate | FP Rate | Precision | F1 Score |
---|---|---|---|---|---|---|
Naive Bayes | 98.52% | Benign | 0.9663 | 0.0004 | 0.9995 | 0.9826 |
DDoS Attack | 0.9996 | 0.0337 | 0.9749 | 0.9871 | ||
KNN—Euclidean | 99.99% | Benign | 0.9998 | 0.0001 | 0.9999 | 0.9999 |
DDoS Attack | 0.9999 | 0.0002 | 0.9999 | 0.9999 | ||
KNN—Correlation | 99.99% | Benign | 0.9999 | 0 | 1 | 0.9999 |
DDoS Attack | 1 | 0.0001 | 0.9999 | 0.9999 |
ML Type | Accuracy | Classes | TP Rate | FP Rate | Precision | F1 Score |
---|---|---|---|---|---|---|
Naive Bayes * | 95.95% | Benign | 0.9987 | 0.0977 | 0.9375 | 0.9671 |
DoS Slowloris | 0.4194 | 0.0001 | 0.9771 | 0.5868 | ||
DoS Hulk | 0.9444 | 0.001 | 0.9982 | 0.9706 | ||
DoS GoldenEye | 0.2488 | 0.0002 | 0.9646 | 0.3955 | ||
KNN—Euclidean | 99.94% | Benign | 0.9994 | 0.0005 | 0.9997 | 0.9995 |
DoS Slowloris | 0.9943 | 0.0001 | 0.9933 | 0.9938 | ||
DoS Hulk | 0.9996 | 0.0004 | 0.9994 | 0.9995 | ||
DoS GoldenEye | 0.9954 | 0.0001 | 0.9926 | 0.994 | ||
KNN—Correlation | 99.92% | Benign | 0.9991 | 0.0006 | 0.9996 | 0.9994 |
DoS Slowloris | 0.9947 | 0.0001 | 0.9922 | 0.9934 | ||
DoS Hulk | 0.9995 | 0.0005 | 0.9991 | 0.9993 | ||
DoS GoldenEye | 0.9954 | 0.0002 | 0.9903 | 0.9929 |
DataSet | # of Feature | SDN Controller | Testbed Env. | IDS Domain | Labeled | Balanced |
---|---|---|---|---|---|---|
KDD’99 [11,36] | 41 | None | TCP/IP suite | packet-base | Yes | No |
NSL-KDD [37] | 41 | None | TCP/IP suite | packet-base | Yes | No |
CICIDS 2017 [45] | 83 | None | TCP/IP suite | packet-base | Yes | No |
CIC-DDoS 2019 [46] | 83 | None | TCP/IP suite | packet-base | Yes | No |
This Work | 83 | POX | Mininet | Flow, packet-base | Yes | No |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
AlSharman, S.A.; Al-Khaleel, O.; Al-Ayyoub, M. A Detailed Inspection of Machine Learning Based Intrusion Detection Systems for Software Defined Networks. IoT 2024, 5, 756-784. https://doi.org/10.3390/iot5040034
AlSharman SA, Al-Khaleel O, Al-Ayyoub M. A Detailed Inspection of Machine Learning Based Intrusion Detection Systems for Software Defined Networks. IoT. 2024; 5(4):756-784. https://doi.org/10.3390/iot5040034
Chicago/Turabian StyleAlSharman, Saif AlDeen, Osama Al-Khaleel, and Mahmoud Al-Ayyoub. 2024. "A Detailed Inspection of Machine Learning Based Intrusion Detection Systems for Software Defined Networks" IoT 5, no. 4: 756-784. https://doi.org/10.3390/iot5040034
APA StyleAlSharman, S. A., Al-Khaleel, O., & Al-Ayyoub, M. (2024). A Detailed Inspection of Machine Learning Based Intrusion Detection Systems for Software Defined Networks. IoT, 5(4), 756-784. https://doi.org/10.3390/iot5040034