Smart Home IoT Network Risk Assessment Using Bayesian Networks
Abstract
:1. Introduction
2. Smart Home IoT Networks
- Access Point: The attacker is positioned at this point.
- Management Level: This level consists of devices that give orders such as laptops and cellphones (CP).
- Router: The router that all devices communicate with.
- Field Devices Level: At this level, all the devices related to the automation of the smart home are located, such as shutters, smart lights, air conditioner and smart cameras.
- Wi-Fi Network 1: Provides connection between access point and management level.
- Wi-Fi Network 2: Provides connection between management level and router.
- Wi-Fi Network 3: Provides connection between router and field devices level.
3. Methods
3.1. Bayesian Networks
- Probability of evidence: for .
- Posterior marginals: Let , be evidence of the Bayesian network, posterior mar-ginals are .
- Most probable explanation: Let , be evidence of the Bayesian network, most probable explanation are such that is maximum.
3.2. Simulation
- A value of 0 if ;
- A value of 1 if .
- (0,0) if ;
- (0,1) if ;
- (1,0) if ; or
- (1,1) if .
- (0,0,0) if ;
- (0,0,1) if ;
- (0,1,0) if ;
- (0,1,1) if ;
- (1,0,0) if ;
- (1,0,1) if
- (1,1,0) if ; or
- (1,1,1) if .
4. Model Implementation
4.1. Bayesian Network Structure
4.1.1. Attack Graph
4.1.2. Directed Acyclic Graph
- SECP: A social engineering attack is performed from the access point to the management level, which consists of a cellphone.
- PHCP: A phishing attack is performed from the access point to the management level.
- MICP: A malware injection attack is performed from the access point to the management level.
- DOSR: A DoS attack is performed from management level to the router.
- RTPR: A routing table poisoning attack is performed from management level to the router.
- PAR: A persistent attack is performed from management level to the router.
- DOSF: A DoS attack is performed from the router to the field devices level.
- MITMF: A MitM attack is performed from the router to the field devices level.
4.2. Attack Simulation
4.2.1. Attack Selection
- Attack selection at the administration level: where —a phishing attack is selected and —a malware attack is selected.
- Attack selection at the router level: where —a DoS attack is selected, —an RTP attack is selected and : a persistent attack is selected.
- Attack selection at field devices level: where —a DoS attack is selected and —a MitM attack is selected.
4.2.2. Simulation Scenarios
- Scenario 1: At the administration and field devices level, for simple attacks, i.e., when the vectors are (0,1) and (1,0), an equal probability of 0.45 is considered. At the router level, a probability of 0.25 is considered for simple attacks, i.e., (0,0,1), (0,1,0) and (1,0,0); and a probability of 0.07 is considered for double attacks, i.e., (0,1,1), (1,0,1) and (1,1,0).
- Scenarios 2 and 4: For both scenarios, the effect of assigning a higher probability to than to is analyzed at the management level. At the router level, for simple attacks, the order from highest to lowest probability is considered as follows: , and . For double attacks, the order from highest to lowest probability is considered as follows: and , then and , and finally, and . For the field devices level, is considered to have higher probability than . The difference between Scenario 4 and Scenario 2 is that in the second scenario, a medium incidence is assigned and in the fourth, a high incidence is assigned to the attacks with the highest probability at each level.
- Scenarios 3 and 5: For both scenarios, the effect of assigning a higher probability to than to is analyzed at the management level. At the router level, for simple attacks, the order from highest to lowest probability is considered as follows: , and ; and for double attacks: and , then and and finally and . For the field devices level, is considered to have higher probability than . The difference between Scenario 3 and Scenario 5 is that in the third scenario, a medium incidence is assigned and in the fifth, a high incidence is assigned to the attacks with the highest probability at each level.
4.2.3. Vulnerabilities and CVSS Scores
- Attack Vector (AV): Indicates the context in which the exploitation of vulnerabilities is possible. The base score is higher the more remote the attack is. The values it can take are: Network (N), Adjacent (A), Local (L) and Physical (P).
- Attack Complexity (AC): Describes the conditions beyond the attacker’s control that must be in place to complete the attack. The base score is higher for less complex attacks. The values it can take are: Low (L) and High (H).
- User Interaction (UI): Determines whether the vulnerability can be exploited solely at the will of the attacker, or whether a user must participate in some way. The base score is higher when no user interaction is required. The values it can take are: None (N) and Required (R).
- Privileges Required (PR): Describes the privileges an attacker must have before exploiting a vulnerability. The base score is higher if no privileges are required. The values it can take are: None (N), Low (L) and High (H).
- Scope (S): Determines if a vulnerability affects resources in components beyond its security scope. The base score is higher if the scope is modified. The values it can take are: Changed (C) and Unchanged (U).
- Confidentiality (C): Measures the impact on the confidentiality of information due to a successfully exploited vulnerability. Confidentiality refers to limiting access and disclosure of information to authorized users only, as well as preventing access or disclosure to unauthorized persons. The base score is higher when the loss of the affected component is higher. The values it can take are: None (N), Low (L) and High (H).
- Integrity (I): Measures the integrity impact of a successfully exploited vulnerability. Integrity refers to the reliability and truthfulness of information. It mainly refers to the modification of data and the type of control that the attacker has in the modification of the information. The base score is higher when the consequence for the affected component is higher. The values it can take are: None (N), Low (L) and High (H).
- Availability (A): Measures the impact on the availability of the affected component as a result of a successfully exploited vulnerability. This metric refers to the loss of availability of the affected component itself, such as a network service (web, database, email). The base score is higher when the consequence for the affected component is higher. The values it can take are: None (N), Low (L) and High (H).
4.2.4. Simulation Algorithm
Algorithm 1. Attack Simulation. |
Else |
Else |
Else |
4.3. Bayesian Network Parametrization
- ;
- ;
- ;
- ;
- ;
- ;
- ; and
- .
4.4. Inferences for Risk Assessment
- Probabilities of evidence: , and .
- Posterior marginals: Considering as evidence, with ; , , and is calculated. , , is also calculated for all values of PHCP and MICP.
- Most probable explanations: For the DoS attack, the values of SECP, PHCP, MICP, DOSR, RTPR and PAR are determined, such that: is maximum. For the MitM attack, the values of SECP, PHCP, MICP, DOSR, RTPR and PAR are determined, such that: is maximum.
4.5. Attack Impact
4.6. Risk Assessment
5. Results
6. Discussion
7. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Rajab, H.; Cinkelr, T. IoT based smart cities. In Proceedings of the 2018 International Symposium on Networks, Computers and Communications (ISNCC), Rome, Italy, 12 November 2018. [Google Scholar]
- Oliva, S.L.; Palmieri, A.; Invidia, L.; Patrono, L.; Rametta, P. Rapid Prototyping of a Star Topology Network based on Bluetooth Low Energy Technology. In Proceedings of the 2018 3rd International Conference on Smart and Sustainable Technologies (SpliTech), Split, Croatia, 26–29 June 2018; pp. 1–6. [Google Scholar]
- Taştan, S.İ.; Dalkiliç, G. Smart Home System Using Internet of Things Devices and Mesh Topology. In Proceedings of the 2021 6th International Conference on Computer Science and Engineering (UBMK), Ankara, Turkey, 13 October 2021; pp. 407–412. [Google Scholar] [CrossRef]
- Johnson, D.; Ketel, M. IoT: The Interconnection of Smart Cities. In Proceedings of the 2019 SoutheastCon, Huntsville, AL, USA, 11–14 April 2019; pp. 1–2. [Google Scholar] [CrossRef]
- Kaiborta, A.K.; Samal, S. IoT based Voice Assistant for Home Automation. In Proceedings of the 2022 4th International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, India, 20–22 January 2022; pp. 165–172. [Google Scholar] [CrossRef]
- Gavra, V.-D.; Po, O.A.P. Usage of ZigBee and LoRa wireless technologies in IoT systems. In Proceedings of the 2020 IEEE 26th International Symposium for Design and Technology in Electronic Packaging (SIITME), Pitesti, Romania, 21–24 October 2020; pp. 221–224. [Google Scholar] [CrossRef]
- Unwala, I.; Taqvi, Z.; Lu, J. Thread: An IoT Protocol. In Proceedings of the 2018 IEEE Green Technologies Conference (GreenTech), Austin, TX, USA, 4–6 April 2018; pp. 161–167. [Google Scholar] [CrossRef]
- Boualouache, A.E.; Nouali, O.; Moussaoui, S.; Derder, A. A BLE-based data collection system for IoT. In Proceedings of the 2015 First International Conference on New Technologies of Information and Communication (NTIC), Mila, Algeria, 8–9 November 2015; pp. 1–5. [Google Scholar] [CrossRef]
- Varol, A.B. Compilation of Data Link Protocols: Bluetooth Low Energy (BLE), ZigBee and Z-Wave. In Proceedings of the 2019 4th International Conference on Computer Science and Engineering (UBMK), Samsun, Turkey, 21 November 2019; pp. 85–90. [Google Scholar] [CrossRef]
- Unwala, I.; Taqvi, Z.; Lu, J. IoT Security: ZWave and Thread. In Proceedings of the 2018 IEEE Green Technologies Conference (GreenTech), Austin, TX, USA, 7 June 2018; pp. 176–182. [Google Scholar] [CrossRef]
- Albalawi, A.; Almrshed, A.; Badhib, A.; Alshehri, S. A Survey on Authentication Techniques for the Internet of Things. In Proceedings of the 2019 International Conference on Computer and Information Sciences (ICCIS), Sakaka, Saudi Arabia, 16 May 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Kandasamy, K.; Srinivas, S.; Achuthan, K.; Rangan, V.P. IoT cyber risk: A holistic analysis of cyber risk assessment frameworks, risk vectors, and risk ranking process. EURASIP J. Inf. Secur. 2020, 2020, 8. [Google Scholar] [CrossRef]
- Jacobsson, A.; Boldt, M.; Carlsson, B. A risk analysis of a smart home automation system. Future Gener. Comput. Syst. 2016, 56, 719–733. [Google Scholar] [CrossRef] [Green Version]
- Kabir, S.; Papadopoulos, Y. Applications of bayesian networks and petri nets in safety, reliability, and risk assessments: A review. Saf. Sci. 2019, 115, 154–175. [Google Scholar] [CrossRef]
- Sommestad, T.; Sandström, F. An empirical test of the accuracy of an attack graph analysis tool. Inf. Comput. Secur. 2015, 23, 516–531. [Google Scholar] [CrossRef]
- Zeng, J.; Wu, S.; Chen, Y.; Zeng, R.; Wu, C. Survey of attack graph analysis methods from the perspective of data and knowledge processing. Secur. Commun. Netw. 2019, 2019, 1–16. [Google Scholar] [CrossRef] [Green Version]
- Wu, J.; Yin, L.; Guo, Y. Cyber attacks prediction model based on bayesian network. In Proceedings of the 2012 IEEE 18th International Conference on Parallel and Distributed Systems, Singapore, 17–19 December 2012; pp. 730–731. [Google Scholar]
- Liu, Y.; Man, H. Network vulnerability assessment using Bayesian networks. In Data Mining, Intrusion Detection, Information Assurance, and Data Networks Security 2005; Dasarathy, B.V., Ed.; International Society for Optics and Photonics, SPIE: Bellingham, WA, USA, 2005; Volume 5812, pp. 61–71. [Google Scholar]
- Frigault, M.; Wang, L. Measuring network security using bayesian network-based attack graphs. In Proceedings of the 2008 32nd Annual IEEE International Computer Software and Applications Conference, Turku, Finland, 28 July 2008–1 August 2008; pp. 698–703. [Google Scholar]
- Frigault, M.; Wang, L.; Singhal, A.; Jajodia, S. Measuring network security using dynamic bayesian network. In Proceedings of the 4th ACM Workshop on Quality of Protection, QoP’ 08, Alexandria, VA, USA, 27 October 2008; Association for Computing Machinery: New York, NY, USA, 2008; pp. 23–30. [Google Scholar]
- Muñoz-González, L.; Sgandurra, D.; Barrère, M.; Lupu, E.C. Exact inference techniques for the analysis of bayesian attack graphs. IEEE Trans. Dependable Secur. Comput. 2019, 16, 231–244. [Google Scholar] [CrossRef] [Green Version]
- Muñoz-González, L.; Sgandurra, D.; Paudice, A.; Lupu, E.C. Efficient attack graph analysis through approximate inference. ACM Trans. Priv. Secur. 2017, 20, 1–30. [Google Scholar] [CrossRef] [Green Version]
- Chockalingam, S.; Pieters, W.; Teixeira, A.; van Gelder, P. Bayesian Network Models in Cyber Security: A Systematic Review; Secure IT Systems; Lipmaa, H., Mitrokotsa, A., Matulevicius, R., Eds.; Springer International Publishing: Cham, The Switzerland, 2017; pp. 105–122. [Google Scholar]
- Bastos, D.; Shackleton, M.; El-Moussa, F. Internet of things: A survey of technologies and security risks in smart home and city environments. In Living in the Internet of Things: Cybersecurity of the IoT—2018; Institution of Engineering and Technology: London, UK, 2018. [Google Scholar]
- Ibrahim, M.; Nabulsi, I. Security analysis of smart home systems applying attack graph. In Proceedings of the 2021 Fifth World Conference on Smart Trends in Systems Security and Sustainability (WorldS4), London, UK, 29–30 July 2021. [Google Scholar]
- Darwiche, A. Modeling and Reasoning with Bayesian Networks; Cambridge University Press: Cambridge, UK, 2009. [Google Scholar]
- Koller, D.; Friedman, N. Adaptive Computation and Machine Learning series. In Probabilistic Graphical Models: Principles and Techniquesi; MIT Press: Cambridge, MA, USA, 2009. [Google Scholar]
- Scutari, M.; Denis, J.-B. Bayesian Networks: With Examples in R; Chapman and Hall/CRC: Boca Raton, FL, USA, 2021. [Google Scholar]
- Ross, S. Simulation; Elsevier: Amsterdam, The Netherlands, 2013. [Google Scholar]
- Common Vulnerability Scoring System Version 3.1 Specification Document. Available online: https://www.first.org/cvss/v3.1/specification-document (accessed on 14 January 2022).
- Common Vulnerability Scoring System Version 3.1 Calculator. Available online: https://www.first.org/cvss/calculator/3.1 (accessed on 14 January 2022).
- Aksu, M.U.; Dilek, M.H.; Tatli, E.I.; Bicakci, K.; Dirik, H.I.; Demirezen, M.U.; Aykir, T. A quantitative CVSS-based cyber security risk assessment methodology for IT systems. In Proceedings of the 2017 International Carnahan Conference on Security Technology (ICCST), Madrid, Spain, 23–26 October 2017. [Google Scholar]
Variables | Scenarios | |||||
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
0 | 0 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |
0 | 1 | 0.45 | 0.60 | 0.30 | 0.75 | 0.15 |
1 | 0 | 0.45 | 0.30 | 0.60 | 0.15 | 0.75 |
1 | 1 | 0.10 | 0.10 | 0.10 | 0.10 | 0.10 |
Variables | Scenarios | ||||||
---|---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | |||
0 | 0 | 0 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |
0 | 0 | 1 | 0.25 | 0.20 | 0.25 | 0.15 | 0.25 |
0 | 1 | 0 | 0.25 | 0.25 | 0.30 | 0.25 | 0.35 |
0 | 1 | 1 | 0.07 | 0.06 | 0.08 | 0.05 | 0.09 |
1 | 0 | 0 | 0.25 | 0.30 | 0.20 | 0.35 | 0.15 |
1 | 0 | 1 | 0.07 | 0.07 | 0.06 | 0.07 | 0.05 |
1 | 1 | 0 | 0.07 | 0.08 | 0.07 | 0.09 | 0.07 |
1 | 1 | 1 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 |
Variables | Scenarios | |||||
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
0 | 0 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |
0 | 1 | 0.45 | 0.30 | 0.60 | 0.15 | 0.75 |
1 | 0 | 0.45 | 0.60 | 0.30 | 0.75 | 0.15 |
1 | 1 | 0.10 | 0.10 | 0.10 | 0.10 | 0.10 |
Node | AV | AC | UI | PR | S | C | I | A | CVSS | Parameter |
---|---|---|---|---|---|---|---|---|---|---|
SECP | N | L | N | N | C | H | N | N | 8.6 | 0.86 |
PHCP | N | L | R | N | C | H | L | N | 8.2 | 0.82 |
MICP | N | L | R | N | C | H | L | L | 8.8 | 0.88 |
DOSR | N | L | N | H | C | H | L | H | 9.0 | 0.90 |
RTPR | N | L | N | H | C | H | L | H | 9.0 | 0.90 |
PAR | N | L | N | H | C | H | L | H | 9.0 | 0.90 |
DOSF | N | H | N | N | C | H | H | H | 9.0 | 0.90 |
MITMF | A | H | R | L | C | H | H | H | 7.6 | 0.76 |
Parameters | Probabilities |
---|---|
Node | C | I | A | CR | IR | AR | RL | Impact |
---|---|---|---|---|---|---|---|---|
SECP | H | N | N | M | L | L | U | 20.00 |
PHCP | H | L | N | M | L | L | OF | 21.25 |
MICP | H | L | L | M | M | M | OF | 34.00 |
DOSR | H | L | H | H | M | H | W | 72.83 |
RTPR | H | L | H | H | M | M | W | 60.17 |
PAR | H | L | H | H | L | M | W | 55.42 |
DOSF | H | H | H | H | M | H | U | 86.67 |
MITMF | H | H | H | M | M | H | U | 73.33 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Flores, M.; Heredia, D.; Andrade, R.; Ibrahim, M. Smart Home IoT Network Risk Assessment Using Bayesian Networks. Entropy 2022, 24, 668. https://doi.org/10.3390/e24050668
Flores M, Heredia D, Andrade R, Ibrahim M. Smart Home IoT Network Risk Assessment Using Bayesian Networks. Entropy. 2022; 24(5):668. https://doi.org/10.3390/e24050668
Chicago/Turabian StyleFlores, Miguel, Diego Heredia, Roberto Andrade, and Mariam Ibrahim. 2022. "Smart Home IoT Network Risk Assessment Using Bayesian Networks" Entropy 24, no. 5: 668. https://doi.org/10.3390/e24050668
APA StyleFlores, M., Heredia, D., Andrade, R., & Ibrahim, M. (2022). Smart Home IoT Network Risk Assessment Using Bayesian Networks. Entropy, 24(5), 668. https://doi.org/10.3390/e24050668