Tackling Verification and Validation Techniques to Evaluate Cyber Situational Awareness Capabilities
Abstract
:1. Introduction
- In-depth review of the state-of-the-art research in military CSA, as well as the evaluation methodologies for assessing the cyber defence tools for CSA acquisition, highlighting needs, technological gaps, challenges and opportunities.
- Introducing a core set of evaluation concepts and a testing workflow for validating CSA related tools.
- Proposing novel verification and validation guidelines for mission-centric assessing of the vertical/horizontal propagation of cyber threats between cyberspace and the mission plane.
- Describing CSA tool-evaluation guidelines and suggesting a reference questionnaire template.
2. A Review of The State-of-the-Art Research
2.1. Situational Awareness in Cyber-Physical Systems
- Observe. Perception of the operational environment through the aggregation and fussing of data from diverse data sources, directly collected by sensors watching over cyberspace or physical features, or gathered from social-cognitive sources (media, etc.) and cyber-physical intelligence repositories or entities.
- Orient. The deduction of additional knowledge useful for the in-depth understanding of the operational, where diagnosis, prediction, simulation and related adaptive machine-learning capabilities shall allow identifying cyber, physical and combined threats; as well as anticipating their evolution at different time horizons.
- Decide. Supports decision-making and planning/conducting first-response actuations on the identified threats by taking advantage of the analytical capabilities previously adopted at the Orient step of the OODA loop. The suggested decisions will provide predictions and simulations able to define anticipatory actuations and preliminarily assess the impact of the courses of action to be conducted.
- Act. Reactive/proactive responses to be conducted will take advantage of both technological enablers that take part of the protected tunnel infrastructure and those carried by first responders. Consequently, conventional actuators such as fire extinguishers, ventilation systems, signalling devices, etc., shall be combined with next-generation capabilities, e.g., unmanned vehicles, wearable devices or virtual network functions (VNF).
2.2. Managing Cyber Incidences on Military Operations
2.3. Situational Awareness Assessment
- SA requirement analysis. Requirement analysis will identify the components that commanders aquire in situation awareness in specific operational contexts and domains. This analysis is thought of as the preliminary action for all SA assessment actions. It may also involve questionnaires, goal-directed task analysis, and interviews with subject-matter experts [43].
- Freeze probe analysis. Freeze probe analysis is based on freezing or temporally interrupting the testing activities for conducting direct queries to participants regarding their understanding of the operational domain. The responses are contrasted with a sequential ground truth, which allows quantitative/qualitative measurement of the acquired operational picture. The main benefit of this procedure is its direct nature, thus allowing gathering fresh feedback from the operational domain. However, the freezes may disrupt the conducted tasks, which may alter the forthcoming results. The situation awareness global assessment technique (SAGAT) [44] has been consolidated as one of the most effective freeze probe analysis techniques, and is used in many different contexts, such as military aviation, air traffic control, military operations, driving, industrial processes, etc. [45].
- Real-time probe techniques. In contrast with the freeze probing analysis methods, real-time probe techniques query users without interrupting their tasks. For this purpose, non-intrusive questionnaires must be preliminarily developed by subject-matter experts, which indicate the topics to be considered, and the time in which questions should be conducted. There is controversy of the “non-intrusively” property of this approach, since most queries may direct attract the participants’ attention and potentially bias the data. The the situation present assessment method (SPAM) is the most widely applied and adapted real-time probe technique [46], which was preliminarily conceived for assessing the situational awareness acquired by air traffic controllers.
- Self-rating techniques. As a variation of the aforementioned methods, self-rating techniques assess the commander understanding of the operational picture through a rating scale. These questionnaires are typically conducted post-tests, so they are not intrusive regarding the tasks to be conducted. As well as not disturbing the participants during the test execution, they are easy to administer. However, they are criticized from two perspectives: first, since the questionnaires are not conducted online, they may entail poor recall and bypass sensitivity/stress variables. On the other hand, the participants may misunderstand the situation or tamper with the self-rating scores. The situation awareness rating technique (SART) is the best self-rating method in the state bibliography [3], which is administered post-trial. SART has 10 dimensions, each of which is rated on a seven-point scale.
- Observer rating techniques. The most popular field rating methods use observers who are typically subject-matter specialists. The fact that these methods are non-intrusive and adaptable to real-world settings is their fundamental benefit. It is debatable if the subject-matter expert can grade the learned situation awareness appropriately. For example, within this category, one extensively used observer evaluation method for infantry men is the situation awareness behavioral rating scale (SABARS) [47]. This method consists of 28 observed behaviors that are scored on a five-point scale.
- Performance measures. Situation awareness can be measured indirectly using performance indicators. These metrics can range from hits and misses to false alarms, accurate rejection, and response time, depending on the task and circumstance. The relationship between performance and situation awareness is not always evident, despite the fact that performance metrics are frequently simple-to-use and non-intrusive [5]. This is due to the fact that these indicators may be directly provided by automatism.
- Process indices. Process indices are based on assessing behavioral and/or biometrical features that may reveal the participant condition when decision making (e.g., stress, confidence, etc.). For example, this is addressed in [48] by eye-tracking on the SA operator’s gaze. The primary criticisms of this strategy are supported by the fact that the recorded data only document what participants observed, not what they perceived [49].
- Establishing metrics that only measure the object the procedure is meant to assess.
- Employing sensitivity and diagnostic techniques to provide the necessary understanding.
- Using a probing technique that is balanced for each specific purpose.
- During the procedure, the construct should not be altered significantly.
2.4. Verification and Validation of the Acquired Cyber Situational Awareness
3. Verification and Validation Frameworks
3.1. Assumptions and Problem Statement
- What is the effect on the environment that is desired?
- What would the behaviour of the machine on the boundary (interface) be that would achieve the desired effect on the environment?
- What would the internal behaviour and structure of the machine be that would lead to the needed behaviour on the boundary?
- Functional requirements, related to the capability or functions required to directly support users in completing their goals and missions.
- Non-functional requirements emerge as system requirements to suit the users’ functional needs, and are often implicit and technical in nature. Examples include service quality, availability, timeliness, and accuracy.
- Measures of effectiveness (MOEs), i.e., metrics of mission success from the perspective of the stakeholders.
- Measures of performance (MOPs) used to assess whether the system satisfies the performance standards required to fulfill the MOE.
- Key performance parameters (KPPs) or indicators (KPIs) defined by stakeholders as measures of a minimal and critical system or project performance and level of acceptance.
- Validation: am I building the right product (according to end-user expectations)?
- Verification: am I building the product correctly?
3.2. Evaluation Model and Core Concepts
3.3. Limitations of the Proposal and Conducted Research
- The issues of responsible and safe data management are not specifically addressed by the concepts proposed. Therefore, prior to adoption, methods to enforce privacy and data protection shall be studied independently.
- The suggested methodology provides a broad overview but will need to be customized for the unique characteristics of each application scenario. The information presented above should help analysts evaluate and adjust cyber defense capabilities and the operational context in which they will be used.
- Time and resource restrictions have not been taken into account. Therefore, before adoption in situations with limited resources, a suitable frugal adaptation may be required.
- Prior research on the subject of cyber situational awareness from a mission-centric approach is seriously lacking fo varied and strong contributions. This indicates that the proposal’s theoretical underpinnings are unstable, and, as a result, the proposal may need to be modified when the scientific community in this area of study makes further progress.
- The conducted research skips over potential adversarial tactics that could sabotage the proposed evaluation activities. These may range from direct assaults on sensors that monitor performance indicators to poisoning threats against human–computer interfaces.
3.4. Early Identified Operational Risks
- The sensors used, as well as how they communicate internally, may have an impact on the conducted mesurements.
- It is possible that users lack the education and/or training necessary to use the cyber defense solutions that are being tested.
- The data gathered during the operation of the cyber defense technologies may report insufficient observations, which may result in misleading statistical conclusions.
- Models built from training data may be used by sensors, data fusion techniques, etc. The research community is aware of the problem of the lack of appropriate datasets in some study domains, which also arises from operational situations such as those to be evaluated by the methods developed.
- The proposed surveys and validations might not take into account the variations in cognition and skill of different users. Therefore, the evaluators’ capacity to handle this issue will ultimately determine how effectively the methodology is implemented.
- The user operation may be disrupted by probes, potential intermediary surveys, etc., generating a visibly apparent fake operating context. This may impact the evaluation results.
- The users may behave differently if they are conscious of being monitored.
4. Cyber Situational Awareness Testing Concepts
4.1. Unit Tests
- Method stubs: they may act as a temporary stand-in for still-to-be-written code, or they may emulate the behavior of existing code.
- Mock objects: simulated items that accurately mirror the behavior of real objects. In order to test the behavior of another object, a programmer often builds a mock object.
- Fakes: a fake is closer to a real-world implementation than a stub. They can be useful for system tests as well as for unit testing purposes, but they are not intended for production use because of some limitation or quality requirement.
- Test harnesses: collection of software and test data configured to test a program unit by running it under varying conditions and monitoring its behaviour and outputs. They are able to: (1) call functions with given parameters, output the results, and (2) compare the obtained values. The test harness serves as a hook to the created code so that it may be tested automatically.
4.2. Integration Tests
- Big-bang testing: In order to create a complete software system or a significant portion of the system, the majority of the generated modules are connected together and used for integration testing. The integration-testing procedure can be sped up significantly with the help of this technique. However, the integration process will be made more difficult and the testing team may not be able to accomplish the purpose of integration testing if the test cases and their results are not properly documented.
- Bottom-up testing: Prior to using them to facilitate the testing of higher level components, the lowest level components are tested. Up till the component at the top of the hierarchy is tested, the process is repeated, including the integration and testing of all the bottom-level or low-level modules, processes, or functions. The next level of integrated modules will be generated and can be used for integration testing when the lower level integrated modules have undergone integration testing. This strategy only works when all or the majority of the modules at the same stage of development are completed. This approach also makes it simpler to provide testing progress as a percentage and aids in assessing the technology readiness levels of generated applications.
- Top-down testing: the top integrated modules are tested and the branch of the module is tested step by step until the end of the related module. This is the opposite of bottom-up testing.
- Hybrid/sandwich testing: a mix of top-down and bottom-up strategies. Here, lower modules are integrated with top modules and tested at the same time that top modules are tested with lower modules. Stubs and drivers are both used in this method. The degree of hybridization used and the module scope for each technique determine the pros and cons.
4.3. Security Tests
- Discovery. To better understand the application’s scope and functionality, the technologies and design concepts being used, and any attack vectors, the application should be manually walked through.
- Configuration management. Analyse the servers that support the application. This includes any web servers, application servers, database servers, or load balancers that are accessible from the target system. These systems are analysed for missing patches, up-to-date software, and security-related configuration settings.
- Authentication. Ensure that the application properly verifies the user’s identity before granting access to restricted functionality and data within the application. Authentication testing also seeks to determine if the authentication process was coded and configured according to recommended best practices.
- Authorization. Ensure the authorization controls by manipulating cookies, hidden parameters, and other identifiers and to attempt to access resources and functionality without having an active authenticated session.
- Session management. Check for issues that may allow a user’s session to be hijacked or otherwise compromised to permit an attacker to impersonate the victim within the application.
- Data validation. Ensure the application handles user input and output securely to prevent misinterpreting user input strings as executable commands or database queries. This also applies to the potential impacts of data forgeries and data-driven adversarial machine-learning tactics (e.g., mimicry). In the case of data feeds from perceptions of the operational environment, validations may include procedures to avoid cognitive bias or intentional actions.
- Error handling. All error messages will be checked that are returned for any sensitive or useful information.
- Data protection. Assessment of the effectiveness of sensitive data protection in storage or transit due to a lack of encryption; improper use of production data in a test environment; or displaying sensitive information to an unauthorized user.
- Reporting. Proper documentation of the security test results, ranging from human-understandable texts up to binary, logs, etc. and any other evidence generated.
4.4. Reliability Tests
5. Cyber Situational Awareness Operational Concepts
5.1. Assessment of Cyber Threats
5.1.1. Accuracy
5.1.2. Performance and Efficiency
- Latency. Time delay between the cause and the effect of each process in the system. Latency is a result of the limited velocity with which any system interaction can take place (measured delays, packet jitter, etc.).
- Bandwidth. Measurement of the bit-rate of available or consumed data communication resources (Mbit/s, Gbit/s, etc.).
- Throughput. Rate of production (Packets/s, Transactions/s, Events/s, etc.).
- Channel capacity. Highest upper bound on the rate of information that can be reliably transmitted/processed over a communications channel, and/or without causing bottlenecks.
- Power consumption. The amount of electricity consumed. This becomes especially important for systems with limited power sources, such as those deployed at the edge. (Power use (Watts), kW/h, etc.).
- Compression ratio. Data compression is subject to a space–time-complexity trade-off, which aims to reduce resource usage (data compression ratio, space savings, transmission savings, losses, etc.).
- Environmental impact. Measurements (such as power usage effectiveness (PUE), compute power efficiency (CPE), etc.) that are taken with the goals of decreasing waste, hazardous chemicals, and a computer’s ecological footprint are referred to as “green computing” measures.
5.1.3. Response Time
- Average response time. The average response time (ART) per CSA operation is typically affected when slow inference actions are being conducted. It serves as a finer grain measurement, which usually require additional indicators to consider non-obvious problems in depth.
- Peak response time. The longest responses to all requests sent by the server are measured as the peak response time (PRT). This is a reliable indicator of how well CSA-related functionalities are performing.
- Total raw response time. The total raw response time (TRRT) is the sum of the response times at a certain time interval, which became very useful for assessing finer grain actions composed by several system procedures.
- Response time percentile. A percentile (or a centile) is a measure originated in statistics that indicates the range in which a given percentage of observations in a group of observations falls. For example, the 80th percentile of the response time is the value below which 80% of the conducted response time may be found. In the scope of threat/risks management and response to decision making, this may be an indicator of the proper processing of potential incidents based on prioritization.
- Response time jitter. Jitter in electronics and telecommunications is the divergence of a supposedly periodic signal from genuine periodicity, frequently in connection to a reference clock signal. Overall, jitter measures how much a specific action’s response time can vary.
5.1.4. Updatability and Upgradability
- Commutativity. Adapted to systems, services, knowledge, configuration, etc.; when commuting components, their results shall be independent of order. For example, it shall be possible to digest new cyber threat intelligence (CTI) from a recent discovery source without their order (space-temporal) affecting the CSA risk-identification and management outcomes. Commutativity may be revealed at system tests where the divergence between different CTI organizational settings shall approach no differences. Regarding CSA capabilities, commutativity may be measured based on the similarity between organizational cases, potentially adopting similarity distances on the metrics adopted for assessing the capabilities of the compared cases separately (they can be evaluated in terms of accuracy, performances, etc.).
- Associativity. In the context of updating/upgrading cyber-situational-awareness-related capabilities, associativity refers to the degree to which how their internal functionalities as black-box views are not affected by the order or prioritization to be conducted. For example, a decision-support system may analyze concurrent if-then scenarios before suggesting the best suitable CoAs. Under an optimal associative condition, the order in which the scenarios are simulated and evaluated shall not affect the suggested decisions. Consequently, it is possible to state that this decision-support capability is more easy to update/upgrade when meeting the associativity property, since fewer conditions should be taken into account prior to conducting modifications. Similar to commutativity, associativity can measure the similarity between different execution cases, potentially adopting similarity distances on the metrics adopted for assessing the capabilities of the compared cases separately.
- Discernibility. Despite the proper satisfaction of the rest of the updatability/upgradability properties, a cyber-situational-awareness-acquisition solution can only be modified or updated when suitable factual and operational knowledge exists and may enhance its original capabilities. These “replacements” may be imported for external sources (e.g., COTS, OSSINT, etc.) and/or manufactured/developed based on the previous capabilities. In this context, discernibility refers to the existence and availability of such prior knowledge. For example, the lack of data for training/validation purposes typically entails a well-known problem for further add ons of machine learning. The CSA components mostly affected by this problem will present a lower level of discernibility than those that leverage data-rich environments and use data-driven algorithms.
- Reversibility. As considered in maths, reversibility refers to the existence of a functionality able to reverse the preliminarily achieved results regarding domain/range. Extending this definition for evaluating the updatability/upgradability of a cyber-situational-awareness system/sub-system, reversibility refers to the capability of achieving changes driven by updates/upgrades on the target of the evaluation. It is assumed that the degree of reversibility is maximized when updates/upgrades can completely change a functionality outcome. Similar to commutativity, reversibility can be assessed by measuring the similarity between different execution cases, potentially adopting similarity distances on the metrics adopted for assessing the capabilities of the compared cases separately.
- Acquisition plan. The evaluation of the acquisition plan for an update/upgrade is discomposed into two main features: total cost of acquisition (TCA) and acquisition model (AM). TCA entails estimations of the optimistic, pessimistic, and average cost, which is the total of the closing, research, accounting, commissions, legal fees, shipping, preparation, and installation costs of the purchase. On the other hand, AM defines the business model associated with the provisioning of upgrading/updating material, which may provide a greater/lower value based on the CSA tool capacitation context (ranging from free of economical and legal boundaries, up to direct sale of licenses, subscription models, or transaction-based charges).
5.1.5. Scalability
- Outrun. According to Little’s law [78], under steady-state conditions, the average number of items in a queuing system equals the average rate at which items arrive multiplied by the average time that an item spends in the system. Let us consider a queuing process where L is number of users, transactions, processes, etc. scheduled; W is the average waiting time; and is the average number of items arriving per unit of time, Little’s law describes the following relationship: . In the context of CIS, this expression is typically abstracted as a black-box view as follows: ; where X returns the outrun metric associated to the scalability throughput, N is the average number of operations conducted by the system/functionality target of evaluation; and R is the average operation duration.
- Coefficient of efficiency. The coefficient of efficiency p acts as a stochastic representation of the outrun metric, being detailed as follows: ; where is the average total of request per service, is the total capacity of the system/functionality for service, and m is the number of servers, resources, etc. which the target system is able to escalate with. Note that when the evaluated capability is not able to property serve all the services so the request queue grows.
- Zero delay rate. Furthermore, the probability that a request has zero delay in a queue before receiving service is shown with . Equivalently, is the probability that a customer has a nonzero delay. The formula that gives the probability that an arriving customer is delayed in the queue (i.e., has positive, non-zero wait in the queue), is a function of the parameters c and r, called the Erlang-C formula [79].
5.1.6. Robustness against Adversarial Tactics
5.2. Cyber-Threat Mitigation on the Mission Plane
- P1. Definition of external environment. Statement of all those capabilities for the mission that have a direct dependence on the outside world, i.e., all those capabilities whose functioning may be altered by external elements to the mission.
- P2. Definition of internal environment. Statement of all mission capabilities whose behavior may be altered by an internal failure. Internal faults are considered as all those faults that do not have an external origin and can affect to the rest of the elements of the mission.
- P3. Generation of risk management context. Statement of how risks and threats (external or internal) that have a direct impact on some mission capability (in the first instance) and/or on some of the elements of the mission will be managed.
- P4. Formulation of impact limit criteria. Statement of how risks are spread hierarchically and vertically/horizontally propagated. Thus, it is necessary to define, firstly, the propagation from the CIS capability plane to the operational task plane and, secondly, from the task plane to the mission plane. For this purpose, solutions such as Bayesian Networks are easily configurable, versatile and extensible.
- P5. Identification of risks. During this stage of the methodology, it is necessary to break down the mission into different tasks, and identify the high-level risks that may affect to the mission and how these risks are distributed among the different tasks.
- P6. Analysis of relevant risks. This task lowers the abstraction level in the methodology process and seeks to understand how CIS capabilities affect each of the elements involved in each of the tasks of the mission plane. To this end, it is proposed to make radar diagrams to identify the impact that different variables or dimensions have on each of the elements of the mission. The dimensions to be evaluated in order to quantify risks are the different dimensions of the vulnerability and threat-assessment methodology (CVSS v3.1) [85].
- P7. Evaluation of risks. Once the impact value of the different threats (P6) has been quantified, it is necessary to calculate the mission-level risks. To do this, it will be necessary not only to calculate the risk of each element but also its implication in the different tasks of the mission and its final impact on the mission goals, being calculated as the target of evaluation tools suggests.
- P8. Identification of options. Identification of the spread of risk between mission tasks as severity of aggregated risk at the mission level.
- P9. Development of action plan. Definition of action or contingency plans, i.e., business continuity and recovery plans (BCP), for each of the previously defined risks. These action plans should be defined for changes from lower levels to the CIS level, so that in the future the system can infer dependencies between different tasks.
- P10. Approval of action plan. Approval of the contingency action plan for the current active risk in order to minimize their impact on the mission.
- P11. Implementation of action plan. Implementation of the action plan approved in the previous stage.
- P12. Identification of residual risks. New analysis of the current situation to re-identify risks. It is especially important to analyze the impact between tasks and residual risks.
- P13. Risk acceptance. Definition of tolerance thresholds to discriminate whether or not to apply corrective measures (business plan actions) and whether or not to abort the mission.
- P14. Risk monitoring and reporting. This stage is related with the system of visualization and control of the state of the risks at the mission [87]. It is in charge of generating different reports of the current situation to facilitate the decision-making process. In addition to this, the system could suggest a set of the decisions to be taken and the possible impact they will have on the correct development of the mission.
- P15. Risk communication, awareness and consulting. The last process of the methodology is related with the communication, awareness and consulting tasks. Hence, this module must interact with the different elements of the full system, intercommunicate, and propagate the risk information.
5.3. Support to Decision-Making
6. Cyber Situational Awareness Application Concepts
6.1. Capability of Facilitating CSA Acquisition
6.2. User Acceptance
- Usability. Degree to which CSA tools can be used by end-users to achieve their tasks with effectiveness, efficiency, and satisfaction in a quantified context of use.
- Ease of Use. Learnability of the CSA tools, and the degree of intuitively for end users.
- Terminology and System Information. Degree to which the system notification, description and presentations are coherent with the CSA operational context.
- Functionality. The range and effectiveness of operations that can be conducted by the provided CSA.
- Satisfaction. Degree to which a CSA enabler fulfills the end-user expectations
6.3. Acceptance Questionnaire
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Disclaimer
References
- Adam, E. Fighter Cockpits of the Future. In Proceedings of the 12th IEEE Digital Avionics Systems Conference, Fort Worth, TX, USA, 25–28 October 1993; pp. 318–323. [Google Scholar]
- Dahal, N.; Abuomar, O.; King, R.; Madani, V. Event Stream Processing for Improved Situational Awareness in the Smart Grid. Expert Syst. Appl. 2015, 42, 6853–6863. [Google Scholar] [CrossRef]
- Endsley, M.; Selcon, S.; Hardiman, T.; Croft, D. A Comparative Analysis of Sagat and Sart for Evaluations of Situation Awareness. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Chicago, IL, USA, 5–9 October 1998; pp. 82–86. [Google Scholar]
- Bigelow, B. What are Military Cyberspace Operations Other Than War? In Proceedings of the 11th IEEE International Conference on Cyber Conflict (CyCon), Tallinn, Estonia, 28–31 May 2019; pp. 318–323. [Google Scholar]
- Lif, P.; Granasen, M.; Sommestad, T. Development and validation of technique to measure cyber situation awareness. In Proceedings of the International Conference On Cyber Situational Awareness, Data Analytics And Assessment (Cyber SA), London, UK, 19–20 June 2017; pp. 1–8. [Google Scholar]
- Maestre Vidal, J.; Orozco, A.; Villalba, L. Adaptive artificial immune networks for mitigating DoS flooding attacks. Swarm Evol. Comput. 2018, 38, 94–108. [Google Scholar] [CrossRef]
- Maestre Vidal, J.; Sotelo Monge, M. A novel Self-Organizing Network solution towards Crypto-ransomware Mitigation. In Proceedings of the 13th International Conference on Availability, Reliability and Security (ARES), Hamburg, Germany, 27–30 August 2018. [Google Scholar]
- Sandoval Rodriguez-Bermejo, D.; Daton Medenou, R.; Ramis Pasqual de Riquelme, G.; Maestre Vidal, J.; Torelli, F.; Llopis Sánchez, S. Evaluation methodology for mission-centric cyber situational awareness capabilities. In Proceedings of the 15th International Conference on Availability, Reliability and Security (ARES), Virtual Event, Ireland, 25–28 August 2020; pp. 1–9. [Google Scholar]
- Endsley, M. Towards a theory of situational awareness in dynamic systems. Hum. Factors 1995, 37, 32–64. [Google Scholar] [CrossRef]
- Chatzimichailidou, M.; Stanton, N.; Dokas, I. The Concept of Risk Situation Awareness Provision: Towards a New Approach for Assessing the DSA about the Threats and Vulnerabilities of Complex Socio-Technical Systems. Saf. Sci. 2015, 79, 126–138. [Google Scholar] [CrossRef]
- Barona López, L.; Valdivieso Caraguay, A.; Maestre Vidal, J.; Sotelo Monge, M. Towards Incidence Management in 5G Based on Situational Awareness. Future Internet 2017, 9, 3. [Google Scholar] [CrossRef]
- Lenders, V.; Tnner, A.; Blarer, A. Gaining an Edge in Cyberspace with Advanced Situational Awareness. IEEE Secur. Priv. 2015, 13, 65–74. [Google Scholar] [CrossRef]
- Franke, U.; Brynnielsson, J. Cyber Situational Awareness—A Systematic Review of the Literature. Comput. Secur. 2014, 46, 18–31. [Google Scholar] [CrossRef]
- Saadou, A.; Chenji, H. Optimizing Situational Awareness in Disaster Response Networks. IEEE Access 2018, 6, 24625–24638. [Google Scholar] [CrossRef]
- Webb, J.; Ahmad, A.; Maynard, S.; Shank, G. A Situation Awareness Model for Information Security Risk Management. Comput. Secur. 2014, 44, 1–15. [Google Scholar] [CrossRef]
- Silva, J.A.H.; Lopez, L.; Caraguay, A.; Hernández-Álvarez, M. A Survey on Situational Awareness of Ransomware Attacks—Detection and Prevention Parameters. Remote Sens. 2019, 11, 1168. [Google Scholar] [CrossRef] [Green Version]
- Ioannou, G.; Louvieris, P.; Clewley, N. A Markov Multi-Phase Transferable Belief Model for Cyber Situational Awareness. IEEE Access 2019, 7, 39305–39320. [Google Scholar] [CrossRef]
- Elbez, H.; Keller, H.; Hagenmeyer, V. A New Classification of Attacks against the Cyber-Physical Security of Smart Grids. In Proceedings of the 13th International Conference on Availability, Reliability and Security, Hamburg, Germany, 27–30 August 2018; p. 63. [Google Scholar]
- Bolbont, V.; Theotokatos, G.; Bujorianu, K.; Boulougouris, E.; Vassalos, D. Vulnerabilities and Safety Assurance Methods in Cyber-Physical Systems: A Comprehensive Review. Reliab. Eng. Syst. Saf. 2019, 182, 179–193. [Google Scholar] [CrossRef] [Green Version]
- Demertzis, K.; Tziritas, N.; Kikiras, P.; Llopis Sanchez, S.; Iliadis, L. The Next Generation Cognitive Security Operations Center: Network Flow Forensics Using Cybersecurity Intelligence. Big Data Cogn. Comput. 2018, 2, 35. [Google Scholar] [CrossRef] [Green Version]
- Kriaa, S.; Petre-Cambacedes, L.; Bouissou, M.; Halgand, Y. A Survey of Approaches Combining Safety and Security for Industrial Control Systems. Reliab. Eng. Syst. Saf. 2015, 139, 156–176. [Google Scholar] [CrossRef]
- Fantini, P.; Pinzone, M.; Taisch, M. Placing the Operator at the Centre of Industry 4.0 Design: Modelling and Assessing Human Activities within Cyber-Physical Systems. Comput. Ind. Eng. 2020, 139, 105058. [Google Scholar] [CrossRef]
- Gharib, M.; Lollini, P.; Ceccarelli, A.; Bondavalli, A. Dealing with Functional Safety Requirements for Automotive Systems: A Cyber-Physical-Social Approach. In Proceedings of the 12th International Conference on Critical Information Infrastructures Security, Lucca, Italy, 8–13 October 2017; pp. 194–206. [Google Scholar]
- Sotelo Monge, M.A.; Maestre Vidal, J.; Martínez Pérez, G. Detection of economic denial of sustainability (EDoS) threats in self-organizing networks. Comput. Commun. 2019, 145, 284–308. [Google Scholar] [CrossRef]
- Zeng, J.; Yang, L.; Lin, M.; Ning, H.; Ma, J. A Survey: Cyber-Physical-Social Systems and their System-Level Design Methodology. Future Gener. Comput. Syst. 2020, 105, 1028–1042. [Google Scholar] [CrossRef]
- Wang, P.; Yang, L.; Li, J.; Hu, S. Data Fusion in Cyber-Physical-Social Systems: State-of-the-Art and Perspectives. Inf. Fusion 2016, 51, 42–57. [Google Scholar] [CrossRef]
- Llopis Sanchez, S.; Mazzolin, R.; Kechaoglou, I.; Wiemer, D.; Mees, W.; Muylaert, J. Cybersecurity Space Operation Center: Countering Cyber Threats in the Space Domain. In Handbook of Space Security; Springer: Cham, Switzerland, 2019. [Google Scholar] [CrossRef]
- Fortson, L.W. Towards the Development of a Defensive Cyber Damage and Mission Impact Methodology. Master’s Thesis, Air Force Institute of Technology, Kaduna, Nigeria, 2007. [Google Scholar]
- Demertzis, K.; Tziritas, N.; Kikiras, P.; Llopis Sanchez, S.; Iliadis, L. The Next Generation Cognitive Security Operations Center: Adaptive Analytic Lambda Architecture for Efficient Defense against Adversarial Attacks. Big Data Cogn. Comput. 2019, 3, 6. [Google Scholar] [CrossRef] [Green Version]
- Price, P.; Leyba, N.; Gondreey, M.; Staples, Z.; Parker, T. Asset criticality in mission reconfigurable cyber systems and its contribution to key cyber terrain. In Proceedings of the 50th International Conference on Systems Sciences (HICSS 2017), Waikoloa Village, HI, USA, 4–7 January 2017; pp. 446–456. [Google Scholar]
- Schulz, A.; Kotson, M.; Zipkin, J. Cyber Network Mission Dependencies; Technical Report 1189; Massachusetts Institute of Technology, Lincoln Laboratory: Lexington, MA, USA, 2015. [Google Scholar]
- Cheng, M.; Wang, B.; Zhao, S.; Zhai, Z.; Zhu, D.; Chen, J. Situation-Aware Dynamic Service Coordination in an IoT Environment. IEEE/ACM Trans. Netw. 2017, 25, 2082–2095. [Google Scholar] [CrossRef]
- Cohen, G.; Afshar, S.; Morreale, B.; Bessell, T.; Wabnitz, A.; Rutten, M.; van Schaik, A. Event-based Sensing for Space Situational Awareness. J. Astronaut. Sci. 2019, 66, 125–141. [Google Scholar] [CrossRef]
- Layton, P. Fifth-generation air warfare. Aust. Def. Force J. 2018, 204, 23–32. [Google Scholar]
- de Barros Barreto, A.; Costa, P.; Yano, E. Using a semantic approach to cyber impact assessment. In Proceedings of the 8th Conference on Semantic Technologies for Intelligence, Defense, and Security (STIDS 2013), Fairfax, VA, USA, 13–14 November 2013; pp. 101–108. [Google Scholar]
- D’Amico, A.; Buchanan, L.; Goodall, J.; Walczak, P. Mission Impact of Cyber Events: Scenarios and Ontology to Express the Relationships between Cyber Assets, Missions, and Users; Tech. Rep. OMB No. 0704-0188; AFRL/RIEF, US Defence Technical Information Center: Fort Belvoir, VA, USA, 2009. [Google Scholar]
- Endsley, M. Situational awareness misconceptions and misunderstanding. J. Cogn. Eng. Decis. Mak. 2016, 9, 4–32. [Google Scholar] [CrossRef]
- Brynielsson, J.; Franke, U.; Varga, S. Cyber Situational Awareness Testing. Combatting Cybercrime and Cyberterrorism; Springer: Cham, Switzerland, 2016; pp. 209–233. [Google Scholar]
- Stevens, S. Measurement, Statistics, and the Schemapiric View. Science 1968, 161, 849–856. [Google Scholar] [CrossRef]
- Parasuraman, R.; Sheridan, T.; Wickens, C. Situation Awareness, Mental Workload, and Trust in Automation: Viable, Empirically Supported Cognitive Engineering Constructs. J. Cogn. Eng. Decis. Mak. 2018, 2, 140–160. [Google Scholar] [CrossRef]
- Dekker, S.; Hummerdal, D.; Smith, K. Situation awareness: Some remaining questions. Theor. Issues Ergon. Sci. 2008, 11, 131–135. [Google Scholar] [CrossRef]
- Salmon, M.; Stanton, N.; Walker, G.; Baber, C.; Jenkins, D.; Mcmaster, R.; Young, M.S. What really is going on? Review of situation awareness models for individuals and teams. Theor. Issues Ergon. Sci. 2008, 9, 297–323. [Google Scholar] [CrossRef]
- Endsley, M. A Survey of Situation Awareness Requirements in Air-to-Air Combat Fighters. Int. J. Aviat. Psychol. 1993, 3, 157–168. [Google Scholar] [CrossRef]
- Endsley, M. Situation awareness global assessment technique (SAGAT). In Proceedings of the IEEE National Aerospace and Electronics Conference, Dayton, OH, USA, 23–27 May 1988; pp. 937–942. [Google Scholar]
- Salmon, P.; Stanton, N.; Jenkins, D. How Do We Know What They Know? Situation Awareness Measurement Methods Review. In Distributed Situation Awareness; CRC Press: Boca Raton, FL, USA, 2017; pp. 55–76. [Google Scholar]
- Miles, J.; Strybel, T. Measuring Situation Awareness of Student Air Traffic Controllers with Online Probe Queries: Are We Asking the Right Questions? Int. J. Hum. Comput. Interact. 2017, 33, 55–65. [Google Scholar] [CrossRef]
- Matthews, M.; Beal, S. Assessing Situation Awareness in Field Training Exercises; Research Report 1795; West Point US Military Academy: West Point, NY, USA, 2002. [Google Scholar]
- Tounsi, W.; Rais, H. A Survey on Technical Threat Intelligence in the Age of Sophisticated Cyber Attacks. Comput. Secur. 2018, 72, 212–233. [Google Scholar] [CrossRef]
- Trautsch, F.; Herbold, S.; Grabowski, J. Are unit and integration test definitions still valid for modern Java projects? An empirical study on open-source projects. J. Syst. Softw. 2010, 159, 110421. [Google Scholar] [CrossRef]
- Endsley, M. Measurement of situation awareness in dynamic systems. Hum. Factors J. Hum. Factors Ergon. Soc. 1995, 37, 65–84. [Google Scholar] [CrossRef]
- Buczak, A.; Guven, E. A Survey of Data Mining and Machine Learning Methods for Cyber Security Intrusion Detection. IEEE Commun. Surv. Tutor. 2015, 18, 1153–1176. [Google Scholar] [CrossRef]
- Gutzwiller, R.; Hunt, S.; Lange, D. A task analysis toward characterizing cyber-cognitive situation awareness (CCSA) in cyber defense analysts. In Proceedings of the IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA), San Diego, CA, USA, 21–25 March 2016. [Google Scholar]
- Mahoney, S.; Roth, E.; Steinke, K.; Pfautz, J.; Wu, C.; Farry, M. A Cognitive Task Analysis for Cyber Situational Awareness. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, San Francisco, CA, USA, 27 September–1 October 2010; Volume 54, pp. 279–283. [Google Scholar]
- Ben-Asher, N.; Gonzalez, C. Effects of cyber security knowledge on attack detection. Comput. Hum. Behav. 2015, 48, 51–61. [Google Scholar] [CrossRef]
- Mancuso, V.; Christensen, J.; Cowley, J.; Finomore, V.; Gonzalez, C.; Knott, B. Human Factors in Cyber Warfare II. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Chicago, IL, USA, 27–31 October 2014; Volume 58, pp. 415–418. [Google Scholar]
- Tenney, Y.; Pew, R. Situation Awareness Catches On: What? So What? Now What? Hum. Factors Ergon. 2006, 2, 1–34. [Google Scholar] [CrossRef]
- Malviya, A.; Fink, G.; Sego, L.; Endicott-Popovsky, B. Situational Awareness as a Measure of Performance in Cyber Security Collaborative Work. In Proceedings of the 8th International Conference on Information Technology: New Generations, Las Vegas, NV, USA, 11–13 April 2011; pp. 937–942. [Google Scholar]
- Giacoe, N. Measuring the Effectiveness of Visual Analytics and Data Fusion Techniques on Situation Awareness in Cyber-Security. Ph.D. Thesis, The Pennsylvania State University, State College, PA, USA, 2012. [Google Scholar]
- Evangelopoulou, M.; Johnson, C. Attack Visualisation for Cyber-Security Situation Awareness. In Proceedings of the 9th IET International Conference on System Safety and Cyber Security, Manchester, UK, 15–16 October 2014; pp. 937–942. [Google Scholar]
- Fink, G.; Best, D.; Manz, D.; Popovsky, V.; Endicott-Popovsky, B. Gamification for Measuring Cyber Security Situational Awareness. In Proceedings of the International Conference on Augmented Cognition, Las Vegas, NV, USA, 21–26 July 2013. [Google Scholar]
- Shiravi, H.; Shiravi, A.; Ghorbani, A. A Survey of Visualization Systems for Network Security. IEEE Trans. Vis. Comput. Graph. 2012, 18, 1313–1329. [Google Scholar] [CrossRef]
- Dressler, J.; Bowen, C.; Moody, W.; Koepke, J. Operational data classes for establishing situational awareness in cyberspace. In Proceedings of the 6th International Conference On Cyber Conflict (CyCon 2014), Tallinn, Estonia, 3–6 June 2014; pp. 175–186. [Google Scholar]
- Katasonov, A.; Sakkinen, M. Requirements quality control: A unifying framework. Requir. Eng. 2006, 11, 42–57. [Google Scholar] [CrossRef]
- Zimek, A.; Vreeken, J. The blind men and the elephant: On meeting the problem of multiple truths in data from clustering and pattern mining perspectives. Mach. Learn. 2015, 98, 121–155. [Google Scholar] [CrossRef]
- Jackson, M. Software Requirements & Specifications: A Lexicon of Practice, Principles and Prejudices; CM Press/Addison-Wesley Publishing Co.: New York, NY, USA, 1995. [Google Scholar]
- Efatmaneshnik, M.; Shoval, S.; Joiner, K. System Test Architecture Evaluation: A Probabilistic Modeling Approach. IEEE Syst. J. 2019, 13, 3651–3662. [Google Scholar] [CrossRef]
- Felderer, M.; Buchler, M.; Johns, M.; Brucker, A.; Breu, R.; Pretschner, A. Chapter One—Security Testing: A Survey. Adv. Comput. 2016, 101, 1–51. [Google Scholar]
- OWASP. Open Web Application Security Project. 2020. Available online: https://owasp.org/ (accessed on 8 June 2022).
- ANSI/IEEE. Standard Glossary of Software Engineering Terminology; STD-729-1991; ANSI/IEEE: New York, NY, USA, 1991. [Google Scholar]
- Bhuyan, M.; Bhattacharyya, D.; Kalita, J. Network Anomaly Detection: Methods, Systems and Tools. IEEE Commun. Surv. Tutor. 2014, 16, 303–336. [Google Scholar] [CrossRef]
- van Schaik, P.; Renaud, K.; Wilson, C.; Jansen, J.; Onibokun, J. Risk as affect: The affect heuristic in cybersecurity. Comput. Secur. 2020, 90, 101651. [Google Scholar] [CrossRef]
- Maestre Vidal, J.; Castro, J.; Orozco, A.; Villalba, L. Evolutions of evasion techniques aigainst network intrusion detection systems. In Proceedings of the 6th International Conference on Information Technology, Bangkok, Thailand, 12–13 December 2013. [Google Scholar]
- Maestre Vidal, J.; Sotelo Monge, M. Obfuscation of Malicious Behaviors for Thwarting Masquerade Detection Systems Based on Locality Features. Sensors 2020, 20, 2084. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Huancayo Ramos, K.; Sotelo Monge, M.; Maestre Vidal, J. Benchmark-Based Reference Model for Evaluating Botnet Detection Tools Driven by Traffic-Flow Analytics. Sensors 2020, 20, 4501. [Google Scholar] [CrossRef] [PubMed]
- Maestre Vidal, J.; Sotelo Monge, M.; Villalba, L. A novel pattern recognition system for detecting Android malware by analyzing suspicious boot sequences. Knowl.-Based Syst. 2018, 150, 198–217. [Google Scholar] [CrossRef]
- Jogalekar, P.; Woodside, M. Evaluating the scalability of distributed systems. IEEE Trans. Parallel Distrib. Syst. 2000, 11, 589–603. [Google Scholar] [CrossRef]
- Xiong, H.; Zeng, G.; Zeng, Y.; Wang, W.; Wu, C. A novel scalability metric about iso-area of performance for parallel computing. J. Supercomput. 2014, 68, 652–671. [Google Scholar] [CrossRef]
- Salmanian, Z.; Izadkhah, H.; Isazdeh, A. Optimizing web server RAM performance using birth–death process queuing system: Scalable memory issue. J. Supercomput. 2017, 73, 5221–5238. [Google Scholar] [CrossRef]
- Gross, D.; Shortle, J.; Thompson, F.; Harris, C. Fundamentals of Queueing Theory; Wiley: New York, NY, USA, 2008. [Google Scholar] [CrossRef]
- Maestre Vidal, J.; Sotelo Monge, M.; Martinez Monterrubio, S. EsPADA: Enhanced Payload Analyzer for malware Detection robust against Adversarial threats. Future Gener. Comput. Syst. 2020, 104, 159–173. [Google Scholar] [CrossRef]
- Tedesco, G.; Aickelin, U. Strategic Alert Throttling for Intrusion. In Proceedings of the 4th International Conference on Information Security (WSEAS), Tenerife, Spain, 16–18 December 2005; pp. 246–251. [Google Scholar]
- Coronoa, I.; Giacinto, G.; Roli, F. Adversarial attacks against intrusion detection systems: Taxonomy, solutions and open issues. Inf. Sci. 2013, 239, 201–225. [Google Scholar] [CrossRef]
- Maestre Vidal, J.; Orozco, A.; Villalba, L. Online masquerade detection resistant to mimicry. Expert Syst. Appl. 2016, 61, 162–180. [Google Scholar] [CrossRef]
- FIRST. Magerit v.3; FIRST: Madrid, Spain, 2020. [Google Scholar]
- FIRST. Common Vulnerability Scoring System (CVSS) v3.1; FIRST: Madrid, Spain, 2020. [Google Scholar]
- ENISA. Methodology for Evaluating Usage and Comparison of Risk Assessment and Risk Management Items; ENISA: Attiki, Greece, 2020. [Google Scholar]
- Llopis, S.; Hingant, J.; Perez, I.; Esteve, M.; Carvajal, F.; Mees, W.; Debatty, T. A comparative analysis of visualisation techniques to achieve cyber situational awareness in the military. In Proceedings of the 2018 International Conference on Military Communications and Information Systems (ICMCIS), Warsaw, Poland, 22–23 May 2018; pp. 1–7. [Google Scholar]
- Shameli-Sendi, A.; Louafi, H.; Wenbo, H.; Cheriet, M. Dynamic Optimal Countermeasure Selection for Intrusion Response System. IEEE Trans. Dependable Secur. Comput. 2016, 15, 755–770. [Google Scholar] [CrossRef]
- Miehling, E.; Rasouli, M.; Teneketzis, D. A POMDP Approach to the Dynamic Defense of Large-Scale Cyber Networks. IEEE Trans. Inf. Forensics Secur. 2018, 13, 2490–2505. [Google Scholar] [CrossRef]
- Llansó, T.; McNeil, M.; Noteboom, C. Multi-Criteria Selection of Capability-Based Cybersecurity Solutions. In Proceedings of the 52nd Hawaii International Conference on System Sciences, Maui, HI, USA, 8–11 January 2019. [Google Scholar]
- Chin, J.; Diehl, V.; Norman, K. Development of an instrument measuring user satisfaction of the human-computer interface. In Proceedings of the SIGCHI conference on Human factors in computing systems, Washington, DC, USA, 15–19 May 1988; pp. 213–218. [Google Scholar]
Feature | Description | Classes |
---|---|---|
Analysis for CTI for update /feasibility | Accuracy for CTI assessment when updating/upgrading, and deletion of CTI linked to FPs | Binary: an action may be needed/ non-needed Multiclass: kind of action |
Threats/Alerts Grouping | Accuracy when grouping threats and/or alerts | Binary: a threat/alert may be grouped/ non-grouped Multiclass: Particular group/cluster on which they are associated |
CI threat/risk detection | Accuracy when identifying CI threats and/or risks | Binary: a CI situation may entail a threat/risk, or not Multiclass: a particular group/cluster on which they are associated |
Mission threat/risk detection | Accuracy when identifying MI threats and/or risks | Binary: an MI situation may entail a threat/risk, or not Multiclass: a particular group/cluster on which they are associated |
Threat/risk Prediction | Accuracy when forecasting threats/risks in the requested time horizons | Binary: a sequence of observations may derive in a threat/risk at time horizon t [75] Multiclass: the kind of threat/risk from which the sequence may be derived. |
CI to MI propagation | Accuracy when inferring MI threats/risks from CI threats/risks | Binary: a CI threat/risk may derive an MI threat/risk, or not Multiclass: the kind of threat/risk from which a CI threat/risk may be derived. |
Anomaly recognition | Accuracy when discriminating | Binary: an observation is tagged as discordant, or not Multiclass: the kind of anomaly in which an observation may be tagged. |
CKT recognition | Accuracy when recognizing cyber key terrains (CKT) | Binary: a cyber asset is tagged as CKT, or not. Multiclass: the kind of CKT in which a cyber asset is tagged. |
Multi-step attack | Attacker next step estimation | Binary: a sequential attack may be derived from further intrusion steps, or not Multiclass: the kind of attack steps inferred from the current status of a multi-step attack. |
Attack path evaluation | Accuracy when inferring each attack path | Binary: an attack path is inferable from certain attack, or not Multiclass: the kind of attack path inferred from a certain attack. |
Self-protection issues | Non AuthZ/AuthN actions, user inactivity, etc. | Binary: an observation is tagged as a potential self-protection issue, or not Multiclass: the kind of self-protection issues from which a certain situation is derived. |
Rate | ||||||
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
Usability | ||||||
U1 | Using the CSA tool would enable me to accomplish tasks more quickly and effectively tasks | |||||
U2 | Using the CSA tool would improve my job performance | |||||
U3 | Using the CSA tool would make it easier to do my job | |||||
U4 | I would find the CSA tool useful in my job | |||||
Ease of use | ||||||
E1 | Learning to operate the CSA tool would be easy for me | |||||
E2 | It would be easy to get the CSA tool to assist in the fulfilment of tasks | |||||
E3 | My interaction with the CSA tool would be clear and understandable | |||||
E4 | I would find the CSA tool to be flexible to interact with | |||||
E5 | It would be easy for me to become skillful at using the CSA tool | |||||
E6 | I would find the CSA tool easy to use | |||||
Terminology and System Information | ||||||
T1 | I find the use of terms and concept of operation throughout the system adequate | |||||
T2 | I find the terminology related to mission, tasks, incidents and courses of actions adequate | |||||
T3 | The position of messages on screen is intuitive | |||||
T4 | The risk levels and their potential propagation are displayed comprehensively and according to their criticality | |||||
T5 | The CSA tool informs about its internal processes non-intrusively and as requested | |||||
T6 | Alerts are property emphasized | |||||
Functionality | ||||||
F1 | I found the various functions in the CSA tool well integrated | |||||
F2 | The CSA tool brings a clear picture of the cyberspace | |||||
F3 | The CSA tool brings a clear picture of the relationship between cyberspace and planned/ongoing missions | |||||
F4 | The CSA tool brings comprehensively and effective support to courses of action identification, selection and planning | |||||
F5 | The integration of the CSA tool with external data sources (SOC, NOC, CTI, Mission planners) is properly operative | |||||
F6 | The analytical capabilities integrated (simulation, diagnosis, prediction, etc.) are effective | |||||
F7 | The CSA tool is self-protected and implements a consisted audition system | |||||
Satisfaction | ||||||
S1 | I am satisfied with the CSA tool | |||||
S2 | I would like to use the CSA tool frequently | |||||
S3 | I would recommend the CSA tool to my team | |||||
S4 | The CSA tool works the way I want it to work | |||||
S5 | I would need the CSA tool for my daily tasks | |||||
S6 | I am very confident with the use of the CSA tool |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Llopis Sanchez, S.; Sandoval Rodriguez-Bermejo, D.; Daton Medenou, R.; Pasqual de Riquelme, R.; Torelli, F.; Maestre Vidal, J. Tackling Verification and Validation Techniques to Evaluate Cyber Situational Awareness Capabilities. Mathematics 2022, 10, 2617. https://doi.org/10.3390/math10152617
Llopis Sanchez S, Sandoval Rodriguez-Bermejo D, Daton Medenou R, Pasqual de Riquelme R, Torelli F, Maestre Vidal J. Tackling Verification and Validation Techniques to Evaluate Cyber Situational Awareness Capabilities. Mathematics. 2022; 10(15):2617. https://doi.org/10.3390/math10152617
Chicago/Turabian StyleLlopis Sanchez, Salvador, David Sandoval Rodriguez-Bermejo, Roumen Daton Medenou, Ramis Pasqual de Riquelme, Francesco Torelli, and Jorge Maestre Vidal. 2022. "Tackling Verification and Validation Techniques to Evaluate Cyber Situational Awareness Capabilities" Mathematics 10, no. 15: 2617. https://doi.org/10.3390/math10152617
APA StyleLlopis Sanchez, S., Sandoval Rodriguez-Bermejo, D., Daton Medenou, R., Pasqual de Riquelme, R., Torelli, F., & Maestre Vidal, J. (2022). Tackling Verification and Validation Techniques to Evaluate Cyber Situational Awareness Capabilities. Mathematics, 10(15), 2617. https://doi.org/10.3390/math10152617