The first step towards building a security monitoring platform tailored to the needs of a specific software company is to identify and report its actual needs. This can be achieved by consulting the employees of the actual targeted company in order to retrieve information about the security monitoring activities that they already employ during their SDLC, which security monitoring activities are considered more important to them, and what novel security monitoring mechanisms are more suitable to be added to their pipelines/workflows. This can be achieved through dedicated surveys, focus groups, and active communication with the company on a consultation basis.
In the present work, we have selected an actual company, namely Onelity, as a case study for demonstrating our approach. Onelity is an IT Services Provider working actively for leading Automotive, Telecommunications, Financial and e-Commerce organizations in Europe. It collaborates with its customers to turn their digital visions into results through its regional offices in Germany, Greece, and Cyprus. It was founded in 2020 by a team of highly qualified professionals, having more than 20 years of international experience in the field of Information Technology. It supports and provides customized turnkey solutions in mid- and long-scale projects all around Europe by using the latest technologies, and a full toolbox of frameworks and systems. It also provides the most advanced and up-to-date training programs in the market. At the time of writing, Onelity (hereafter referred to as Company) employs more than 50 highly skilled professionals.
In order to gather the required information, a survey and a focus group were conducted with employees of the Company. In particular, 16 key employees of the Company were used as subjects of the study, acting as the participants both in our survey and in the focus group. We ensured that all of the selected participants are involved in the SDLC of the core software products that are developed by the Company from various roles, ranging from software developers to project managers. We also ensured that multiple people from each role would be involved in our study in order to further avoid potential bias.
Two-Step Survey
Initially, a 2-step survey was conducted with the 16 participants of the Company, in order to gather the required information. The division of the survey into two sequential steps was necessary, as we first needed to understand the actual needs of the Company, i.e., which novel security monitoring mechanisms should be deployed into their pipelines, and based on the identified needs to gather dedicated information that is necessary for properly tuning/configuring the selected mechanisms to better fit into the Company’s pipelines.
Step 1: The purpose of the first step of the survey was to (i) identify the current state of the Company with respect to security monitoring, (ii) identify their needs with respect to further security monitoring activities that should be applied during their SDLC, and (iii) chose the most suitable security monitoring mechanisms that should be added to their pipelines/workflows. To gather this information, as will be explained later in more detail, a dedicated questionnaire was constructed and shared with the participants, as presented in
Table 2. Initially, the first two parts of the questionnaire (i.e., Part 1 and Part 2 in
Table 2) were constructed and shared with the participants. In brief, the following broader questions (which were mapped into more concrete questions in the actual questionnaire) had to be answered by the participants:
What is your role in the company and how many years of experience do you have?
Does your company employ a secure SDLC in the projects that you are involved?
What pro-active and re-active security testing and monitoring activities do you employ during the SDLC?
Which is the most important security activity in which your company needs to invest more in the future?
The responses to these questions were based on multiple choice answers in order to give the option to the participants to select among predefined answers. Since we wanted to provide freedom to the participants and gather as much useful information as possible from their side, most of these questions also provided the “Other” option to allow the participants to give a different answer from those provided, if they considered it necessary. This was decided because we did not want to risk limiting (or potentially directing) the participants’ responses to specific answers. In addition to this, it should be noted that the increased freedom in the responses was necessary for this step, as it allowed us to better design the rest of our survey and collect useful information for the requirements elicitation process in the next steps.
Step 2: The purpose of the second step of the survey was to gather information that is necessary for properly tuning/configuring the most suitable security monitoring mechanisms (as identified based on the answers of the first step), in order to be tailored to the pipelines/workflows and needs of the Company. In particular, the responses of the first step of the survey were analyzed and the main security monitoring mechanisms were detected. As will be discussed later in the text, the participants showed great interest in security monitoring solutions that are based on static analysis, particularly on quantitative security assessment (QSA) and vulnerability prediction models (VPMs). Hence, the questionnaire (see
Table 2) was updated by adding two additional parts (i.e., Part 3 and Part 4) with questions necessary for tuning these models in the future. The broader questions that were asked in these two sections are presented below:
How important is the characteristic of ≪Security_Characteristic≫ for a software application (compared to the other characteristics)?
According to your expertise, which Security Characteristics are significantly affected by ≪Security_Issue≫?
The purpose of these questions is (i) to identify the main security aspects (i.e., characteristics) that are of high interest for the projects that are developed by the Company, as well as their relative importance, and (ii) to identify the main security issues that they face in these projects along with their impacts on important security aspects. In contrast to the answers to the questions of Step 1, which offered much freedom to the participants, the answers to the questions of Step 2 were provided either on a 5-point Likert Scale or based on a list of predefined choices. The answers in this step had to be strictly defined since (i) we are referring to official security terms, and (ii) the responses will be used for configuring the parameters of the models (see
Section 3.3). Therefore, the consistency and correctness of the responses need to be ensured. The selected security characteristics are retrieved from ISO/IEC 25010 [
42], whereas the Security Issues are retrieved from NIST’s Common Weakness Enumeration (CWE) (
https://cwe.mitre.org/top25/) database. The values of the ≪Security_Characteristic≫ and ≪Security_Issue≫ that were selected are shown in
Table 2, along with the summary of the final questionnaire.
Questionnaire: As already stated, as an instrument for gathering the required information from the 2-step survey, we opted for a questionnaire. The most important part of developing a questionnaire is the selection of questions. In our survey, this process was governed by the guidelines provided by Kitchenham and Pfleeger [
43]: (a) keep the number of questions low, (b) questions should be purposeful and concrete, (c) answer categories should be mutually exclusive, and (d) the number, the order, and the wording of questions should avoid biasing the respondent. To this end, we constructed a questionnaire with 25 questions, organized into four parts (see
Table 2).
As already stated, the first two parts were used in the first step of the survey, whereas the latter two were used in the second step of the survey. The questionnaire containing only the first two parts was initially distributed to the participants, who were asked to provide their answers. An initial analysis of their answers was conducted and the main security monitoring mechanisms that should be deployed regarding the SDLC of the Company product were identified. Then, the questionnaire was updated by adding the remaining two parts (i.e., Part 3 and Part 4), including questions specifically crafted for collecting information that is required for configuring/tuning the selected security monitoring mechanisms (see
Section 3.3). The updated questionnaire was distributed to the participants, who were asked to provide answers to the questions of the remaining two sections. It should be noted that in the beginning of each section, the questionnaire introduced the participant to the involved security terms, in order to ensure that the participants had a clear understanding of the included terms and the questions.
Core Findings
In the present section, we provide the core findings of the previously described two-step survey and focus group. For reasons of brevity, only the main findings that were considered important for the design and development of the VM4SEC platform are presented and summarized.
In
Figure 2 and
Figure 3, we provide the demographics of the participants that were involved both in the two-step survey and in the focus group, as gathered by the questionnaire. As can be seen in
Figure 2, the vast majority of the participants (i.e., 50%) were actual developers, followed by Quality Assurance (QA) Engineers (i.e., 19%), 13% were project managers, and the remaining percentage (i.e., 6%) was equally distributed among software engineers, software architects, and DevOps engineers. Hence, we have representatives from the whole SDLC, whereas the main body of the responses stems from people that are actively involved in the actual development of the software projects. In addition to this, as can be seen in
Figure 3, 56.3% of the participants have less than 5 years of working experience, while 12.5% have more than 10 years of working experience.
SURVEY: Figure 4 presents the results of Q2.1 regarding the type of security testing (i.e., pro-active or re-active) the participants use during their SDLC. As can be seen, the vast majority of the participants stated that they use security testing approaches in their projects in order to enhance security. Among the re-active approaches (Q2.2), vulnerability patching and installation of firewall are the most widely used, followed by attack detection techniques and honeypots, as shown in
Figure 5. An interesting observation is that around 31% of the respondents said that they do not normally apply re-active approaches in their projects. This was an engaging finding that was marked as a point for discussion in the focus group, in order to better understand the reasons why in some projects no re-active approaches are adopted. As will be discussed later in the focus group, these participants stated that they did not use re-active security approaches, as the software products that were working on did not have security concerns, and therefore the installation of security countermeasures was not considered necessary.
In
Figure 6, the results of Q2.3 with respect to the kind of pro-active approaches used by the Company during the SDLC are presented. As can be seen, around 69% of the participants stated that they employ dynamic security testing, followed by static testing, which was selected by around 31% of the participants. Around 19% of the participants stated that they do not utilize any pro-active security testing approach, either static or dynamic, during the overall development process. Similarly to Q2.2, as revealed during the focus group, the reason for not using pro-active security testing approaches was that the referred software products were not security-critical.
In Q2.4, the participants were asked to declare which one of the pro-active security approaches they consider useful, and therefore should receive more attention from the development team. As can be seen in
Figure 7, around 75% of the participants stated that static analysis is considered the most promising security testing technique during the coding phase and therefore, deserves more attention from the Company. This contradicts the low utilization of static analysis for security purposes by the Company, which was observed in the responses of Q2.3. An interesting topic for discussion, which was left for the focus group, was to understand whether the participants selected static analysis because they find it indeed useful and helpful for adding security to their systems, or because they recognize that it is not widely used during their pipelines.
In Q2.5, the participants were asked which novel security monitoring mechanisms (from a given set) they consider to be interesting and with practical value, in order to be included in their pipelines. A summary of their responses is illustrated in
Figure 8. As can be seen, the static-analysis-based security monitoring mechanisms, namely the quantitative security assessment (QSA) and the vulnerability prediction models (VPMs), were recognized as the most valuable ones, taking up around 88% and 94% of the votes, respectively.
By analyzing the responses presented above, we reached the conclusion that the participants are more interested in security monitoring mechanisms that are based on static code analysis, and particularly on the QSA and VPMs mechanisms. Based on this observation, as already discussed in “Two-Step Survey”, the questions of Part 3 and Part 4 of the questionnaire were properly defined (see
Table 2), in order to collect information necessary for configuring/tuning those mechanisms.
For reasons of brevity, we provide the main findings of these questions. According to the participants’ responses, the security characteristics of Confidentiality, Availability, and Integrity were considered the most important security aspects of their software products. In addition to this, the security issues that greatly affect each one of these security characteristics was identified. The most critical security issues that were identified were the Null Pointer references.
More information on how these results were leveraged is provided in
Section 3.3, where the models are constructed. As can be seen also by inspecting
Table 2, the questions of Part 3 and Part 4 of the questionnaire are meant for gathering statistics, which would be further processed by us in order to configure the core security monitoring mechanisms that were selected and described in
Section 3.3. Illustrating the “raw” charts would take up much space without providing any added value to the discussion.
FOCUS GROUP: As already stated, the survey was followed by a focus group. As made clear by the above description, many interesting observations were made during the analysis of the responses of the questionnaire that led to additional questions that had to be further discussed in order to gain more insight. Based on the process described previously, the focus group was conducted. The main observations from each block of the focus group are presented in what follows.
Block 1: The vast majority of the participants stated that they prefer static analysis over dynamic analysis for security testing. This is in line with the results of the first step of the survey, in which static code analysis was recognized by the participants as one of the most interesting security testing activities that they should employ during the SDLC (see
Figure 7). When asked why they prefer static analysis, the most common reason was its ability to highlight a security issue along with its location in the source code, followed by the high automation of the approach and its ability to be applied even before the code can be executed or even compiled. This is in line with the results of other popular surveys on the usefulness of ASA [
3]. When asked about the shortcomings of static analysis, the main shortcoming that was reported was the large volume of alerts that it produces, which is often difficult to manage. Equally important was the lack of interpretation of the results. The participants, especially the project managers, expressed the difficulty that they face in understanding the security information that resides in these alerts, due to the fact that they are in a raw format. This is the main reason for the limited adoption of such approaches in practice, despite their acknowledged benefits. All the participants agreed that post-processing tools able to extract useful information from the raw alerts produced by static analysis are highly useful and of practical importance.
It should also be noted that the results of the focus group were in line with the results of the survey with respect to the applied re-active and pro-active approaches. In fact, security patching and firewalls were the most widely used re-active approaches; whereas, with respect to the pro-active approaches, dynamic testing was more frequently used than static testing. Those participants that said that they do not use security testing approaches, were actually working on software applications with no security considerations.
Block 2: None of the participants were fully aware of the discussed trends in the field of software security monitoring. Hence, the discussion was then directed to the four highly popular novel security monitoring/testing techniques, two from dynamic and two from static analysis (see
Table 3). A brief description of each one of those four mechanisms was provided in order to ensure that the participants had a sufficient understanding of its purpose and functionality. The vast majority of the participants expressed their interest in the
quantitative security assessment and
vulnerability prediction models, since they do not have similar tools in their pipeline and they think that they provide useful insight during the development. In addition to this, the majority of the participants did not consider the utilization of ML-based fuzzing and penetration testing useful for their pipelines, since they already apply common fuzzing and penetration testing tools and they consider them already accurate and sufficient. These outcomes are in line with the answers that they provided during the survey through the questionnaires. However, the focus group allowed us to realize the reasoning behind this selection. In particular, this explains the reason why in Q2.4 the participants ranked dynamic security testing so low, as they consider it a traditional approach that is already part of the process, without additional interest.
Block 3: Apart from the core functionalities, the platform should also provide additional features that are considered to be important and useful by the software engineers. The questions of the third block of the focus group helped us in identifying these requirements. In brief, the participants (in fact, the software engineers that are involved in the development of the Company’s software) commonly expressed the need for the platform to access the source code directly from version control systems such as GitHub, Bitbucket, and GitLab, as these are the repositories in which their projects reside. In addition to this, there was a consensus on the need for a GUI able to visualize the results. To this end, several non-functional requirements were determined, such as acceptable downtime, analysis speed, etc. The information collected through the focus group was highly useful for the requirements elicitation and use case definition of the VM4SEC platform, a process described in the next section.
Block 4: In the fourth part of the focus group, emphasis was given to the main security aspects/characteristics of the software applications that are actively developed by the company, the security issues that they normally face, and how these issues affect the various security characteristics. All of the participants agreed that the characteristics of Confidentiality, Integrity, and Availability are the most critical security aspects that should be satisfied at minimum by any software project under development. This is in line with what was observed in the third part of the questionnaire. Then, for each one of these three security aspects, the most critical security issues that may affect them were identified. The identified critical issues are in line with what was retrieved by the questionnaire, further enhancing our confidence with respect to the reliability of the responses, and, in turn, to the correctness of the final model parameters. Minor inconsistencies were discussed and appropriate updates were made to the ranked list when necessary.
Main Takeaways: In summary, through the discussion carried out during the focus group and the responses that were provided through the questionnaires, we observed that the employees of the Company consider static analysis as an important mechanism for identifying security issues during the development process. We also identified a need for novel mechanisms able to post-process the static analysis results in order to extract security-related information that is encapsulated in them. Finally, the participants believe that they sufficiently cover dynamic security testing through penetration testing and fuzzing, and that the utilization of ML to further improve them is not critical. Among the presented novel security mechanisms, static-analysis-based quantitative security assessment and vulnerability prediction were found to be the most interesting mechanisms that they would like to add to their pipelines.
To this end, we decided to build a static-analysis-based security monitoring platform, namely the VM4SEC platform, being able to post-process the results of static analysis in order to conduct quantitative security assessment and vulnerability prediction.
At this point, a remark about the importance of the focus group is considered necessary. From the above analysis, it is clear that the focus group helped us to better understand the reasoning behind the responses of the participants, and gain better insights into the current state and needs of the Company. For instance, we were able to understand how important static analysis is considered by the employees of the Company, along with their willingness to utilize it more actively in their workflows via mechanisms that will improve their experience with static analysis tools and help them gain deeper insight from the raw static analysis results. We also realized that the low interest in investing in dynamic security testing techniques that were reported in the questionnaires was not due to the fact that they consider them less important, but because they have already deployed such tools in their pipeline, they consider them mature enough and there is no need for further investing in dynamic testing. Making these observations would not be possible by simply analyzing the responses of the questionnaires, which could have led us to wrong conclusions. Hence, we consider a follow-up focus group a necessity for gaining correct feedback from the employees.