1. Introduction
The ubiquitous prevalence of Social Networks (SNs) in modern societies, which consist of the most preferable cloud computing services worldwide, has dynamically transformatted not only the field of communication, but also many others socio-economical domains, such as relationships maintenance, entertainment, social interaction, self-representation, professional activities, and e-governance [
1]. This intensification of individual and social activities, in several domains within SNs, promotes complexity and it shifts the edges among the determination and the activities of public and private life [
2]. This occurs, since, in order for SNs to be effectively utilized, users provide a great amount of personal information, which is further analyzed and used by the SNs providers [
3]. Furthermore, taking into consideration that SNs overcome spatiotemporal boundaries as well, they lead to a greater diffusion of users’ personal and sensitive information than other informational systems [
4]. As [
5] maintain this ubiquity of information analyses and distribution impacts on users’ social norms of privacy. In this respect, [
6] emphatically supports that a new battleground among individuals and service providers is indicated, concerning “a new kind of information war” p. 64, regarding the access, collection, storage, processing and disclosure of users’ personal information. SNs’ structure challenge the concept of privacy [
7,
8,
9], which, over the past years, has been determined under different, but often overlapping frameworks and notions [
10,
11]. Therefore, they challenge users’ social privacy norms and their respective practices. Thus, under this “war” and the privacy notion challenges, it has been recognized that users’ privacy protection is not adequately achieved within SNs. Furthermore, users’ complex privacy concerns and needs, under the different contexts that they use SNs, are not respectively well-considered [
12]. Additionally, despite the fact that several privacy and security measures have been introduced by SNs providers in order to provide users’ with the sense that they have control over their information, it is proved that only providers have this ability [
13].
Previous research has also highlighted that users differ significantly in their privacy management strategies within SNs, raising questions for how to support such broad privacy concerns and needs in a more user-centered way. In particular, in Europe, this is even more immense, considering the enforcement of the General Data Protection Regulation (GDPR), which promotes users’ privacy safeguard not only at a socio-legal oriented layer, but also at a technical one, supporting the implementation of the principle of Privacy by Design (PbD). PbD, introduced at first by [
14], aims to be a holistic and human-orientated approach for implementing technical privacy measures, offering realistic solutions [
15]. According to GDPR, the data controllers and processors, including SNs providers obviously, are obligated to deploy the appropriate technical procedures in order to ensure the protection of the data subjects’ rights [
16]. In this regard, providing SNs users with the control over their information could be a realistic solution to several privacy issues that derive from users’ willing to disclose information, while preventing unauthorized access from third parties. From a technical perspective, as [
13] supports, either hosting user’s information on a constantly available paid server or providing personal server for each user or a personal virtual machine in a paid cloud, as well as the acting of personal mobile devices with Internet connectivity as servers, could be effective solutions for users’ privacy protection.
To that end and towards to provide effective privacy protection schemes within SNs, in a more user-centered way, several self-adaptive privacy approaches have been developed [
17]. Self-Adaptive privacy aims at protecting users’ privacy, through the development of holistic user models that pay attention to their socio-contextual and technological frames of action [
18]. For instance, [
19] developed the user-tailored privacy by design framework, aiming to address the types of privacy adaptations that should be implemented, in order for Facebook users with different privacy management strategies to be supported more personalized. However, as other previous ambitious self-adaptive privacy schemes for SNs [
20,
21] that have been proposed, their work does not identify users’ social categories and attributes in depth. These attributes affect their privacy norms, which is of great importance for the developers and the design of adequate privacy solutions [
2], in order for instance to support users’ authentication, authorization and confidentiality of personal information, which considered being of the most significant privacy challenges in Internet of Things environments [
22].
In our previous work [
23], we have shown that, in particular for self-adaptive privacy schemes, in order to be effectively designed and deployed, several criteria concerning users’ social and technical context should be satisfied. Self-adaptive privacy systems should be able to protect users’ privacy in changing contexts, either by providing users with recommendations or by proceeding automated actions based on users’ decisions for personal information disclosure or not, within their context. In this regard, specific functions should be deployed, such as classified interaction strategies, which facilitate the connection of the system and the user, providing privacy awareness, justification for each privacy decision and control capabilities. To address that, monitoring, analysis, design, and implementation of self-adaptive privacy systems should be performed through framework and behavioral models, which identify user’s environment and their interconnections with each system. Furthermore, the systems should be adapted to the interoperability of the used technologies and to the structure of the systems, such as SNs, so as to balance among systems’ privacy choices automation and users’ choices. This reveals a number of criteria that should be considered in order to reason about privacy under a socio-technical view. These concern: (a) the identification of users’ privacy social needs in each context; (b) the identification of all stakeholders’ privacy technical needs; (c) the identification of privacy risks and threats; (d) the indication of users’ sensitive information in each context; (e) the systems’ adaption to the interoperability of technologies; (f) the assessment of the best options among users’ and systems’ privacy choices; and (g) an effective decision-making procedure to be followed, which balances users’ social and privacy needs.
Among these criteria, the identification of users’ social characteristics and privacy needs is crucial, since, in previous works regarding self-adaptive privacy schemes, these were part-substantially or limited addressed. This identification will provide further privacy protection within SNs, enabling for instance the satisfaction of the privacy criteria that [
13] supported, such as the addition/removal of users from a SNs group, the efficiency of a user key revocation, the encryption/decryption efficiency, the encryption header overhead, the ability to encrypt for the conjunction/disjunction of SNs groups, the ability to encrypt for a SNs a user which is not a group member. Therefore, to support this aim, the investigation and the capturing of empirical data related to users’ social characteristics specifically within SNs is required, since they affect their privacy management. For example, users’ privacy safeguard, when they share information concerning multiple users, is of major importance [
24]. The lack of the empirically identified social parameters that influence individuals’ privacy behaviors within SNs, highlights a major challenge in order to address self-adaptive privacy under a systematic user-centric way. Since SNs, and cloud services in general, are getting more customized and context-aware, there is a greater necessity to understand which of the users’ social attributes are important regarding SNs use, as well as which are their specific privacy behaviors.
In this regard, this paper, as a part of our study for the identification of the socio-technical requirements for the design of self-adaptive privacy-aware cloud-based systems, presents the crucial issues of leveraging knowledge about users’ specific individual and social factors and of identifying relevant determinants of privacy practices within SNs. We support that this identification will provide input for the design of usable and self-adaptive privacy within SNs. To achieve that and to capture these data, a survey was administrated to the academic and administrative staff of the University of the Aegean in Greece. The survey was implemented by distributing an interdisciplinary measurement instrument that we introduced in a previous work [
25]. This embodies validated metrics from both privacy and sociological literature. The instrument promotes the thoroughly identification of users’ social landscape and privacy perceptions and behaviors within SNs. The results of this examination provide the further understanding of users’ digital and social privacy interests and boundaries, enabling future research for the meeting with technical privacy affordances, so as to balance among users’ need for preserving personal information and the need for disclosing such information within SNs. The rest of the paper is organized as follows.
Section 2 presents the methodology regarding our research, data collection and sampling, as well as the instrument that was distributed and its measures.
Section 3 presents the results regarding users’ social attributes and privacy perceptions and management within SNs, while
Section 4 discusses the results. Finally,
Section 5 concludes our work and poses future research directions.
2. Materials and Methods
Privacy, as a multifaceted concept, presupposes different privacy perceptions and management strategies for different users in SNs, which differentiate even more because their diverse socio-contextual backgrounds. Due to this complexity, it is often difficult to be measured, so as to reflect users’ social privacy norms and therefore to lead to the proper technical privacy measures development for managing privacy issues within social networks in a self-adaptive way [
26]. Despite the fact that users’ social landscape is of vital importance for understanding their privacy perceptions and behaviors, most of previous works, regarding users’ privacy management [
27,
28], fragmentary include social factors in their measurement scales, addressing consequently privacy as a one-layered construct. To meet that need, the measurement instrument that was developed, adopted constructs and their respective metrics from both sociological and privacy literature, aiming at examining multiple information about users’ social attributes and privacy management within SNs. As far as the sociological literature concerns, it put special emphasis on social identity theory metrics and social capital theory metrics, since both social identity and social capital concepts have been highlighted by previous research to influence users’ privacy management within SNs [
29,
30].
Social identity refers to the ways that individuals determine their attitudes and practices in each domain of activity, based on specific social attributes, which represent their personal social categories and characteristics [
31]. Furthermore, previous literature supports that there is a high possibility for individuals to gain more resources when they belong to many groups [
32], and in this sense, users’ social identity affects their social capital. Social capital captures the gains and the advantages that individuals obtain by participating in networks and social institutions [
33]. Two types of social capital are referred by previous literature to affect privacy management, the Bonding and the bridging one [
34]. Bonding social capital concerns the development of coherent ties among individuals within tight networks, experiencing similar situations and exchanging support and trust, such as family or close friends. Bridging social capital refers to the development of connective ties among individuals within vulnerable, heterogeneous, and diverse networks, experiencing different situations, without a common sense of belonging [
33]. Consequently, in contrast to previous research that has examined these social constructs separately, in our work they are both examined, in order to provide further understanding of users’ social landscape. Privacy literature was thoroughly investigated in order for the validated metrics of previous works regarding privacy perceptions and management to be adopted in our instrument. Since privacy, apart from the several definitions of its concept; it has specific and very often descriptive and measurable interactive functions within a society [
26], such as privacy concerns, privacy risks, and privacy behaviors, it was important to incorporate these measures in our instrument. Woo in [
35], for instance, argues about individuals’ management strategy to remain anonymous and untraceable within SNs, not only by not providing personal information at all, but also by providing untrue information in order for not to be visible while online [
36]. The specific privacy metrics are described in detail as follows. Therefore, the questionnaire that was developed for the data collection, included three wider sections, concerning users’ social identity, users’ social capital and users’ privacy management within social networks, along with their respective items. Furthermore, a set of six questions to address participants’ socio-demographic characteristics, namely gender, age, family form, educational level, professional experience, monthly income, were included in the last part of the instrument, in order to take advantage of the beneficial time that participants needed to complete it.
2.1. Social Identity
The five items of this section regarding the examination of users’ social identity were adopted from [
32] work, entitled Online Social Identity Mapping (oSIM), a tool that was designed to assess the multifaceted aspects of individuals’ social identities. Participants were firstly asked to indicate in which groups they belong within SNs, by using a social identity taxonomy compatible with [
31] work. This include groups, such as broad opinion-based ones, leisure groups, family or friendship, community groups, sporting or well-being ones, professional groups, health related ones or other users’ indicative groups. The item high-contact groups concerns the frequency that users communicate with the members of the groups that they belong within SNs and by which participants rated this frequency on a 5-Point Likert scale, ranging from “not often at all” to “every week”.
For the rest three items, positive groups, representative groups and supportive groups, participants were asked to rate their agreement on a 5-Point Likert scale as well, declaring of how much positivity they perceive for each one of their SNs indicated groups, of how representative they feel due to belonging to each one of their SNs indicated groups and of how much support they receive from each one of their SNs indicated groups.
2.2. Social Capital
In this section, the two constructs of Bonding and Bridging social capital are examined. The five items that investigate users’ bonding social capital within SNs, by using a 5-Point Likert system, are derived from Williams [
37] Bonding Social Capital Scale and they are (a) “If I needed 100 € urgently someone of my social network could lend me”, (b) “People of my social network could provide good job references for me”, (c) “When I feel lonely there are several people on SMs I could talk to”, (d) “There are several people on SMs I trust to solve my problems”, and (e) “I do not know anyone well enough from my SMs network to get him/her to do anything important”.
The Bridging social capital five items of our instrument have been incorporated from [
34] as well, as the most used and validated metrics in previous privacy research. These, in particularly, are (a) “Interacting with people in my social network makes me want to try new things”, (b) “I am willing to spend time on supporting community activities”, (c) “I meet new people very often”, (d) “Interacting with people in my SMs network makes me feel like a part of a larger community”, and (e) “Interacting with people in my SMs network makes me realize that somehow we are all connected worldwide”.
2.3. Privacy Management
Considering that previous privacy metrics could be further exploited from expansion in many ways [
28], while the combination of their advantages may elevate the examination of self- adaptive privacy within SNs, this section is consisting of nine sub-scales, adopted from previous literature. It aims at including as much as possible privacy-related metrics, in order to reflect users’ privacy context within SNs in depth. The participants were asked to rate their agreement on a 5-Point Likert scale, ranging from “not at all” to “very much”, for the following subscales and their specific items:
Beliefs in Privacy Rights: (a) Users’ right to be left alone, (b) users’ right to use Internet anonymously, (c) no gathering of disclosed personal information without users’ consent, and (d) users’ right control on their personal information [
38].
Privacy Concerns: (a) I am concerned about my online information being linked to my publicly available offline one; (b) I am concerned that the information I submit on SMs could be misused; (c) I’m concerned that too much personal information is collected by so many SMs; (d) It usually concerns me when I am asked to provide personal information on SMs; (e) I am concerned that others can find private information about me online; and (f) I am concerned about providing personal information on SMs, because it could be used in a way I did not foresee [
28,
39].
Information Collection: (a) It usually bothers me when SMs ask me for personal information, (b) When SMs ask me for personal information, I sometimes think twice before providing it, and (c) It bothers me to give personal information to so many SMs [
40,
41].
Self-disclosure: (a) I frequently talk about myself online, (b) I often discuss my feelings about myself online, (c) I usually write about myself extensively online, (d) I often express my personal beliefs and opinions online, (e) I disclose my close relationships online, and (f) I often disclose my concerns and fears online [
42].
Trusting Beliefs: (a) SMs would be trustworthy in information handling, (b) SMs fulfill promises related to the information provided by me, (c) I trust that SMs would keep my best interests in mind when dealing with my provided information [
41].
Privacy Control: (a) I have control over who can get access to my personal information online, (b) I always optimize my privacy settings when I create an online profile, (c) I consider the privacy policy of SMs where I give out personal information, (d) I would opt out of a service due to privacy issues, and (e) I only upload information online that is suitable for everyone that can see [
27,
28,
39,
42].
Privacy Awareness: (a) Personal information is of value to SMs providers, (b) I am aware of the privacy issues and practices in SMs, (c) SMs providers do not have the right to sell users personal information, (d) I follow the developments about privacy issues and violations within cloud, (e) I keep myself updated on privacy solutions that law and SMs employ, and (f) I am aware of protecting my personal information from unauthorized access [
28,
39].
Collaborative privacy management: (a) Prior to disclosing content, my group members and I discuss the appropriate privacy settings, (b) I ask for approval before dis-closing content from those group members involved, and (c) My group ask for approval before uploading content concerning myself [
43].
Self-disclosure/ Cost–Benefit: (a) The risk posed to me if personal information is exposed outweighs the benefits of sharing it, (b) In general, my need to obtain SMs is greater than my concern about privacy, and (c) I value the personalized SMs I received from providing such personal information [
39].
In general, most of previous works tend to focus on informational privacy concept, while their metrics usually spotlight specific privacy constructs, such as privacy concerns, risks, trust, data collection [
28], neglecting users’ socio-contextual attributes. Therefore, they do not provide a more socio-technical perspective, despite the fact that for SNs, security and privacy is a major issue. [
44], specifically, maintain that focusing on privacy requires not only the security of users’ personal information and content, but also the security of SNs communication channels from internal or external attacks, as well as the unauthorized access to users’ communication by third parties, using access control mechanisms. In this respect, [
45] support that the following security and privacy requirements are immense, namely, the identity privacy, the location privacy, the node compromise attack, the layer-removing/adding attack, the forward and backward security, as well as the semi-trusted and malicious cloud security. Considering these, in order to expand previous works and to identify the adequate privacy-related requirements for a self-adaptive privacy protection scheme within SNs, the key issue is to examine users social and privacy needs. To our best knowledge, a measurement scale meeting these issues and focusing on self-adaptive privacy management within SNs in particular has not been developed in previous literature. To address that, taking into consideration that existing privacy scales could benefit from expansion manifold, while the combination of the advantages of previous privacy metrics may improve the level of privacy within cloud, we presented our scale. In the following
Table 1 the comparison with these previous measurement scales is presented, indicating that social identity and social capital metrics are not included in other works.
2.4. Sample, Data Collection and Procedure
The academic and administrative staff of the University of the Aegean in Greece was invited to participate in this survey, considering that adults are more likely to participate in many social groups and to support, due to their age, plenty of social roles. The total research population is consist of 747 members. In total, 123 members of the staff yielded the questionnaire; thus, 10 cases were excluded. Therefore, a total sample of 113 participants was included in our survey, giving a response rate of 15%. Before the distribution of the questionnaire to the research population, the instrument was tested for its form, language, clarity, difficulty and responsiveness to respondents’ interests in a pilot study addressed to 20 members of the staff, in order to identify possible design problems and to revise items where it was necessary. The questionnaire was implemented through Google forms and its link was sent in the professional e-mails of the staff of the University of the Aegean. The procedure and the purpose of the survey were explained with clarity in the online questionnaire’s introductory note, as well as ethics was plainly described. The instrument was also tested for its validity and reliability (values of Cronbach’s Alpha index were >0.7 for each section).
4. Discussion
Users’ privacy protection within SNs is an ongoing process, depending on social, legal, and technical aspects [
64]. However, their inadequate bridging arises difficulties for an effective privacy protection solution [
65], compliant to GDPR principles and rules as well. Towards this, several self-adaptive privacy solutions under the differential privacy scheme [
66] or under the context-adaptive privacy scheme [
67,
68], have been introduced. However, they were subsequent to many limitations, since they did not identify in depth users’ social landscape and outlet, failing therefore to correlate them with the appropriate privacy technical solutions [
69]. Thus, as we, in detailed, have discussed in a previous work, since self-adaptive privacy focuses on users’ integrated context, both social and technical, a group of criteria should be satisfied for its optimal design. Among these, the thoroughly investigation of users’ social norms is a focal one and the first step in order for the self-privacy related requirements to be elicited. The determining of users’ social attributes will provide developers with further knowledge so as to, not only to include social requirements at their design, but also to build the appropriate technical privacy affordances. After all, previous research for self-adaptive privacy has shown that not enough attention is paid on eliciting requirements from the users’ perspective, in order for introducing flexibility in the self-adaptive systems behavior at an early phase of requirements engineering [
70], a prerequisite for the implementation of PbD principle. In this regard, our survey highlights the social attributes of a targeted research population within the academic community of a Greek University, this of the academic and administrative staff of the University of the Aegean. It should be noted that, although the utilization and the impact of social networks is growing among the populations of the academic educational settings [
71], as far as the privacy issues concern in particular, most of previous research focus on students [
52] and not on the adults. Under this, the findings of our survey indicate the social landscape of adults SNs users, which is considerable different from this one of the young adults. Our sample, at the majority, has been indicated to be over the 30 years old, highly educated, with many years of professional experience, while keeping in mind Greece’s financial situation, it has a descent monthly income. These entire social attributes, examined separately by previous research, have been shown to influence users’ privacy perceptions and behaviors [
72,
73], leading them to grow their privacy concerns and making them to follow strict enough privacy management strategies when using SNs.
Our findings, while supporting previous evidence regarding privacy concerns and self-disclosure practices, indicate that these attributes do not have the same effect as far as the privacy control concerns, showing a low control level. Despite of the high cultural and financial capital of the participants, privacy control remains an issue of great importance for users and therefore an appropriate self-adaptive privacy scheme should provide users with the control level over the information they want to reveal, as it was argued by [
74]. Furthermore, participants in our survey, declare to belong to several social groups within SNs, with most dominant the Friends and Family Groups. This indicates that they belong to more than one social categories, which are significant parts of individuals’ self-concept. This categorization enables them to distinguish their personal boundaries and these of their memberships. Thus, participants several social categories create concentrations regarding their “self-interestedness, self-reliance, self-realization and self-determination”, as it is shown from their low or medium rates regarding the positive impact, the importance, and the support that they receive from these groups in SNs. This leads to the development of a particular privatism, which can be associated with plenty of personal perceptions and practices in institutional forms of expression as [
75] supported, such as the SNs engagement. This privatism development is also indicative for the participants’ strong beliefs regarding the necessity of the protection of users’ privacy rights, as well as of their annoyance to provide personal information in SNs when is needed.
However, as in previous literature [
30,
76], privacy managing issues and several privacy implications arise from participants’ multiple identities, as it is indicated by fact that the other members of their groups do not take their approval in order to upload their own personal information. It has been already argued that family and close friends group memberships, due to including the baring of emotional needs, enhance risking at various levels. Therefore, this raise further questions regarding the disclosure of personal information within undistinguishable private/public boundaries of contexts, such SNs. In this regard, since self-adaptive privacy protection schemes should have the ability to maintain users’ privacy in changing contexts [
77], they should provide users with recommendations about their own decisions or of their family and friends group members specifically to disclose or not information. The specification of these groups is of great importance, since [
78] in a previous work, attempting to identify distinct social requirements of privacy issues within SNs by interviewing 15 adolescents (14–18) regarding their online behaviour on Facebook, indicated the need to make more explicit which are the “real generalized others”. The dominance of Family, Friends and Companionships groups characterizes the frequency of participants’ communication within SNs, while it is quite interesting that a great amount of communication takes place among the members of political groups. Especially as far as the belonging to the political groups concerns, participants also declare a high level of representativeness and importance. Therefore, the political social group is also indicated among the social categories that should be taken under consideration, when designing privacy adaptations. In general, the frequent communication contributes to the formulation of a social identity, even though some identities might be more primary than others [
79], as it is observed in our case regarding the political group membership. As [
80] argue, ‘‘mediated groups can develop a meaningful and strong sense of identity through interaction’’. SNs provide multiple possibilities for interaction and communication. Thus, the technical features of SNs not only alter users’ constructs for their functioning and purpose, but they also alter how users actually employ these features for managing privacy, while interacting with other users [
2]. The participants, regardless their group memberships, do not find in majority SNs trustworthy enough when handling such information. In this respect, self-adaptive privacy protection schemes should be adapted to the interoperability of SNs technologies and the structure of their systems, considering users communication behavior, in order to determine through these communications, the sensitive information that should not be revealed, at a short time operational function. [
78] research, which indicates that a context collision deriving from SNs segments should be avoided, also supports this. In addition to users’ social landscape and outlet in SNs, the thoroughly examination of users’ privacy perceptions, behaviors and personal information flow is needed. Investigating these as well, since previous works have shown that they affect the identification of the technical privacy requirements, is also a critical step for an adequate self-adaptive privacy solution to be designed.
In our survey, as it was aforementioned, participants’ privacy concerns were highly rated. Privacy concerns, among other privacy factors, affect not only users’ intentions, but also their actual behaviors [
57]. This indicates the reasons that participants invest low in capturing of social capital within SNs through self-disclosing, in contrast with previous findings [
81], which show that users willingly provide this information in order to acquire the perceived benefits resulting from their networks. As [
82] argue, the interrelation among disclosure attitudes and gaining benefits is not straightforward. Thus, participants rate their privacy concerns higher their needs for SNs services, indicating that their privacy calculus is not expected to change after a privacy violation, since they already respond with mechanisms, such the ones [
83] proposed, namely, refusal or negativism. Up to this, considering that the relationship between self-disclosure and benefits is mediated by the factor of control of information [
84], it was not surprising that the participants indicate the low degree of control on their information.
However, as results show, they still upload information, for which they are not sure if it is proper for every audience to see or they do not take under consideration SNs privacy policies. After all, previous literature has already highlighted that, while users perceive highly their control over information, they usually ignore the control deriving by SNs policies over the information [
85]. With this respect, an effective self-adaptive privacy scheme within SNs should provide users with plenty of control capabilities, enhancing their cognitive processes for implementing their privacy strategies. [
77] emphatically support that self-adaptive privacy systems should provide users with adequate opportunities to express preferences and give feedback in relation to the privacy decisions they have to undertake. Nevertheless, ignoring privacy policies, a privacy risky behaviour, indicates another crucial aspect for the participants of our survey, concerning the necessity of the enhancing their privacy awareness. Participants estimate their privacy awareness in medium levels and consequently, privacy aware increase is of great importance to be provided by the designed self-adaptive privacy schemes. This becomes even more crucial, since according to [
30], SNs incorporate basic privacy technical features, meeting only typically their responsibility for users’ privacy protection, failing to address their complex privacy needs. Up to this, self-adaptive privacy schemes should provide users with the proper classified interaction strategies, facilitating the connection of the systems with them by proceeding to automated actions that enhance their privacy awareness and justify the privacy choices or decisions. This will strengthen users’ privacy and it will enhance not only their awareness, but also their knowledge regarding the accessibility and use of information, deriving from SNs structure.
What is more, our survey indicates that participants’ information within SNs is co-managed and disclosed by other members of the groups they belong to, leading to risky behaviors, since the other group members do not take approval for disclosing information. As [
62] argues, it is often unclear who and how many people are included in the groups, which disclose such information. Furthermore, it is also unclear who accesses and stores users’ data, which are often analyzed by unauthorized parties [
86]. Personal information disclosures of a user for other users, regardless other users’ consent, like sharing friends, family, colleagues and other connections contact lists, enables SNs and third companies to misuse such information and circumvent users’ privacy rights provided with GDPR, such as to know, to restrict processing, to not be subject of automated decision-making, leading to untrustworthy web services. Therefore, capturing the collaborative process of privacy management within SNs should be a great deal for the design of self-adaptive privacy schemes. The monitoring process of self-adaptive privacy systems should not focus only to the user itself, but also to its social environment and its interconnections, in order to provide the proper features, since privacy protection within SNs presupposes a group-level coordination.
Drawing up, therefore, on the results of our survey regarding their social attributes, privacy perceptions and management, indicatively the following privacy related requirements should be taken under consideration at the early design of self-adaptive privacy schemes within SNs, in order to support effectively their operations for monitoring, analysis and implementation:
Adults SNs users, with high cultural and financial capital, value privacy in a high level and they indicate higher perceived privacy concerns regarding the ways of their personal information is handled and they cannot foresee or of the ways these can be misused. In this respect, self-adaptive privacy schemes should offer users proper justification and awareness about the privacy decision that they undertake.
Adults SNs users, despite their belonging to several social groups within SNs, they more often participate and communicate with Family and Friends groups. Since the belonging in these groups includes emotional involvement and a further level of trust among the members, self-adaptive privacy preserving schemes should provide users with the proper features that detect threats before disclosing information based on this frequent communication and trust among the group members within SNs.
Adult SNs users perceive their belonging to political groups within SNs of great importance. In this regard, self-adaptive privacy schemes should utilize framework models to identify these users’ SNs environments and to provide the features that determine the value of such groups, so as to diagnose the privacy related threats.
Adults SNs users, despite the general low to medium level of social capital investment within SNs, give more emphasis in the bonding social capital and in particular in the resources provided for job references. Consequently, when users disclose information for such benefits, self-adaptive privacy schemes should provide them with the possibility of selective information disclosure.
Adults SNs users find SNs untrustworthy and they perceive highly the value of their information for these providers. In this regard, self-adaptive protection schemes should be adapted to the interoperability of SNs technologies and the structure of their systems, providing users with justifications regarding the SNs role on privacy risks, in order to take the right privacy decisions.
Adults SNs users perceive a low level of privacy control within SNs, and therefore self-adaptive privacy preserving schemes should provide them with further control capabilities and choices, in order to enhance their privacy awareness as well.
Adults SNs users co-manage personal information with other users within SNs due to belonging to multiple social groups, leading often to privacy implications. Self-adaptive privacy schemes should be able to analyze this co-management and to detect threats before information disclosure, calculating users’ benefits in comparison to information disclosure costs.
In the following scenario case, we may focus on an adult female user of Facebook, which belongs in a Family group, in a Friends group, and in political group, while seeking for a job and she has made a lot of applications. In that way, the user is characterized by multiple social identities, such as being a Mother, a Wife, a Political party member, and Unemployed. The user, despite of her privacy concerns, discloses personal information during executing everyday tasks in her daily routine for each one of these identities in her profile, since she has a low degree of privacy awareness of how this information is used. In some cases, other members of her groups disclose her personal information, uploading her photos for instance. Since her profile is open to all, this indicates the revelation of her privacy normativity within Facebook. Everyone is able to follow the user and to have knowledge for her social places or backgrounds, such as her house, her leisure with friends, her activities with the political party. Furthermore, everyone can pay special attention in the way that her social identities are utilized or dropped accordingly.
Therefore, her privacy normativity can include all the anticipated activities, while being present at a specific place, accordingly to the identity, such as eating or home keeping, while using the Mother identity within the family group or her political activities (e.g., participating in a demonstration), while using her political identity. At the same time, since she is seeking for a job, she has asked for job references from her groups’ members. However, the direct or indirect disclosure of her information while online can lead to a number of subsequent privacy implications. The privacy implications may happen, when, for instance, an employer in which she has sent an application, will visit her profile and through various ways of her communicating information in Facebook, such as posting photographs, hash tagging places, time description of posts or check-ins, evaluates her activities, and he/she decides not to hire her, because he/she disagrees with her political action or because she has three children. Therefore, privacy implications may appear due to users’ social need for sharing information and low privacy awareness that encourage unveiling her identity along with additional information (photographs or hashtags) which may lead to her detectability and observability by unwanted parties. Thus, since a self-adaptive privacy protection scheme could be utilized by the user that has considered these identified privacy related requirements, these implications can be prevented. In case the user, under her Mother identity, is ready to disclose information about her home and children or under her political identity is ready to disclose information for her political action, the self-adaptive privacy scheme is able to indicate users’ environment and to identify that she has asked for job references. Therefore, it provides her with the proper justification and awareness about the privacy decision that she will undertake, since it can detect the threats and the privacy implications, before she discloses information. It can also provide user with justifications regarding the Facebook role on privacy risks, since her profile is open, in order to offer her the ability to take the right privacy decision. Furthermore, it is able to analyze the co-management of information sharing from other group members and to detect threats before information disclosure, giving her specific notifications. In this regard, it is of major importance that by identifying these privacy related requirements, user is given with further control capabilities and privacy choices, which, from our survey it is indicated that are lagging, while her privacy awareness is enhanced as well.
Furthermore, considering these privacy related requirements at the early stage of the self-adaptive privacy schemes design, will also enhance the implementation of the technical perspectives of PbD approach. These will support the satisfaction of several technical privacy requirements, such as Authentication, Authorization, Identification, Anonymity, Pseudonymity, Unlinkability, Data Protection, Unobservability, Undetectability, Isolation, Provenanceability, Traceability, Intervenability, and Accountability, which were introduced by the extended PriS framework for cloud computing services [
87]. It will also enable the developers to support GDPR enforcement, e.g., by providing users the ability to assess the options among their own privacy preferences and the systems’ choices, in order for an effective decision-making procedure to be followed that respects subjects’ data rights and satisfies their needs.