Differentially Private Timestamps Publishing in Trajectory
Abstract
:1. Introduction
- Weak privacy preservation: Perturbing trajectory data can protect individual’s accurate location, but trajectory has a time property. The adversary can still know the habit of people just by observing the time of an individual’s trajectory, e.g., we can know the home location of people by observing their location in the morning or evening, even if the trajectory is perturbed.
- Low-level utility: State-of-the-art methods try to add noise into the latitude and longitude of the trajectory to perturb the real value. However, a noisy trajectory means the loss of accuracy, which has a negative effect on trajectory mining tasks. In real-world applications, we do not want this scene to happen.
- These challenges imply that a novel mechanism for differentially private release of trajectory data in social media is in high demand. With respect to the first challenge, to lift the weak privacy preservation, we attempt to perturb the time pattern based on the k-anonymity algorithm. We use k-anonymity to generalize the time range of trajectory. Then, the attacker cannot infer the habit of people just by observing the pattern of the trajectory’s time. For the second challenge, we add noise into the time of trajectory instead of the latitude and longitude. In this way, we can guarantee high-level utility because we do not change the value of the trajectory.
- To defend against inference attacks launched by adversaries who can observe the time pattern of trajectory, we first formalize the notion of TDP, and then we show the possibility of achieving TDP guarantees by augmenting a k-anonymity- and truncated Laplace-based mechanism. Our TDP solution can protect an individual’s habits and does not need to perturb the real value of the trajectory.
- A k-anonymity-based algorithm is proposed to hide the real-time pattern of individuals, which is different from the original k-anonymity algorithm. Then, the data in the original single time period are anonymously hidden for a whole day. In this way, we can hide individual’s accurate time at a specific position even if the adversary knows their position.
- A truncated Laplacian-based mechanism is proposed to add noise to the frequency of temporal data, and the added noise is deleted during the recovery process. By this way, we can guarantee that the noise added in the time domain has no effect on the trajectory mining utility while preserving the time privacy. It is theoretically proven that this mechanism satisfies differential privacy.
2. Related Work
3. Preliminaries
3.1. k-Anonymity
- Identifier attribute: the identity attribute that can identify an individual;
- Quasi-identifier attribute: an attribute that can identify an individual’s identity when linked with other data tables;
- Sensitive attributes: attributes that need to be kept confidential when data are published;
- Non-sensitive attributes: attributes that can be disclosed and have no effect on the privacy of the user, also known as ordinary attributes.
3.2. Differential Privacy
3.3. Attack Model
4. Methodology
4.1. Problem Definition
4.2. Sketch of TDP
4.2.1. Time Anonymization
4.2.2. Truncated Laplace Time Perturbation
4.3. Time Anonymization
Algorithm 1: Time anonymization |
Input: GPS data }, k, ε. |
Output: confidence, perturbed GPS data. |
1: while then |
2: input k |
3: end while |
4: for ∈T}do |
5: |
6: end for |
7: for ∈T} do |
8: |
9: end for |
10: for ∈T} do |
11: |
12: end for |
4.4. Truncated Laplace Time Perturbation
4.4.1. Definition of Truncated Laplace Time Perturbation
4.4.2. Truncated Laplace Time Perturbation Mechanism
Algorithm 2: Truncated Laplace time perturbation mechanism |
Input: Privacy budget , timestamps , query function . |
Output: Noise variables , perturbed query results . |
for each round |
1. Select a timestamp ; |
2. Compute noise scale parameter ; |
3: Generate truncated Laplace noise according to the PDF in Definition 6 with bound ; |
4: Compute the noisy response ; |
end for |
return . |
5. Security and Utility Analysis
5.1. Security Analysis
5.2. Utility Analysis
6. Experimental Evaluation
6.1. Experiment Datasets and Setup
Experimental Results and Analysis
- (1)
- Intuitive Diagram after TDP Algorithm Encryption
- (2)
- Data Security Results and Analysis
- (3)
- Data Validity Results and Analysis
- (4)
- Optimization Effect of Truncated Laplace
6.2. Experimental Results Summary
7. Conclusions
- The widespread use of -anonymity should be further considered, since, when the time data are widely distributed, the parameter can only take the value of 1, which imposes limitations on our privacy protection. Therefore, we intend to optimize the mechanism to further improve it when the data are extensively applied.
- When adding noise to the time–frequency histogram, although we adopted a truncation method in this study to improve data availability, it is still worth considering how to further improve this. In the future, we will consider combining data distribution characteristics to find more reasonable partition noising schemes.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Wang, H.; Gao, Q.; Li, H.; Wang, H.; Yan, L.; Liu, G. A Structural Evolution-Based Anomaly Detection Method for Generalized Evolving Social Networks. Comput. J. 2022, 65, 1189–1199. [Google Scholar] [CrossRef]
- Zhao, J.; Zhang, Y.; Li, X.; Ma, J. Trajectory privacy preserving method based on trajectory frequency suppression. J. Comput. Sci. 2014, 37, 2096–2106. [Google Scholar]
- Wang, H.; Wang, H. Correlated Tuple Data Release via Differential Privacy. Inf. Sci. 2021, 560, 347–369. [Google Scholar] [CrossRef]
- Yue, S. Research on Privacy Preserving of Web Application System Data Publishing Based on k-Anonymity. Master’s Thesis, Beijing University of Posts and Telecommunications, Beijing, China, 2018. [Google Scholar]
- Zheng, Z.; Lu, H.; Duan, Y. Privacy preserving generalization algorithm for data sets with the same sensitive value. J. Zhengzhou Univ. 2018, 50, 35–40. [Google Scholar]
- Chen, X. Research and Implementation of Data Anonymity Privacy Preserving Method. Master’s Thesis, Jiangsu University of Science and Technology, Zhenjiang, China, 2018. [Google Scholar]
- Yang, Z.; Ning, B.; Li, Y. Personal privacy preserving method of spatiotemporal data based on κ-generalization technology. J. East China Norm. Univ. 2017, 5, 174–185. [Google Scholar]
- Yu, Q.; Wang, Y.; Ye, Z.; Zhang, S.; Chen, C. Privacy preserving algorithm for trajectory data release based on optimized local suppression. Comput. Eng. 2019, 46, 112–118. [Google Scholar]
- Wang, Z. Research on Path Privacy Preserving Method Based on Location Service. Ph.D. Thesis, Hainan University, Haikou, China, 2015. [Google Scholar]
- Wu, X. Research on Privacy Preserving Methods of Fake Trajectory. Master’s Thesis, China University of Science and Technology, Taipei City, Taiwan, 2016. [Google Scholar]
- Huo, Z.; Meng, X. A path data publishing method satisfying differential privacy. J. Comput. Sci. 2018, 41, 400–412. [Google Scholar]
- Xiong, P.; Zhu, T.; Wang, T. Differential privacy and its application. J. Comput. Sci. 2014, 37, 101–122. [Google Scholar]
- Li, W.; Zhang, X.; Cao, G.; Li, S.; Zhang, Q. Data hierarchical fusion release mechanism based on differential privacy. Minicomput. Syst. 2019, 40, 2252–2256. [Google Scholar]
- Dwork, C. Differential Privacy: A Survey of Results. In Theory and Applications of Models of Computation; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2006; Volume 26, pp. 1–12. [Google Scholar]
- Chen, R.; Fung, B.C.M.; Desai, B.C. Differentially private trajectory data publication. arXiv 2020, arXiv:1112.2020. [Google Scholar]
- Xu, Q.; Chen, Z. Trajectory Data Protection based on Differential Privacy k-means. In Proceedings of the 39th Chinese Control Conference, Shenyang, China, 27–29 July 20202; Volume 7, pp. 7649–7654. [Google Scholar]
- Abadi, M. PDP-SAG: Personalized Privacy Protection in Moving Objects Databases by Combining Differential Privacy andSensitive Attribute Generalization. IEEE Access 2019, 7, 85887–85902. [Google Scholar]
- Jiang, K.; Shao, D.; Bressan, S.; Kister, T.; Tan, K.L. Publishing trajectories with differential privacy guarantees. In Proceedings of the 25th International Conference on Scientific and Statistical Database Management, Baltimore, MD, USA, 29–31 July 2013; pp. 1–12. [Google Scholar]
- Zhang, L.; Liu, Y.; Wang, R. Data publishing technology based on differential privacy in location big data service. J. Commun. 2016, 37, 46–54. [Google Scholar]
- Wu, Y.; Chen, H.; Zhao, S.; Liang, W.; Wu, Y.; Li, C.; Zhang, X. A differential privacy trajectory preserving mechanism based on spatiotemporal correlation. J. Comput. Sci. 2018, 41, 309–322. [Google Scholar]
- Bi, X.; Liang, Y.; Shi, H.; Tian, H. A secondary privacy preserving method for anonymous location based on privacy preference. J. Shandong Univ. 2017, 52, 75–84. [Google Scholar]
- Ma, Y.; Zhang, L. LBS group nearest neighbor query based on differential privacy. Comput. Sci. 2017, 44, 336–341. [Google Scholar]
- Hua, J.; Gao, Y.; Zhong, S.B. Differentially private publication of general time-serial trajectory data. In Proceedings of the 2015 IEEE Conference on Computer Communications (INFOCOM), Hong Kong, China, 26 April–1 May 2015; pp. 549–557. [Google Scholar]
- Hu, D.; Zhan, H. Predictable privacy preserving method of differential disturbance user trajectory. Minicomput. Syst. 2019, 40, 1286–1290. [Google Scholar]
- Geng, Q.; Ding, W.; Guo, R.; Kumar, S. Privacy and Utility Tradeoff in Approximate Differential Privacy. arXiv 2018, arXiv:1810.00877. [Google Scholar]
- Hay, M.; Li, C.; Miklau, G.; Jensen, D. Accurate estimation of the degree distribution of private networks. In Proceedings of the 2009 Ninth IEEE International Conference on Data Mining, Miami Beach, FL, USA, 6–9 December 2009; pp. 169–178. [Google Scholar]
- Day, W.-Y.; Li, N.; Lyu, M. Publishing graph degree distribution with node differential privacy. In Proceedings of the 2016 International Conference on Management of Data, San Francisco, CA, USA, 26 June–1 July 2016; pp. 123–138. [Google Scholar]
- Kasiviswanathan, S.P.; Nissim, K.; Raskhodnikova, S.; Smith, A. Analyzing graphs with node differential privacy. In Theory of Cryptography Conference; Springer: Berlin/Heidelberg, Germany, 2013; pp. 457–476. [Google Scholar]
- Nissim, K.; Raskhodnikova, S.; Smith, A. Smooth sensitivity and sampling in private data analysis. In Proceedings of the Thirty-Ninth Annual ACM Symposium on Theory of Computing, San Diego, CA, USA, 11–13 June 2007; pp. 75–84. [Google Scholar]
- Sala, A.; Zhao, X.; Wilson, C.; Zheng, H.; Zhao, B.Y. Sharing graphs using differentially private graph models. In Proceedings of the 2011 ACM SIGCOMM Conference on Internet Measurement Conference, Berlin, Germany, 2–4 November 2011; pp. 81–98. [Google Scholar]
- Jiang, L.J.; Guo, L. Mining spatial association rules based on taxi trajectory data. Geomat. Spat. Inf. Technol. 2019, 42, 56–59. [Google Scholar]
- Jian, Y.; Wang, D.; Gao, X.; Wang, R.; Lin, S. Privacy preserving model of anonymous group LBS trajectory based on differential privacy. Minicomput. Syst. 2019, 40, 341–347. [Google Scholar]
- Xia, C.; Hua, J.; Tong, W.; Zhong, S. Distributed K-Means clustering guaranteeing local differential privacy. Comput. Secur. 2020, 90, 101699. [Google Scholar] [CrossRef]
- Hu, Z.; Yang, J. Differential privacy protection method based on published trajectory cross-correlation constraint. PLoS ONE 2020, 15, e0237158. [Google Scholar] [CrossRef] [PubMed]
- Chen, Z.; Wang, Y.; Zhang, S.; Zhong, H.; Chen, L. Differentially private user-based collaborative filtering recommendation based on k-means clustering. Expert Syst. Appl. 2021, 168, 114366. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yan, L.; Wang, H.; Wang, Z.; Wu, T.; Fu, W.; Zhang, X. Differentially Private Timestamps Publishing in Trajectory. Electronics 2023, 12, 361. https://doi.org/10.3390/electronics12020361
Yan L, Wang H, Wang Z, Wu T, Fu W, Zhang X. Differentially Private Timestamps Publishing in Trajectory. Electronics. 2023; 12(2):361. https://doi.org/10.3390/electronics12020361
Chicago/Turabian StyleYan, Liang, Hao Wang, Zhaokun Wang, Tingting Wu, Wandi Fu, and Xu Zhang. 2023. "Differentially Private Timestamps Publishing in Trajectory" Electronics 12, no. 2: 361. https://doi.org/10.3390/electronics12020361
APA StyleYan, L., Wang, H., Wang, Z., Wu, T., Fu, W., & Zhang, X. (2023). Differentially Private Timestamps Publishing in Trajectory. Electronics, 12(2), 361. https://doi.org/10.3390/electronics12020361