Table 8.Summary of state-of-the-art differential privacy techniques enhanced by AI. ObjectiveAI methodPerformanceReferenceYear Privacy-enhanced multi-area task assignment strategyRL DQNAccuracy: 70%–75% on different number of areasWang et al.[164]2022 ...
As with other technologies and techniques that allow for a range of instantiations, the devil is in the details, and choices in the implementation of differential privacy influence the degree of protection against privacy loss in analogy to the manner in which the security parameter in a cryptosyst...
Application of differential privacy-based schemes in LBS In this section, we first analyze the techniques surveyed in this paper with various parameters from the perspective of the use of LBSs, and then present their applicability to protect users’ location privacy in LBSs. ...
distributed noise generation."Annual International Conference on the Theory and Applications of Cryptographic Techniques. Springer, Berlin, Heidelberg, 2006. [2] Dwork, Cynthia, and Aaron Roth. "The algorithmic foundations of differential privacy."Foundations and Trends® in Theoretical Computer Science9....
In this survey, we recall the definition of differential privacy and two basic techniques for achieving it. We then show some interesting applications of these techniques, presenting algorithms for three specific tasks and three general results on differentially private learning. ...
With the increasing awareness of data privacy protection and the growing stringency of data security regulations, federated learning (FL) as a distributed machine learning approach has garnered widespread attention. However, in practice, FL faces severe challenges in privacy protection. This paper propos...
Differential privacy describes a promise, made by a data curator to a data subject: you will not be affected, adversely or otherwise, by allowing your data to be used in any study, no matter what other studies, data sets, or information from other sources is av...
In this survey, we recall the definition of differential privacy and two basic techniques for achieving it. We then show some interesting applications of these techniques, presenting algorithms for three specific tasks and three general results on differentially private learning. (在新选项卡中打开...
Similarly, distributed learning techniques such as federated learning, often touted as being “privacy-preserving” because the data does not leave its owner, have been proven ineffective against attackers who participate in the training protocol and are able to capture updates submitted by other ...
Then we put forward an attack model called non-location sensitive information attack, in order to resist this attack, we add noise into the location data and non-location sensitive data using differential privacy techniques. Finally, the Algorithm can be consistently dealt with the problem of data...