start
Differences
This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
start [2020/11/17 00:06] – created conan | start [2020/11/19 00:37] (current) – [Introduction to the tool] conan | ||
---|---|---|---|
Line 1: | Line 1: | ||
+ | [[start|{{: | ||
+ | [[release|{{: | ||
+ | [[contact|{{: | ||
+ | |||
====== Model Checking Differentially Private Properties ====== | ====== Model Checking Differentially Private Properties ====== | ||
- | In this tool, the branching time temporal logic dpCTL is developed for specifying differential privacy. Several differentially private mechanisms are formalized as Markov chains or Markov decision processes. Using formal models, subtle privacy conditions are specified by dpCTL. In order to verify privacy properties automatically, | + | Differential privacy is a framework for designing and analyzing privacy measures, |
+ | which has become a gold standard in data privacy[1]. | ||
+ | In this tool, the branching time temporal logic dpCTL is developed for specifying differential privacy. Several differentially private mechanisms are formalized as Markov chains or Markov decision processes. Using formal models, subtle privacy conditions are specified by dpCTL. In order to verify privacy properties automatically, | ||
+ | |||
+ | ===== Introduction to the tool ===== | ||
+ | Our tool is built as a plugin of the model checker ePMC, and now is publicly available on GitHub at [[https:// | ||
+ | |||
+ | ===== Future work ===== | ||
+ | We have now studied the verification problem of pufferfish privacy[3], a more generalized definition which consumes differential privacy. A formalization of the problem models the privacy mechanisms as hidden Markov models and the privacy is checked by computing and comparing the observations distribution from initial pair of information states. We study the complexity of the problem and gives an algorithm which applies SMT solver to solve the verification problem. We also implement the algorithm as a tool FAIER, the pufferFish privAcy verifIER. The tool is to be pubished soon, though, you can first have a look at this paper[4]. | ||
+ | |||
+ | |||
+ | [1] Cynthia Dwork. 2006. Differential Privacy. In International Colloquium on Automata, Languages and Programming (ICALP) (LNCS, Vol. 4052), Michele Bugliesi, Bart Preneel, Vladimiro Sassone, and Ingo Wegener | ||
+ | (Eds.). Springer, 1–12. | ||
+ | |||
+ | [2] Daniel Kifer and Ashwin Machanavajjhala. 2014. Pufferfish: A Framework for Mathematical Privacy Definitions. ACM Trans. Database Syst. 39, 1, Article 3 (2014), 3:1–3:36 pages. | ||
+ | |||
+ | [3] Liu D., Wang BY., Zhang L. (2018) Model Checking Differentially Private Properties. In: Ryu S. (eds) Programming Languages and Systems. APLAS 2018. Lecture Notes in Computer Science, vol 11275. Springer, Cham. [[https:// | ||
+ | |||
+ | [4] Liu D., Wang BY., Zhang L. Verifying Pufferfish Privacy in Hidden Markov Models [[https:// | ||
- | Liu D., Wang BY., Zhang L. (2018) Model Checking Differentially Private Properties. In: Ryu S. (eds) Programming Languages and Systems. APLAS 2018. Lecture Notes in Computer Science, vol 11275. Springer, Cham. [[https:// |
start.1605542787.txt.gz · Last modified: 2020/11/17 00:06 (external edit)