1 / 29

Computational Complexity & Differential Privacy

Explore the intersection of computational complexity and differential privacy, discussing challenges in synthesizing accurate data summaries and the inherent complexity. Discover negative results and tools like digital signature schemes.

lbennett
Download Presentation

Computational Complexity & Differential Privacy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computational Complexity & Differential Privacy Salil VadhanHarvard University Joint works with Cynthia Dwork, KunalTalwar, Andrew McGregor, Ilya Mironov, Moni Naor, Omkant Pandey, Toni Pitassi, Omer Reingold, Guy Rothblum, Jon Ullman TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA

  2. Computational Complexity When do computational resource constraints change what is possible? Examples: • Computational Learning Theory [Valiant `84]: small VC dimension  learnable with efficient algorithms (bad news) • Cryptography [Diffie & Hellman `76]: don’t need long shared secrets against a computationally bounded adversary (good news)

  3. Today: Computational Complexityin Differential Privacy I. Computationally bounded curator • Makes differential privacy harder • Differentially private & accurate synthetic data infeasible to construct • Open: release other types of summaries/models? II. Computationally bounded adversary • Makes differential privacy easier • Provable gain in accuracy for 2-party protocols (e.g. for estimating Hamming distance)

  4. PART I: Computationally BOUNDED CURATORS

  5. Cynthia’s Dream: Noninteractive Data Release C Original Database D Sanitization C(D)

  6. Noninteractive Data Release: Desidarata • (,)-differential privacy: for every D1, D2 that differ in one row and every set T, Pr[C(D1) T]  exp()Pr[C(D2) T]+, with  negligible • Utility: C(D) allows answering many questions about D • Computational efficiency: C is polynomial-time computable.

  7. Utility: Counting Queries • D = (x1,…,xn)Xn • P = {  : X {0,1}} • For any  P, want to estimate (from C(D)) counting query (D):=(i(xi))/n within accuracy error  • Example: X = {0,1}dP = {conjunctions on  k variables}Counting query = k-way marginale.g. What fraction of people in D smokeand have cancer?

  8. Form of Output • Ideal: C(D) is a synthetic dataset •  P |(C(D))-(D)|  • Values consistent • Use existing software • Alternatives? • Explicit list of |P| answers (e.g. contingency table) • Median of several synthetic datasets [RR10] • Program M s.t.  P |M()-(D)|

  9. Positive Results • D = (x1,…,xn)({0,1}d)n • P = {  : {0,1}d {0,1}} • (D):=(1/n) i(xi) •  = accuracy error •  = privacy

  10. Positive Results • D = (x1,…,xn)({0,1}d)n • P = {  : {0,1}d {0,1}} • (D):=(1/n) i(xi) •  = accuracy error •  = privacy

  11. Positive Results • D = (x1,…,xn)({0,1}d)n • P = {  : {0,1}d {0,1}} • (D):=(1/n) i(xi) •  = accuracy error •  = privacy

  12. Positive Results • D = (x1,…,xn)({0,1}d)n • P = {  : {0,1}d {0,1}} • (D):=(1/n) i(xi) •  = accuracy error •  = privacy

  13. Positive Results • D = (x1,…,xn)({0,1}d)n • P = {  : {0,1}d {0,1}} • (D):=(1/n) i(xi) •  = accuracy error •  = privacy

  14. Positive Results • D = (x1,…,xn)({0,1}d)n • P = {  : {0,1}d {0,1}} • (D):=(1/n) i(xi) •  = accuracy error •  = privacy Summary: Can construct synthetic databases accurate on huge families of counting queries, but complexity may be exponential in dimensions of data and query set P. Question: is this inherent?

  15. Negative Results for Synthetic Data Summary: • Producing accurate & differentially private synthetic data is as hard as breaking cryptography (e.g. factoring large integers). • Inherently exponential in dimensionality of data (and in dimensionality of queries).

  16. Negative Results for Synthetic Data • Thm [DNRRV09]: Under standard crypto assumptions (OWF), there is no n=poly(d) and curator that: • Produces synthetic databases. • Is differentially private. • Runs in time poly(n,d). • Achieves accuracy error =.99 for P = {circuits of size d2} (so |P|~2d2) • Thm [UV10]: Under standard crypto assumptions (OWF), there is no n=poly(d) and curator that: • Produces synthetic databases. • Is differentially private. • Runs in time poly(n,d). • Achieves accuracy error =.01 for 2-way marginals.

  17. Tool 1: Digital Signature Schemes A digital signature scheme consists of algorithms (Gen,Sign,Ver): • On security parameter d, Gen(d) = (SK,PK)  {0,1}d {0,1}d • On m {0,1}d, can compute =SignSK(m){0,1}ds.t. VerPK(m,)=1 • Given many (m,) pairs, infeasible to generate new (m’,’) satisfying VerPK • Gen, Sign, Ver all computable by circuits of size d2.

  18. Hard-to-Sanitize Databases • VerPK {circuits of size d2}=P • VerPK(D) = 1 • Generate random (PK,SK) Gen(d), m1, m2,…, mn {0,1}d D C(D) curator • VerPK(C(D))  1- > 0 • iVerPK(m’i,i)=1 • Case1: m’iDForgery! • Case 2: m’iDReidentification!

  19. Negative Results for Synthetic Data • Thm [DNRRV09]: Under standard crypto assumptions (OWF), there is no n=poly(d) and curator that: • Produces synthetic databases. • Is differentially private. • Runs in time poly(n,d). • Achieves accuracy error =.99 for P = {circuits of size d2} (so |P|~2d2) • Thm [UV10]: Under standard crypto assumptions (OWF), there is no n=poly(d) and curator that: • Produces synthetic databases. • Is differentially private. • Runs in time poly(n,d). • Achieves accuracy error =.01 for 3-waymarginals.

  20. Tool 2: Probabilistically Checkable Proofs The PCP Theorem:  efficient algorithms (Red,Enc,Dec) s.t. z {0,1}d’ satisfyingall of V w s.t. V(w)=1 Enc Set of 3-clauses on d’=poly(d) vars V={x1 x5x7, x1 v5xd’,…} Circuit V of size d2 Red z’ {0,1}d’ satisfying .99 fraction of V w’ s.t. V(w’)=1 Dec

  21. Hard-to-Sanitize Databases • Let PK = Red(VerPK) • Each clause in PK is satisfied by all zi • Generate random (PK,SK) Gen(d), m1, m2,…, mn {0,1}d D C(D) VerPK Enc curator • Each clause in PK is satisfied by  1- of the z’i • is.t. z’i satisfies  1- of the clauses • Dec(z’i) = valid (m’i,i) • Case1: m’iDForgery! • Case 2: m’iDReidentification!

  22. Part I Conclusions • Producing private, synthetic databases that preserve simple statistics requires computation exponential in the dimension of the data. How to bypass? • Average-case accuracy: Heuristics that don’t give good accuracy on all databases, only those from some class of models. • Non-synthetic data: • Thm [DNRRV09]: For general P (e.g. P={circuits of size d2}),  efficient curators “iff”  efficient “traitor-tracing” schemes • But for structured P (e.g. P={all marginals}), wide open!

  23. Part II: COMPUTATIONALLY BOUNDED ADVERSARIES

  24. Motivation • Differential privacy protects even against adversaries with unlimited computational power. • Can we gain by restricting to adversaries with bounded (but still huge) computational power? • Better accuracy/utility? • Enormous success in cryptography from considering computationally bounded adversaries.

  25. Definitions [MPRV09] • (,neg(k))-differential privacy: for all D1, D2 differing in one row, every set T, and security parameter k, Pr[Ck(D1) T]  exp()Pr[Ck(D2) T]+neg(k), • Computational -differential privacy v1: for all D1, D2 differing in one row, everyprobabilistic poly(k)-time algorithm T, and security parameter k, Pr[T(Ck(D1))=1]  exp()Pr[T(Ck(D2))=1]+neg(k) • Computational -differential privacy v2:  (,neg(k))-differentially private C’k such that for all D, Ck(D) and C’k(D) are computationally indistinguishable. open: requires generalization of Dense Model Thm [GT04,RTTV08] immediate

  26. m1 m2 m3 2-Party Privacy • 2-party (& multiparty) privacy: each party has a sensitive dataset, want to do a joint computation f(DA,DB) mk-1 mk ZB f(DA,DB) ZA f(DA,DB) • A’s view should be a (computational) differentially private function of DB(even if A deviates from protocol), and vice-versa

  27. Benefit of Computational Differential Privacy Thm: Under standard cryptographic assumptions (OT),  2-party computational -differentially private protocol for estimating Hamming distance of bitvectors, with error O(1/). Proof: generic paradigm • Centralized Solution: Trusted third party could compute diff. private approx. to Hamming distance w/error O(1/) • Distribute via Secure Function Evaluation [Yao86,GMW86]: Centralized solution  distributed protocol s.t. no computationally bounded party can learn anything other than its output. Remark: More efficient or improved protocols by direct constructions [DKMMN06,BKO08,MPRV09]

  28. Benefit of Computational Differential Privacy Thm: Under standard cryptographic assumptions (OT),  2-party computational -differentially private protocol for estimating Hamming distance of bitvectors, with error O(1/). Thm [MPRV09,MMPRTV10]: The best 2-party differentially private protocol (vs. unbounded adversaries) for estimating Hamming distance has error ~(n1/2). Computational privacy  significant gain in accuracy! And efficiency gains too [BNO06].

  29. Conclusions • Computational complexity is relevant to differential privacy. • Bad news: producing synthetic data is intractable • Good news: better protocols against bounded adversaries Interaction with differential privacy likely to benefit complexity theory too.

More Related