1 / 16

Information-Theoretic Secrecy

Probability Theory: Bayes’ theorem Perfect secrecy: definition, some proofs, examples: one-time pad; simple secret sharing Entropy: Definition, Huffman coding property, unicity distance. Information-Theoretic Secrecy. CSCI381 Fall 2005 GWU Reference: Stinson. Bayes’ Theorem.

daisy
Download Presentation

Information-Theoretic Secrecy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probability Theory: Bayes’ theorem • Perfect secrecy: definition, some proofs, examples: one-time pad; simple secret sharing • Entropy: Definition, Huffman coding property, unicity distance Information-Theoretic Secrecy CSCI381 Fall 2005 GWU Reference: Stinson

  2. Bayes’ Theorem If Pr[y] > 0 then Pr[x|y] = Pr[x]Pr[y|x]/  xXPr[x]Pr[y|x] What is the probability that the 1st dice throw is 2 when the sum of two dice throws is 5? What is the probability that the 2nd dice throw is 3 when the product of the two dice throws is: 6, and 5? CS284/Spring04/GWU/Vora/Shannon Secrecy

  3. (Im)Perfect Secrecy: Example P = {1, 2, 3} K = {K1, K2, K3} and C = {2, 3, 4, 5, 6} Keys chosen equiprobably Pr[1] = Pr[2] = Pr[3] = 1/3 Pr[c=3] = ? Pr[m|c=3] = ? Pr[k|c=3] = ? CS284/Spring04/GWU/Vora/Shannon Secrecy

  4. (Im)Perfect Secrecy: Example How should the above ciphers be changed to improve the cryptosystem? What defines a good cryptosystem? CS284/Spring04/GWU/Vora/Shannon Secrecy

  5. (Im)Perfect Secrecy: ExampleLatin Square Assume all keys and messages equiprobable What’s good about this? P(k|c) P(m|c) CS284/Spring04/GWU/Vora/Shannon Secrecy

  6. Perfect Secrecy: Definition A cryptosystem has perfect secrecy if Pr[x|y] = Pr[x]  xP, yC a posteriori probability = a priori probability posterior = prior CS284/Spring04/GWU/Vora/Shannon Secrecy

  7. Example: one-time pad P = C = Z2n dK=eK(x1, x2, …xn) = (x1+K1, x2+K2, …xn+Kn) mod 2 Show that it provides perfect secrecy CS284/Spring04/GWU/Vora/Shannon Secrecy

  8. Some proofs: Thm. 2.4 Thm 2.4: Suppose (P, C, K, E, D) is a cryptosystem where |K| = |P| = |C|. Then the cryptosystem provides perfect secrecy if and only if every key is used with equal probability 1/|K|, and x Pand y C, there is a unique key K such that eK(x) = y (eg: Latin square) CS284/Spring04/GWU/Vora/Shannon Secrecy

  9. Entropy H(X) = - pi log2 pi Example: pi = 1/n Examples: ciphertext and plaintext entropies for examples. CS284/Spring04/GWU/Vora/Shannon Secrecy

  10. Huffman encoding f: X*  {0, 1}* String of random variables to string of bits e.g. X = {a, b, c, d} f(a) = 1, f(b) = 10, f(c) = 100, f(d) = 1000 CS284/Spring04/GWU/Vora/Shannon Secrecy

  11. Huffman encoding algorithm X = {a, b, c, d, e} p(a) = 0.05 p(b) = 0.1 p(c) = 0.12 p(d) = 0.13 p(e) = 0.6 a: 000, b: 001, c: 010, d: 011, e: 1 Average length = ? Entropy = ? CS284/Spring04/GWU/Vora/Shannon Secrecy

  12. Theorem H(X)  average length of Huffman encoding  H(X) + 1 Without proof CS284/Spring04/GWU/Vora/Shannon Secrecy

  13. Properties of Entropy • H(X)  log2 n • H(X, Y)  H(X) + H(Y) • H(X, Y) = H(X) + H(Y|X) = H(Y) + H(X|Y) Where H(X|Y) = - x y p(y)p(x|y)log2p(x|y) • H(X|Y)  H(X) With proofs and examples CS284/Spring04/GWU/Vora/Shannon Secrecy

  14. Theorem H(K|C) = H(K) + H(P) – H(C) Examples: Previous imperfect squares Proof: H(K, P, C) = H(K, C) = H(K, P) CS284/Spring04/GWU/Vora/Shannon Secrecy

  15. Language Entropy and Redudancy HL = Lim n H(Pn ) /n (lies between 1 and 1.5 for English) RL = 1 – HL /log2 |P| (the amount of “space” in a letter of English for other information) Need, on average, about n ciphertext characters to break a substitution cipher where: n = key entropy / RL log2 |P| n is “unicity distance” of cryptosystem CS284/Spring04/GWU/Vora/Shannon Secrecy

  16. Proof • H(K|Cn) = H(K) + H(Pn) – H(Cn) CS284/Spring04/GWU/Vora/Shannon Secrecy

More Related