290 likes | 383 Views
Lecture 2 Overview. Cryptography. Secret writing Disguised data cannot be read, modified, or fabricated easily Feasibility of complexity for communicating parties Encryption : encoding (encipher) plaintext cipher text C = E(c) (E = encryption rule) Decryption : decoding (decipher)
E N D
Cryptography • Secret writing • Disguised data cannot be read, modified, or fabricated easily • Feasibility of complexity for communicating parties • Encryption: encoding (encipher) plaintext cipher text C = E(c) (E = encryption rule) • Decryption : decoding (decipher) Cipher text plaintext P = D(c) (D = decryption rule) CS 450/650 – Lecture 2 Overview
Encryption Keyless Original plaintext plaintext ciphertext Encryption Decryption Symmetric key Original plaintext plaintext ciphertext Encryption Decryption Asymmetric key Original plaintext plaintext ciphertext Encryption Decryption CS 450/650 – Lecture 2 Overview
Symmetric Encryption System • Secret Key • Both sender and receiver share one key • Encryption and decryptions algorithms are closely related • N * (N-1) /2 keys are needed for N users to communicate in pairs • Key must be kept secret CS 450/650 – Lecture 2 Overview
Asymmetric Encryption System • Public Key • One key must be kept secret, the other can be freely exposed – private key and public key • Only the corresponding private key can decrypt what has been encrypted using the private key CS 450/650 – Lecture 2 Overview
Cryptanalysis • How to break an encryption! • Cryptanalyst • Deduce the original meaning of the ciphertext • Determine the decryption algorithm that matches the encryption one used Breakable Encryption! CS 450/650 – Lecture 2 Overview
Substitution Ciphers • Substitute a character or a symbol for each character of the original message • Caesar Cipher • Ci = pi + 3 • Permutation • Alphabet is scrambled, each plaintext letter maps to a unique ciphertext letter • Key can be used to control the permutation to be used CS 450/650 – Lecture 2 Overview
Cryptanalysis of substitution ciphers • Clues • Short words, • Words with repeated patterns, • Common initial and final letters, … • Knowledge of language may simplify it • English E, T, O, A occur far more than J, Q, X, Z • Digrams, Trigrams, and other patterns • Context CS 450/650 – Lecture 2 Overview
One-Time Pads • One-Time Pad • Set of sheets of paper with keys, glued into a pad • Pre-arranged charts (Vignere Tableau) • Vernam Cipher • random numbers • Book Ciphers • access to identical objects CS 450/650 – Lecture 2 Overview
Transposition Ciphers • The order of letters is rearranged • Columnar transposition • cryptanalysis using digrams CS 450/650 – Lecture 2 Overview
Challenge • Try breaking • fqjcb rwjwj vnjax bnkhj whxcq nawjv nfxdu mbvnu ujbbf nnc frequency table of letters in the Concise Oxford Dictionary (9th ed.,1995) CS 450/650 Fundamentals of Integrated Computer Security
Lecture 3Entropy CS 450/650 Fundamentals of Integrated Computer Security Slides are modified from David Madison
Ciphers • The intent of cryptography is to provide secrecy to messages and data • Substitutions • ‘hide’ letters of plaintext • Transposition • scramble adjacent characters CS 450/650 – Lecture 3: Entropy
Entropy • Shannon demonstrated mathematical methods of treating communication channels, bandwidth, and the effects of random noise on signals • pi is the probability of a given message (or piece of information) • n is the number of possible messages (or pieces of information) CS 450/650 – Lecture 3: Entropy
Example 1 • Suppose there is only one possible signal • i.e., n = 1, and p1 = 1 H = -1 x log 1 = 0 • There is only one possible message that has a probability of 1 • Since there is no uncertainty, the entropy in this case is zero CS 450/650 – Lecture 3: Entropy
Example 2 • There are only two possible, equally probable, messages. H = -(0.5 log (0.5) + 0.5 log(0.5)) = - ( 0.5(-1)+0.5 (-1)) = 1 • There are two possible equally probable messages, and the uncertainty (entropy) is 1 • one bit can specify two possible conditions, • i.e., 0 or 1 CS 450/650 – Lecture 3: Entropy
Example 3 • There are 1024 (= 210) possible signals, all of equal probability (pi = 2-10). H = -(210 x 2-10 log(2-10)) = 10 • There are 1024 equally probably possible messages, and the uncertainty (entropy) is 10 bits. CS 450/650 – Lecture 3: Entropy
Entropy • Entropy gives an indication of the complexity, or randomness, of a message or a data set. • Generally, signals or data sets with high entropy, • Have a greater chance of a data transmission error • Require greater bandwidth to transmit • Have smaller capacity for compression • Appear to have a greater degree of "disorder” CS 450/650 – Lecture 3: Entropy
Entropy • English language (and most other human languages) have a relatively low entropy due to the frequency of certain characters • the letters 'e' and 't‘ • Information can be compressed using algorithms that "squeeze out" the redundancies in a message • making the compressed version much smaller, and much more random • Compressing a file twice doesn't reduce the size ! CS 450/650 – Lecture 3: Entropy
Entropy and Cryptography • Through cryptography, we increase the uncertainty in the message for those who do not know the key • Plaintext has an entropy of zero as there is no uncertainty about it. • This class is CS 450 • Encryption using one of x equally probable keys increases the entropy to lg x • KBXT LWER ACMF OSJU CS 450/650 – Lecture 3: Entropy
Entropy and Cryptography • With a perfect cipher “all keys are essentially equivalent” • having an encrypted sample won't help the cryptanalyst do his or her job • an encrypted message is similar to a signal that is buried in noise; • the higher the noise level, the more difficult it is to extract the message • A good cipher will make a message look like noise CS 450/650 – Lecture 3: Entropy
Entropy and Cryptography • Encryption should "scramble" the original message to the maximum possible extent • Algorithms should take a message through a sequence of substitutions and transpositions • Shannon: • “Encrypting a message will intentionally increase the message's entropy” CS 450/650 – Lecture 3: Entropy
Shannon Characteristics of ‘Good’ Ciphers • “The amount of secrecy needed should determine the amount of labor appropriate for the encryption and decryption” • Hold off the interceptor for required time duration • “The set of keys and enciphering algorithm should be free from complexity” • There should not be restriction on choice of keys or types of plaintext • “The implementation of the process should be as simple as possible” • Hand implementation, software bugs CS 450/650 – Lecture 3: Entropy
Shannon Characteristics of ‘Good’ Ciphers • “Errors in ciphering should not propagate and cause corruption of further information in the message” • An error early in the process should not throw off the entire remaining cipher text • “The size of the enciphered text should be no larger than the text of original message” • A ciphertext that expands in size cannot possibly carry more information than the plaintext CS 450/650 – Lecture 3: Entropy
Trustworthy Encryption Systems • Commercial grade encryption • Based on sound mathematics • Analyzed by competent experts • Test of time DES: Data Encryption Standard RSA: River-Shamir-Adelman AES: Advanced Encryption Standard CS 450/650 – Lecture 3: Entropy
Stream and Block Ciphers • Stream • Converts one symbol of plaintext into a symbol of ciphertex • Block • Encrypts a group of plaintext symbols as one block CS 450/650 – Lecture 3: Entropy
Confusion and Diffusion • Confusion • Has complex relation between plaintext, key, and ciphertext • The interceptor should not be able to predict what will happen to ciphertext by changing one character in plaintext • Example • Caesar Cipher • One time pad CS 450/650 – Lecture 3: Entropy
Confusion and Diffusion • Diffusion • Cipher should spread information from plaintext over entire ciphertext • The interceptor should require access to much of ciphertext to infer algorithm • Example • Caesar Cipher • One time pad CS 450/650 – Lecture 3: Entropy