130 likes | 362 Views
ECE 101 An Introduction to Information Technology Information Theory. Information Path. Source of Information. Digital Sensor. Information Display. Information Receiver and Processor. Information Processor & Transmitter. Transmission Medium. Information Theory.
E N D
ECE 101An Introduction to Information TechnologyInformation Theory
Information Path Source of Information Digital Sensor Information Display Information Receiver and Processor Information Processor & Transmitter Transmission Medium
Information Theory • Source generates information by producing data units called symbols • Measurement of information present • measure randomness (value of information) • do this mathematically using probability • amount of information present is measure of “entropy”
Probability • Study of random outcomes • The experiment • The outcome • P[Xi] = probability of an a particular outcome (Xi) • 0 < P[Xi] < 1 • where N= number of different outcomes
Measuring Information • Symbol - data units of information • Entropy • average amount of energy that a source produces, measured in bits/symbol
Logarithms – Base 2 • In information theory we need logs to the base 2, not 10 (log10 N = x or 10x = N) (logs are exponents) • log2 N = x or 2x = N • 20 = 1; log2 1 = 0 • 21 = 2; log2 2 = 1 • 22 = 4; log2 4 = 2 • 23 = 8; log2 8 = 3 • 24 = 16; log2 16 = 4 • 25 = 32; log2 32 = 5
Logarithms – Base “a” then a=2 • Conversion of bases in general: • loga N = x or ax = N • So log2 N = x or 2x = N • loga N = (log10 N)/ (log10 a) • If a = 2, then use log10 2 = .301 • log2 N = 3.32 (log10 N) • loga MN = (loga M) + (loga N) • loga M/N = (loga M) - (loga N) • loga Nm = m(loga N)
Measuring Information • Symbol - data units of information • Entropy • average amount of energy that a source produces, measured in bits/symbol
Effective Probability and Entropy • Measurement of entropy when probability is not known • estimate probability when it is not known • effective probability = Pe[Xi] = NXi/N
Simulating Randomness by Computer • Information is an unexpected quality • Model it an an experiment that produces random outcomes • Common method: pseudo-random number generator (PRNG) • PRNG uses Modular Arithmetic
Modular Arithmetic • [B]mod(N) = modulo-N value of integer B • Divide B by N: B/N = I + R/N • where I is integer quotient and R is remainder • 0 R (N-1) • [B]mod(N) = R = B - (I N) • or R = (B/N - I) N, where B/N = I.xxx
Pseudo-Random Number Generator • Create a random number from a sequence X1, X 2, X3 , … , Xn, … where Xn is the nth integer in the sequence • Find Xn = [A Xn-1 + B]mod(N) where • A is an arbitrary multiplier of Xn-1 • N is the base of the modulus • B prevents the sequence from degenerating into a set of zeroes • to get started we need an arbitrary X0, or seed
Arbitrary Range for Pseudo-Random Numbers • Desire range other than an integer number then