300 likes | 442 Views
資料壓縮. 資訊理論基本概念. 授課老師 : 陳建源 Email:cychen07@nuk.edu.tw 研究室 : 法 401 網站 http://www.csie.nuk.edu.tw/~cychen/. 1. Self-information. Let S be a system of events. in which. Def: The self-information of the event E k is written I(E k ):. The base of the logarithm: 2 (log) , e (ln).
E N D
資料壓縮 資訊理論基本概念 授課老師: 陳建源 Email:cychen07@nuk.edu.tw 研究室:法401 網站 http://www.csie.nuk.edu.tw/~cychen/
1. Self-information Let S be a system of events in which Def: The self-information of the event Ek is written I(Ek): The base of the logarithm: 2 (log) , e (ln) 單位:bit, nat
1. Self-information then when then when then when then when 愈小 愈大
1. Self-information Ex1. A letter is chosen at random from the Enlish alphabet. Ex2. A binary number of m digits is chosen at random.
1. Self-information Ex3. 64 points are arranged in a square grid. Ej be the event that a point picked at random in the jth column Ek be the event that a point picked at random in the kth row Why?
2. Entropy f: Ek→ fk E(f) be expectation or average or mean of f Let S be the system with events the associated probabilities being
2. Entropy Def: The entropy of S, called H(S), is the average of the self-information Self-information of an event increases as its uncertainty grows 觀察 Let certainty 最小值為0,表示已確定。但最大值呢?
2. Entropy Thm: with equality only when Proof:
2. Entropy Thm 2.2: For x>0 with equality only when x=1. Assume that pk≠0
2. Entropy In the system S the probabilities p1 and p2 where p2> p1 are replaced by p1 +ε and p2-εrespectively under the proviso 0<2ε<p2-p1 . Prove the H(S) is increased. We know that entropy H(S) can be viewed as a measure of _____ about S. Please list 3 items for this blank. information uncertainty randomness
3. Mutual information Let S1 be the system with events the associated probabilities being Let S2 be the system with events the associated probabilities being
3. Mutual information Two systems S1 and S2 satisfying relation
3. Mutual information relation
3. Mutual information conditional probability conditional self-information mutual information NOTE:
3. Mutual information conditional entropy mutual information
3. Mutual information mutual information and conditional self-information If Ej and Fk are statistically independent
3. Mutual information joint entropy joint entropy and conditional entropy
3. Mutual information mutual information and conditional entropy
3. Mutual information Thm: mutual information of two systems cannot exceed the sum of their separate entropies
3. Mutual information System’s independent If S1 and S2 are statistically independent Joint entropy of two statistically independent systems is the sum of their separate entropies
Ch2: Basic Concepts 2. 3 Mutual information Thm: with equality only if S1 and S2 are statistically independent Proof:Assume that pjk≠0
Ch2: Basic Concepts 2. 3 Mutual information Thm: with equality only if S1 and S2 are statistically independent Proof:
3. Mutual information Ex: A binary symmetric channel with crossover probability ε 1-ε 0 0 ε ε 1 1 1-ε Let S1 be the input E0=0, E1=1 and S2 be the output F0=0, F1=1
3. Mutual information Assume that Then
3. Mutual information Compute the output Then then If
3. Mutual information Compute the mutual information
3. Mutual information Compute the mutual information
4. Differential entropy The differential entropy of f(x) is defined by whenever the integral exists. NOTE: • The entropy of a continuous distribution need not exist. • Entropy may be negative Def: The entropy of S, called H(S), is the average of the self-information
4. Differential entropy Example: Consider a random variable distributed uniformly from 0 to a so that its density is 1/a from 0 to a and 0 elsewhere. Then its differential entropy is whenever the integral exists.