160 likes | 174 Views
This review delves into key concepts of information quantity, average information, and coding methods in noiseless systems. Explore important properties of codes, key parameters, and how to efficiently match source to channel for optimal transmission.
E N D
A review on important bits of Part I • Quantity of Information (noiseless system) a) Depends on probability of event. b) Depends on length of message. • Average Information: Entropy probability of event Source producing many symbols of probabilities etc.
Maximum entropy For a binary source
Redundancy • Conditional entropy H(j|i) If there is intersymbol influence, average information is given by Conditional probability (probability of j given i) Joint probability
Coding in noiseless channel : Source coding (Speed of transmission is the main consideration ) • Important properties of codes • uniquely decodable (all combinations of code words distinct) • instantaneous (no code words a prefix of another) • compact (shorter code words given to more probable symbols)
Important parameters: where is length (in binary digits) • Coding methods • Fano-Shannon method • Huffman’s Method
Coding methods • Fano-Shannon method 1. Writing the symbol in a table in the order of descending order of probabilities ; 2. Dividing lines are inserted to successively divide the probabilities into halves, quarters, etc (or as near as possible); 3. A ‘0’ and ‘1’ are added to the code at each division. 4. Final code for each symbol is obtained by reading from towards each symbol.
Coding methods • Huffman’s Method • 1. Writing the symbol in a table in the order of • descending order of probabilities ; • The probabilities are added in pairs from bottom • and reordered. • 3. A ‘0’ or ‘1’ is placed at each branch; • 4. Final code for each symbol is obtained by reading from towards each symbol.
Codes: S1: 0 S2: 11 S3: 101 S4: 1000 S5: 1001
L=0.5×1+0.2 ×2+ 0.1 ×3+2 ×0.1 ×4=2.0 H=1.96 E=0.98
Shannon’s first theorem Shannon proved formally that if the source symbols are coded in groups of n, then the average length per symbol tends to the source entropy H as n tends to infinite. In consequence, a further increase in efficiency can be obtained by grouping the source symbols in groups, ( pairs, threes), and applying the coding procedure to the relevant probabilities of the chosen group. • Matching source to channel The coding process is sometimes known as ‘matching source to channel’ , that is to making the output of the coder as suitable as possible for the channel.
Example An information source produces a long sequence of three independent symbols A, B, C with probabilities 16/20,3/20 and 1/20 respectively; 100 such symbols are produced per second. The information is to be transmitted via a noiseless binary channel which can transmit up to 100 binary digits per second. Design a suitable compact instantaneous code and find the probabilities of the binary digits produced. 0, 1 100 symbol/s channel decoder source coder P(A)=16/20, p(B)=3/20, p(C)=1/20 Coding singly, using Fano-Shannon method P(0)=0.73, p(1)=0.27
L=1.865 per pair, R=93.25bits/s p(0)=0.547, p(1)=0.453. The entropy of the output stream is –(p(1)logp(0)+p(1)logp(1))=0.993 bits. close to maximum value of 1bit, (p(0)=p(1)).