380 likes | 559 Views
Introduction to Data Communication: the discrete channel model . A.J. Han Vinck University of Essen April 2005. content. communication model transmission model MAP-ML receiver burst error model Interleaving: block-convolutional several models. The communication model. k. K‘.
E N D
Introduction to Data Communication: the discrete channel model A.J. Han Vinck University of Essen April 2005
content • communication model • transmission model • MAP-ML receiver • burst error model • Interleaving: block-convolutional • several models
The communication model k K‘ data reduction/ compression data protection source n K‘ k Message construction decoder sink
Point-to-point transmitter channel receiver modem physical modem message bits bits message Signal generator Signal processor
transmission model (OSI) Data Link Control Data Link Control Transmission of reliable packets Physical Physical Unreliable trans-mission of bits link
transmission channel model input xiP(y|xi) output y transition probabilities • memoryless: • output only on input • input and output alphabet finite
binary symmetric channel model (BSC) • 1-p • 0 0 • p • 1 • 1-p Error Source e yi = xi e xi + Output Input E is the binary error sequence s.t. P(1) = 1-P(0) = p Xi is the binary information sequence for message i Y is the binary output sequence
Error probability (MAP) Suppose decision is message i for a received vector Y then, the probability of a correct decision = P( Xi transmitted | Y received ) Hence, decide i that maximizes P( Xi transmitted | Y received ) (Maximum Aposteriori Probability, MAP)
Maximum Likelihood (ML) receiver find i that maximizes P( Xi | Y ) = P( Xi , Y ) / P( Y ) = P( Y |Xi ) P ( Xi ) / P( Y ) for equally likely Xi this is equivalent to find maximum P( Y | Xi )
example For p = 0.1 and X1 = ( 0 0 ); P( X1 = 1/3 ) X2 = ( 1 1 ); P( X0 = 2/3) Give your MAP and ML decision for Y = ( 0 1 )
Something to think about message bits message compression protection of bits MPEG, JPEG, etc Error correction bits message correction of incorrect bits decompression message Compression reduces bit rate Protection increases bit rate
Bit protection • Obtained by Error Control Codes (ECC) • Forward Error Correction (FEC) • Error Detection and feedback (ARQ) • Performance depends on error statistics! • Error models are very important
Error control code with rate k/n Code book Code word in receive message estimate 2k decoder channel Code book contains all processing n There are 2k code words of length n
example Transmit: 0 0 0 or 1 1 1 How many errors can we correct? How many errors can we detect? Transmit: A = 00000; B = 01011; C = 10101; D = 11110 How many errors can we correct? How many errors can we detect? What is the difference?
A simple error detection method 0 1 1 0 0 1 1 0 0 1 1 0 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 Fill row wise Transmit column wise RESULT: any burst of length L can be detected L 0 1 1 0 0 1 1 0 0 0 0 0 0 0 1 1 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 1 1 row parity What happens with bursts of length larger than L?
Modeling: binary transmission test sequence error sequence channel Problem: 0 1 0 0 0 1 0 1 0 0 0 0 1 1 1 ••• Determination of burst and guard space burst guard burst
modeling How to model scratches on a CD? Answer is important for the design of ECC
Density increases sensitivity Blue Laser CD DVD
Modeling: networking packet - 1-error causes retransmission - long packets always have an error - short packets with ECC give lower efficiency Ack/Nack Suppose that a packet arrives correctly with probability Q. What is then the throughput as a funtion of Q?
burst error model Random error channel; outputs independent P(0) = 1- P(1); Error Source Burst error channel; outputs dependent P(0 | state = bad ) = P(1|state = bad ) = 1/2; P(0 | state = good ) = 1 - P(1|state = good ) = 0.999 Error Source State info: good or bad transition probability Pgb Pbb Pgg good bad Pbg
question Pgb Pbb Pgg good bad Pbg P(0 | state = bad ) = P(1|state = bad ) = 1/2; P(0 | state = good ) = 1 - P(1|state = good ) = 0.99 What is the average for P(0) for: Pgg = 0.9, Pgb = 0.1; Pbg = 0.99, Pbb = 0.01 ? Indicate how you can you extend the model?
Interleaving: block Channel models are difficult to derive: - burst definition ? (a burst starts and ends with a 1) - random (?) and burst errors ? for practical reasons: convert burst into random error read in row wise transmit column wise 1 0 0 1 1 0 1 0 0 1 1 0 000 0 0 1 1 0 1 0 0 1 1
received power time Reception after fading channel Example (from Timo Korhonen, Helsinki) • In fading channels received data can experience burst errors that destroy large number of consecutive bits. This is harmful for channel coding • Interleaving distributes burst errors along data stream • A problem of interleaving is introduced extra delay • Example below shows block interleaving: Received interleaved data: 1 0 0 0 1 1 1 0 1 0 1 1 1 0 0 0 1 1 0 0 1 1 0 0 0 1 1 1 0 1 0 1 1 1 0 0 0 1 1 0 0 1 Block deinterleaving : Recovered data: 1 0 0 0 1 0 0 0 1 0 1 1 1 1 0 1 1 0 1 0 1
A1 A1 A1 A1 A1 A2 B1 B1 A2 A2 A3 A3 C1 A3 C1 A2 A2 B1 B1 B1 B2 B2 B2 B2 B2 B3 B3 B3 C2 C2 A3 C1 C1 A3 C1 C2 B3 C2 B3 C2 C3 C3 C3 C3 C3 example • Consider the code C = { 000, 111 } • A burst error of length 3 can not be corrected. • Let us use a block interleaver 3X3 2 errors Interleaver Deinterleaver 1 error 1 error 1 error
De-Interleaving: block read in column wise this row contains 1 error 1 0 0 1 1 0 1 0 0 1 1 e e e e e e 1 1 0 1 0 0 1 1 read out row wise
Interleaving: convolutional input sequence 0 input sequence 1 delay of b elements input sequence m-1 delay of (m-1)b elements Example:b = 5, m = 3 in out
Interleaving: destroys memory bursty Message interleaver channel interleaver -1 message encoder decoder „random error“ Note: interleaving brings encoding and decoding delay Homework: compare the block and convolutional interleaving w.r.t. delay
Middleton type of burst channel model 0 1 0 1 Transition probability P(0) channel 1 channel 2 Select channel k with probability Q(k) … channel k has transition probability p(k)
Impulsive Noise Classification (a) Single transient model • Parameters of single transient : • peakamplitude - pseudo frequency f0 =1/T0- damping factor- duration- Interarrival Time Measurements carried out by France Telecom in a house during 40 h 2 classes of pulses (on 1644 pulses) : single transient and burst
the Z-channel Application in optical communications 0 1 0 (light on) 1 (light off) x y p 1-p P( x = 0 ) = 1 - P( x = 1) =P0
the erasure channel Application: cdma detection, disk arrays 1-e e e 1-e 0 1 0 E 1 Disk 1 Disk 2 x y Disk 3 Known position of error Disk 4 Disk 5 P( x = 0) = 1 – P( x = 1 ) = P0
From Gaussian to binary to erasure e + + xi = +/- yi = xi+ e + - - Output Input - + + + E E E - - - +
A Simple code • For low packet loss rates (e.g. 5%), sending duplicates is expensive (wastes bandwidth) • XOR code • XOR a group of data pkts together to produce repair pkt • Transmit data + XOR: can recover 1 lost pkt 10101 00111 11100 11000 10110 10101 10110 11100 11000 00111
Channel with insertions and deletions • Bad synchronization or clock recovery at receiver: • insertion ••• 1 0 0 0 1 1 1 0 0 1 0 ••• •••1 0 0 1 0 1 1 1 0 0 1 0 ••• • deletion ••• 1 0 0 0 1 1 1 0 0 1 0 ••• •••1 0 0 1 1 1 0 0 1 0 ••• Problem: finding start and end of messages
Channel with insertions and deletions • Due to errors in bit pattern flag = 1 1 1 1 1 0, avoid 1 1 1 1 1 in frame ••• 0 1 1 1 1 1 0 0 1 1 0 1 ••• ••• 0 1 1 1 1 0 1 0 0 1 1 0 1 ••• ••• 0 1 1 0 1 0 1 0 0 1 1 0 1 ••• insertion ••• 0 1 1 1 0 0 0 0 1 1 0 1 ••• ••• 0 1 1 1 1 0 0 0 1 1 0 1 ••• ••• 0 1 1 1 1 0 0 1 1 0 1 ••• deletion
Channels with interference • Example (optical channel) Error probability depends on symbols in neighboring slots
Channels with memory (ex: recording) • Example: Yi = Xi + Xi-1 Xi { +1, -1 } Xi Xi-1 Yi { +2, 0, -2 }
tasks • Construct a probability transformer from uniform to Gaussian • Give an overview of burst error models, statistics of important parameters