240 likes | 372 Views
ECED 4504 Digital Transmission Theory. Jacek Ilow (based on the slides by Matthew Valenti) Performance of Block Codes. equality for “perfect” codes (Golay, Hamming). Error Probability: Hard Decision Decoding. Assumptions: p is the probability that any single code symbol is in error.
E N D
ECED 4504Digital Transmission Theory Jacek Ilow (based on the slides by Matthew Valenti) Performance of Block Codes
equality for “perfect” codes (Golay, Hamming) Error Probability:Hard Decision Decoding • Assumptions: • p is the probability that any single code symbol is in error. • p = Ps for the type of modulation used (i.e. BPSK) • Replace Eb/No with rEb/No • Errors occur independently. • All combinations of t or less errors are correctable. • Most combinations of more than t errors are not correctable. • “Perfect codes” cannot correct any combination of more than t errors. • Then the code word error probability is:
Example: Performance of (7,3) Code • Compute Pc for the example code if p = 0.01 • Since dmin = 4, t = 1
“hard decision” decoding Error Correction of Digitally Modulated Signals • Consider the following system: k data bits rate r = k/n block encoder n code bits s(t) BPSK modulator AWGN n(t) + r(t) estimates of data bits block decoder estimates of code bits BPSK detector
Error Correction with Digital Modulation • The bit error probability is found by using the appropriate equation for the modulation that is being used. • However, we want performance as a function of Eb/No. • Eb is energy per data bit (not code bit). • The energy per code symbol is Es = rEb • Therefore, we must replace Eb/No with rEb/No in all our error formulas for different modulation types.
Example: Performance of (7,3) Code with BPSK Modulation • For uncoded BPSK: • For our coded system: • Therefore the code word error probability is:
A More Interesting Example • Find the error probability for a (63,45) BCH code, which has t = 3 (see table 8-1-6). • First, compute p: • The compute code word probability:
Performance of Golay Code:Hard-decision Decoding • (23,12), t = 3 Golay code • BPSK modulation • Hard decision decoding: • Where: Union bound is exact because Golay code is perfect
Codeword Error Probability and Bit Error Probability • If a code word is received correctly, then all data bits will be correctly decoded. • If a code word is received incorrectly, then between 1 and k bits will be incorrect at the output of the decoder, so: • To find the exact BER, we need to know how many errors there are at the output of the decoder whenever it makes an error: • Let (i) be the average number of bit errors when i code bits are in error. • Then:
BER of Golay code • To find the BER, we need to know (i). • First find the distance spectrum Most common error at high SNR
Coding Gain • The coding gain is the difference between the uncoded and coded Eb/No required to achieve a desired Pb. • Usually we use a reference of Pb = 10-5 • The stronger the code, the higher the coding gain.
capacity curve is at Eb/No = (22r-1)/(2r) = 0.07 dB 2.1 dB coding gain 7.4 dB away from capacity Performance Curve for (23,12) Code 0 10 -2 10 uncoded BPSK BER -4 10 -6 10 (23,12) Golay code -8 10 0 1 2 3 4 5 6 7 8 9 10 Eb/No (in dB)
Soft Decision Decoding • With hard-decision decoding, a hard-decision is made on the bits before decoding takes place. • Input to the decoder is hard bit decisions {0,1} • Whenever a hard-decision is made, valuable information is lost. • We are interested in not only if the receiver thinks the received code bit was a 0 or a 1, but how confident it was about that decision. • The decoder should rely more on strong signals, and less on weaker signals. • Any type of decoder that uses soft-information about the confidence of the bit decision is called a soft-decision decoder.
This is where the hard decision is made: Information is lost! Essentially a 1-bit (2 level) quantizer Hard Decision Decoder for BPSK • Assume bits are equally likely: r r(t) block decoder f1(t)
Softer Decision Decoder for BPSK • Replace the 1 bit quantizer with a p bit quantizer. • Note that this requires a more complicated decoder. • Must be able to work with more finely quantized samples from output of correlator. • Benefit is that decoder can place more confidence on strong signals and less confidence on weak signals. • Large (~2 dB) performance gain even for p = 3 bits. rQ r r(t) p bit quantizer block decoder f1(t)
Soft Decision Decoder for BPSK • To achieve a fully soft decoder, simply pass the output of the correlator to the decoder. • Equivalent to letting p • The vector r contains more information than • Due to the “data processing theorem” • Therefore we can obtain better performance with soft-decision decoding r r(t) block decoder f1(t)
Comments on Soft Decision Decoding • Hard decision decoding chooses the code word with smallest Hamming distance from the received code word. • Soft decision decoding chooses the code word with the smallest Euclidian distance from the received code word. • For block codes, soft-decision decoders are usually much more complex than hard-decision decoders. • Exception: Errors and erasures decoding of RS codes. • However, soft-decision decoding is easy for convolutional codes.
Performance of Soft Decision Decoding • Calculate the pairwise error probability between all pairs of code words i j: • Where is the Euclidian distance between modulated code words ci and cj. • Euclidian distance is related to Hamming distance. • Depends on type of modulation • For BPSK:
Performance of Soft Decision Decoding • Apply Union bound to compute overall code word error probability: Assume the 2k code words are equally likely Assume a linear code: Conditional probability of error is the same for all possible transmitted code words. Just assume that all-zeros was sent This is called the uniform error property ad is the number of code words of weight w = d For high SNR, performance is dominated by the code words of weight w = dmin “Free distance asymptote”
Bit Error Rate • Now use total information weight
Weight Distribution • The weight distribution or distance spectrum is the number of code words for each possible weight. • Example: Golay code (table 8-1-1)
Performance of Golay Code:Soft Decision Decoding • Soft decision decoding, BPSK modulation.
2 dB difference at high Eb/No Soft-decision vs. Hard-decision Decoding of Golay Code 0 10 -2 10 Soft Decision Decoding Minimum Distance Asymptote Uncoded BER -4 10 Soft decision decoding Union Bound Hard decision decoding -6 10 -8 10 0 1 2 3 4 5 6 7 8 9 10 Eb/No (in dB)