700 likes | 1.38k Views
Soft Decision Decoding Algorithms of Reed-Solomon Codes. Jing Jiang and Krishna R. Narayanan Department of Electrical Engineering Texas A&M University. Historical Review of Reed Solomon Codes. Date of birth: 40 years ago (Reed and Solomon 1960)
E N D
Soft Decision Decoding Algorithms of Reed-Solomon Codes Jing Jiang and Krishna R. Narayanan Department of Electrical Engineering Texas A&M University
Historical Review of Reed Solomon Codes Date of birth: 40 years ago (Reed and Solomon 1960) Related to non-binary BCH codes (Gorenstein and Zierler 1961) Efficient decoder: not until 6 years later (Berlekamp 1967) Linear feedback shift register (LFSR) interpretation (Massey 1969) • Other algebraic hard decision decoder: • Euclid’s Algorithm (Sugiyama et al. 1975) • Frequency-domain decoding (Gore 1973 and Blahut 1979)
Wide Range of Applications of Reed Solomon Codes • NASA Deep Space: CC + RS(255, 223, 32) • Multimedia Storage: • CD: RS(32, 28, 4), RS(28, 24, 4) with interleaving • DVD: RS(208, 192, 16), RS(182, 172, 10) product code • Digitial Video Broadcasting: DVB-T CC + RS(204, 188) • Magnetic Recording: RS(255,239) etc. (nested RS code)
Basic Properties of Reed Solomon Codes (cont’d) • Properties of RS code: • Symbol level cyclic (nonbinary BCH codes) • Maximum distance separable (symbol level): • Properties of BM algorithm: • Decoding region: • Decoding complexity: Usually
source interleaving RS Encoder PR Encoder + interleaving AWGN extrinsic + - sink de-interleaving RS Decoder Channel Equalizer extrinsic a priori hard decision RS Coded Turbo Equalization System a priori + Motivation for RS Soft Decision Decoder Hard decision decoder does not fully exploit the decoding capability Efficient soft decision decoding of RS codes remains an open problem Soft input soft output (SISO) algorithm is favorable
Presentation Outline Symbol-level algebraic soft decision decoding Binary expansion of RS codes and soft decoding algorithms Iterative decoding for RS codes Simulation results Applications and future works
Reliability Assisted Hard Decision Decoding • Generalized Minimum Distance (GMD) Decoding (Forney 1966): • New distance measure: generalized minimum distance • Successively erase the least reliable symbols and run the hard decision decoder • GMD is shown to be asymptotically optimal • Chase Type-II decoding (Chase 1972): • Exhaustively flip the least reliable symbols and run the hard decision decoder • Chase algorithm is also shown to be asymptotically optimal • Related works: • Fast GMD (Koetter 1996) • Efficient Chase (Kamiya 2001) • Combined Chase and GMD for RS codes (Tang et al. 2001) • Performance analysis of these algorithms for RS codes seems still open
Algebraic Beyond Half dmin List Decoding Bounded distance + 1 decoding (Berlekamp 1996) Beyond decoding for low rate RS codes (Sudan 1997) Decoding up errors (Guruswami and Sudan 1999) A good tutorial paper (JPL Report, McEliece 2003)
Decoding: Basic idea: find f(x), which fits as many points in pairs • Factorization Step: • generate a list of y-roots, i.e.: • Pick up the most likely codeword from the list L • Complexity: • Interpolation (Koetter’s fast algorithm): • Factorization (Roth and Ruckenstein’s algorithm): Outline of Algebraic Beyond Half Distance Decoding • Interpolation Step: • Construct a bivariate polynomial of minimum (1,K-1) degree, which has a zero of order at , i.e.:
Algebraic Soft Interpolation Based List Decoding • Koetter and Vardy algorithm (Koetter & Vardy 2003) • Based on the Guruswami and Sudan’s algebraic list decoding • Use the reliability information to assign multiplicities • KV is optimal in multiplicity assignment for long RS codes • Reduced complexity KV (Gross et al. submitted 2003) • Re-encoding technique: largely reduce the cost for high rate codes • VLSI architecture (Ahmed et al. submitted 2003)
Definition: • Reliability matrix: • Multiplicity matrix: • Score: • Cost: Soft Interpolation Based Decoding Basic idea: interpolating more symbols using the soft information The interpolation and factorization is the same as GS algorithm Sufficient condition for successful decoding: The complexity increases with , maximum number of multiplicity
Recent Works and Remarks • Recent works on performance analysis and multiplicity assignment: • Gaussian approximation (Parvaresh and Vardy 2003) • Exponential bound (Ratnakar and Koetter 2004) • Chernoff bound (El-Khamy and McEliece 2004) • Performance analysis over BEC and BSC (Jiang and Narayanan 2005) The ultimate gain of algebraic soft decoding (ASD) over AWGN channel is about 1dB Complexity is scalable but prohibitively huge for large multiplicity The failure pattern of ASD algorithm and optimal multiplicity assignment scheme is of interest
Performance Analysis of ASD over Discrete Alphabet Channels • Performance Analysis over BEC and BSC (Jiang and Narayanan, accepted by ISIT2005) • The analysis gives some intuition about the decoding radius of ASD • We investigate the bit-level decoding radius for high rate codes • For BEC, bit-level radius is twice as large as that of the BM algorithm • For BSC, bit-level radius is slightly larger than that of the BM algorithm • In conclusion, ASD is limited by its algebraic engine
Binary Image Expansion of RS Codes and Soft Decision Decoding
Bit-level Weight Enumerator “The major drawback with RS codes (for satellite use) is that the present generation of decoders do not make full use of bit-based soft decision information” (Berlekamp) How does the binary expansion of RS codes perform under ML decoding? Performance analysis using its weight enumerator Averaged ensemble weight enumerator of RS codes (Retter 1991) It gives some idea about how RS codes perform under ML decoding
Remarks RS codes themselves are good code However, ML decoding is NP-hard (Guruswami and Vardy 2004) Are there sub-optimal decoding algorithms using the binary expansions?
Trellis based Decoding using BCH Subcode Expansion • Maximum-likelihood decoding and variations: • Partition RS codes into BCH subcodes and glue vectors (Vardy and Be’ery 1991) • Reduced complexity version (Ponnampalam and Vucetic 2002) • Soft input soft output version (Ponnampalam and Grant 2003)
BCH subcodes Glue vector Subfield Subcode Decomposition • Remarks: • Decomposition greatly reduces the trellis size for short codes • Impractical for long codes, since the size of the glue vectors is very large • Related work: • Construct sparse representation for iterative decoding (Milenkovic and Vasic 2004) • Subspace subcode of Reed Solomon codes (Hattori et al. 1998)
Reliability based Ordered Statistic Decoding • Reliability based decoding: • Ordered Statistic Decoding (OSD) (Fossorier and Lin 1995) • Box and Match Algorithm (BMA) (Valembois and Fossorier 2004) • Ordered Statistic Decoding using preprocessing (Wu et al. 2004) • Basic ideas: • Order the received bits according to their reliabilities • Propose hard decision reprocessing based on the most reliable basis (MRB) • Remarks: • The reliability based scheme is efficient for short to medium length codes • The complexity increases exponentially with the reprocessing order • BMA algorithm trade memory for time complexity
A Quick Question How does the panacea of modern communication, iterative decoding algorithm work for RS codes? Note that all the codes in the literature, for which we can use soft decoding algorithms are sparse graph codes with small constraint length.
erased bits bit nodes ? …………. ……….. ……………. ……………. . . . . . . . . . check nodes How does standard message passing algorithm work? If two or more of the incoming messages are erasures the check is erased For the AWGN channel, two or more unreliable messages invalidate the check
Binary image expansion of the parity check matrix of RS(7, 5) over GF(23) A Few Unreliable Bits “Saturate” the Non-sparse Parity Check Matrix Consider RS(7, 5) over GF(23) : Iterative decoding is stuck due to only a few unreliable bits “saturating” the whole non-sparse parity check matrix
Sparse Parity Check Matrices for RS Codes Can we find an equivalent binary parity check matrix that is sparse? For RS codes, this is not possible! The H matrix is the G matrix of the dual code The dual of an RS code is also an MDS Code Each row has weight at least (K+1) Typically, the row weight is much higher
Iterative Decoding for RS Codes • Iterative decoding for general linear block codes: • Iterative decoding for general linear block codes (Hagenauer et al. 1996) • APP decoding using minimum weight parity checks (Lucas et al. 1998) • Generalized belief propagation (Yedidia et al. 2000) • Recent progress on RS codes: • Sub-trellis based iterative decoding (Ungerboeck 2003) • Stochastic shifting based iterative decoding (Jiang and Narayanan, 2004) • Sparse representation of RS codes using GFFT (Yedidia, 2004)
Binary image expansion of the parity check matrix of RS(7, 5) over GF(23) Recent Iterative Techniques Sub-trellis based iterative decoding (Ungerboeck 2003) Self concatenation using sub-trellis constructed from the parity check matrix: • Remarks: • Performance deteriorates due to large number of short cycles • Work for short codes with small minimum distances
Shift by 2 Recent Iterative Techniques (cont’d) Stochastic shifting based iterative decoding (Jiang and Narayanan, 2004) Due to the irregularity in the H matrix, iterative decoding favors some bits Taking advantage of the cyclic structure of RS codes Stochastic shift prevent iterative procedure from getting stuck Best result: RS(63,55) about 0.5dB gain from HDD However, for long codes, the performance deteriorates
transmitted codeword received vector parity check matrix Iterative Decoding Based on Adaptive Parity Check Matrix Idea: reduce the sub-matrix corresponding to unreliable bits to a sparse nature using Gaussian elimination For example, consider (7,4) Hamming code: We can make the (n-k) less reliable positions sparse!
unreliable bits bit nodes …………. ……….. ……………. ……………. . . . . . . . . . check nodes Adaptive Decoding Procedure After the adaptive update, iterative decoding can proceed
J is also a function of H. It is adapted such that unreliable bits are separated in order to avoid getting stuck at zero gradient points: Gradient Descent and Adaptive Potential Function • Geometric interpretation (suggested by Ralf Koetter) • Define the tanh domain transform as: • The syndrome of a parity check can be expressed as: • Define the soft syndrome as: • Define the cost function as: The decoding problem is relaxed as minimizing J using gradient descent with the initial value T observed from the channel
Two Stage Optimization Procedure Proposed algorithm is a generalization of the iterative decoding scheme proposed by Lucas et al. (1998), two-stage optimization procedure: The damping coefficient serves to control the convergent dynamics
Avoid Zero Gradient Point Adaptive scheme changes the gradient and prevents it getting stuck at zero gradient points Zero gradient point
Variations of the Generic Algorithm Connect unreliable bits as deg-2 Incorporate this algorithm with hard decision decoder Adapting the parity check matrix at symbol level Exchange bits in reliable and unreliable part. Run the decoder multiple times Reduced complexity partial updating scheme
Asymptotic performance is consistent with the ML upper-bound. AWGN Channels (cont’d)
source interleaving RS Encoder PR Encoder + interleaving AWGN extrinsic + - sink de-interleaving RS Decoder BCJR Equalizer extrinsic a priori hard decision RS Coded Turbo Equalization System a priori + Embed the Proposed Algorithm in the Turbo Equalization System