410 likes | 540 Views
Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code. Reporter :林煜星 Advisor : Prof. Y.M. Huang. Outline. Introduction Related Research Transmission Model for BCJR Simulation for BCJR Algorithm
E N D
Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code Reporter:林煜星 Advisor:Prof. Y.M. Huang
Outline • Introduction • Related Research • Transmission Model for BCJR • Simulation for BCJR Algorithm • Proposed Methodology • Transmission Model for Sequential • Simulation for Soft-Decision Sequential Algorithm • Conclusion
Discrete source Source Encoder Channel Encoder Modulator 錯誤更正碼 資料壓縮 User Source Decoder Joint Decoder Channel Decoder Demodulator Introduction Channel
Related Research • [1]L. Guivarch, J.C. Carlach and P. Siohan • [2]M. Jeanne, J.C. Carlach, P. Siohan and L.Guivarch • [3]M. Jeanne, J.C. Carlach, Pierre Siohan
Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR
Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-Independent Source or first order Markov Source
Transmission Model for BCJR-Independent Source or first order Markov Source(1)
Transmission Model for BCJR-Independent Source or first order Markov Source(2)
Example: Transmission Model for BCJR-Independent Source or first order Markov Source(3)
Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-Huffman Codign
Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-Turbo Coding parallel concatenation
u d = (11101) v Transmission Model for BCJR-Turbo Coding parallel concatenation(1) Non Systematic Convolution code
(0,3) (1,3) (2,3) (3,3) (4,3) (5,3) 1 0 10 01 (0,1) (1,1) (2,1) (3,1) (4,1) (5,1) 01 (0,2) (1,2) (2,2) (3,2) (4,2) (5,2) 00 (0,0) (1,0) (2,0) (3,0) (5,0) (4,0) 11 Transmission Model for BCJR-Turbo Coding parallel concatenation(2) d = (11101)
Transmission Model for BCJR-Turbo Coding parallel concatenation(3) Recursive Systematic Convolution(RSC) Rate=1/2
Transmission Model for BCJR-Turbo Coding parallel concatenation(4) Rate=1/4
Interleaver Transmission Model for BCJR-Turbo Coding parallel concatenation(5)
Turbo Code rate1/3 Transmission Model for BCJR-Turbo Coding parallel concatenation(6) Rate=1/4
Transmission Model for BCJR-Turbo Coding parallel concatenation(7) Turbo Code rate=1/2
Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-AWGN
Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP
priori Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(5) BCJR2 BCJR1
Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(1) MAP Decoder Define
Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(2) Logarithm of Likelihood Ratio(LLR) Recall
10 10 (4,3) (5,3) 1 0 (4,1) (5,1) 01 01 (5,2) 00 00 11 11 Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(3)
Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(4) (2,3) (3,3) (0,0) (5,0)
priori Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(5) BCJR2 BCJR1
Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(6)
Simulation for BCJR Algorithm • The end of the transmission occurs when either the maximum bit error number fixed to 1000, or the maximum transmitted bits equal to 10 000 000 is reached. • Input date into blocks of 4096 bits
Simulation for BCJR Algorithm(1) 1NP:1次iteration independent source No Use a priori probability 1NP:1次iteration independent source Use a priori probability
Simulation for BCJR Algorithm(2) 1NP: 1次iteration Markov Source No use a proiri probability 1MP: 1次iteration Markov Source Use Markvo a priori probability
Simulation for BCJR Algorithm(3) 12D: 1次iteration Independent Source Use a priori probability Bit time(level)、Convolution state 13D: 1次iteration Independent Source Use a priori probability Bit time(level)、tree state Convolution state
Proposed Methodology • [4]Catherine Lamy, Lisa Perros-Meilhac
priori Transmission Model for Sequential-Sequential Decoding BCJR2 Sequential
1 : if Code word bits Transmission Model for Sequential-Sequential Decoding(1) 0 : Otherwise
Example: r=(-1, 3,2,1,-2,-1,-3,-1,1,2) 4 y=(1,0,0,0,1,1,1,1,0,0) (4,3) 3 4 1 2 01 (1,1) (2,1) (3,1) (3,1) (5,1) 10 2 2 (5,1) 0 00 (4,2) (4,2) (0,0) 3 (4,2) (1,1) (4,2) 1 1 11 4 5 4 (3,1) (1,1) (4,3) (3,1) 11 11 11 (1,0) (0,0) (1,0) (2,0) (2,0) (3,0) (5,0) 00 00 00 4 (2,0) (3,0) (3,0) (4,3) (2,0) y=(00) y=(00) y=(10) y=(11) y=(11) 4 (1,1) (1,0) (1,0) (3,0) (2,1) |r|=(12) |r|=(21) |r|=(13) |r|=(31) |r|=(21) 5 (2,1) (1,1) (2,1) (5,0) (0,0) Transmission Model for Sequential-Sequential Decoding(2) Origin node (0,0) Open Close
Simulation for Sequential Algorithm 2D1: 1次iteration Independent Source Use a priori probability Bit time(level)、Convolution state 3D1: 1次iteration Independent Source Use a priori probability Bit time(level)、Convolution state、tree state
Conclusion • Heuristic方法求Sequential Decoder Soft-Output value運用在Iterative解碼架構,雖然使錯誤降低,節省運算時間,但解碼效果無法接近Tubro Decoder的解碼效果,為來將繼續研究更佳的方法求Sequential Decoder Soft-Output value使解碼效果更逼近Turbo Decoder的解碼效果