110 likes | 249 Views
{ k 1 } k m exp { x k . u k 1 + y k . V k 1,m } k+1 f(1,m). m. { k 0 } k m exp { x k . u k 0 + y k . V k 0,m } k+1 f(0,m). m. k m exp { y k . V k 1,m } k+1 f(1,m). m. { k } exp { 2x k }.
E N D
{ k1} km exp { xk . uk1 + yk . Vk1,m } k+1f(1,m) m { k0} km exp { xk . uk0 + yk . Vk0,m }k+1f(0,m) m km exp { yk . Vk1,m } k+1f(1,m) m { k} exp { 2xk } km exp { yk . Vk0,m }k+1f(0,m) m LLR L( dk ) = L(dk) + { 2xk } + Log [ke ] Iterative decoding steps Likelihood Ratio ( dk ) = = = { k} exp { 2xk } { k e}
ki,m = ke i exp { xk . uki + yk . Vki,m } km k1,m k+1f(1,m) km k0,m k+1f(0,m) m m Iterative decoding • For the second iteration; • Calculate LLR for all times Log Likelihood Ratio L( dk ) = Log • Hard decision based on LLR after multiple iterations
Channel measurement based LLR • When no CSI is available in the decoder, the equation can be approximated by a Gaussian distribution with a mean x·Ea [a]=0.8862 x , and a variance σ2. The variance σ2 is determined by the additive noise. • If the decoder has knowledge of the fading amplitudes for each symbol, we can apply the Gaussian distribution with a mean axand a variance σ2.
Performance Comparison • Effect of block size • Effect of channel fading • Effect of channel correlation • Importance of interleaver