330 likes | 542 Views
LT-AF Codes: LT Codes with Alternating Feedback. Ali Talari and Nazanin Rahnavard Oklahoma State University IEEE ISIT (International Symposium on Information Theory) 2013. Outlines. Introduction LT-AF codes Simulation results Conclusions. Introduction.
E N D
LT-AF Codes: LT Codes with Alternating Feedback Ali Talari and Nazanin Rahnavard Oklahoma State University IEEE ISIT (International Symposium on Information Theory) 2013
Outlines • Introduction • LT-AF codes • Simulation results • Conclusions
Introduction • Rateless codes require only one feedback that is issued by the decoder to inform the encoder of a successful LT decoding. • LT codes, the available feedback channel remains under utilizedduring the transmission. • As the data-block length decreases the performance of LT codes significantly deteriorates. • Existing work have proposed to employ feedbacks to inform the encoder • the number of successfully decoded input symbols, • a suitable input symbol to be sent for enhancing the decoding, • the index of some recovered input symbols.
Introduction • We propose LT Codes with Alternating Feedback (LT-AF Codes) that considerably improve the performance of LT codes for short-block lengths when Belief Propagation decoder is in use. • The decoder alternatively issues two types of feedbacks based on the dependencies of the still undecoded received output symbols and the number of decoded input symbols. • In contrast to other existing work, we design LTAF codes with a realistic feedback channel assumption, where the feedback channel can have unknown or varying erasure rate εfb[0, 1).
LT-AF codes • γsucc: the required coding overhead to have a successful decoding with high probability • γsucc × kcoded symbols are enough to decode k input symbols with high probability • γ: the received coding overhead (meanwhile the transmission is in progress), 0 < γ < γsucc • γ × k: the number of received output symbols at receiver • We exploit the feedback channel to obtain a much smaller γsucc for a finite k in LT-AF coding.
LT-AF codes • Ω k,n(.) : the degree distribution of LT-AF codes for a data-block of length k when n input symbols are already recovered at decoder. • We adopt the idea of SLT codes [6], and propose to shift Ω k,n(.) based on n. • We allow the decoder to issue the first type of feedback referred to as fb1, which is used to keep the encoder updated with the current value of n. • The encoder generates an output symbol of degree-one (containing a randomly selected input symbol) as acknowledgment. [6] A. Hagedorn, S. Agarwal, D. Starobinski, and A. Trachtenberg, “Rateless coding with feedback,” IEEE INFOCOM 2009, pp. 1791 –1799, 2009. http://www.powercam.cc/slide/24647
Decoding host Inefficiency of LT Codes for our Problem Many redundant encoded symbols k+Encoded Symbols Decode Input Symbols n out of k input symbols are known a priori at the decoding host http://www.powercam.cc/slide/24647
Inefficiency of LT Codes for our Problem • The number of these redundant encoded symbols grows with the ratio of input symbols known at the decoder (n) to the total input symbols (k) • If n input symbols are known a priori, then an additional LT-encoded symbol will provide no new information to the decoding host with probability …which quickly approaches 1 as n → k http://www.powercam.cc/slide/24647
Intuitive Fix • nknown input symbols serve the function of degree 1 encoded symbols, disproportionately skewing the degree distribution for LT encoding • We thus propose to shift the Robust Soliton distribution to the right in order to compensate for the additional functionally degree 1 symbols • Questions • 1) How? • 2) By how much? http://www.powercam.cc/slide/24647
Shifted Code Construction • Definition The shifted robust soliton distribution is given by • k : the number of input symbols in the system • n: the number of input symbols already known at the decoder • round(・)rounds to the nearest integer • Intuition n known input symbols at the decoding host reduce the degree of each encoding symbols by an expected fraction http://www.powercam.cc/slide/24647
Shifted Code Distribution LT code distribution and proposed Shifted code distribution, with parameters k = 1000, c = 0.01, = 0.5. The number of known input symbols at the decoding host is set to n = 900 for the Shifted code distribution. The probabilities of the occurrence of encoded symbols of some degrees is 0 with the shifted code distribution. http://www.powercam.cc/slide/24647
LT-AF codes • We propose to employ Ideal Soliton distribution in LT-AF coding and exploit the feedback channel and request a suitable input symbol (which is an output symbol of degree 1) • We allow the decoder to request desired input symbols employing the second type of feedback referred to as fb2 • The encoder generates a degree-one output symbol if and only if it has received a fb1 or fb2
LT-AF codes • The lack of the arrival of an output symbol at the decoder with degree one after issuing a fb1 or fb2 clearly indicates a feedback loss. • All feedback packet losses are identified by the decoder and a feedback retransmission is performed. • The decoding recovery rate of LT-AF codes does not considerably degrade at high feedback channel loss ratesεfb[0, 1) • A degree-one output symbol generated at the encoder after a fb1 contains a randomly selected input symbol. However, such an output symbol would contain the requested input symbol (selected by decoder) after a fb2
LT-AF codes • Let ,where is the probability of selecting degree d to generate an LT-AF output symbol • We do not allow the encoder to generate any degree-one symbol, we set
LT-AF codes • The average degree of a check node (output symbol) generated employing Ω k,n(.) distribution is
Generating fb1 • The decoder is not always aware of n unless its knowledge about n is updated by a fb1 • Initially, the encoder assumes n = 0 and employs the degree distribution Ωk,0(.) to generate output symbols. • nr: the most recent reported value of nemploying a fb1 • We propose the encoder to generate a fb1 when , i.e., average degree of Ω k,n(.) increases by at least • ni: the threshold that for n≧ni the ithfb1 is generated
Generating fb1 58 4346 39 2740 • At k = 102 the first and the second fb1’s are issued at n ≧ 39 and n ≧ 58 • For k = 104 the first and the second fb1’s are issued at n ≧ 2740 and n ≧ 4346 • ni / k decreases as k increases.
Generating fb2 • Transmission of fb2’s before γ = 1 does not considerably contribute to decoding progress. • We propose to generate fb2’s only when γ surpasses 1. • To have uniformly distributed fb2’s and to avoid feedback channel congestion, an LT-AF decoder issues a fb2on the reception of every (lnk)thoutput symbol.
Generating fb2 • 1) Generating fb2 Based on Variable Node with Maximum Degree (VMD) • Decoder requests the variable node with the highest degree in its current decoding graph to issue a fb2 • VMD greedily removes the largest possible number of edges from Gand decreases the degree of many check nodes. • Vun: the set of remaining undecoded variable nodes • Cbuff : the set of buffered check nodes with a degree higher than one • the ripple : the check nodes with degree 1
Generating fb2 • VMD would request v5 • On the arrival of c8 containing only v5 , the value of v5 will become known. • This removes all the edges emanating from v5 to all other check nodes and reduces some to degree 1 (they are added to the ripple). • C7 is added to the ripple, which recovers v7 in the next decoding iteration. Fig. 2. The bipartite graph representing the input and the output symbols of an LT-AF code at the buffer of a decoder. Graph G at a decoder at γ = 1 for k = 7.
v1 v2 v3 v4 v5 v6 v7 c1 c2 c3 c4 c5 c6 c7 v1 v2 v3 v4 v5 v6 v7 c1 c2 c3 c4 c5 c6 c7 c8
v1 v2 v3 v4 v6 v7 c1 c2 c3 c4 c5 c6 c7 c8 • the ripple : c7 • Cbuff= {c1, c2, . . . , c6} • Vun = {v1, v2, v3, v4, v6}
Generating fb2 • 2) Generating fb2 Based on Full Variable Node Decoding(FVD) • A more complex method to generate fb2 is to run a dummydecoding for all unrecovered input symbols. • The single input symbol whose delivery results in the highest number of decoded input symbols is requested by a fb2 • Full Variable Node Decoding (FVD) has a much higher complexity than VMD.
Simulation Results • Our results are obtained employing Monte-Carlo method by averaging over the results of at least 107 numerical simulations. • The decoding bit-error-rate (BER) (average ratio of unrecovered input symbols to total number of input symbols 1−E[n/k]) • k = 1000 • We set c = 0.9 and δ= 0.1 for SLT and LT codes as proposed in [6]. [6] A. Hagedorn, S. Agarwal, D. Starobinski, and A. Trachtenberg, “Rateless coding with feedback,” IEEE INFOCOM 2009, pp. 1791 –1799, 2009.
LT-AF Decoding Error Rate and Runtime γsucc =1.09 γsucc =1.14 γsucc =1.31
LT-AF Decoding Error Rate and Runtime • In LT-AF coding the average degree of output symbols is much higher than that of regular LT codes, which causes a higher encoding/decoding complexity.
Number of Feedbacks • Not only LT-AF codes decrease the required coding overhead for a successful decoding γsucc , but also they need slightly smaller number of feedbacks compared to SLT codes. • The total number of feedbacks is much smaller than γsucck.
Robustness to Erasure in Feedback Channel Fig. 5. Effect of 90% feedback loss on the performance of SLT and LTAF codes employing VMD. Note that the curves representing LT-AF codes’ performance for εfb= 0.9 and εfb= 0.9 fully overlap for all γ ". • Assume that the loss rate of the feedback channel is εfb= 0.9 (which is not known to encoder and decoder), hence 90% of the feedbacks are lost in transmission. • To the best of our knowledge robustness against feedback loss had not been considered in any existing work and this significantly distinguishes LT-AF codes.
Conclusions • We proposed LT-AF codes that are LT codes with two types of feedback, which alleviate the low performance of LT codes for short data-block lengths. • We showed that LT-AF codes require lower coding overhead for successful decoding and lower number of feedback.
References • [6] A. Hagedorn, S. Agarwal, D. Starobinski, and A. Trachtenberg, “Rateless coding with feedback,” IEEE INFOCOM 2009, pp. 1791 –1799, 2009. • [7] A. Kamra, V. Misra, J. Feldman, and D. Rubenstein, “Growth codes: Maximizing sensor network data persistence,” SIGCOMM Computer Communication Rev., vol. 36, no. 4, pp. 255–266, 2006. • [8] A. Beimel, S. Dolev, and N. Singer, “RT oblivious erasure correcting,” IEEE/ACM Transactions on Networking, vol. 15, pp. 1321 –1332, dec. 2007. • [9] S. Kokalj-Filipovic, P. Spasojevic, E. Soljanin, and R. Yates, “ARQ with doped fountain decoding,” ISSSTA, pp. 780 –784, aug. 2008. • [10] J. Sørensen, P. Popovski, and J. Østergaard, “On the Role of Feedback in LT Codes,” Arxiv preprint arXiv:1012.2673, 2010. • [11] J. H. Sorensen, T. Koike-Akino, and P. Orlik, “Rateless feedback codes,” in IEEE ISIT, pp. 1767–1771, 2012.