1 / 43

It’s Easier to Approximate

It’s Easier to Approximate. David Tse Wireless Foundations U.C. Berkeley ISIT 2009 Seoul, South Korea. TexPoint fonts used in EMF: A A A A A A A A A A A A A A A. Holy Grail of Network Information Theory. Noisy channel coding Lossy source coding Joint source-channel coding

tejano
Download Presentation

It’s Easier to Approximate

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. It’s Easier to Approximate David Tse Wireless Foundations U.C. Berkeley ISIT 2009 Seoul, South Korea TexPoint fonts used in EMF: AAAAAAAAAAAAAAA

  2. Holy Grail of Network Information Theory • Noisy channel coding • Lossy source coding • Joint source-channel coding What constitutes a solution? Sender Node Dest. Node Network Source

  3. Point-to-Point Communication This exact solution is remarkable but also sets a high bar for us. Shannon 48 Target distortion D is achievable iff .

  4. First Success: Multiple Access Channel MAC result holds for any number of users and for general channel models. Ahlswede 71 Liao 72

  5. What’s Next? all network IT problems Cover 72 Open for 40 years! ? Broadcast Multiple access

  6. Gaussian Problems • practically relevant • physically meaningful • provide worst-case bounds Are they easier?

  7. Gaussian Problems Gaussian network IT problems lots of progress! interference relay broadcast distributed lossy compression multiple access multiple description Conclusion: we are still pretty stuck!

  8. New Strategy: Approximate! • Look for approximate solutions with guarantee on gap to optimality. • Approximation results are not new. • We advocate a systematic approach to apply to many problems.

  9. Approximation Approach 1) Simplify: • Channel coding: noisy to noiseless • Source coding: lossy to lossless 2) Analyze simplified problem. 3) Use insight to find new schemes and outer bounds for Gaussian problem. 4) Derive worst-case gap.

  10. Enc1 Dec Enc2 Noiseless and Lossless Problems Noiseless is easier than noisy. Pinsker 78 Marton 77 noisy broadcast: open noiseless: solved Lossless is easier than lossy. Slepian & Wolf 73 lossy distributed compression: open solved

  11. Goal of Talk Gaussian network IT problems interference relay broadcast distributed lossy compression multiple access multiple description

  12. 1. Interference

  13. Dec1 Enc1 Enc2 Dec2 2-User Interference Channel

  14. Special Case: Strong Interference Each user’s signal is heard louder at the other receiver. public Dec1 Enc1 public Enc2 Dec2 Observation: In a working system, m1 is decodable at Rx 2. Theorem: Sato 81 Han & Kobayashi 81

  15. Dec1 Enc1 Enc2 Dec2 Weak Interference Too inefficient to make m1and m2 public. How should each link code to efficiently co-exist? Open problem for 30 years.

  16. Enc1 Dec1 Dec2 Enc2 Deterministic Interference Channel A A B A B El Gamal & Costa 82 B Observation: In a working system, the component of m1 on V1 is decodable by Rx 2, i.e. public. Theorem: Han-Kobayashi scheme is optimal.

  17. Connection to Gaussian Problem public V1 private Is there a natural V1?

  18. Gaussian: Optimal to Within 1 Bit Pub 1 Private 1 Pub 2 Private 2 noise level Strategy: Gaussian Han-Kobayashi with private signal received at noise level at other Rx. Theorem: (Etkin, T. & Wang 06) Achieves capacity region of interference channel to within 1 bit/s/Hz.

  19. New Strategies? • Gaussian Han-Kobayashi turns out to be quite good for the 2 user interference channel. • How about for more users?

  20. Many-to-One Interference Channel interference alignment Random code Structured code Theorem: (Bresler, Parekh & T. 07) Capacity can be achieved to within a constant gap, using structured codes and interference alignment.

  21. Open Question Find a constant gap approximation to the general K-user interference channel,

  22. 2. Relay

  23. Single-Sender Single-Destination Gaussian Networks • Capacity open for 30 years even for single relay. • Best known achievable strategyCover & El Gamal 79 … …

  24. Gaussian vs Wireline Wireline Gaussian Max-flow min-cut: Cutset bound: Ford & Fulkerson 56 What to forward? How to forward?

  25. From Wireline to Gaussian Deterministic Wireline Gaussian Quantize at noise level to extract good bits. Each bit received is good. random network coding Randomly map quantized signal to Gaussian codewords. Alshwede,Cai,Li,Yeung 00

  26. Approximate Max-Flow Min-Cut Theorem: (Avestimehr, Diggavi & T. 08) Quantize-and-random-map strategy achieves where the gap  does not depend on the channel gains. Not true for classic schemes: • Decode-and-forward • amplify-and-forward, • Gaussian compress-and-forward

  27. gap gap r-d gain 1 s-rgain Example: Single Relay DecR:EncR Dec Enc Theorem: Gap to cutset bound is at most 1 bit/s/Hz.

  28. Scaling with Network Size Bit of a bad news: gap increases with # of nodes Open question: Is the gap of the capacity to the cutset bound universal of # of nodes also?

  29. Lossy Source Coding

  30. Source Enc. Source Dec. Source-Channel Duality Noisy to noiseless channel. Lossy to lossless source coding.

  31. 3. Multiple Description

  32. Gaussian Multiple Description • Rate region solved for K=2 descriptions (Ozarow 80) • What about for K>2 descriptions? • Symmetric case: want same distortion Dm for any subset of m descriptions. Gaussian Enc1 Dec1A Dec 2 Enc2 Dec1B

  33. From Lossy to Lossless Multilevel diversity coding Enc1 Dec1A Dec2 Enc2 Dec1B LSB MSB Enc1 Dec1 (Roche,Yeung & Hau 97, Yeung & Zhang 99) Dec2 Enc2 Dec1 In general, want to recover V1,V2,…Vm from any subset of m descriptions.

  34. Symmetric Multilevel Diversity Coding (3,1) repetition code LSB (3,3) no coding (3,2) MDS code MSB Description 3 Description 2 Description 1

  35. Back to Gaussian Lossy Problem Gaussian Successive Refinement Code Multilevel Diversity Coding Theorem: (Tian, Mohajer & Diggavi 08) This scheme can achieve within 1.48 bits/sample of the symmetric rate point for any # of descriptions. A more sophisticated scheme (Puri,Pradhan, Ramchandran 05) has a gap of 0.92 bits.

  36. 4. Distributed Lossy Compression

  37. Enc1 Dec Enc2 Gaussian Problem • Rate region solved recently for 2 sources (Wagner, Tavildar, Viswanath 08, building on Oohama 97) • Random-quantize-and-bin is optimal. • What about 3 or more sources?

  38. QuantizeD1 QuantizeD2 QuantizeD3 Bin Bin Bin Example 1: Tree Sources Enc1 Dec Enc2 Enc3 Theorem: (Maddah-Ali & T. 09) Quantize-and-bin can achieve the rate region to within 2.4 bits/sample.

  39. Quantize Quantize Bin Bin Example 2: Recovering Differences Enc1 Dec Enc2 coarse lattice fine lattice Krithivasan & Pradhan 07 Theorem: (Wagner 09) Structured-quantize-and-bin can achieve within 1 bit

  40. State of Affairs

  41. State of Affairs: Exact Solutions Gaussian network IT problems interference relay broadcast distributed lossy compression multiple access solved multiple description open

  42. State of Affairs: Approximate Solutions Gaussian network IT problems interference relay broadcast distributed lossy compression multiple access solved multiple description open partially solved

  43. Collaborators Hua Wang Raul Etkin Abhay Parekh Salman Avestimehr Randy Berry Mohammad Maddah-Ali Changho Suh Guy Bresler Suhas Diggavi Roy Yates Vinod Prabhakaran Emre Telatar

More Related