500 likes | 742 Views
E N D
1. The Capacity of Interference Channels with Partial Transmitter Cooperation
The theme that I am going to talk about today considers cooperation among the nodes in wireless networks to improve the network performance. This topic has been in a focus for a several years now and has been analyzed from many aspects to understand how to cooperate.
We use network information theory to determine the fundamental limits to the performance and propose cooperative schemes that can achieve or come close to these limits.
I am going to talk about a specific problem that analyzes a communication situation in which two senders wish to communicate with two receivers, that is. the interference channel.
At the same time, we want to allow cooperation between transmitters and investigate gains from transmitter cooperation, i.e. channel models that allow transmitters to partially cooperate. So, we introduce the limited transmitter cooperation in the interference channel.
For two different scenarios in the interference channel with limited cooperation, we will show conditions under which, we determine the capacity region.
This is joint work with Prof. Roy Yates in Winlab and Gerhard Kramer, Bell Labs.The theme that I am going to talk about today considers cooperation among the nodes in wireless networks to improve the network performance. This topic has been in a focus for a several years now and has been analyzed from many aspects to understand how to cooperate.
We use network information theory to determine the fundamental limits to the performance and propose cooperative schemes that can achieve or come close to these limits.
I am going to talk about a specific problem that analyzes a communication situation in which two senders wish to communicate with two receivers, that is. the interference channel.
At the same time, we want to allow cooperation between transmitters and investigate gains from transmitter cooperation, i.e. channel models that allow transmitters to partially cooperate. So, we introduce the limited transmitter cooperation in the interference channel.
For two different scenarios in the interference channel with limited cooperation, we will show conditions under which, we determine the capacity region.
This is joint work with Prof. Roy Yates in Winlab and Gerhard Kramer, Bell Labs.
2. 2 Interference Channel More specifically, the interference channel is given by two sets-input alphabets X1 and X2, two output alphabets Y1 and Y2 and conditional probability distributions
More specifically, the interference channel is given by two sets-input alphabets X1 and X2, two output alphabets Y1 and Y2 and conditional probability distributions
3. 3 Strong Interference Capacity region known if there is strong interference: Strong interference channel: the mutual information across is larger. In the Gaussian channel, the channel gain on the interference path is stronger (at least as strong as) the channel gain on the desired path.Strong interference channel: the mutual information across is larger. In the Gaussian channel, the channel gain on the interference path is stronger (at least as strong as) the channel gain on the desired path.
4. 4 Compound MAC
5. 5 Cooperation in Interference Channel More specifically, the interference channel is given by two sets-input alphabets X1 and X2, two output alphabets Y1 and Y2 and conditional probability distributions
More specifically, the interference channel is given by two sets-input alphabets X1 and X2, two output alphabets Y1 and Y2 and conditional probability distributions
6. 6 Full cooperation: MIMO broadcast channel
DPC optimal [Weingarten, Steinberg & Shamai, 04], [Caire & Shamai, 01], [Viswanath, Jindal & Goldsmith, 02]
Several cooperation strategies proposed [Host-Madsen, Jindal, Mitra& Goldsmith, 03, Ng & Goldsmith, 04]
[Jindal, Mitra& Goldsmith, Ng& Goldsmith]:
Gain from DPC when sources close together
When apart, relaying outperforms DPC
Assumptions:
Dedicated orthogonal cooperation channels
Total power constraint
Transmitter Cooperation for Gaussian Channels How should nodes cooperate?
Transmitter cooperation in Gaussian channel with two senders and two receivers have been considered by several authors. Different coding cooperative strategies were proposed. Host-Madsen and Ng and Goldsmith demonstrated the different performance depending on the relative network geometry. The real difference is depending whether sources clustered together or not. Under full cooperation: this channel is a MIMO broadcast and DPC optimal. How should nodes cooperate?
Transmitter cooperation in Gaussian channel with two senders and two receivers have been considered by several authors. Different coding cooperative strategies were proposed. Host-Madsen and Ng and Goldsmith demonstrated the different performance depending on the relative network geometry. The real difference is depending whether sources clustered together or not. Under full cooperation: this channel is a MIMO broadcast and DPC optimal.
7. 7 Transmitter Cooperation In Discrete Memoryless Channel Simple model for cooperation in DMC was proposed by Willems for the MAC channel. Cooperation using links with finite capacities
We want to model cooperation between sources for discrete channel.
Here is a simple model for cooperation that was used/proposed by Willems:Simple model for cooperation in DMC was proposed by Willems for the MAC channel. Cooperation using links with finite capacities
We want to model cooperation between sources for discrete channel.
Here is a simple model for cooperation that was used/proposed by Willems:
8. 8 Cooperation through Conference h is a set of communicating functionsh is a set of communicating functions
9. 9
10. 10 Transmitter Cooperation
11. 11 Compound MAC with Common Information
12. 12 The Compound MAC with Common Information Encoding
Decoding
The error probability
(R0, R1, R2) achievable if, for any , there is an (M0, M1, M2, N, Pe) code such that
The capacity region is the closure of the set of all achievable (R0, R1, R2)
Easy to determine given the result by [Slepian & Wolf, 1973] After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.
13. 13 The MAC with Common Information After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.
14. 14 The MAC with Common Information After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.
15. 15 The Compound MAC with Common Information After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.
16. 16 union over p(u)p(x1|u)p(x2|u)p(y1,y2|x1,x2)
For each p : ( R0,R1,R2 ) is an intersection of rate regions RMACt achieved in two MACs with common information: The Compound MAC with Common Information After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.
17. 17 Converse Error probability in MACt
Error probability in CMAC
?
? Necessary condition for :
? Rates confined to R MAC1 (p) and R MAC2 (p) for every p
18. 18 From Slepian and Wolf result, choosing the rates ( R0,R1,R2 ) in
will guarantee that Pe1 and Pe2 can be made arbitrarily small
? Pe will be arbitrarily small Achievability
19. 19 Implications We can use this result to determine the capacity region of several channel with partial transmitter cooperation:
The Strong Interference Channel with Common Information
The Strong Interference Channel with Unidirectional Cooperation
The Compound MAC with Conferencing Encoders
20. 20 After The Conference After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.
21. 21 Theorem For the interference channel with common information satisfying
for all product input distributions the capacity region C is
22. 22 Proof Achievability
Follows directly from the achievability in the Compound MAC with common information
Decoding constraint is relaxed
Converse
Using Fanos inequality
In the interference channel with no cooperation: outer bounds rely on the independence of X1 and X2
Due to cooperation: Codewords not independent
Theorem conditions obtained from the converse
23. 23 Relationship to the Strong Interference Conditions ?
24. 24 Interference Channel With Unidirectional Cooperation Encoding functions
Decoding functions General theme of this work is cooperation in the interference channel, in this talk I will focus on the case of the unidirectional cooperation.
It is assumed that a message W1 at sender 1 is also know to the other encoder, but not vice versa.General theme of this work is cooperation in the interference channel, in this talk I will focus on the case of the unidirectional cooperation.
It is assumed that a message W1 at sender 1 is also know to the other encoder, but not vice versa.
25. 25 Cognitive Radio Settings Cognitive Radio Channel [Devroye, Mitran, Tarokh, 2005]
An achievable rate region
Consider simple two-transmitter, two-receiver network:
Assume that one transmitter is cognitive
It can overhear transmission of the primary user
It obtains partially the primary users message ? it can cooperate
At this point, it is unclear what exactly the transmitter cooperation isAt this point, it is unclear what exactly the transmitter cooperation is
26. 26 Interference Channel with Unidirectional Cooperation The Interference Channel with Unidirectional Cooperation [Maric, Yates & Kramer, 2005]
Capacity in very strong interference
The Interference Channel with Degraded Message Set [Wu, Vishwanath & Arapostathis, 2006]
Capacity for weak interference and for Gaussian channel in weak interference
Cognitive Radio Channel [Jovicic& Viswanath, 2006]
Capacity for Gaussian channel in weak interference
The Interference Channel with a Cognitive Transmitter [Maric, Goldsmith, Kramer& Shamai, 2006]
New outer bounds and an achievable region
27. 27 For the interference channel with unidirectional cooperation satisfying
for all joint input distributions p(x1,x2), the capacity region C is
Capacity region = capacity region of the Compound MAC channel with Common Information
Theorem
28. 28 Achievability: Compound MAC with Common Information
29. 29 Converse Using Fanos inequality
Interference channel with no cooperation: outer bounds rely on the independence of X1 and X2
Due to cooperation: Codewords not independent
Theorem conditions obtained from the converse
30. 30 Converse (Continued ) Lemma: If per-letter conditions are satisfied for all p(x1,x2), then
31. 31 Converse (Continued ) Recall that the achievable rates are
By assumption, for all p(x1,x2)
The rates
union over all p(x1,x2)
are thus achievable and are an outer bound
32. 32 Gaussian Channel Channel outputs:
33. 33 Capacity Achieving Scheme in Strong Interference Encoder 1: Codebook x1(w1)
34. 34 Gaussian Channel- Strong Interference Conditions
35. 35 Gaussian Channel with Unidirectional Cooperation
36. 36 Gaussian Channel With Conferencing Channel outputs:
37. 37
38. 38
Sender t power, t=1,2:
Pc-conference power
39. 39 Full Cooperation in Gaussian Channel
40. 40 Interference Channel With Conferencing Relax the constraint ? Each user decodes only one message:
Strong interference conditions?
After the conference
41. 41 Interference Channel With Conferencing After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.After the conference, there is a part of both W1 and W2 that is known to BOTH encoders: we refer to it as a common message, it contains the cell indexes exchanged during the conference.
42. 42 Discussion Introduced cooperation into the Interference Channel:
Capacity results for scenarios of strong interference
If the interference is strong : decoders can decode the interference
No partial decoding at the receivers ? easier problem
Ongoing work:
Channel models that incorporate node cooperation
Capture characteristics of cognitive radio networks
Capacity results and cooperation gains for more general settings
43. 43 Ongoing Work Determining the fundamental limits of wireless networks requires understanding the optimal ways of cooperation
Hierarchy of cooperation
Encoding schemes
Optimum resource allocation
How to cope with interference
44. 44 Not needed
45. 45
46. 46 Interference Channel More specifically, the interference channel is given by two sets-input alphabets X1 and X2, two output alphabets Y1 and Y2 and conditional probability distributions
More specifically, the interference channel is given by two sets-input alphabets X1 and X2, two output alphabets Y1 and Y2 and conditional probability distributions
47. 47 Interference Channel with Confidential Messages Joint work with: Ruoheng Liu, Predrag Spasojevic and Roy D. Yates
Developed inner and outer bounds
48. 48 Initailly, the decoding constraintInitailly, the decoding constraint
49. 49 Theorem The Compound MAC capacity region C (C12,C21) is
union over p(u)p(x1|u)p(x2|u)p(y1,y2|x1,x2) denoted p
For each p : Intersection between rate regions of two MACs with partially cooperating encoders
50. 50 Gaussian Channel With Conferencing Union over all