2013 IEEE Information Theory Workshop (ITW) 2013
DOI: 10.1109/itw.2013.6691306
|View full text |Cite
|
Sign up to set email alerts
|

State-dependent Gaussian Z-channel with mismatched side-information and interference

Abstract: A state-dependent Gaussian Z-interference channel model is investigated in the regime of high state power, in which transmitters 1 and 2 communicate with receivers 1 and 2, and only receiver 2 is interfered by transmitter 1's signal and a random state sequence. The state sequence is known noncausally only to transmitter 1, not to the corresponding transmitter 2. A layered coding scheme is designed for transmitter 1 to help interference cancelation at receiver 2 (using a cognitive dirty paper coding) and to tra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(12 citation statements)
references
References 17 publications
(21 reference statements)
0
12
0
Order By: Relevance
“…Here, (22) comes from (18) and the fact that X i , i = 1, 2, is a function of (S i , M i ) as well as the fact that {S 1 , S 2 , M 1 , M 2 } are mutually independent, (23) is due to the fact that the conditioning reduces entropy, (24) follows from (18) and these facts that the conditioning reduces entropy and X 2 = f 2 (S 2 , M 2 ), and finally (25) follows from the Cauchy-Schwarz inequality as well as the fact that the differential entropy is maximized by a Gaussian distribution for a fixed second moment. In addition, we can rewrite (22)-(25) as…”
Section: Outer Boundmentioning
confidence: 99%
See 1 more Smart Citation
“…Here, (22) comes from (18) and the fact that X i , i = 1, 2, is a function of (S i , M i ) as well as the fact that {S 1 , S 2 , M 1 , M 2 } are mutually independent, (23) is due to the fact that the conditioning reduces entropy, (24) follows from (18) and these facts that the conditioning reduces entropy and X 2 = f 2 (S 2 , M 2 ), and finally (25) follows from the Cauchy-Schwarz inequality as well as the fact that the differential entropy is maximized by a Gaussian distribution for a fixed second moment. In addition, we can rewrite (22)-(25) as…”
Section: Outer Boundmentioning
confidence: 99%
“…Moreover, a state-dependent Gaussian ZIC with a mismatched receiver-side state interference and transmitter-side state cognition has been studied in Ref. [22], where a state that corrupts only the received signal at receiver 2 is known non-causally only at the transmitter 1 while the intended transmitter 2 is unaware of it, hence the problem investigated in the current work is completely different to the work done in Ref. [22].…”
mentioning
confidence: 99%
“…Therefore, by using time sharing between the two corner points, given in (14) and (27), we can achieve the following rate region:…”
Section: Our Proposed Schemementioning
confidence: 99%
“…For the very strong interference regime, as well as for the weak regime, the sum capacity is obtained under certain conditions on channel parameters [13]. In [14], a state-dependent Gaussian Z-interference channel model in the regime of high state power is investigated. By utilizing a layered coding scheme, inner and outer bounds on the capacity region are derived.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation