574 NETWORK INFORMATION THEORY
into 2
nR
0
cells, with S
i
∩ S
j
= φ,i = j ,and∪S
i
= W. The partition will
enable us to send side information to the receiver in the manner of Slepian
and Wolf [502].
Generation of random code: Fix p(x
1
)p(x|x
1
).
First generate at random 2
nR
0
i.i.d. n-sequences in X
n
1
, each
drawn according to p(x
1
) =
n
i=1
p(x
1i
). Index them as x
1
(s), s ∈
{1, 2,...,2
nR
0
}. For each x
1
(s), generate 2
nR
conditionally independent
n-sequences x(w|s),w ∈{1, 2,...,2
nR
}, drawn independently accord-
ing to p(x|x
1
(s)) =
n
i=1
p(x
i
|x
1i
(s)). This defines the random code-
book
C ={x(w|s),x
1
(s)}. The random partition S ={S
1
,S
2
,...,S
2
nR
0
} of
{1, 2,...,2
nR
} is defined as follows. Let each integer w ∈{1, 2,...,2
nR
}
be assigned independently, according to a uniform distribution over the
indices s = 1, 2,...,2
nR
0
, to cells S
s
.
Encoding: Let w
i
∈{1, 2,...,2
nR
} be the new index to be sent in block
i,andlets
i
be defined as the partition corresponding to w
i−1
(i.e., w
i−1
∈
S
s
i
). The encoder sends x(w
i
|s
i
). The relay has an estimate
ˆ
ˆw
i−1
of the
previous index w
i−1
. (This will be made precise in the decoding section.)
Assume that
ˆ
ˆw
i−1
∈ S
ˆ
ˆs
i
. The relay encoder sends x
1
(
ˆ
ˆs
i
) in block i.
Decoding: We assume that at the end of block i − 1, the receiver
knows (w
1
,w
2
,...,w
i−2
) and (s
1
,s
2
,...,s
i−1
) and the relay knows (w
1
,
w
2
,...,w
i−1
) and consequently, (s
1
,s
2
,...,s
i
). The decoding procedures
at the end of block i are as follows:
1. Knowing s
i
and upon receiving y
1
(i),therelay receiver estimates
the message of the transmitter
ˆ
ˆw
i
= w if and only if there exists
a unique w such that (x(w|s
i
), x
1
(s
i
), y
1
(i)) are jointly -typical.
Using Theorem 15.2.3, it can be shown that
ˆ
ˆw
i
= w
i
with an arbi-
trarily small probability of error if
R<I(X;Y
1
|X
1
) (15.253)
and n is sufficiently large.
2. The receiver declares that ˆs
i
= s was sent iff there exists one and
only one s such that (x
1
(s), y(i)) are jointly -typical. From Theo-
rem 15.2.1 we know that s
i
can be decoded with arbitrarily small
probability of error if
R
0
<I(X
1
;Y) (15.254)
and n is sufficiently large.