
92 ENTROPY RATES OF A STOCHASTIC PROCESS
4.7 Entropy rates of Markov chains
(a) Find the entropy rate of the two-state Markov chain with tran-
sition matrix
P =
1 − p
01
p
01
p
10
1 − p
10
.
(b) What values of p
01
,p
10
maximize the entropy rate?
(c) Find the entropy rate of the two-state Markov chain with tran-
sition matrix
P =
1 − pp
10
.
(d) Find the maximum value of the entropy rate of the Markov
chain of part (c). We expect that the maximizing value of p
should be less than
1
2
, since the 0 state permits more informa-
tion to be generated than the 1 state.
(e) Let N(t) be the number of allowable state sequences of length t
for the Markov chain of part (c). Find N(t) and calculate
H
0
= lim
t→∞
1
t
log N(t).
[Hint: Find a linear recurrence that expresses N(t) in terms
of N(t − 1) and N(t − 2).WhyisH
0
an upper bound on the
entropy rate of the Markov chain? Compare H
0
with the max-
imum entropy found in part (d).]
4.8 Maximum entropy process. A discrete memoryless source has the
alphabet {1, 2}, where the symbol 1 has duration 1 and the sym-
bol 2 has duration 2. The probabilities of 1 and 2 are p
1
and p
2
,
respectively. Find the value of p
1
that maximizes the source entropy
per unit time H(
X) =
H(X)
ET
. What is the maximum value H(X)?
4.9 Initial conditions. Show, for a Markov chain, that
H(X
0
|X
n
) ≥ H(X
0
|X
n−1
).
Thus, initial conditions X
0
become more difficult to recover as the
future X
n
unfolds.
4.10 Pairwise independence.LetX
1
,X
2
,...,X
n−1
be i.i.d. random
variables taking values in {0, 1}, with Pr{X
i
= 1}=
1
2
.LetX
n
= 1
if
n−1
i=1
X
i
is odd and X
n
= 0 otherwise. Let n ≥ 3.