
11.7 HYPOTHESIS TESTING 375
theorem states that the first few elements are asymptotically independent
with common distribution P
∗
.
Example 11.6.2 As an example of the conditional limit theorem, let us
consider the case when n fair dice are rolled. Suppose that the sum of the
outcomes exceeds 4n. Then by the conditional limit theorem, the proba-
bility that the first die shows a number a ∈{1, 2,...,6} is approximately
P
∗
(a),whereP
∗
(a) is the distribution in E that is closest to the uni-
form distribution, where E ={P :
P(a)a ≥ 4}. This is the maximum
entropy distribution given by
P
∗
(x) =
2
λx
6
i=1
2
λi
, (11.173)
with λ chosen so that
iP
∗
(i) = 4 (see Chapter 12). Here P
∗
is the
conditional distribution on the first (or any other) die. Apparently, the
first few dice inspected will behave as if they are drawn independently
according to an exponential distribution.
11.7 HYPOTHESIS TESTING
One of the standard problems in statistics is to decide between two alter-
native explanations for the data observed. For example, in medical testing,
one may wish to test whether or not a new drug is effective. Similarly, a
sequence of coin tosses may reveal whether or not the coin is biased.
These problems are examples of the general hypothesis-testing problem.
In the simplest case, we have to decide between two i.i.d. distributions.
The general problem can be stated as follows:
Problem 11.7.1 Let X
1
,X
2
,...,X
n
be i.i.d. ∼ Q(x). We consider two
hypotheses:
•
H
1
: Q = P
1
.
•
H
2
: Q = P
2
.
Consider the general decision function g(x
1
,x
2
,...,x
n
),whereg(x
1
,
x
2
,...,x
n
) = 1 means that H
1
is accepted and g(x
1
,x
2
,...,x
n
) = 2
means that H
2
is accepted. Since the function takes on only two val-
ues, the test can also be specified by specifying the set A over which
g(x
1
,x
2
,...,x
n
) is 1; the complement of this set is the set where
g(x
1
,x
2
,...,x
n
) has the value 2. We define the two probabilities of error:
α = Pr(g(X
1
,X
2
,...,X
n
) = 2|H
1
true) = P
n
1
(A
c
) (11.174)