
378 Chapter 9
When we looked at the expectation and variance of independent observations of
discrete random variables, we found that
E(X
1
+ X
2
+ ... X
n
) = nE(X)
and
Var(X
1
+ X
2
+ ... + X
n
) = nVar(X)
As you’d expect, these same calculations work for continuous random variables too.
This means that if X ~ N(μ, σ
2
), then
X
1
+ X
2
+ ... + X
n
~ N(nμ, nσ
2
)
Q:
So what’s the difference between
linear transforms and independent
observations?
A: Linear transforms affect the underlying
values in your probability distribution. As
an example, if you have a length of rope of
a particular length, then applying a linear
transform affects the length of the rope.
Independent observations have to do with
the quantity of things you’re dealing with.
As an example, if you have n independent
observations of a piece of rope, then you’re
talking about n pieces of rope.
In general, if the quantity changes, you’re
dealing with independent observations. If the
underlying values change, then you’re dealing
with a transform.
Q:
Do I really have to know which is
which? What difference does it make?
A: You have to know which is which
because it make a difference in your
probability calculations. You calculate
the expectation for linear transforms and
independent observations in the same
way, but there’s a big difference in the way
the variance is calculated. If you have n
independent observations then the variance
is n times the original. If you transform your
probability distribution as aX + b, then your
variance becomes a
2
times the original.
Q:
Can I have both independent
observations and linear transforms in the
same probability distribution?
A: Yes you can. To work out the probability
distribution, just follow the basic rules for
calculating expectation and variance. You
use the same rules for both discrete and
continuous probability distributions.
If X ~ N(μ
x
, σ
2
x
) and
Y ~ N(μ
y
, σ
2
y
), and X and Y
are independent, then
X + Y ~ N(μ
x
+ μ
y
, σ
2
x
+ σ
2
y
)
X - Y ~ N(μ
x
- μ
y
, σ
2
x
+ σ
2
y
)
If X ~ N(μ, σ
2
) and a and b are
numbers, then
aX + b ~ N(aμ + b, a
2
σ
2
)
If X
1
, X
2
, ..., X
n
are
independent observations of
X where X ~ N(μ, σ
2
), then
X
1
+ X
2
+ ... + X
n
~ N(nμ, nσ
2
)
finding expectation and variance
Expectation and variance for independent observations