Let us note in this example that, since x
2
x
2
1
, matrix C is constrained in that
its elements in the third column are the squared values of their corresponding
elements in the second column. It needs to be cautioned that, for high-order
polynomial regression models, constraints of this type may render matrix C
T
C
ill-conditioned and lead to matrix-inversion difficulties.
REFERENCE
Rao, C.R., 1965, Linear Statistical Inference and Its
Applications
, John Wiley & Sons
Inc., New York.
FURTHER READING
Some additional useful references on regression analysis are given below.
Anderson, R.L., and Bancroft, T.A., 1952, Statistical Theory in
Research,
McGraw-Hill,
New York.
Bendat, J.S., and Piersol, A.G., 1966, Measurement and Analysis of Random Data, John
Wiley & Sons Inc., New York.
Draper, N., and Smith, H., 1966, Applied Regression Analysis, John Wiley & Sons Inc.,
New York.
Graybill, F.A., 1961, An Introduction to Linear Statistical Models, Volume 1 . McGraw-
Hill, New York.
PROBLEMS
11.1 A special case of simple linear regression is given by
Determine:
(a) The least-square estimator ;
(b) The mean and variance of
(c) An unbiased estimator for
2
, the variance of Y .
11.2 In simple linear regression, show that the maximum likelihood estimators for and
are identical to their least-square estimators when Y is normally distributed.
11.3 Determine the maximum likelihood estimator for variance
2
of Y in simple linear
regression assuming that Y is normally distributed. Is it a biased estimator?
11.4 Since data quality is generally not uniform among data points, it is sometimes
desirable to estimate the regression coefficients by minimizing the sum of weighted
squared residuals; that is, and in simple linear regression are found by minimizing
Linear Models and Linear Regression
359
Y x E:
^
B
^
B
^
^
X
n
i1
w
i
e
2
i
;
for
;