ˆ
1
冸
兺
n
i1
rˆ
i1
y
i
冹兾冸
兺
n
i1
rˆ
i
2
1
冹
,
(3.22)
where the rˆ
i1
are the OLS residuals from a simple regression of x
1
on x
2
, using the sam-
ple at hand. We regress our first independent variable, x
1
, on our second independent
variable, x
2
, and then obtain the residuals (y plays no role here). Equation (3.22) shows
that we can then do a simple regression of y on rˆ
1
to obtain
ˆ
1
. (Note that the residu-
als rˆ
i1
have a zero sample average, and so
ˆ
1
is the usual slope estimate from simple
regression.)
The representation in equation (3.22) gives another demonstration of
ˆ
1
’s partial
effect interpretation. The residuals rˆ
i1
are the part of x
i1
that is uncorrelated with x
i2
.
Another way of saying this is that rˆ
i1
is x
i1
after the effects of x
i2
have been partialled
out, or netted out. Thus,
ˆ
1
measures the sample relationship between y and x
1
after x
2
has been partialled out.
In simple regression analysis, there is no partialling out of other variables because
no other variables are included in the regression. Problem 3.17 steps you through the
partialling out process using the wage data from Example 3.2. For practical purposes,
the important thing is that
ˆ
1
in the equation yˆ
ˆ
0
ˆ
1
x
1
ˆ
2
x
2
measures the change
in y given a one-unit increase in x
1
, holding x
2
fixed.
In the general model with k explanatory variables,
ˆ
1
can still be written as in equa-
tion (3.22), but the residuals rˆ
i1
come from the regression of x
1
on x
2
,…,x
k
. Thus,
ˆ
1
measures the effect of x
1
on y after x
2
,…,x
k
have been partialled or netted out.
Comparison of Simple and Multiple Regression
Estimates
Two special cases exist in which the simple regression of y on x
1
will produce the same
OLS estimate on x
1
as the regression of y on x
1
and x
2
. To be more precise, write the
simple regression of y on x
1
as y˜
˜
0
˜
1
x
1
and write the multiple regression as
yˆ
ˆ
0
ˆ
1
x
1
ˆ
2
x
2
. We know that the simple regression coefficient
˜
1
does not usu-
ally equal the multiple regression coefficient
ˆ
1
. There are two distinct cases where
˜
1
and
ˆ
1
are identical:
1. The partial effect of x
2
on y is zero in the sample. That is,
ˆ
2
0.
2. x
1
and x
2
are uncorrelated in the sample.
The first assertion can be proven by looking at two of the equations used to determine
ˆ
0
,
ˆ
1
, and
ˆ
2
:
兺
n
i1
x
i1
(y
i
ˆ
0
ˆ
1
x
i1
ˆ
2
x
i2
) 0 and
ˆ
0
y¯
ˆ
1
x¯
1
ˆ
2
x¯
2
. Setting
ˆ
2
0 gives the same intercept and slope as does the regression of y on x
1
.
The second assertion follows from equation (3.22). If x
1
and x
2
are uncorrelated in
the sample, then regressing x
1
on x
2
results in no partialling out, and so the simple
regression of y on x
1
and the multiple regression of y on x
1
and x
2
produce identical esti-
mates on x
1
.
Even though simple and multiple regression estimates are almost never identical,
we can use the previous characterizations to explain why they might be either very dif-
ferent or quite similar. For example, if
ˆ
2
is small, we might expect the simple and mul-
Chapter 3 Multiple Regression Analysis: Estimation
77
d 7/14/99 4:55 PM Page 77