CH 03
CH 03
CH 03
1. Estimation
ˆ1 rˆi1 yi rˆ 2
i1 , where rˆi1 are
the residuals from the estimated
regression xˆ1 ˆ0 ˆ2 xˆ 2
Economics 20 - Prof. Anderson 4
“Partialling Out” continued
Previous equation implies that regressing y on
x1 and x2 gives same effect of x1 as regressing y
on residuals from a regression of x1 on x2
This means only the part of xi1 that is
uncorrelated with xi2 are being related to yi so
we’re estimating the effect of x1 on y after x2
has been “partialled out”
~ ~ ~
Compare the simple regression y 0 1 x1
with the multiple regression yˆ ˆ0 ˆ1 x1 ˆ 2 x2
~
Generally, 1 ˆ1 unless :
ˆ 0 (i.e. no partial effect of x ) OR
2 2
R2 = SSE/SST = 1 – SSR/SST
y y yˆ yˆ
2
R 2 2
i i
~
1
xi1 x1 yi
x i1 x1
2
~
1 2 x x x x
i1 1 i2 i1 x1 ui
x x x x1
2 2
i1 1 i1
~
E 1 1 2
x x xi1 1 i2
x x
2
i1 1
x x
2
i1 1
~ ~
so E 1 1 2 1
2
Var ˆ j , where
SST j 1 R j
2
2
~ ~ ~ ~
y 0 1 x1 , so that Var 1
SST1
~
Thus, Var Var ˆ unless x and
1 1 1
df = n – (k + 1), or df = n – k – 1
df (i.e. degrees of freedom) is the (number
of observations) – (number of estimated
parameters)
Economics 20 - Prof. Anderson 27
The Gauss-Markov Theorem
Given our 5 Gauss-Markov Assumptions it
can be shown that OLS is “BLUE”
Best
Linear
Unbiased
Estimator
Thus, if the assumptions hold, use OLS