Problem Set

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Problem Set 5 – ECON 425

Chapter 8 Problems 1 and 3

ANSWERS
i) The estimators are unbiased and consistent when heteroscedasticity is present.
ii) When heteroscedasticity is present, F test is not valid
iii) The OLS estimator are no longer best linear unbiased estimators.
Hence, ii) and iii) are the two consequences of heteroscedasticity.

- It is False because WLS is preferred over OLS when heteroscedasticity is present in the model.

Chapter 8 Computer Exercise 1 and 13 Part (i)

Answers:
i) By the condition in the model
var(u|totwork,educ,age,age2,yngkid)=var(u|male)
lets Assume var(u|male)= A0+ A1 male
This implies that the varience of error term u, in case of female will be A0 and in case
of male it will be A1
ii) U2=189359-28850 male . error term u is higher for women.
iii) Since the reported p value is 0.29 which means only 29% confidence so it is non
significant at 95% confidence or at 5% significane.

1
CODES
setwd("C:/users/sushi/onedrive/desktop/study materials/Rstudy")
rm(list=ls())
load('sleep75.rdata')
sqage=data$age^2
one.lm=lm("sleep~totwrk+educ+age+sqage+yngkid+male",data=data)
summary(one.lm)
sqreg=resid(one.lm)^2
two.lm=lm(sqreg~male, data)
summary(two.lm)

Call:
lm(formula = " children~ age+ agesq + educ +electric+ urban", data = data)

Residuals:
Min 1Q Median 3Q Max
-5.9012 -0.7136 -0.0039 0.7119 7.4318

Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) -4.2225162 0.2401888 -17.580 < 2e-16 ***
age 0.3409255 0.0165082 20.652 < 2e-16 ***
agesq -0.0027412 0.0002718 -10.086 < 2e-16 ***
educ -0.0752323 0.0062966 -11.948 < 2e-16 ***
electric -0.3100404 0.0690045 -4.493 7.20e-06 ***
urban -0.2000339 0.0465062 -4.301 1.74e-05 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

coeftest(one.lm, vcov = vcovHC(one.lm, type="HC1"))

t test of coefficients:

2
Estimate Std. Error t value Pr(>|t|)
(Intercept) -4.22251623 0.24385099 -17.3160 < 2.2e-16 ***
age 0.34092552 0.01917466 17.7800 < 2.2e-16 ***
agesq -0.00274121 0.00035051 -7.8206 6.549e-15 ***
educ -0.07523232 0.00630771 -11.9270 < 2.2e-16 ***
electric -0.31004041 0.06394815 -4.8483 1.289e-06 ***
urban -0.20003386 0.04547093 -4.3992 1.113e-05 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

No, robust standard errors are not always bigger than non-robust errors. It can
even be seen in this above example. Some are less and some are greater.

CODES
setwd("C:/users/sushi/onedrive/desktop/study materials/Rstudy")
rm(list=ls())
load('fertil2.rdata')
library(sandwich)
library (lmtest)
one.lm=lm(" children~age+agesq+educ+electric+urban",data)
summary(one.lm)
coeftest(one.lm, vcov = vcovHC(one.lm, type="HC1"))

Chapter 15 Computer Exercise 1

3
Answers:
i)

one.lm = lm(lwage ~ sibs, data)


summary(one.lm)
##
## Call:
## lm(formula = lwage ~ sibs, data = data)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1.97662 -0.25857 0.02503 0.28572 1.22677
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 6.861076 0.022078 310.771 < 2e-16 ***
## sibs -0.027904 0.005908 -4.723 2.68e-06 ***
## ---

4
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.4164 on 933 degrees of freedom
## Multiple R-squared: 0.02335, Adjusted R-squared: 0.0223
## F-statistic: 22.31 on 1 and 933 DF, p-value: 2.68e-06

We get a negative coefficient. Having more siblings is correlated with fewer years of education.
ii)

two.lm = lm(educ ~ brthord, data, na.action=na.exclude)


summary(two.lm)
##
## Call:
## lm(formula = educ ~ brthord, data = data, na.action = na.exclude)
##
## Residuals:
## Min 1Q Median 3Q Max
## -4.8668 -1.5842 -0.7362 2.1332 6.1117
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 14.14945 0.12868 109.962 < 2e-16 ***
## brthord -0.28264 0.04629 -6.106 1.55e-09 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 2.155 on 850 degrees of freedom
## (83 observations deleted due to missingness)
## Multiple R-squared: 0.04202, Adjusted R-squared: 0.04089
## F-statistic: 37.29 on 1 and 850 DF, p-value: 1.551e-09

There is a strong negative correlation.


iii)

three.lm = ivreg(lwage ~ educ | brthord, data = data, na.action=na.exclude)


summary(three.lm)

5
##
## Call:
## ivreg(formula = lwage ~ educ | brthord, data = data, na.action = na.exclud
e)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1.8532 -0.2557 0.0435 0.2970 1.3033
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 5.03040 0.43295 11.619 < 2e-16 ***
## educ 0.13064 0.03204 4.078 4.97e-05 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.4215 on 850 degrees of freedom
## Multiple R-Squared: -0.02862, Adjusted R-squared: -0.02983
## Wald test: 16.63 on 1 and 850 DF, p-value: 4.975e-05

We now estimate the return to a year of education as a 13% increase in wages. IF we believe in our
IV, this is a unbiased estimate. Because of missing data on brthord, we are using fewer observations
than in the previous analyses.
iv)
That magical word “identification”!
In the reduced form equation

educ=π0+π1sibs+π2brthord+v,educ=π0+π1sibs+π2brthord+v,
we need π2≠0π2≠0 in order for the β1β1 to be identified. We take the null to
be H0:π2=0H0:π2=0 and look to reject H0H0 at a small significance level.
four.lm = lm(educ ~ sibs + brthord, data, na.action=na.exclude)
summary(four.lm)
##
## Call:
## lm(formula = educ ~ sibs + brthord, data = data, na.action = na.exclude)
##
## Residuals:

6
## Min 1Q Median 3Q Max
## -5.1438 -1.6854 -0.6852 2.0090 5.9950
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 14.29650 0.13329 107.260 < 2e-16 ***
## sibs -0.15287 0.03987 -3.834 0.000135 ***
## brthord -0.15267 0.05708 -2.675 0.007619 **
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 2.137 on 849 degrees of freedom
## (83 observations deleted due to missingness)
## Multiple R-squared: 0.05833, Adjusted R-squared: 0.05611
## F-statistic: 26.29 on 2 and 849 DF, p-value: 8.33e-12

The regression of educ on sibs and brthord (using 852 observations)


yields π^2=0.153π^2=0.153 and se(π^2)=.057se(π^2)=.057. The t statistic is about –2.68, which
rejects H0H0 fairly strongly. Therefore, the identification assumptions appear to hold.
v)

five.lm = ivreg(lwage ~ educ + sibs | sibs + brthord, data = data, na.action=


na.exclude)
summary(five.lm)
##
## Call:
## ivreg(formula = lwage ~ educ + sibs | sibs + brthord, data = data,
## na.action = na.exclude)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1.84808 -0.26227 0.03841 0.29901 1.30836
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 4.938527 1.055690 4.678 3.37e-06 ***

7
## educ 0.136994 0.074681 1.834 0.0669 .
## sibs 0.002111 0.017372 0.122 0.9033
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.427 on 849 degrees of freedom
## Multiple R-Squared: -0.05428, Adjusted R-squared: -0.05676
## Wald test: 10.9 on 2 and 849 DF, p-value: 2.124e-05

The standard error on β^educβ^educ is much larger than we obtained in part (iii). The 95% CI
for βeducβeduc is roughly -0.010 to .284, which is very wide and includes the value zero. The
standard error of β^sibsβ^sibs is very large relative to the coefficient estimate,
rendering sibssibs very insignificant.
vi)

data$pred.four <- predict(four.lm)


cor(data$pred.four,data$sib,use="complete.obs")
## [1] -0.9294818

Letting educieduci be the first-stage fitted values, the correlation


between educieduci and sibsisibsi is about -0.930, which is a very strong negative correlation. This
means that, for the purposes of using IV, multicollinearity is a serious problem here and is not
allowing us to estimate βeducβeduc with much precision.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy