Quetions For Top Students PDF
Quetions For Top Students PDF
Quetions For Top Students PDF
(i) If you do not intend to take more advanced courses, for example HE3021, HE4021, or
MH3510, then you can ignore the questions below.
(iii) If you are very good at math and statistics and want to prepare for the advanced
econometrics topics, then you can try to solve the questions.
(iv) Knowing how to solve the questions here is not a necessary condition to get an A in this
course HE2005.
1. (Difficulty Level: 4; this was a final exam question in 2011 I made for MAEC students)
Consider the linear regression model 𝑦𝑦 = 𝛽𝛽𝛽𝛽 + 𝑢𝑢, where 𝑦𝑦 is a 𝑛𝑛 × 1 vector of observations, 𝑋𝑋 is
𝑛𝑛 × 𝐾𝐾 matrix of nonstochastic explanatory variables such that 𝑟𝑟𝑎𝑎𝑛𝑛𝑛𝑛(×) = 𝐾𝐾 < 𝑛𝑛, 𝛽𝛽 is 𝐾𝐾 × 1
unknown parameters, and u is a 𝑛𝑛 × 1 unobserved disturbance with s a 𝐸𝐸(𝑢𝑢) = 0 and s a 𝐸𝐸(𝑢𝑢𝑢𝑢′ ) =
𝜎𝜎 2 𝐼𝐼.
(a) Show that the OLS estimator is 𝛽𝛽̂ = (𝑋𝑋 ′ 𝑋𝑋)−1 𝑋𝑋 ′ 𝑦𝑦. Check 2nd derivative.
(b) Show that the estimator is unbiased and 𝑉𝑉𝑉𝑉𝑉𝑉�𝛽𝛽̂ � = 𝜎𝜎 2 (𝑋𝑋 ′ 𝑋𝑋)−1
This is based on Baltagi (1995). For the simple regression model without a constant, 𝑦𝑦𝑖𝑖 = 𝛽𝛽1 𝑥𝑥𝑖𝑖 +
𝑢𝑢𝑖𝑖 , 𝑖𝑖 = 1, … , 𝑛𝑛; where 𝑢𝑢𝑖𝑖 ~𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖(0, 𝜎𝜎 2 ) independent of 𝑥𝑥𝑖𝑖 . Consider the following three unbiased
estimators of 𝛽𝛽1:
∑ 𝑛𝑛 𝑛𝑛
∑ (𝑥𝑥𝑖𝑖 −𝑥𝑥̅ )(𝑦𝑦𝑖𝑖 −𝑦𝑦�)
𝑥𝑥𝑖𝑖 𝑦𝑦𝑖𝑖 𝑦𝑦�
𝛽𝛽̂1 = ∑𝑖𝑖=1 �
𝑛𝑛 𝑥𝑥 2 , 𝛽𝛽1 = 𝑥𝑥̅̅
and, 𝛽𝛽1̿ = 𝑖𝑖=1
∑𝑛𝑛 (𝑥𝑥 −𝑥𝑥̅ )2
𝑖𝑖=1 𝑖𝑖 𝑖𝑖=1 𝑖𝑖
∑𝑛𝑛
𝑖𝑖=1 𝑥𝑥𝑖𝑖 ∑𝑛𝑛
𝑖𝑖=1 𝑦𝑦𝑖𝑖
where 𝑥𝑥̅ = and 𝑦𝑦� =
𝑛𝑛 𝑛𝑛
̂
�1 � = �𝑉𝑉𝑉𝑉𝑉𝑉(𝛽𝛽1 )�
2
�𝑡𝑡ℎ𝑒𝑒 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑜𝑜𝑜𝑜𝛽𝛽̂1 𝑎𝑎𝑎𝑎𝑎𝑎 𝛽𝛽 �1 ) .
𝑉𝑉𝑉𝑉𝑉𝑉(𝛽𝛽
(b) Show that the optimal combination of 𝛽𝛽̂1 and 𝛽𝛽
�1, given by 𝛽𝛽̂ = 𝛼𝛼𝛽𝛽̂1 + (1 − 𝛼𝛼)𝛽𝛽
�1 where
−∞ < 𝛼𝛼 < ∞ occurs at 𝛼𝛼 ∗ = 1. Optimality here refers to minimizing the variance. [Hint: read
the paper by Samuel-Cahn (1994)
3.
Consider a simple regression with no constant: 𝑦𝑦𝑖𝑖 = 𝛽𝛽1 𝑥𝑥𝑖𝑖 + 𝑢𝑢𝑖𝑖 , 𝑖𝑖 = 1, … , 𝑛𝑛 where
𝑢𝑢𝑖𝑖 ~𝑖𝑖𝑖𝑖𝑖𝑖 𝑁𝑁(0, 𝜎𝜎 2 ) independent of 𝑥𝑥𝑖𝑖 . Theil (1971) showed that among all linear estimators in 𝑦𝑦𝑖𝑖 , the
minimum mean square estimator for 𝛽𝛽1, i.e., that which minimizes 𝐸𝐸(𝛽𝛽 �1 − 𝛽𝛽1 )2 is given by
𝑛𝑛 𝑛𝑛
�1 = 𝛽𝛽12 �
𝛽𝛽 𝑥𝑥𝑖𝑖 𝑦𝑦𝑖𝑖 ��𝛽𝛽12 � 𝑥𝑥𝑖𝑖2 + 𝜎𝜎 2 �
𝑖𝑖=1 𝑖𝑖=1