0% found this document useful (0 votes)
16 views

Multicollinearity Econometrics Corrected Format

Econometrics
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Multicollinearity Econometrics Corrected Format

Econometrics
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Multicollinearity in Econometrics (Based on Gujarati)

Key Concepts of Multicollinearity (Gujarati):


1. Definition: Multicollinearity refers to a situation where one or more
explanatory variables in a regression model are highly linearly related.

2. Perfect vs. Imperfect Multicollinearity:


- Perfect Multicollinearity: Exact linear relationship (e.g., X2 = 2X3).
- Imperfect Multicollinearity: High, but not perfect, correlation among
variables (e.g., r > 0.8).

Examples from Gujarati:


1. Perfect Multicollinearity:
Suppose you have the regression model:
Y = β0 + β1X2 + β2X3 + u
And X3 = 2X2. Here, X2 and X3 are perfectly collinear, making it
impossible to estimate β1 and β2 separately.

2. Imperfect Multicollinearity:
In real-world data, variables like income (X2) and education level
(X3) are highly correlated but not perfectly. This causes large standard
errors for the estimated coefficients, making it hard to determine their
statistical significance.

Detection of Multicollinearity:
1. High Pairwise Correlation:
Compute the correlation coefficient between explanatory variables. If
r > 0.8, multicollinearity might be a concern.

2. Variance Inflation Factor (VIF):


- Formula: VIF_j = 1 / (1 - R_j^2), where R_j^2 is the R^2 of
regressing X_j on all other predictors.
- VIF > 10 suggests multicollinearity.

Consequences of Multicollinearity:
1. Coefficients remain unbiased, but their standard errors are inflated,
making hypothesis testing unreliable.

2. Difficulty in interpreting regression coefficients due to unstable


estimates.

3. Reduced model precision.

Remedies for Multicollinearity:


1. Drop one of the correlated variables if it is not critical to the
analysis.

2. Transform the variables using Principal Component Analysis (PCA) or


creating indices.

3. Collect more data to reduce correlation.

4. Use Ridge Regression or Lasso Regression for regularization.

Example (Practical):
Imagine a regression model to explain house prices (Y):
Y = β0 + β1LotSize + β2NumberOfRooms + β3AreaSize + u
Here, LotSize and AreaSize are likely to be highly correlated. High
multicollinearity might lead to:
- Large p-values for coefficients.
- Misleading inference about the impact of LotSize or AreaSize.
Solution: Drop one variable (e.g., AreaSize) or use VIF to identify the
issue.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy