Tải bản đầy đủ (.pdf) (13 trang)

Econometrics – lecture 3 – multiple regression

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (348.07 KB, 13 trang )

KTEE 310 FINANCIAL ECONOMETRICS
MULTIPLE REGRESSION ANALYSIS: ESTIMATION
Chap 6 – S & W

Dr TU Thuy Anh
Faculty of International Economics
1


MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES:
EXAMPLE
Q = 1 + 2 L+ 3 K + u
The model has three dimensions, one each for Q, L, and K. The starting point for
investigating the determination of Q is the intercept, 1.

1
Q
K
L


MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES:
EXAMPLE
Q = 1 + 2L+ 3K + u

pure L
effect

1

1 + 2L



Q
K
L


MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES:
EXAMPLE
Q = 1 + 2 L+ 3 K + u

1 + 3 K

pure K effect

1
Q
K
L


MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES:
EXAMPLE
Q = 1 + 2 L+ 3 K + u

1 + 2L + 3K
1 + 3 K

pure K effect
Pure L
effect


1

combined effect of L
and K

1 + 2L

Q
K
L


MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES:
EXAMPLE
Q = 1 + 2 L+ 3 K + u

1 + 2L + 3K + u
u

1 + 3K

pure K effect
pure L
effect

1

1 + 2L + 3K
combined effect of L

and K

1 + 2L

Q
K
L
The final element of the model is the disturbance term, u. This causes the
actual values of Q to deviate from the plane. In this observation, u
happens to have a positive value.


MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES:
EXAMPLE

Yi   1   2 X 2 i   3 X 3 i  ui
Yˆi  b1  b2 X 2 i  b3 X 3 i
ei  Yi  Yˆi  Yi  b1  b2 X 2 i  b3 X 3 i
RSS   e i2   (Yi  b1  b2 X 2 i  b3 X 3 i ) 2

The regression coefficients are derived using the same least squares principle
used in simple regression analysis. The fitted value of Y in observation i
depends on our choice of b1, b2, and b3.
The residual ei in observation i is the difference between the actual and fitted
values of Y.
We define RSS, the sum of the squares of the residuals, and choose b1, b2, and
b3 so as to minimize it, using first order condition.


MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES:

EXAMPLE
Model 3: OLS, using observations 1899-1922 (T = 24)
Dependent variable: q
coefficient std. error t-ratio p-value
-------------------------------------------------------const
-4,85518
14,5403
-0,3339 0,7418
l
0,916609
0,149560
6,129 4,42e-06 ***
k
0,158596
0,0416823 3,805 0,0010 ***
Mean dependent var 165,9167
Sum squared resid 2534,226
R-squared
0,942443
F(2, 21)
171,9278
Log-likelihood
-89,96960
Schwarz criterion 189,4734
rho
0,098491

8

S.D. dependent var 43,75318

S.E. of regression 10,98533
Adjusted R-squared 0,936961
P-value(F)
9,57e-14
Akaike criterion
185,9392
Hannan-Quinn
186,8768
Durbin-Watson
1,535082


PROPERTIES OF THE MULTIPLE REGRESSION COEFFICIENTS

A.1: The model is linear in parameters and correctly
specified.

Y   1   2 X 2  ...   k X k  u
A.2: There does not exist an exact linear relationship
among the regressors in the sample (No
multicolinearity).
A.3 The disturbance term has zero expectation
A.4 The disturbance term is homoskedastic
A.5 The values of the disturbance term have
independent distributions
A.6 The disturbance term has a normal distribution
Provided that the regression model assumptions are valid, the OLS
estimators in the multiple regression model are unbiased and efficient, as
in the simple regression model.



MULTICOLINEARITY
 Example: X1 - 2 X2 =0
 Definition: X1 and X2 are perfectly multi-collinear if there

exists b1, b2 such that:
 b1X1+b2X2 = a ( a: constant)
 at least one of (bi) is non-zero

 X1 and X2 are perfectly multi-collinear iff r(X1, X2) = +/- 1
 X1 and X2 are highly multi-collinear if r(X1, X2) is large
 ~X1 and X2 are highly multi-collinear if R2 of the model X1

on (a1, X2) is large

10


MULTICOLLINEARITY
 Definition: X1,..,Xk are perfectly multi-colinear if there

exists b1,..,bk :
 b1X1+..+bkXk = a ( a: constant)
 at least one of (bi) is non-zero

 X1,..,Xk are highly multi-collinear R2 of the model (Xj on the

rest and a1) is large.
 Assumption 6: no perfect multi-collinearity among X2,..,Xk



11


MULTICOLLINEARITY

What would happen if you tried to run a regression when there is an
exact linear relationship among the explanatory variables? The
coefficient is not defined

2

Q  1   2 L   3 K   4 K  u


MULTICOLLINEARITY
Model 4: OLS, using observations 1899-1922 (T = 24)
Dependent variable: q
coefficient
std. error t-ratio p-value
----------------------------------------------------------const
-10,7774
16,4164
-0,6565 0,5190
l
0,822744
0,190860
4,311 0,0003 ***
k
0,312205

0,195927
1,593 0,1267
sq_k
-0,000249224 0,000310481 -0,8027 0,4316
Mean dependent var 165,9167
Sum squared resid 2455,130
R-squared
0,944239
F(3, 20)
112,8920
Log-likelihood
-89,58910
Schwarz criterion 191,8904
rho
-0,083426

13

S.D. dependent var 43,75318
S.E. of regression 11,07955
Adjusted R-squared 0,935875
P-value(F)
1,05e-12
Akaike criterion
187,1782
Hannan-Quinn
188,4284
Durbin-Watson
1,737618




×