No Slide Title

Download Report

Transcript No Slide Title

Research Method
Lecture 11-3 (Ch15)
Instrumental Variables
Estimation and Two
Stage Least Square
©
1
IV solution to Errors in variable
problems: Example 1
Consider the following model
Y=β0+β1x1*+β2x2+u
Where x1* is the correctly measured
variable. Suppose, however, that you only
have error ridden variable x1=x1*+e1.
Thus, the actual estimation model becomes
Y=β0+β1x1+β2x2+(u- β1e1)
Thus, the OLS estimate of β1 is biased. This
is the error-in-variable bias.
2
The error-in-variable bias cannot be
corrected with the panel data method. But
IV method can solve the problem.
Suppose that you have another measure
for x1*. Call this z1. For example consider
x1* is the husband’s annual salary, and x1
is the annual salary reported by the
husband, which is reported with errors.
Sometimes, the data also asks the wife to
report her husband’s annual salary. Then
z1 is the husband’s annual salary reported
by the wife.
3
In this case, z1=x1*+a1 where a1 is the
measurement error.
Although z1 is measured with errors, it
can serve as the instrument for x1. Why?
First, x1 and z1 should be correlated.
Second, since e1 and a1 are just
measurement errors, they are unlikely to
be correlated, which means that z1 is
uncorrelated with the error term (u- β1e1).
Y=β0+β1x1+β2x2+(u- β1e1)
So, 2SLS with z1 as an instrument can
eliminate this bias.
4
IV solution to Errors in variable
problems: Example 2
This is a more complicated example.
Consider the following model.
log( wage)   0  1educ  (abil  u)
where we have the unobserved ability
problem.
Suppose that you have two test scores
that are the indicators of the ability.
test1 = γabil+e1
test2 = δabil+e2
5
If you use test1 as the proxy variable for
ability, you have the following model.
log( wage)   0  1educ  1test1  (u  1e1 )
where   1/  . Thus, test1 is correlated with
the error term: It has the error-in-variable
problem. In this case, a simple plug-insolution does not work.
However, since you have test2, another
measure of abil, you can use test2 as an
instrument for test1 in the 2SLS procedure
to eliminate the bias.
1
1
6
Exercise
Using WAGE2.dta, consider a log-wage
regression with explanatory variables
educ exper tenure married south urban
and black. Using IQ and KWW
(knowledge of the world of work) as two
measures of the unobserved ability,
estimate the model that correct for the
bias in educ.
7
OLS
. reg lwage educ exper tenure married south urban black
Source
SS
df
MS
Model
Residual
41.8377619
123.818521
7
927
5.97682312
.133569063
Total
165.656283
934
.177362188
lwage
Coef.
educ
exper
tenure
married
south
urban
black
_cons
.0654307
.014043
.0117473
.1994171
-.0909036
.1839121
-.1883499
5.395497
Std. Err.
.0062504
.0031852
.002453
.0390502
.0262485
.0269583
.0376666
.113225
t
10.47
4.41
4.79
5.11
-3.46
6.82
-5.00
47.65
Number of obs
F( 7,
927)
Prob > F
R-squared
Adj R-squared
Root MSE
P>|t|
0.000
0.000
0.000
0.000
0.001
0.000
0.000
0.000
=
=
=
=
=
=
935
44.75
0.0000
0.2526
0.2469
.36547
[95% Conf. Interval]
.0531642
.007792
.0069333
.1227801
-.142417
.1310056
-.2622717
5.17329
.0776973
.020294
.0165613
.276054
-.0393903
.2368185
-.1144281
5.617704
8
. reg lwage IQ educ exper tenure married south urban black
Source
SS
df
MS
Model
Residual
43.5360162
122.120267
8
926
5.44200202
.131879338
Total
165.656283
934
.177362188
lwage
Coef.
IQ
educ
exper
tenure
married
south
urban
black
_cons
.0035591
.0544106
.0141458
.0113951
.1997644
-.0801695
.1819463
-.1431253
5.176439
Std. Err.
.0009918
.0069285
.0031651
.0024394
.0388025
.0262529
.0267929
.0394925
.1280006
t
3.59
7.85
4.47
4.67
5.15
-3.05
6.79
-3.62
40.44
Number of obs
F( 8,
926)
Prob > F
R-squared
Adj R-squared
Root MSE
P>|t|
0.000
0.000
0.000
0.000
0.000
0.002
0.000
0.000
0.000
=
=
=
=
=
=
935
41.27
0.0000
0.2628
0.2564
.36315
Simple plug
in solution
[95% Conf. Interval]
.0016127
.0408133
.0079342
.0066077
.1236134
-.1316916
.1293645
-.2206304
4.925234
.0055056
.068008
.0203575
.0161825
.2759154
-.0286473
.2345281
-.0656202
5.427644
. ivregress 2sls lwage educ exper tenure married south urban black (IQ=KWW)
Instrumental variables (2SLS) regression
lwage
Coef.
IQ
educ
exper
tenure
married
south
urban
black
_cons
.0130473
.0250321
.01442
.0104562
.2006903
-.0515532
.1767058
-.0225612
4.592453
Instrumented:
Instruments:
Std. Err.
.0049103
.0165266
.0033047
.0025887
.0404813
.0309777
.0280756
.0736029
.324209
Number of obs
Wald chi2(8)
Prob > chi2
R-squared
Root MSE
z
2.66
1.51
4.36
4.04
4.96
-1.66
6.29
-0.31
14.17
P>|z|
0.008
0.130
0.000
0.000
0.000
0.096
0.000
0.759
0.000
=
=
=
=
=
935
298.58
0.0000
0.1900
.37884
[95% Conf. Interval]
.0034234
-.0073595
.0079429
.0053824
.1213485
-.1122685
.1216785
-.1668202
3.957015
IQ
educ exper tenure married south urban black KWW
.0226712
.0574238
.0208972
.01553
.2800322
.009162
.231733
.1216979
5.227891
Plug in + IV
using KWW
as the
instrument
for IQ
9
2SLS with heteroskedasticity
When heteroskedasticity is present, we
have to modify the standard error
formula.
The derivation of the formula is not the
scope of this class. However, STATA
automatically compute this. Just use
robust option.
10
Testing overidentifying
restrictions
Usually, the instrument exogeneity
cannot be tested.
However, when you have extra
instruments, you can effectively test this.
This is the test of overidentifying
restrictions.
11
The basic idea behind the test of
overidentifying restrictions
Before presenting the procedure, I will
provide you with the basic idea of the test.
Consider the following model.
y1=β0+β1y2+β2z1+β3z2+u1
Suppose you have two instruments for y2: z3
z4. If both instruments are valid instruments,
using either z3 or z4 as an instrument will
produce consistent estimates.

Let  be the IV estimator when z3 is used as an
~

instrument. Let be the IV estimate when z4 is
12
used as an instrument
1
1

1
~
1
The idea is to check if and are similar.
 ~
That is, you test H0:     0 .
1
1
If you reject this null, it means that either
z3 or z4, or both of them are not
exogenous. We do not know which one is
not exogenous. So the rejection of the null
typically means that your choice of
instruments is invalid.
13
On the other hand, if you fail to reject the
null hypothesis, we can have some
confidence in the overall set of
instruments used.
However, caution is necessarily. Even if
you fail to reject the null, this does not
always mean that the set of instruments
are valid.
For example, consider wage regression
with education being the endogenous
variable. And you have mother and
father’s education as instruments.
14
Even if mother and father’s education do
 ~
not satisfy the instrument exogeneity,   
may be very close to zero since the
direction of the biases are the same. In this
case, even if they are invalid instruments,
we may fail to reject the null (i.e.,
erraneously judge that they satisfy the
instrument exogeneity).
1
1
15
The procedure of the test of
overidentifying restrictions
The procedure:
(i)Estimate the structural equation by 2SLS and
obtain the 2SLS residuals, û.1
(ii)Regress û1 on al exogenous variables. Obtain Rsquared. Say R12.
(iii)Under the null that all IVs are uncorrelated
with the structural error u1, n  R12 ~  q2 where q is
the number of extra instruments.
If you fail to reject the null (i.e., if nR12 is small),
then you have some confidence about the
instrument exogeneity. If you reject it, at least
16
some of the instruments are not exogenous.
The NR12 statistic is valid when
homoskedasticity assumption holds. NR12
statistic is also calld the Sargan’s statistic.
When we assume heteroskedasticity, we
have to use another statistic called the
Hansen’s J statistic.
Both tests can be done automatically
using STATA.
17
Exercise
Consider the following model.
Log(wage)=β0+β1(educ)+β2Exper+β3Exper2+u
1.Using Mroz.dta, estimate the above
equation using motheduc & fathereduc as
instruments for (educ).
2.Test the overidentifying restrictions.
18
Answers 1
. reg lwage educ exper expersq
Source
SS
df
MS
Model
Residual
35.0222967
188.305144
3
424
11.6740989
.444115906
Total
223.327441
427
.523015084
lwage
Coef.
educ
exper
expersq
_cons
.1074896
.0415665
-.0008112
-.5220406
Std. Err.
.0141465
.0131752
.0003932
.1986321
t
7.60
3.15
-2.06
-2.63
Number of obs
F( 3,
424)
Prob > F
R-squared
Adj R-squared
Root MSE
P>|t|
0.000
0.002
0.040
0.009
=
=
=
=
=
=
428
26.29
0.0000
0.1568
0.1509
.66642
OLS
[95% Conf. Interval]
.0796837
.0156697
-.0015841
-.9124667
.1352956
.0674633
-.0000382
-.1316144
. ivregress 2sls lwage exper expersq (educ=motheduc fatheduc)
Instrumental variables (2SLS) regression
lwage
Coef.
educ
exper
expersq
_cons
.0613966
.0441704
-.000899
.0481003
Std. Err.
.0312895
.0133696
.0003998
.398453
Number of obs
Wald chi2(3)
Prob > chi2
R-squared
Root MSE
z
1.96
3.30
-2.25
0.12
P>|z|
0.050
0.001
0.025
0.904
=
=
=
=
=
428
24.65
0.0000
0.1357
.67155
2SLS
[95% Conf. Interval]
.0000704
.0179665
-.0016826
-.7328532
.1227228
.0703742
-.0001154
.8290538
19
Answer: 2
First, conduct the test “manually”.
. ivregress 2sls lwage exper expersq (educ=motheduc fatheduc)
Instrumental variables (2SLS) regression
lwage
Coef.
educ
exper
expersq
_cons
.0613966
.0441704
-.000899
.0481003
Instrumented:
Instruments:
Std. Err.
.0312895
.0133696
.0003998
.398453
Number of obs
Wald chi2(3)
Prob > chi2
R-squared
Root MSE
z
1.96
3.30
-2.25
0.12
educ
exper expersq motheduc fatheduc
. predict uhat, resid
(325 missing values generated)
P>|z|
0.050
0.001
0.025
0.904
=
=
=
=
=
428
24.65
0.0000
0.1357
.67155
1. First, estimate
2SLS.
[95% Conf. Interval]
.0000704
.0179665
-.0016826
-.7328532
.1227228
.0703742
-.0001154
.8290538
2. Second, generate
the 2sls residual.
Call this uhat.
20
. reg uhat exper expersq motheduc fatheduc
Source
SS
df
MS
Model
Residual
.170503136
192.84951
4
423
.042625784
.455909007
Total
193.020013
427
.452037502
uhat
Coef.
exper
expersq
motheduc
fatheduc
_cons
-.0000183
7.34e-07
-.0066065
.0057823
.0109641
Std. Err.
.0133291
.0003985
.0118864
.0111786
.1412571
Number of obs
F( 4,
423)
Prob > F
R-squared
Adj R-squared
Root MSE
t
P>|t|
-0.00
0.00
-0.56
0.52
0.08
0.999
0.999
0.579
0.605
0.938
[95% Conf. Interval]
-.0262179
-.0007825
-.0299704
-.0161902
-.2666892
. gen rsq=e(r2)
. su rsq
Variable
Obs
Mean
rsq
753
.0008833
Std. Dev.
0
=
428
=
0.09
= 0.9845
= 0.0009
= -0.0086
= .67521
Min
Max
.0008833
.0008833
5. Finally, compute NR2.
.0261813
.000784
.0167573
.0277547
.2886173
3. Third, regress uhat
on all the exogenous
variables. Don’t
forget to include
exogenous variables
in the structural
equation: exper and
expersq
4. Fourth, get the Rsquared from this
regression. You can
use this, but this is
rounded. To
compute more
precisely, type this.
. gen n_rsq=428*rsq
. su n_rsq
Variable
Obs
Mean
n_rsq
753
.3780714
Std. Dev.
0
Min
Max
.3780714
.3780714
This is NR2 stat. This is
also called the Sargan’s
statistic
21
The NR2 stat follows χ2(1).The degree of
freedom is equal to the number of extra
instruments. In our case it is 1. (In our mode,
there is only one endogenous variable. Thus, you need only one
instrument. But we have two instruments. Therefore the number of
extra instrumetn is 1. )
Since the 5% cutoff point for χ2(1) is 3.84,
we failed to reject the null hypothesis that
exogenous variables are not correlated
with the structural error.
Thus, we have some confidence in the
choice of instruments. In other word, our
instruments have ‘passed’ the test of
22
overidentifying restrictions.
Now, let us conduct the test of
overidentifying restriction automatically.
. ivregress 2sls lwage exper expersq (educ=motheduc fatheduc)
Instrumental variables (2SLS) regression
lwage
Coef.
educ
exper
expersq
_cons
.0613966
.0441704
-.000899
.0481003
Std. Err.
.0312895
.0133696
.0003998
.398453
Number of obs
Wald chi2(3)
Prob > chi2
R-squared
Root MSE
z
1.96
3.30
-2.25
0.12
P>|z|
0.050
0.001
0.025
0.904
Instrumented: educ
Instruments: exper expersq motheduc fatheduc
. estat overid
Tests of overidentifying restrictions:
Sargan (score) chi2(1) = .378071 (p = 0.5386)
Basmann chi2(1)
= .373985 (p = 0.5408)
=
428
= 24.65
= 0.0000
= 0.1357
= .67155
[95% Conf. Interval]
.0000704
.0179665
-.0016826
-.7328532
.1227228
.0703742
-.0001154
.8290538
This is NR2 stat. It
is also called the
Sargan’s statistic.
23
The heteroskedasticity version can also be
done automatically.
. ivregress 2sls lwage exper expersq (educ=motheduc fatheduc), robust
Number of obs
Wald chi2(3)
Prob > chi2
R-squared
Root MSE
Instrumental variables (2SLS) regression
lwage
Coef.
educ
exper
expersq
_cons
.0613966
.0441704
-.000899
.0481003
Robust
Std. Err.
.0331824
.0154736
.0004281
.4277846
z
1.85
2.85
-2.10
0.11
P>|z|
0.064
0.004
0.036
0.910
Instrumented: educ
Instruments: exper expersq motheduc fatheduc
. estat overid
428
=
= 18.61
= 0.0003
= 0.1357
= .67155
[95% Conf. Interval]
-.0036397
.0138428
-.001738
-.7903421
.126433
.074498
-.00006
.8865427
Use robust
option
when
estimating
2SLS.
Then type
the same
command.
Test of overidentifying restrictions:
Score chi2(1)
= .443461 (p = 0.5055)
Heteroskedasticity robust
version is called the
Hansen’s J statistic
24
Note
Even if you fail to reject the null
hypothesis in the test, there is a possibility
that your instruments are still invalid.
Thus, even if your instruments “pass the
test”, in general, you should try to
provide a plausible “story” why your
instruments satisfy the instrument
exogeneity. (Quarter of birth is a good
example).
25
Testing the endogeneity
Consider again the following model.
y1=β0+β1y2+β2z1+β3z2+u1
Where y2 is the suspected endogenous
variable and you have instruments z3 and z4.
If y2 is actually exogenous, OLS is better.
If you have valid instruments, you can test if
y2 is exogenous or not.
26
Before laying out the procedure, let us
understand the basic idea behind the test.
Structural eq:y1=β0+β1y2+β2z1+β3z2+u1
Reduced eq :y2=π0+π1z1+ π2z2+ π3z3+ π4z4+v2
You can check that y2 is correlated with u1 only
if v2 is correlated with u1.
Further, let u1=δv2+e1. Then u1 and v2 are
correlated only if δ =0. Thus, consider
y1=β0+β1y2+β2z1+β3z2+ δv2+e1
then, test if δ is zero or not.
27
The test of endogeneity: procedure
(i) Estimate the reduced form equation
using OLS.
y2=π0+π1z1+ π2z2+ π3z3+ π4z4+v2
Then obtain the residual v̂2 .
(ii) Add v̂2 to the structural equation and
estimate using OLS
y1=β0+β1y2+β2z1+β3z2+αv̂2 +e1
Then, test H0: α=0.
If we reject H0, then we conclude that y2 is
endogenous because u1 and v2 are correlated.
28
Exercise
Consider the following model.
Log(wage)=β0+β1(educ)+β2Exper+β3Exper2+u
Suppose that father and mother’ education
satisfy the instrument exogeneity. Conduct the
Hausman test of endogeneity to check if
(educ) is exogenous or not.
29
Answer
First, conduct the test “manually”.
. ivregress 2sls lwage exper expersq (educ=motheduc fatheduc)
Instrumental variables (2SLS) regression
lwage
Coef.
educ
exper
expersq
_cons
.0613966
.0441704
-.000899
.0481003
Std. Err.
.0312895
.0133696
.0003998
.398453
Number of obs
Wald chi2(3)
Prob > chi2
R-squared
Root MSE
z
1.96
3.30
-2.25
0.12
Instrumented: educ
Instruments: exper expersq motheduc fatheduc
P>|z|
0.050
0.001
0.025
0.904
=
428
= 24.65
= 0.0000
= 0.1357
= .67155
[95% Conf. Interval]
.0000704
.0179665
-.0016826
-.7328532
.1227228
.0703742
-.0001154
.8290538
To use the
same
observations
as 2SLS, run
2SLS once
and generate
this variable
. gen fullsample=e(sample)
30
. reg educ exper expersq motheduc fatheduc
Source
SS
df
if fullsample==1
MS
Model
Residual
471.620998
1758.57526
4
423
117.90525
4.15738833
Total
2230.19626
427
5.22294206
educ
Coef.
exper
expersq
motheduc
fatheduc
_cons
.0452254
-.0010091
.157597
.1895484
9.10264
Std. Err.
.0402507
.0012033
.0358941
.0337565
.4265614
t
1.12
-0.84
4.39
5.62
21.34
Number of obs
F( 4,
423)
Prob > F
R-squared
Adj R-squared
Root MSE
P>|t|
0.262
0.402
0.000
0.000
0.000
=
=
=
=
=
=
428
28.36
0.0000
0.2115
0.2040
2.039
Now run the
reduced for
regression, then
get the residual.
[95% Conf. Interval]
-.0338909
-.0033744
.087044
.1231971
8.264196
.1243417
.0013562
.2281501
.2558997
9.941084
. predict uhat_reduced, resid
. reg lwage educ exper expersq uhat_reduced
Source
SS
df
MS
Model
Residual
36.2573098
187.070131
4
423
9.06432745
.442246173
Total
223.327441
427
.523015084
lwage
Coef.
educ
exper
expersq
uhat_reduced
_cons
.0613966
.0441704
-.000899
.0581666
.0481003
Std. Err.
.0309849
.0132394
.0003959
.0348073
.3945753
t
1.98
3.34
-2.27
1.67
0.12
Number of obs
F( 4,
423)
Prob > F
R-squared
Adj R-squared
Root MSE
P>|t|
0.048
0.001
0.024
0.095
0.903
=
=
=
=
=
=
428
20.50
0.0000
0.1624
0.1544
.66502
[95% Conf. Interval]
.000493
.0181471
-.0016772
-.0102501
-.7274721
.1223003
.0701937
-.0001208
.1265834
.8236727
Then
check if
this
coefficient
is different
from zero.
31
The coefficient on uhat is significant at
10% level. Thus, you reject the null
hypothesis that educ is exogenous (not
correlated with the structural error) at
10% level.
This is a moderate evidence that educ is
endogenous and thus 2SLS should be
reported (along with OLS).
32
Stata conduct the test of endogeneity
automatically. Stata uses a different
version of the test.
. ivregress 2sls lwage exper expersq (educ=motheduc fatheduc)
Instrumental variables (2SLS) regression
lwage
Coef.
educ
exper
expersq
_cons
.0613966
.0441704
-.000899
.0481003
Instrumented:
Instruments:
Std. Err.
.0312895
.0133696
.0003998
.398453
Number of obs
Wald chi2(3)
Prob > chi2
R-squared
Root MSE
z
1.96
3.30
-2.25
0.12
P>|z|
0.050
0.001
0.025
0.904
=
=
=
=
=
428
24.65
0.0000
0.1357
.67155
[95% Conf. Interval]
.0000704
.0179665
-.0016826
-.7328532
.1227228
.0703742
-.0001154
.8290538
educ
exper expersq motheduc fatheduc
. estat endog
Tests of endogeneity
Ho: variables are exogenous
Durbin (score) chi2(1)
Wu-Hausman F(1,423)
=
=
2.80707
2.79259
(p = 0.0938)
(p = 0.0954)
33
Note that the test of endogeneity is valid
only if that the instruments satisfy the
instrument exogeneity.
Thus, test the overidentifying restrictions
first to check if the instruments satisfy the
instrument exogeneity. If instruments
“pass” the overidentifying test, then
conduct the test of endogeneity.
34
Applying 2SLS to pooled
cross sections
When you simply apply 2SLS to the
pooled cross section data, there is no new
difficulty. You can just apply 2SLS.
35
Combining panel data
method and IV method
Suppose you have two period panel data.
The period is 1987 and 1988. Consider the
following model.
Log(scrap)it=β0+δ0d88t+β1(hrsemp)it+ai+uit
Where (scrap) is the scrap rate. (Hrsemp) is
the hours of employee training. You have
data
36
Correlation between ai and (hrsemp)it
causes a bias in β1. In the first differened
model, we difference to remove ai: that is,
we estimate
∆Log(scrap)it=δ0+β1∆ (hrsemp)it+∆uit…(1)
In some case, ∆ (hrsemp)it and ∆uit can still be
correlated. For example, when a firm hires
more skilled workers, they may reduce the
job training.
37
In this case, the quality of the worker is
time varying, so it is not contained in ai, but
it is contained in uit. In this case ∆(hrsemp)it
and∆uit may be negatively correlated. This
would cause OLS estimate of β1 to be biased
upward (bias towards not finding the
productivity enhancing effect of training).
To eliminate the bias, we can apply IV
method to equation (1).
38
One possible instrument for ∆(hrsemp)it is
the ∆(Grant)it. (Grant)it is a variable
indicating if the company received job
training grant. Since the grant designation is
given at the beginning of 1988, ∆(Grant)it
may be uncorrelated with ∆uit. At the same
time, it would be correlated with ∆(hrsemp)it.
Thus, we can use ∆(Grant)it as an IV for
∆(hrsemp)it.
39
Exercise
Using JTRAIN.dta, estimate the following
model.
∆Log(scrap)it=δ0+β1∆ (hrsemp)it+∆uit…(1)
Use ∆(Grant)it as an instrument for
∆(hrsemp)it. Use the data between 1987 and
1988 only.
40
Answer
First estimate it manually.
. tsset fcode year
panel variable:
time variable:
delta:
fcode (strongly balanced)
year, 1987 to 1989
1 unit
This is the
simple first
differenced
model
. gen dlscrap =lscrap-L.lscrap
(363 missing values generated)
. gen dhrsemp=hrsemp-L.hrsemp
(220 missing values generated)
. gen dgrant=grant-L.grant
(157 missing values generated)
. reg dlscrap dhrsemp if year<=1988
Source
SS
df
MS
Model
Residual
1.07071245
16.2191273
1
43
1.07071245
.377189007
Total
17.2898397
44
.392950903
dlscrap
Coef.
dhrsemp
_cons
-.0076007
-.1035161
Std. Err.
.0045112
.103736
t
-1.68
-1.00
Number of obs
F( 1,
43)
Prob > F
R-squared
Adj R-squared
Root MSE
P>|t|
0.099
0.324
=
=
=
=
=
=
45
2.84
0.0993
0.0619
0.0401
.61416
[95% Conf. Interval]
-.0166984
-.3127197
.0014971
.1056875
41
. ivregress 2sls dlscrap (dhrsemp=dgrant) if year<=1988
Instrumental variables (2SLS) regression
dlscrap
Coef.
dhrsemp
_cons
-.0141532
-.0326684
Std. Err.
.0077369
.124098
Number of obs
Wald chi2(1)
Prob > chi2
R-squared
Root MSE
z
-1.83
-0.26
P>|z|
0.067
0.792
=
45
= 3.35
= 0.0674
= 0.0159
= .61491
[95% Conf. Interval]
-.0293171
-.275896
.0010108
.2105592
This is the
firstdifferenced
model + IV
method
Instrumented: dhrsemp
Instruments: dgrant
42
Now, estimate the model automatically.
. tsset fcode year
panel variable: fcode (strongly balanced)
time variable: year, 1987 to 1989
delta: 1 unit
. xtivreg lscrap (hrsemp=grant) if year<=1988, fd
First-differenced IV regression
Group variable:
fcode
Time variable (t): year
Number of obs
Number of groups
=
=
45
45
R-sq: within =
.
between = 0.0016
overall = 0.0016
Obs per group: min =
avg =
max =
1
1.0
1
corr(u_i, Xb) = -0.2070
Wald chi2(1)
Prob > chi2
D.lscrap
Coef.
hrsemp
D1.
_cons
-.0141532
-.0326684
.0079147
.1269512
sigma_u
sigma_e
rho
1.3728352
.62904299
.8264778
(fraction of variance due to u_i)
Instrumented:
Instruments:
hrsemp
grant
Std. Err.
z
-1.79
-0.26
P>|z|
0.074
0.797
=
=
3.20
0.0737
First
differenced
model +
IV method.
[95% Conf. Interval]
-.0296658
-.2814881
.0013594
.2161513
43