Transcript Using SAS

Using SAS for Time Series Data
LSU Economics Department
March 16, 2012
Next Workshop March 30
Instrumental Variables Estimation
Time-Series Data:
Nonstationary Variables
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 3
Chapter Contents
 12.1 Stationary and Nonstationary Variables
 12.2 Spurious Regressions
 12.3 Unit Root Tests for Nonstationarity
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 4
The aim is to describe how to estimate regression
models involving nonstationary variables
– The first step is to examine the time-series
concepts of stationarity (and nonstationarity)
and how we distinguish between them.
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 5
12.1
Stationary and Nonstationary
Variables
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 6
12.1
Stationary and
Nonstationary
Variables
The change in a variable is an important concept
– The change in a variable yt, also known as its
first difference, is given by Δyt = yt – yt-1.
• Δyt is the change in the value of the variable
y from period t - 1 to period t
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 7
12.1
Stationary and
Nonstationary
Variables
Principles of Econometrics, 4th Edition
FIGURE 12.1 U.S. economic time series
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 8
12.1
Stationary and
Nonstationary
Variables
Principles of Econometrics, 4th Edition
FIGURE 12.1 (Continued) U.S. economic time series
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 9
12.1
Stationary and
Nonstationary
Variables
Formally, a time series yt is stationary if its mean
and variance are constant over time, and if the
covariance between two values from the series
depends only on the length of time separating the
two values, and not on the actual times at which
the variables are observed
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 10
12.1
Stationary and
Nonstationary
Variables
That is, the time series yt is stationary if for all
values, and every time period, it is true that:
Eq. 12.1a
E  yt   μ
(constant mean)
Eq. 12.1b
var  yt   σ 2
(constant variance)
Eq. 12.1c
cov  yt , yt  s   cov  yt , yt  s   γ s (covariance depends on s, not t )
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 11
12.1
Stationary and
Nonstationary
Variables
FIGURE 12.2 Time-series models
12.1.1
The First-Order
Autoregressive
Model
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 12
12.1
Stationary and
Nonstationary
Variables
FIGURE 12.2 (Continued) Time-series models
12.1.1
The First-Order
Autoregressive
Model
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 13
12.2
Spurious
Regressions
FIGURE 12.3 Time series and scatter plot of two random walk variables
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 14
12.2
Spurious
Regressions
A simple regression of series one (rw1) on series
two (rw2) yields:
rw1t  17.818  0.842 rw2t ,
(t )
R 2  0.70
(40.837)
– These results are completely meaningless, or
spurious
• The apparent significance of the relationship
is false
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 15
12.2
Spurious
Regressions
When nonstationary time series are used in a
regression model, the results may spuriously
indicate a significant relationship when there is
none
– In these cases the least squares estimator and
least squares predictor do not have their usual
properties, and t-statistics are not reliable
– Since many macroeconomic time series are
nonstationary, it is particularly important to
take care when estimating regressions with
macroeconomic variables
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 16
12.3
Unit Root Tests for Stationarity
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 17
12.3
Unit Root Tests for
Stationarity
There are many tests for determining whether a
series is stationary or nonstationary
– The most popular is the Dickey–Fuller test
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 18
12.3
Unit Root Tests for
Stationarity
12.3.1
Dickey-Fuller Test 1
(No constant and No
Trend)
Consider the AR(1) model:
yt  yt 1  vt
Eq. 12.4
– We can test for nonstationarity by testing the
null hypothesis that ρ = 1 against the alternative
that |ρ| < 1
• Or simply ρ < 1
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 19
12.3
Unit Root Tests for
Stationarity
12.3.1
Dickey-Fuller Test 1
(No constant and No
Trend)
A more convenient form is:
yt  yt 1  yt 1  yt 1  vt
yt     1 yt 1  vt
Eq. 12.5a
  yt 1  vt
– The hypotheses are:
H0 :   1  H0 :   0
H1 :   1  H1 :   0
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 20
12.3
Unit Root Tests for
Stationarity
12.3.2
Dickey-Fuller Test 2
(With Constant but
No Trend)
The second Dickey–Fuller test includes a constant
term in the test equation:
yt     yt 1  vt
Eq. 12.5b
– The null and alternative hypotheses are the
same as before
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 21
12.3
Unit Root Tests for
Stationarity
12.3.3
Dickey-Fuller Test 3
(With Constant and
With Trend)
The third Dickey–Fuller test includes a constant
and a trend in the test equation:
yt    yt 1  t  vt
Eq. 12.5c
– The null and alternative hypotheses are
H0: γ = 0 and H1:γ < 0
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 22
12.3
Unit Root Tests for
Stationarity
12.3.4
The Dickey-Fuller
Critical Values
To test the hypothesis in all three cases, we simply
estimate the test equation by least squares and
examine the t-statistic for the hypothesis that
γ=0
– Unfortunately this t-statistic no longer has the
t-distribution
– Instead, we use the statistic often called a τ
(tau) statistic
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 23
12.3
Unit Root Tests for
Stationarity
Table 12.2 Critical Values for the Dickey–Fuller Test
12.3.4
The Dickey-Fuller
Critical Values
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 24
12.3
Unit Root Tests for
Stationarity
12.3.4
The Dickey-Fuller
Critical Values
To carry out a one-tail test of significance, if τc is
the critical value obtained from Table 12.2, we
reject the null hypothesis of nonstationarity if
τ ≤ τc
– If τ > τc then we do not reject the null
hypothesis that the series is nonstationary
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 25
12.3
Unit Root Tests for
Stationarity
12.3.4
The Dickey-Fuller
Critical Values
An important extension of the Dickey–Fuller test
allows for the possibility that the error term is
autocorrelated
– Consider the model:
m
yt     yt 1   as yt  s  vt
Eq. 12.6
where
s 1
yt 1   yt 1  yt 2  , yt 2   yt 2  yt 3  ,
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 26
12.3
Unit Root Tests for
Stationarity
12.3.6
The Dickey-Fuller
Tests: An Example
As an example, consider the two interest rate
series:
– The federal funds rate (Ft)
– The three-year bond rate (Bt)
Following procedures described in Sections 9.3
and 9.4, we find that the inclusion of one lagged
difference term is sufficient to eliminate
autocorrelation in the residuals in both cases
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 27
12.3
Unit Root Tests for
Stationarity
12.3.6
The Dickey-Fuller
Tests: An Example
The results from estimating the resulting equations
are:
Ft  0.173  0.045 Ft 1  0.561Ft 1
(tau )
(  2.505)
Bt  0.237  0.056 Bt 1  0.237 Bt 1
(tau )
(  2.703)
– The 5% critical value for tau (τc) is -2.86
– Since -2.505 > -2.86, we do not reject the null
hypothesis
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 28
12.3
Unit Root Tests for
Stationarity
12.3.7
Order of Integration
Recall that if yt follows a random walk, then γ = 0
and the first difference of yt becomes:
yt  yt  yt 1  vt
– Series like yt, which can be made stationary by
taking the first difference, are said to be
integrated of order one, and denoted as I(1)
• Stationary series are said to be integrated of
order zero, I(0)
– In general, the order of integration of a series is
the minimum number of times it must be
differenced to make it stationary
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 29
12.3
Unit Root Tests for
Stationarity
12.3.7
Order of Integration
The results of the Dickey–Fuller test for a random
walk applied to the first differences are:
  F t   0.447  F t 1
(tau )
(  5.487)
  B t   0.701 B t 1
(tau )
Principles of Econometrics, 4th Edition
(  7.662)
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 30
12.3
Unit Root Tests for
Stationarity
12.3.7
Order of Integration
Based on the large negative value of the tau
statistic (-5.487 < -1.94), we reject the null
hypothesis that ΔFt is nonstationary and accept the
alternative that it is stationary
– We similarly conclude that ΔBt is stationary
(-7:662 < -1:94)
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 31
12.4
Cointegration
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 32
12.4
Cointegration
As a general rule, nonstationary time-series
variables should not be used in regression models
to avoid the problem of spurious regression
– There is an exception to this rule
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 33
12.4
Cointegration
There is an important case when
et = yt - β1 - β2xt is a stationary I(0) process
– In this case yt and xt are said to be cointegrated
• Cointegration implies that yt and xt share
similar stochastic trends, and, since the
difference et is stationary, they never diverge
too far from each other
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 34
12.4
Cointegration
The test for stationarity of the residuals is based
on the test equation:
eˆt  γeˆt 1  vt
Eq. 12.7
– The regression has no constant term because
the mean of the regression residuals is zero.
– We are basing this test upon estimated values of
the residuals
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 35
12.4
Cointegration
Principles of Econometrics, 4th Edition
Table 12.4 Critical Values for the Cointegration Test
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 36
12.4
Cointegration
There are three sets of critical values
– Which set we use depends on whether the
residuals are derived from:
Eq. 12.8a
Equation 1: eˆt  yt  bxt
Eq. 12.8b
Equation 2 : eˆt  yt  b2 xt  b1
Eq. 12.8c
Equation 3: eˆt  yt  b2 xt  b1  ˆ t
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 37
12.4
Cointegration
12.4.1
An Example of a
Cointegration Test
Eq. 12.9
Consider the estimated model:
Bˆt  1.140  0.914Ft , R 2  0.881
(t ) (6.548) (29.421)
– The unit root test for stationarity in the
estimated residuals is:
eˆt  0.225eˆt 1  0.254eˆt 1
(tau ) (4.196)
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 38
12.4
Cointegration
12.4.1
An Example of a
Cointegration Test
The null and alternative hypotheses in the test for
cointegration are:
H 0 : the series are not cointegrated  residuals are nonstationary
H1 : the series are cointegrated  residuals are stationary
– Similar to the one-tail unit root tests, we reject
the null hypothesis of no cointegration if τ ≤ τc,
and we do not reject the null hypothesis that the
series are not cointegrated if τ > τc
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 39
Chapter 9
Regression with Time Series Data:
Stationary Variables
Walter R. Paczkowski
Rutgers University
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 40
Chapter Contents








9.1 Introduction
9.2 Finite Distributed Lags
9.3 Serial Correlation
9.4 Other Tests for Serially Correlated Errors
9.5 Estimation with Serially Correlated Errors
9.6 Autoregressive Distributed Lag Models
9.7 Forecasting
9.8 Multiplier Analysis
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 41
9.1
Introduction
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 42
9.1
Introduction
When modeling relationships between variables,
the nature of the data that have been collected has
an important bearing on the appropriate choice of
an econometric model
– Two features of time-series data to consider:
1. Time-series observations on a given
economic unit, observed over a number of
time periods, are likely to be correlated
2. Time-series data have a natural ordering
according to time
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 43
9.1
Introduction
There is also the possible existence of dynamic
relationships between variables
– A dynamic relationship is one in which the
change in a variable now has an impact on that
same variable, or other variables, in one or
more future time periods
– These effects do not occur instantaneously but
are spread, or distributed, over future time
periods
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 44
9.1
Introduction
Principles of Econometrics, 4th Edition
FIGURE 9.1 The distributed lag effect
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 45
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship:
1. Specify that a dependent variable y is a
function of current and past values of an
explanatory variable x
yt  f ( xt , xt 1 , xt 2 ,...)
Eq. 9.1
• Because of the existence of these lagged
effects, Eq. 9.1 is called a distributed lag
model
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 46
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Eq. 9.2
Ways to model the dynamic relationship (Continued):
2. Capturing the dynamic characteristics of timeseries by specifying a model with a lagged
dependent variable as one of the explanatory
variables
yt  f ( yt 1 , xt )
• Or have:
Eq. 9.3
yt  f ( yt 1 , xt , xt 1 , xt 2 )
– Such models are called autoregressive
distributed lag (ARDL) models, with
‘‘autoregressive’’ meaning a regression of yt
on its own lag or lags
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 47
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship (Continued):
3. Model the continuing impact of change over
several periods via the error term
yt  f ( xt )  et
Eq. 9.4
et  f (et 1 )
• In this case et is correlated with et - 1
• We say the errors are serially correlated or
autocorrelated
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 48
9.1
Introduction
9.1.2
Least Squares
Assumptions
The primary assumption is Assumption MR4:
cov  yi , y j   cov  ei , e j   0 for i  j
• For time series, this is written as:
cov  yt , ys   cov  et , es   0 for t  s
– The dynamic models in Eqs. 9.2, 9.3 and 9.4
imply correlation between yt and yt - 1 or et and
et - 1 or both, so they clearly violate assumption
MR4
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 49
9.2
Finite Distributed Lags
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 50
9.2
Finite Distributed
Lags
Consider a linear model in which, after q time
periods, changes in x no longer have an impact on
y
Eq. 9.5
yt    0 xt  1 xt 1  2 xt 2 
 q xt q  et
– Note the notation change: βs is used to denote
the coefficient of xt-s and α is introduced to
denote the intercept
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 51
9.2
Finite Distributed
Lags
Model 9.5 has two uses:
– Forecasting
Eq. 9.6
yT 1    0 xT 1  1 xT  2 xT 1 
 q xT q1  eT 1
– Policy analysis
• What is the effect of a change in x on y?
Eq. 9.7
Principles of Econometrics, 4th Edition
E ( yt ) E ( yt  s )

 s
xt  s
xt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 52
9.3
Serial Correlation
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 53
9.3
Serial Correlation
When is assumption TSMR5, cov(et, es) = 0 for
t ≠ s likely to be violated, and how do we assess
its validity?
– When a variable exhibits correlation over time,
we say it is autocorrelated or serially
correlated
• These terms are used interchangeably
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 54
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Eq. 9.14
More generally, the k-th order sample
autocorrelation for a series y that gives the
correlation between observations that are k periods
apart is:
T
 yt  y  yt k  y 

rk  t  k 1 T
2
  yt  y 
t 1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 55
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
How do we test whether an autocorrelation is
significantly different from zero?
– The null hypothesis is H0: ρk = 0
– A suitable test statistic is:
Eq. 9.17
Principles of Econometrics, 4th Edition
rk  0
Z
 T rk
1T
Chapter 9: Regression with Time Series Data:
Stationary Variables
N  0,1
Page 56
9.3
Serial Correlation
9.3.1b
The Correlagram
The correlogram, also called the sample
autocorrelation function, is the sequence of
autocorrelations r1, r2, r3, …
– It shows the correlation between observations
that are one period apart, two periods apart,
three periods apart, and so on
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 57
9.3
Serial Correlation
FIGURE 9.6 Correlogram for G
9.3.1b
The Correlagram
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 58
9.3
Serial Correlation
9.3.2a
A Phillips Curve
To determine if the errors are serially correlated,
we compute the least squares residuals:
Eq. 9.20
Principles of Econometrics, 4th Edition
eˆt  INFt  b1  b2 DUt 
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 59
9.3
Serial Correlation
9.3.2a
A Phillips Curve
The k-th order autocorrelation for the residuals can
be written as:
T
rk 
Eq. 9.21
 eˆ eˆ
t  k 1
T
t t k
2
ˆ
e
t
t 1
– The least squares equation is:
Eq. 9.22
Principles of Econometrics, 4th Edition
INF  0.7776  0.5279 DU
 se   0.0658  0.2294 
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 60
9.3
Serial Correlation
9.3.2a
A Phillips Curve
The values at the first five lags are:
r1  0.549 r2  0.456 r3  0.433 r4  0.420 r5  0.339
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 61
9.4
Other Tests for Serially Correlated
Errors
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 62
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
If et and et-1 are correlated, then one way to model
the relationship between them is to write:
et  ρet 1  vt
Eq. 9.23
– We can substitute this into a simple regression
equation:
Eq. 9.24
Principles of Econometrics, 4th Edition
yt  β1  β2 xt  ρet 1  vt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 63
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
To derive the relevant auxiliary regression for the
autocorrelation LM test, we write the test equation
as:
yt  β1  β2 xt  ρeˆt 1  vt
Eq. 9.25
– But since we know that yt  b1  b2 xt  eˆt , we
get:
b1  b2 xt  eˆt  β1  β2 xt  ρeˆt 1  vt
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 64
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
Rearranging, we get:
eˆt  β1  b1   β 2  b2  xt  ρeˆt 1  vt
Eq. 9.26
 γ1  γ 2 xt  ρeˆt 1  v
– If H0: ρ = 0 is true, then LM = T x R2 has an
approximate χ2(1) distribution
• T and R2 are the sample size and goodnessof-fit statistic, respectively, from least
squares estimation of Eq. 9.26
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 65
9.5
Estimation with Serially Correlated
Errors
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 66
9.5
Estimation with
Serially Correlated
Errors
Three estimation procedures are considered:
1. Least squares estimation
2. An estimation procedure that is relevant when
the errors are assumed to follow what is
known as a first-order autoregressive model
et  ρet 1  vt
3. A general estimation strategy for estimating
models with serially correlated errors
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 67
9.5
Estimation with
Serially Correlated
Errors
We will encounter models with a lagged
dependent variable, such as:
yt  δ  θ1 yt 1  δ0 xt  δ1 xt 1  vt
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 68
9.5
Estimation with
Serially Correlated
Errors
ASSUMPTION FOR MODELS WITH A LAGGED DEPENDENT VARIABLE
TSMR2A In the multiple regression model yt  β1  β2 xt 2   βK xK  vt
Where some of the xtk may be lagged values of y, vt is uncorrelated with all
xtk and their past values.
Principles of Econometrics, 4th Edition
Chapter 12: Regression with Time-Series Data:
Nonstationary Variables
Page 69
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
Suppose we proceed with least squares estimation
without recognizing the existence of serially
correlated errors. What are the consequences?
1. The least squares estimator is still a linear
unbiased estimator, but it is no longer best
2. The formulas for the standard errors usually
computed for the least squares estimator are
no longer correct
• Confidence intervals and hypothesis tests
that use these standard errors may be
misleading
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 70
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
It is possible to compute correct standard errors
for the least squares estimator:
– HAC (heteroskedasticity and autocorrelation
consistent) standard errors, or Newey-West
standard errors
• These are analogous to the heteroskedasticity
consistent standard errors
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 71
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
Consider the model yt = β1 + β2xt + et
– The variance of b2 is:
var  b2    wt2 var  et    wt ws cov  et , es 
t
t s
  wt ws cov  et , es  

ts
  wt2 var  et  1 
2


w
t

t var  et 


t
Eq. 9.27
where
wt   xt  x 
Principles of Econometrics, 4th Edition
 x  x
t
Chapter 9: Regression with Time Series Data:
Stationary Variables
2
t
Page 72
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
When the errors are not correlated, cov(et, es) = 0,
and the term in square brackets is equal to one.
– The resulting expression
var  b2   t wt2 var  et 
is the one used to find heteroskedasticityconsistent (HC) standard errors
– When the errors are correlated, the term in
square brackets is estimated to obtain HAC
standard errors
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 73
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
If we call the quantity in square brackets g and its
estimate gˆ , then the relationship between the two
estimated variances is:
Eq. 9.28
Principles of Econometrics, 4th Edition
varHAC  b2   varHC  b2   gˆ
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 74
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
Substituting, we get:
Eq. 9.43
Principles of Econometrics, 4th Edition
yt  β1 1  ρ  β2 xt  ρyt 1  ρβ2 xt 1  vt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 75
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
The coefficient of xt-1 equals -ρβ2
– Although Eq. 9.43 is a linear function of the
variables xt , yt-1 and xt-1, it is not a linear
function of the parameters (β1, β2, ρ)
– The usual linear least squares formulas cannot
be obtained by using calculus to find the values
of (β1, β2, ρ) that minimize Sv
• These are nonlinear least squares estimates
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 76