Autocorrelation I

Download Report

Transcript Autocorrelation I

Autocorrelation: Nature and Detection

13.1

Aims and Learning Objectives

By the end of this session students should be able to: • Explain the nature of autocorrelation • Understand the causes and consequences of autocorrelation • Perform tests to determine whether a regression model has autocorrelated disturbances 13.2

Nature of Autocorrelation Autocorrelation

is a systematic pattern in the errors that can be either attracting (

positive

) or repelling (

negative

) autocorrelation.

For

efficiency

(accurate estimation/prediction) all systematic information needs to be incor porated into the regression model.

13.3

Regression Model Y t =  1 +  2 X 2t +  3 X 3t + U t No autocorrelation:

Cov (U

i

, U

j

) or E(U

i

, U

j

) =

0 Autocorrelation: Note: i  j

Cov (U

i

, U

j

)

or E(U

i

, U

j

)

 0

0

In general

E(U

t

, U

t-s

)

 0 13.4

Postive Auto.

U

t 0

.

.

. . . .

.

. . .

Attracting

.

. .

.

. .

..

. ..

.

t

No Auto.

U

t 0 Negative Auto.

U

t 0

.

.

. .

.

.

. .

.

.. .

.

.

.

.

.

.

.

.

.

.

Random

.

.

.

Repelling

.

.

.

.

.

..

.

.

.

.

.

.

.

.

.

.

t t

13.5

Y t Order of Autocorrelation =  1 +  2 X 2t +  3 X 3t + U t

1st Order

:

U

t

=



U

t  1

+

 t

2nd Order 3rd Order

: :

U U

t

=

 1 t

=

 1

U

t 

U

1 t  1

+ +

 2  2

U

t  2

+

 t

U

t  2

+

 3

U

t  3

+

 t Where -1 <  < +1 We will assume First Order Autocorrelation

:

AR(1)

:

U

t

=



U

t  1

+

 t 13.6

Causes of Autocorrelation

Direct • Inertia or persistence Indirect • Omitted Variables • Spatial correlation • Functional form • Cyclical Influences • Seasonality 13.7

Consequences of Autocorrelation

1. Ordinary least squares still

linear

and

unbiased

.

2. Ordinary least squares

not efficient

.

3. Usual formulas give

incorrect

standard errors for least squares.

4. Confidence intervals and hypothesis tests based on usual standard errors are

wrong

.

13.8

Y t =  1 + ^  2 X t + e t Autocorrelated disturbances:

E(e

t

, e

t-s

)

0

Formula for ordinary least squares variance (no autocorrelation in disturbances):

Var

(  ˆ 2 )    2

x t

2 Formula for ordinary least squares variance (autocorrelated disturbances):

Var

(  ˆ 2 )    2

x t

2 1   1

x t

2 2 

x i x j

k

Therefore when errors are autocorrelated ordinary least squares estimators are inefficient (i.e. not “best”)

Detecting Autocorrelation

Y t

  ˆ 1   ˆ 2

X

2

t

  ˆ 3

X

3

t

e t

e t provide proxies for U t Preliminary Analysis (Informal Tests) • Data - autocorrelation often occurs in time-series (exceptions: spatial correlation, panel data) • Graphical examination of residuals - plot e t time or e t-1 to see if there is a relation against 13.10

Formal Tests for Autocorrelation Runs Test : analyse the uninterrupted sequence of the residuals Durbin-Watson (DW) d test : ratio of the sum of squared differences in successive residuals to the residual sum of squares Breusch-Godfrey LM test : A more general test which does not assume the disturbances are AR(1). 13.11

Durbin-Watson d Test

H

o

:

= 0 vs. H

1

:

= 0 ,

> 0, or

< 0

The Durbin-Watson Test statistic, d, is

:

d =

n



e

t 

e

t-1 

t = 2 2 n



e

t

t = 1 2

Ratio of the sum of squared differences in successive residuals to the residual sum of squares 13.12

The test statistic, d, is approximately related to

^

as:

d

2(1



)

When

^

= 0 , the Durbin-Watson statistic is d  2.

When

^

= 1 , the Durbin-Watson statistic is d  0.

When

^

= -1 , the Durbin-Watson statistic is d  4.

13.13

DW d Test

4 Steps Step 1 : Estimate

Y

ˆ

i

  ˆ 1   ˆ 2

X

2

i

  ˆ 3

X

3

i

And obtain the residuals Step 2 : Compute the DW d test statistic Step 3 : Obtain d L and d U : the lower and upper points from the Durbin-Watson tables 13.14

Step 4 : Implement the following decision rule:

Value of d relative to d L and d U Decision

d < d L d L  d  d U Reject null of no positive autocorrelation No decision d U < d < 4 - d U Do not reject null of no positive or negative autocorrelation 4 – d L < d < 4 - d U d > 4 - d L No decision Reject null of no negative autocorrelation 13.15

Restrictive Assumptions: • There is an intercept in the model • X values are non-stochastic • Disturbances are AR(1) • Model does not include a lagged dependent variable as an explanatory variable, e.g.

Y t =  1 +  2 X 2t +  3 X 3t +  4 Y t-1 + U t 13.16

Breusch-Godfrey LM Test

This test is valid with lagged dependent variables and can be used to test for higher order autocorrelation Suppose, for example, that we estimate: Y t =  1 +  2 X 2t +  3 X 3t +  4 Y t-1 + U t And wish to test for autocorrelation of the form:

U t

  1

U t

 1   2

U t

 2   3

U t

 3 

v t

13.17

Breusch-Godfrey LM Test

4 steps Step 1 . Estimate Y t =  1 +  2 X 2t +  3 X 3t +  4 Y t-1 + U t obtain the residuals (e t ) Step 2 . Estimate the following auxiliary regression model:

e t

 

b

1

c

1

e t

 1 

b

2

X

2 

c

2

e t

 2 

b

3

X

3 

c

3

e t

 3 

b

4

Y t

 1 

w t

13.18

Breusch-Godfrey LM Test

Step 3 . For large sample sizes, the test statistic is: (

n

p

)

R

2 ~  2

p

Step 4 . If the test statistic exceeds the critical chi-square value we can reject the null hypothesis of no serial correlation in any of the  terms 13.19

Summary

In this lecture we have: 1. Analysed the theoretical causes and consequences of autocorrelation 2. Described a number of methods for detecting the presence of autocorrelation 13.20