Forecasting - 周瑞生 教授, Professor Jui

Download Report

Transcript Forecasting - 周瑞生 教授, Professor Jui

Forecasting Methods

Forecasting Methods Quantitative Qualitative Causal Smoothing Time Series Trend Projection Trend Projection Adjusted for Seasonal Influence Slide ‹#›

General Linear Model

Models in which the parameters (

0 ,

1 , . . . ,

p

have exponents of one are called linear models. ) all It does not imply that the relationship between y and the x

i

’ s is linear.

A general linear model involving p independent variables is

y

        

Each of the independent variables z is a function of x 1 ,

x

2 , ... , x

k

(the variables for which data have been collected).

Slide ‹#› © 2006 by Thomson Learning, a division of Thomson Asia Pte Ltd..

General Linear Model

The simplest case is when we have collected data for just one variable x 1 and want to estimate y by using a straight-line relationship. In this case z 1 = x 1 .

This model is called a simple first-order model with one predictor variable.

  © 2006 by Thomson Learning, a division of Thomson Asia Pte Ltd..

Slide ‹#›

Estimated Multiple Regression Equation A simple random sample is used to compute sample statistics b 0 , b 1 , b 2 , . . . , b

p

that are used as the point estimators of the parameters b 0 , b 1 , b 2 , . . . , b

p

.

The estimated multiple regression equation is:

y = b 0 + b 1

x

1 + b 2

x

2 + . . . + b

p x p

Slide ‹#›

Estimation Process

Multiple Regression Model E(y) =  0 +  1

x

1 +  2

x

2 +. . .+ 

p x p

Multiple Regression Equation E(y) =  0 +  1

x

1 +  2

x

2 +. . .+ 

p x p

Unknown parameters are  0 ,  1 ,  2 , . . . , 

p +

b

0 , b 1 , b 2 , . . . , b

p

provide estimates of  0 ,  1 ,  2 , . . . , 

p

Sample Data:

x

1

x

2

. . . x p y

. . . .

. . . .

Estimated Multiple Regression Equation ˆ    Sample statistics are

b

0 , b 1 , b 2 , . . . , b

p

Slide ‹#›

Least Squares Method

Least Squares Criterion

min  (

y i

i

) 2 

Computation of Coefficient Values The formulas for the regression coefficients

b

0 , b 1 , b 2 , . . ., b

p

involve the use of matrix algebra. We will rely on computer software packages to perform the calculations.

Slide ‹#›

Multiple Regression Equation

 

Example: Butler Trucking Company To develop better work schedules, the managers want to estimate the total daily travel time for their drivers

Data

Slide ‹#›

Multiple Regression Equation

MINITAB Output

Slide ‹#›

Multiple Regression Model

Example: Programmer Salary Survey A software firm collected data for a sample of 20 computer programmers. A suggestion was made that regression analysis could be used to determine if salary was related to the years of experience and the score on the firm’s programmer aptitude test.

The years of experience, score on the aptitude test, and corresponding annual salary ($1000s) for a sample of 20 programmers is shown on the next slide.

Slide ‹#›

Multiple Regression Model

Exper. Score 4 7 1 5 8 10 0 1 6 6 78 100 86 82 86 84 75 80 83 91 Salary 24 43 23.7

34.3

35.8

38 22.2

23.1

30 33 Exper.

9 2 10 5 6 8 4 6 3 3 Score 88 73 75 81 74 87 79 94 70 89 Salary 38 26.6

36.2

31.6

29 34 30.1

33.9

28.2

30 Slide ‹#›

Multiple Regression Model

Suppose we believe that salary (y) is related to the years of experience (x 1 ) and the score on the programmer aptitude test (x 2 ) by the following regression model:

y =  0 +  1

x

1 +  2

x

2 + 

where

x

y = annual salary ($1000)

x

2 1 = years of experience = score on programmer aptitude test

Slide ‹#›

Solving for the Estimates of

0 ,

1 ,

2

Input Data

x

1

x

2

y

4 78 24 7 100 43 . . .

. . .

3 89 30 Computer Package for Solving Multiple Regression Problems Least Squares Output

b

0

b

1

b

2 = = =

R

2 = etc.

Slide ‹#›

Solving for the Estimates of

0 ,

1 ,

2

Excel Worksheet (showing partial data entered) 4 5 6 7 8 9 A B C 1 Programmer Experience (yrs) Test Score 2

1 4 78

3

2 7 100 3 4 5 6 7 8 1 5 8 10 0 1 86 82 86 84 75 80

Note: Rows 10-21 are not shown.

D Salary ($K)

24.0

43.0

23.7

34.3

35.8

38.0

22.2

23.1

Slide ‹#›

Solving for the Estimates of

0 ,

1 ,

2

Excel’s Regression Dialog Box

Slide ‹#›

Solving for the Estimates of

0 ,

1 ,

2

Excel’s Regression Equation Output 38 39 40 41 42 43 A

Intercept Experience Test Score

B C

Coeffic. Std. Err.

3.17394 6.15607

1.4039 0.19857

0.25089 0.07735

Note: Columns F-I are not shown.

D E

t Stat P-value

0.5156 0.61279

7.0702 1.9E-06 3.2433 0.00478

Slide ‹#›

Estimated Regression Equation

SALARY = 3.174 + 1.404(EXPER) + 0.251(SCORE)

Note: Predicted salary will be in thousands of dollars.

Slide ‹#›

Interpreting the Coefficients

In multiple regression analysis, we interpret each regression coefficient as follows:

b i

represents an estimate of the change in y corresponding to a 1-unit increase in x

i

when all other independent variables are held constant.

Slide ‹#›

Interpreting the Coefficients

b

1 = 1. 404

Salary is expected to increase by $1,404 for each additional year of experience (when the variable score on programmer attitude test is held constant).

Slide ‹#›

Interpreting the Coefficients

b

2 = 0.251

Salary is expected to increase by $251 for each additional point scored on the programmer aptitude test (when the variable years of experience is held constant).

Slide ‹#›

Multiple Coefficient of Determination

Relationship Among SST, SSR, SSE

SST = SSR + SSE  ( ) 2   (

i

) 2   (  ˆ

i

) 2

where: SST = total sum of squares SSR = sum of squares due to regression SSE = sum of squares due to error

Slide ‹#›

Multiple Coefficient of Determination

Excel’s ANOVA Output 32 33 34 35 36 37 38 A

ANOVA Regression Residual Total

B C D E F

df SS MS F

2 500.3285 250.1643 42.76013

Significance F

2.32774E-07 17 99.45697

19 599.7855

5.85041

SSR SST Slide ‹#›

Multiple Coefficient of Determination

R

2 = SSR/SST

R

2 = 500.3285/599.7855 = .83418

• • In general,

R

2 always increases as independent variables are added to the model.

adjusting

R

2 for the number of independent variables to avoid overestimating the impact of adding an independent variable Slide ‹#›

Adjusted Multiple Coefficient of Determination

a

2

a

1 ( ( ) )

n

   • •

R

2

a

n denoting the number of observations p denoting the number of independent variables

 .814671

Slide ‹#›

Adjusted Multiple Coefficient of Determination

Excel’s Regression Statistics 23 24 25 26 27 28 29 30 31 32 A

SUMMARY OUTPUT

B

Multiple R

Regression Statistics

0.913334059

R Square Adjusted R Square Standard Error Observations 0.834179103

0.814670762

2.418762076

20

C

Slide ‹#›

Assumptions About the Error Term

 The error  is a random variable with mean of zero.

The variance of  , denoted by  2 , is the same for all values of the independent variables.

The values of  are independent.

The error  is a normally distributed random variable reflecting the deviation between the y value and the expected value of y given by  0 +  1

x

1 +  2

x

2 + ... + 

p x p .

Slide ‹#›

Multiple Regression Analysis with Two Independent Variables

Graph

Slide ‹#›

Testing for Significance

In simple linear regression, the F and t tests provide the same conclusion.

In multiple regression, the F and t tests have different purposes.

Slide ‹#›

Testing for Significance: F Test

The F test is used to determine whether a significant relationship exists between the dependent variable and the set of all the independent variables.

The F test is referred to as the test for overall significance.

Slide ‹#›

Testing for Significance: t Test

If the F test shows an overall significance, the t test is used to determine whether each of the individual independent variables is significant.

A separate t test is conducted for each of the independent variables in the model.

We refer to each of these t tests as a test for individual significance.

Slide ‹#›

Testing for Significance: F Test

Hypotheses Test Statistics

H

0 :

1 =

2 = . . . =

p

= 0

H

a : One or more of the parameters is not equal to zero.

F = MSR/MSE

Rejection Rule

Reject H 0 if p-value <

a

or if F > F

a ,

where F

a

is based on an F distribution with p d.f. in the numerator and n - p - 1 d.f. in the denominator.

Slide ‹#›

Testing for Significance: F Test

ANOVA Table for A Multiple Regression Model with p Independent Variables

Slide ‹#›

F Test for Overall Significance

Hypotheses

H

0 :

1 =

2 = 0

H

a : One or both of the parameters is not equal to zero.

Rejection Rule

For

a

= .05 and d.f. = 2, 17; F .05

= 3.59

Reject H 0 if p-value < .05 or F > 3.59

Slide ‹#›

F Test for Overall Significance

Excel’s ANOVA Output 32 33 34 35 36 37 38 A

ANOVA Regression Residual Total

B C D E F

df SS MS F

2 500.3285 250.1643 42.76013

Significance F

2.32774E-07 17 99.45697

19 599.7855

5.85041

p-value used to test for overall significance Slide ‹#›

F Test for Overall Significance

Test Statistics

F = MSR/MSE = 250.16/5.85 = 42.76

Conclusion

p-value < .05, so we can reject H (Also, F = 42.76 > 3.59) 0 .

Slide ‹#›

Testing for Significance: t Test

Hypotheses

H H

0

a

: :  

i i

  0 0 Test Statistics Rejection Rule

t t s s b i i i i

Reject H 0 if p-value < if t < -t a  or t > t a  a or where t a  is based on a t distribution with n - p - 1 degrees of freedom.

Slide ‹#›

t Test for Significance of Individual Parameters

Hypotheses

H

0

H a

: :  

i i

  0 0 Rejection Rule For a = .05 and d.f. = 17, t .025

Reject H 0 = 2.11

if p-value < .05 or if t > 2.11

Slide ‹#›

t Test for Significance of Individual Parameters

Excel’s Regression Equation Output 38 39 40 41 42 43 A

Intercept Experience Test Score

B Note: Columns F-I are not shown.

C

Coeffic. Std. Err.

3.17394 6.15607

1.4039 0.19857

0.25089 0.07735

D E

t Stat P-value

0.5156 0.61279

7.0702 1.9E-06 3.2433 0.00478

t statistic and p-value used to test for the individual significance of “Experience” Slide ‹#›

t Test for Significance of Individual Parameters

Excel’s Regression Equation Output 38 39 40 41 42 43 A

Intercept Experience Test Score

B Note: Columns F-I are not shown.

C D E

Coeffic. Std. Err.

3.17394 6.15607

1.4039 0.19857

0.25089 0.07735

t Stat P-value

0.5156 0.61279

7.0702 1.9E-06 3.2433 0.00478

t statistic and p-value used to test for the individual significance of “Test Score” Slide ‹#›

t Test for Significance of Individual Parameters

Test Statistics

s b b

1 1

s b b

2 2 .

.

.

.

.

.

Conclusions Reject both H 0 :  1 = 0 and H 0 :  2 Both independent variables are significant.

= 0.

Slide ‹#›

Testing for Significance: Multicollinearity

The term multicollinearity refers to the correlation among the independent variables.

When the independent variables are highly correlated (say, |r | > .7), it is not possible to determine the separate effect of any particular independent variable on the dependent variable.

Slide ‹#›

Testing for Significance: Multicollinearity

If the estimated regression equation is to be used only for predictive purposes, multicollinearity is usually not a serious problem.

Every attempt should be made to avoid including independent variables that are highly correlated.

Slide ‹#›

Modeling Curvilinear Relationships

To account for a curvilinear relationship, we might set

z

1 = x 1 and z 2 = .

1 

This model is called a second-order model with one predictor variable.

x

2 Slide ‹#›

Modeling Curvilinear Relationships

 

Example: Reynolds, Inc., Managers at Reynolds want to investigate the relationship between length of employment of their salespeople and the number of electronic laboratory scales sold.

Data

Slide ‹#›

Modeling Curvilinear Relationships

Scatter Diagram for the Reynolds Example

Slide ‹#›

Modeling Curvilinear Relationships

Let us consider a simple first-order model and the estimated regression is Sales = 111 + 2.38 Months, where : Sales = number of electronic laboratory scales sold, Months = the number of months the salesperson has been employed

Slide ‹#›

Modeling Curvilinear Relationships

MINITAB output – first-order model

Slide ‹#›

Modeling Curvilinear Relationships

 

Standardized Residual plot – first-order model The standardized residual plot suggests that a curvilinear relationship is needed

Slide ‹#›

Modeling Curvilinear Relationships

 

Reynolds Example : The second-order model The estimated regression equation is Sales = 45.3 + 6.34 Months + .0345 MonthsSq where : Sales = number of electronic laboratory scales sold, MonthsSq = the square of the number of months the salesperson has been employed

Slide ‹#›

Modeling Curvilinear Relationships

MINITAB output – second-order model

Slide ‹#›

Modeling Curvilinear Relationships

Standardized Residual plot – second-order model

Slide ‹#›

Variable Selection Procedures

Stepwise Regression

Forward Selection

Backward Elimination Iterative; one independent variable at a time is added or deleted based on the F statistic

Slide ‹#›

Variable Selection: Stepwise

Compute F stat. and p-value for each indep.

variable not in model

Regression

Any p-value < alpha

to enter

?

No

Any p-value > alpha

to remove

?

Yes

Indep. variable with largest p-value is removed from model

Yes

Stop

No

Compute F stat. and p-value for each indep.

variable in model

next iteration

Start with no indep.

variables in model Indep. variable with smallest p-value is entered into model Slide ‹#›

Variable Selection: Forward Selection

Start with no indep.

variables in model Compute F stat. and p-value for each indep.

variable not in model Any p-value < alpha

to enter

?

No

Stop

Yes

Indep. variable with smallest p-value is entered into model Slide ‹#›

Variable Selection: Backward Elimination

Start with all indep.

variables in model Compute F stat. and p-value for each indep.

variable in model Any p-value > alpha

to remove

?

No

Stop

Yes

Indep. variable with largest p-value is removed from model Slide ‹#›

Qualitative Independent Variables

In many situations we must work with qualitative independent variables such as gender (male, female), method of payment (cash, check, credit card), etc.

For example, x 2 might represent gender where x 2 indicates male and x 2 = 1 indicates female.

= 0 In this case, x 2 is called a dummy or indicator variable.

Slide ‹#›

Qualitative Independent Variables

Example: Programmer Salary Survey As an extension of the problem involving the computer programmer salary survey, suppose that management also believes that the annual salary is related to whether the individual has a graduate degree in computer science or information systems.

The years of experience, the score on the programmer aptitude test, whether the individual has a relevant graduate degree, and the annual salary ($1000) for each of the sampled 20 programmers are shown on the next slide.

Slide ‹#›

Qualitative Independent Variables

Exper. Score 4 7 1 5 8 10 0 1 6 6 78 100 86 82 86 84 75 80 83 91 Degr.

Salary No Yes No Yes Yes Yes No No No Yes 24 43 23.7

34.3

35.8

38 22.2

23.1

30 33 Exper.

9 2 10 5 6 8 4 6 3 3 Score Degr.

Salary 88 73 75 81 74 87 79 94 70 89 Yes No Yes No No Yes No Yes No No 38 26.6

36.2

31.6

29 34 30.1

33.9

28.2

30 Slide ‹#›

Estimated Regression Equation

y = b 0 + b 1

x

1 + b 2

x

2 + b 3

x

3 where:

x

1

x

2

x

3 = years of experience = score on programmer aptitude test = 0 if individual does not have a graduate degree 1 if individual does have a graduate degree

x

3 is a dummy variable Slide ‹#›

Qualitative Independent Variables

Excel’s Regression Statistics 23 24 25 26 27 28 29 30 31 32 A

SUMMARY OUTPUT

B

Multiple R

Regression Statistics

0.920215239

R Square Adjusted R Square Standard Error Observations 0.846796085

0.818070351

2.396475101

20

C

Slide ‹#›

Qualitative Independent Variables

Excel’s ANOVA Output 32 33 34 35 36 37 38 A

ANOVA Regression Residual Total

B C D E F

df

3

SS MS F

507.896 169.2987 29.47866

Significance F

9.41675E-07 16 91.88949 5.743093

19 599.7855

Slide ‹#›

Qualitative Independent Variables

Excel’s Regression Equation Output 38 39 40 41 42 43 44 A B C D E

Intercept Experience Test Score

Coeffic. Std. Err.

7.94485

1.14758

0.19694

t Stat

7.3808 1.0764

0.2976 3.8561

P-value

0.2977

0.0014

0.0899 2.1905 0.04364

Grad. Degr. 2.28042 1.98661 1.1479 0.26789

Note: Columns F-I are not shown.

Not significant Slide ‹#›

More Complex Qualitative Variables

If a qualitative variable has k levels, k - 1 dummy variables are required, with each dummy variable being coded as 0 or 1.

For example, a variable with levels A, B, and C could be represented by x 1 and x 2 values of (0, 0) for A, (1, 0) for B, and (0,1) for C.

Care must be taken in defining and interpreting the dummy variables.

Slide ‹#›

More Complex Qualitative Variables For example, a variable indicating level of education could be represented by x follows: 1 and x 2 values as

Highest Degree Bachelor’s Master’s Ph.D.

x

0 1 0 1

x

0 0 1 2 Slide ‹#›

Interaction

If the original data set consists of observations for y and two independent variables x 1 and x 2 we might develop a second-order model with two predictor variables.

x

2 2 4 2 

In this model, the variable z 5 = x 1

x

2 is added to account for the potential effects of the two variables acting together.

This type of effect is called interaction.

Slide ‹#›

Interaction

 

Example: Tyler Personal Care New shampoo products, two factors believed to have the most influence on sales are unit selling price and advertising expenditure.

Data

Slide ‹#›

Interaction

Mean Unit Sales (1000s) for the Tyler Personal Care Example

At higher selling prices, the effect of increased advertising expenditure diminishes. These observations provide evidence of interaction between the price and advertising expenditure variables.

Slide ‹#›

Mean Sales as a Function of Selling Price and Advertising Expenditure

Interaction

Slide ‹#›

Interaction

To account for the effect of interaction, use the following regression model

y

  0   1

x

1   2

x

2   3

x

1

x

2  

where : y = unit sales (1000s),

x

1 = price ($),

x

2 = advertising expenditure ($1000s).

Slide ‹#›

Interaction

General Linear Model involving three independent variables (z 1 , z 2 , and z 3 )

y

  0   1

z

1   2

z

2   3

z

3  

where : y= Sales = unit sales (1000s)

z

1 = x 1 (price) = price of the product ($)

z

2 = x 2 (AdvExp) = advertising expenditure ($1000s)

z

3 = x 1

x 2

(PriceAdv) = interaction term (Price times AdvExp)

Slide ‹#›

Interaction

MINITAB Output for the Tyler Personal Care Example

Slide ‹#›