Forecasting - California State University, Fullerton

Download Report

Transcript Forecasting - California State University, Fullerton

CHAPTER 7
Forecasting Models
1
9.1 Introduction to Time Series Forecasting
• Forecasting is the process of predicting the future.
• Forecasting is an integral part of almost all business
enterprises.
• Examples
– Manufacturing firms forecast demand for their product, to
schedule manpower and raw material allocation.
– Service organizations forecast customer arrival patterns to
maintain adequate customer service.
2
Introduction
•
More examples
– Security analysts forecast revenues, profits, and
debt ratios, to make investment recommendations.
– Firms consider economic forecasts of indicators
(housing starts, changes in gross national profit)
before deciding on capital investments.
3
Introduction
• Good forecasts can lead to
– Reduced inventory costs.
– Lower overall personnel costs.
– Increased customer satisfaction.
• The forecasting process can be based on:
– Educated guess.
– Expert opinions.
– Past history of data values, known as a time series.
4
Components of a Time Series
– Long-term trend
• A time series may be stationary or exhibit trend over time.
• Long-term trend is typically modeled as a linear, quadratic
or exponential function.
– Seasonal variation
• When a repetitive pattern is observed over some time
horizon, the series is said to have seasonal behavior.
• Seasonal effects are usually associated with calendar or
climatic changes.
• Seasonal variation is frequently tied to yearly cycles.
5
Components of a Time Series
– Cyclical variation
• An upturn or downturn not tied to seasonal variation.
• Usually results from changes in economic conditions.
– Random effects
6
Time
series
value
Components of a Time Series
Linear trend and seasonality time series
Linear trend time series
Future
A stationary time series
Time
7
Steps in the Time Series Forecasting Process
• The goal of a time series forecast is to identify factors
that can be predicted.
• This is a systematic approach involving the following
steps.
– Step 1: Hypothesize a form for the time series model.
– Step 2: Select a forecasting technique.
– Step 3: Prepare a forecast.
8
Steps in the Time Series Forecasting Process
Step 1: Identify components included in the time series
– Collect historical data.
– Graph the data vs. time.
– Hypothesize a form for the time series model.
– Verify this hypothesis statistically.
9
Steps in the Time Series Forecasting Process
• Step 2: Select a Forecasting Technique
– Select a forecasting technique from among several techniques
available. The selection includes
• Determination of input parameter values
• Performance evaluation on past data of each technique
• Step 3: Prepare a Forecast using the selected technique
10
7.2 Stationary Forecasting Models
• In a stationary model the mean value of the time series is
assumed to be constant.
• The general form of such a model is
Where:
yt = b0 + et
•The values of et
are assumed to be
independent
• The values of et
are assumed to
have a mean of 0.
yt = the value of the time series at time period t.
b0 = the unchanged mean value of the time series.
et = a random error term at time period t.
11
Stationary Forecasting Models
• Checking for trend
• Use Linear Regression if et is normally distributed.
• Use a nonparametric test if et is not normally distributed.
• Checking for seasonality component
• Autocorrelation measures the relationship between the values of the time
series in different periods.
• Lag k autocorrelation measures the correlation between time series
values which are k periods apart.
– Autocorrelation between successive periods indicates a possible trend.
– Lag 7 autocorrelation indicates one week seasonality (daily data).
– Lag 12 autocorrelation indicates 12-month seasonality (monthly data).
• Checking for Cyclical Components
12
Moving Average Methods
• The last period technique
The forecast for the next period is
the last observed value.
Ft 1  y t
t
tt+1 tt+1 t+1
t
t+1
13
Moving Average Methods
• The moving average method
The forecast is the average of
the last n observations of the
time series.
y t  y t 1  ...  y t n1
Ft 1 
n
t-2
t-2
t-2t-2t-2t-1
t-2
t-1
t-1t-1t-1t-1
tt
t t t t t+1
t+1
14
Moving Average Methods
• The weighted moving average method
– More recent values of the time series
get larger weights than past values when performing
the forecast.
Ft 1 = w1yt + w2yt-1 +w3yt-2 + …+ wnyt-n+1
w1  w2 …  wn
Swi = 1
15
Moving Average Methods
• Forecasts for Future Time Periods
The forecast for time period t+ 1 is the forecast for
all future time periods:
Ft k  y t for k  1,2,3,...
This forecast is revised only when new data
becomes available.
16
YOHO BRAND YO - YOs
Moving Average Methods • Galaxy Industries is interested in forecasting weekly
demand for its YoHo brand yo-yos.
• The yo-yo is a mature product. This year demand
pattern is expected to repeat next year.
• To forecast next year demand, the past 52 weeks
demand records were collected.
17
YOHO BRAND YO - YOs
Moving Average Methods • Three forecasting methods were suggested:
– Last period technique - suggested by Amy Chang.
– Four-period moving average - suggested by Bob Gunther.
– Four-period weighted moving average - suggested by Carlos
Gonzalez.
• Management wants to determine:
– If a stationary model can be used.
– What forecast will be obtained using each method?
18
YOHO BRAND YO YOs- Solution
• Construct the time series plot
11
22
33
44
55
66
77
88
99
10
10
11
11
12
12
13
13
Demand
Neither
Demand
415
415
236
236
348
348
272
272
280
280
395
395
438
438
431
431
446
446
354
354
529
529
241
241
262
262
Week
Week
seasonality
nor cyclicalWeek
effectsDemand
can be observed
Week Demand
Demand
Week
Demand
Week
14
14
15
15
16
16
17
17
18
18
19
19
20
20
21
21
22
22
23
23
24
24
25
25
26
26
Demand
Week
Week
700
600
500
400
300
200
100
0
365
365
471
471
402
402
429
429
376
376
363
363
513
513
197
197
438
438
557
557
625
625
266
266
551
551
Weeks
27
27
28
28
29
29
30
30
31
31
32
32
33
33
34
34
35
35
36
36
37
37
38
38
39
39
351
351
388
388
336
336
414
414
346
346
252
252
256
256
378
378
391
391
217
217
427
427
293
293
288
288
Series1
40
40
41
41
42
42
43
43
44
44
45
45
46
46
47
47
48
48
49
49
50
50
51
51
52
52
Demand
Demand
282
282
399
399
309
309
435
435
299
299
522
522
376
376
483
483
416
416
245
245
393
393
482
482
484
484
19
Is trend present?
• Run linear regression to test b1 in the model yt=b0+b1t+et
• Excel results
C oeff.
Intercept
W eeks
S tand. E rr
t-S tat
P -value
Lower 95%U pper 95%
369.27 27.79436 13.2857 5E-18
313.44 425.094
0.3339 0.912641 0.36586 0.71601
0.71601 -1.49919 2.16699
This large P-value indicates
that there is little evidence that trend exists
• Conclusion: A stationary model is appropriate.
20
Forecast for Week 53
• Last period technique (Amy’s Forecast)
y 53 = y52 = 484 boxes.
– Four-period moving average (Bob’s forecast)
y53 = (y52 + y51 + y50 + y49) /4 =
(484+482+393+245) / 4 = 401 boxes.
– Four period weighted moving average (Carlo’s forecast)
y 53 =0.4y52 + 0.3y51 + 0.2y 50 + 0.1y49 =
0.4(484) + 0.3(482) + 0.2(393) + 0.1(245) = 441.3 boxes.
21
Forecast for Weeks 54 and 55
• Since the time series is stationary, the forecasts
for weeks 54 and 55 remain as the forecast for
week 53.
• These forecasts will be revised pending
observation of the actual demand in week 53.
22
The Exponential Smoothing Technique
• This technique is used to forecast stationary time
series.
• All the previous values of historical data affect
the forecast.
23
The Exponential Smoothing Technique
• For each period create a smoothed value Lt of the time
series, that represents all the information known by t.
• The smoothed value Lt is the weighted average of
– The current period’s actual value (with weight of a).
– The forecast value for the current period (with weight of 1-a).
• The smoothed value Lt becomes the forecast for period t+1.
24
The Exponential Smoothing Technique
Define:
Ft+1 = the forecast value for time t+1
yt = the value of the time series at time t
a = smoothing constant
Ft 1  L t  ay t  (1  a)Ft
An initial “forecast” is needed to start the process.
25
The Exponential Smoothing Technique –
Generating an initial forecast
– Approach 1:
F2  L1  y1
Continue from t=3 with the recursive formula.
– Approach 2:
• Average the initial “ n ” values of the time series.
• Use this average as the forecast for period n + 1 (Fn1  y n ).
• Begin using exponential smoothing from that time period
onward [Fn2  ay n  (1  a)Fn1  ay n  (1  a) y n ],
and so on.
26
The Exponential Smoothing Technique –
Future Forecasts
• Since this technique deals with stationary time series,
the forecasts for future periods does not change.
• Assume N is the number of periods for which data are
available. Then
FN+1 = ayN + (1 – a)FN,
FN+k = FN+1, for k = 2, 3, …
27
YOHO BRAND YO - YOs
The Exponential Smoothing Technique
• An exponential smoothing forecast is suggested, with a = 0.1.
• An Initial Forecast is created at t=2 by F2 = y1 = 415.
• The recursive formula is used from period 3 onward:
F3 = .1y2 + .9F2 = .1(236) + .9(415) = 397.10
F4 = .1y3 + .9F3 = .1(348) + .9(397.10) = 392.19
and so on, until period 53 is reached (N+1 = 52+1 = 53).
F53 = .1y52 + .9F52 = .1(484) + .9(382.32) = 392.49
F54 = F55 = 392.49 ( = F52)
28
YOHO BRAND YO - YOs
The Exponential Smoothing Technique
(Excel)
Period
Series
Forecast
1
2
3
45
49
415
236
348
272
280
245
#N/A
50
51
52
53
54
55
393
482
484
415
397.1
392.19
380.171
382.5884742
368.8296268
371.2466641
382.3219977
392.4898
392.4898
392.4898
29
YOHO BRAND YO - YOs
The Exponential Smoothing Technique
(Excel)
700
600
500
400
300
200
100
0
0
Notice the amount of smoothing
Included
in the smoothed
series
10
20
30
40
50
60
30
The Exponential Smoothing Technique
Average age of information
• Relationship between exponential smoothing and simple moving
average
– The two techniques will generate forecasts having the same average age of
information if
k
2a
a
– This formula is a useful guide to the appropriate value for a.
• An exponential smoothing forecast “based on large number of periods” should
have a small a.
• A small a provides a lot of smoothing.
• A large a provides a fast response to the recent changes in the
time series and a smaller amount of smoothing.
31
7.3 Evaluating the performance
of Forecasting Techniques
• Several forecasting methods have been
presented.
• Which one of these forecasting methods gives
the “best” forecast?
32
Performance Measures
• Generally, to evaluate each forecasting method:
– Select an evaluation measure.
– Calculate the value of the evaluation measure using the forecast
error equation
 t  y t  Ft
– Select the forecast with the smallest value of the evaluation
measure.
33
Performance Measures –
Sample Example
• Find the forecasts and the errors for each forecasting
technique applied to the following stationary time series.
Time
Time series:
3-Period Moving average:
Error for the 3-Period MA:
3-Period Weighted MA(.5, .3, .2)
Error for the 3-Period WMA
1
2
3 4
100
110
90 80 105 115
100
- 20
98
- 18
5
93.33
11.67
89
16
6
91.6
23.4
85.5
29.5
34
Performance Measures
MSE =
MAD =
S( )2
t
n
S |t|
n
MAPE =
Sn|t| yt
n
LAD = max |t|
35
Performance Measures –
MSE for the Sample Example
MSE for the moving average technique:
S
(t)2 (-20) +(11.67) +(23.4)
MSE =
=
2
2
3
n
2
= 361.24
MSE for the weighted moving average technique:
S
(t)2 (-18) + (16) + (29.5)
MSE =
=
2
n
2
3
2
= 483.4
36
Performance Measures –
MAD for the Sample Example
MAD for the moving average technique:
MAD =
S |t|
n
=
|-20| + |11.67| + |23.4|
= 18.35
3
MAD for the weighted moving average technique:
MAD =
S |t|
n
=
|-18| + |116| + |29.5|
3
= 21.17
37
|t|/Y
n
Performance Measures –
MAPE for the Sample Example
MAPE for the moving average technique:
MAPE=
=
|-20|/80 + |11.67|/105+ |23.4|/115
= .188
3
MAPE for the weighted moving average technique:
|t|
S
MAPE=
n
=
|-18|/80 + |16|/105 + |29.5|/115
3
= .211
38
Performance Measures –
LAD for the Sample Example
LAD for the moving average technique:
LAD= max|t| = max {|-18|, |16|, |29.5|} = 29.5
LAD for the weighted moving average technique:
LAD= max |t| = |-20|, |11.67|, |23.4| = 23.4
39
Performance Measures –
YOHO BRAND YO - YOs
=B4
Drag to Cell C56
=E5/B
5
=D5^
2
=ABS(D5
)
=B5-C5
Highlight
Cells D5:G5
and Drag to
Cells
D55:G55
40
Performance Measures –
YOHO BRAND YO - YOs
Forecast begins at period 5.
=AVERAGE(B4:B7)
Drag to Cell C56
=E8/B
8
=D8^
2
=ABS(D8
)
=B8-C8
=C56
Drag to C58
Highlight
Cells D8:G8
and Drag to
Cells
41
D55:G55
Performance Measures –
Selecting Model Parameters
• Use the performance measures to select a good set of
values for each model parameter.
– For the moving average:
• the number of periods (n).
– For the weighted moving average:
• The number of periods (n),
• The weights (Wi).
– For the exponential smoothing:
• The exponential smoothing factor (a).
• Excel Solver can be used to determine the values of the
model parameters.
42
Weights for the Weighted Moving Average
Minimize MSE using SolverMinimize
Cell containing MSE
Weights
Weights Are
Nonincreasin
g
Total of Weights
Sum to 1
43
Selecting Forecasting Technique
• Key issues considered when determining the
technique to be used in forecasting stationary timeseries.
– The degree of autocorrelation.
– The possibility of future shifts in time series values.
– The desired responsiveness of the forecasting technique.
– The amount of data required to calculate the forecast.
44
7.4 Time Series with Linear Trend
• If we suspect trend, we should assess whether the
trend is linear or nonlinear.
• Here we assume only linear trend.
yt = b0 + b1t + et
• Forecasting methods
– Linear regression forecast.
– Holt’s Linear Exponential Smoothing Technique.
45
The Linear Regression Approach
• Construct the regression equation based on the
historical data available.
• The independent variable is “time”.
The dependent variable is the “time-series value”.
• Forecasts should not expand to periods far into the
future.
46
Holt’s Technique –
A qualitative demonstration
Y3
L 3+
L+2
2
++
+
+
F3+
+T2
3
F4
+
T2 T3
L3  L2
L t  ay t  (1  a)Ft
Tt  (L t  L t 1 )  (1   )Tt 1
4
47
The Holt’s Technique
• The Holt’s Linear Exponential Smoothing Technique.
– Adjust the Level Lt , and the Trend Tt in each period:
L t  ay t  (1  a)Ft
Level:
Trend:
Tt  (L t  L t 1 )  (1 )Tt 1
Initial values:
L 2  y 2 and T2 = y 2  y1
a = smoothing constant for the time series level.
 = smoothing constant for the time series trend.
L t = estimate of the time series for time t as of time t.
Tt = estimate of the time series trend for time t as of time t.
yt = value of the time series at time t.
Ft
= forecast of the value of the time series for time t calculated at a
period prior to time t.
48
Future Forecasts
• Forecasting k periods into the future
– By linear regression
Ft k  b0  b1(t  k)
– By the Holt’s linear exponential smoothing technique
Ft k  L t  kTt
49
American Family Products Corp.
• Standard and Poor’s (S&P) is a bond rating firm.
• It is conducting an analysis of American Family
Products Corp. (AFP).
• The forecast of year-end current assets is
required for years 11 and 12, based on data over
the previous 10 years.
50
American Family Products Corp.
•
•
The company’s assets have been increasing
at a relatively constant rate over time.
Data
Year-end current assets
Year
Current Assets
1
2
3
4
5
6
7
8
9
10
1990 (Million)
2280
2328
2635
3249
3310
3256
3533
3826
4119
51
AFP -SOLUTION
Forecasting with the Linear Regression Model
Assets
4500
4000
3500
3000
2500
2000
1500
1000
500
0
Year
1
2
3
4
Year
5
Series1
6
7
8
9
10
52
AFP -SOLUTION
Forecasting with the Linear Regression Model
Regression Statistics
Multiple R
0.980225251
R Square
0.960841543
Adjusted R Square
0.955946736
Standard Error 149.0358255
Observations
10
The Regression Equation
y t = 1788.2 + 229.89 t
ANOVA
df
Regression
Residual
Total
Intercept
Year
SS
MS
F
Significance F
1 4360110.982 4360110.982 196.2981421 6.53249E-07
8 177693.4182 22211.67727
9
4537804.4
Coefficients Standard Error
t Stat
P-value
Lower 95% Upper 95%
1788.2 101.8108511 17.56394315 1.12773E-07 1553.423605 2022.976395
229.8909091 16.40830435 14.01064389 6.53249E-07 192.0532669 267.7285513
53
AFP -SOLUTION
Forecasting with the Linear Regression Model
=$B$31+$B$32*A2
Drag to cells C3:C13
Forecasts for Years
11 and 12
54
AFP -SOLUTION
Forecasting using the Holt’s technique
Demonstration of the calculation procedure.
Year Current Assets
1
1990
2
2280
3
2328
4
2635
5
3249
………………………
………………………
with a = 0.1 and  = 0.2
Year 2:
y2 = 2280
L2 =y2 L2 = 2280.00
T2=y2-y1 T2 = 2280 - 1990 = 290
F3=L2+T2 F3= 2280 + 290 = 2570.00
Year 3:
y3 = 2328
L t  ay t  (1  a)Ft L3 = (.1)(2328) + (1 - 0.1)(2570.00) = 2545.80
Tt  (L t  L t 1 )  (1 )Tt 1 T3 = (.2)(2545.80-2280)) + (1 - 0.2)(290.00) = 285.16
Ft+1=Lt+Tt F4= 2545.80 + 285.16 = 2830.96
55
AFP -SOLUTION
Forecasting using the Holt’s technique (Excel)
INPUTS
Number of Periods of Data Collected =
Smoothing Constant (alpha) =
Smoothing Constant (gamma) =
Initial Forecast Value (Level) =
Initial Forecast Value (Trend) =
OUTPUTS
Period
11
12
13
Forecast 4593.378 4849.579 5105.779
Period
1
2
3
4
5
6
7
8
9
10
11
Value
1990
2280
2328
2635
3249
3310
3256
3533
3826
4119
OUTPUTS
Period
10
0.1
0.2
2280
290
14
5361.98
MSE =
MAD =
15
5618.18
Forecast =
72994.37 MAPE =
250.3462 LAD =
7.752572
411.0682
16
17
18
19
5874.38 6130.581 6386.781 6642.982
Forecast
Level
Forecast
Trend
Forecast
Error
Absolute
Error
Error
Squared
Absolute
% Error
2280
2545.8
2811.364
3108.244
3384.352
3625.961
3863.711
4100.371
4337.178
4134.04
290
285.16
281.2408
284.3687
282.7164
274.4951
267.146
261.0488
256.2004
164.3329
2570
2830.96
3092.605
3392.613
3667.068
3900.456
4130.857
4361.42
4593.378
-242
-195.96
156.3952
-82.613
-411.068
-367.456
-304.857
-242.42
242
195.96
156.3952
82.61302
411.0682
367.4564
304.8567
242.4199
58564
38400.32
24459.46
6824.912
168977
135024.2
92937.63
58767.4
0.103952
0.074368
0.048136
0.024959
0.126249
0.104007
0.07968
0.058854
56
7.5 Time Series with Trend, Seasonality,
and Cyclical Variation
• Many time series exhibit seasonal and cyclical variation
along with trend.
• Seasonality and cyclical variation arise due to calendar,
climate, or economic factors.
Time series value
• Two models are considered:
Trend component
– Additive model
Cyclical component
yt = Tt + Ct + St + et
Random error
Seasonal component
– Multiplicative model
yt = Tt Ct Stet
57
The Classical Decomposition
• This technique can be used to develop an additive or
multiplicative model.
• The time series is first decomposed to its components
(trend, seasonality, cyclical variation).
• After these components have been determined, the
series is re-composed by
– adding the components - in the additive model
– multiplying the components - in the multiplicative model.
58
The Classical Decomposition- Procedure (1)
•
Smooth the time series to remove
random effects and seasonality.
•
Calculate moving averages.
•
Determine “period factors” to
isolate the (seasonal)(error)
factor.
•
Calculate the ratio yt/MAt.
•
Determine the “unadjusted
seasonal factors” to eliminate the
random component from the period
factors
•
Average all the yt/MAt that
correspond to the same season.
59
The Classical Decomposition- Procedure (2)
•
Determine the “adjusted seasonal
factors”.
Calculate:
[Unadjusted seasonal factor]
[Average seasonal factor]
•
Determine “Deseasonalized data
values”.
Calculate:
•
Determine a deseasonalized trend
forecast.
Use linear regression on the
deseasonalized time series.
•
Determine an “adjusted seasonal
forecast”.
Calculate:
(yt/Mat) [Adjusted seasonal forecast].
yt
[Adjusted seasonal factors]t
60
CANADIAN FACULTY ASSOCIATION (CFA)
• The CFA is the exclusive bargaining agent for the
state-supported Canadian college faculty.
• Membership in the organization has grown over the
years, but in the summer months there was always
a decline.
• To prepare the budget for the 2001 fiscal year, a
forecast of the average quarterly membership
covering the year 2001 is required.
61
CFA - Solution
• Membership records from 1997 through 2000 were
collected and graphed.
62
CFA - Solution
YEAR
PERIOD
QUARTER
AVERAGE
MEMBERSHIP
1997
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
7130
6940
7354
7556
7673
7332
7662
7809
7872
7551
7989
8143
8167
7902
8268
8436
1998
1999
2000
The graph exhibits long term trend
The graph exhibits seasonality pattern
1997
1998
1999
2000
63
Classical Decomposition – step 1:
Isolating Trend and Cyclical Components
• Smooth the time series to
remove random effects and
seasonality.
Calculate moving averages.
Average membership for the first 4 periods
= [7130+6940+7354+7556]/4 = 7245.01
First moving average is centered at
quarter (1+4)/ 2 = 2.5
Average membership for periods [2, 5]
= [6940+7354+7556+7673]/4 = 7380.75
Second moving average is centered at
quarter (2+5)/ 2 = 3.5
Centered moving average of the first
two moving averages is
[7245.01 + 7380.75]/2 = 7312.875
Centered location is t = 3
64
Classical Decomposition – step 2
[SeasonalRandom Error] Factors
• Determine “period factors” to
isolate the
Calculate the ratio yt/MAt.
(Seasonal)(Random error)
factor.
The Centered Moving Average only
represents TtCt. The SeasonalRandom
error factors are represented by
Stet = yt/TtC t
Example: In period 7 (3rd quarter of
1998):
S7e7=7662/7643.875 = 1.00237
65
Classical Decomposition – step 3
Unadjusted Seasonal Factors
• Determine the “unadjusted
seasonal factors” to eliminate
the random component from
the period factors
Average all the yt/MAt that
correspond to the same
season.
Averaging the SeasonalRandom Error Factors eliminates the random factor
from Stet . This leaves us with the seasonality component only for each season.
Example: Unadjusted Seasonal Factor for the third quarter.
S3 = {S3,97 e3,97 + S3,98 e3,98 + S3,99 e3,99}/3 = {1.0056+1.0024+1.0079}/3 = 1.0053
66
Classical Decomposition – step 4
Adjusted Seasonal Factors
• Determine the “adjusted
seasonal factors”.
Calculate:
[Unadjusted seasonal factor]
[Average seasonal factor]
Without seasonality the seasonal factors for each season should be equal to 1.
Thus, the sum of all seasonal factors would be equal to 4.
The adjustment of the unadjusted seasonal factors maintains the sum of 4.
Example: The average seasonal factor is
(1.0149+.9658+1.00533+1.01624)/4=1.00057.
Adjusted Seasonal
The adjusted seasonal factor for the 3rd quarter:
factors
Quarter 1
Quarter 2
Quareter 3
Quarter 4
1.014325
0.965252
1.004759
1.015663
S3/Average seasonal factor = 1.00053/1.00057 = 1.00472
Excel’s exact value=1.004759).
67
Classical Decomposition – step 5
The Deseasonalized Time Series
• Determine “Deseasonalized
data values”.
Calculate:
yt
[Adjusted seasonal factors]t
Deseasonalize the Time Series by .yt/(Adjusted S)t = TtCtet
Example:
Deseasonalized series value for the 2nd quarter, 1998 =
y6/[Adjusted S2] = 7332/0.9652 = 7595.94
68
Classical Decomposition – step 5
The Time series trend
Average Membership
(payroll deduction)
Deseasonalized Time Series
8400
8200
8000
Seasonality has been substantially
Reduced. This graph represents TtCtet.
7800
7600
7400
7200
7000
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
Period
69
Classical Decomposition – step 6
The Time series trend Component
• Determine a deseasonalized
trend forecast.
Use linear regression on the
deseasonalized time series.
Trend factor: Tt = 7069.6677 + 78.4046t
70
Classical Decomposition – step 7
The forecast
• Assuming no cyclic effects the forecast
becomes:
F(quarter i, time N+k) = TN+k(Adjusted Si)
Trend factor: T17 = 7069.6677 + 78.4046(17) = 8402
Forecast(Q1, t=17) = (8402)(1.01433) = 8523
71
Classical Decomposition – step 7
The forecast
•
•
Assuming cyclic effects, create a series
of the cyclic component, as follows:
•
TCe
Deseasonalized time series
= t t t = Ctet.
Tt
Trend component (from the regression)
•
Smooth out the error component using moving averages to
isolate Ct.
Perform the forecast:
F(quarter i, time N+k) = TN+k CN (Adjusted Si)


72
Classical Decomposition Template
73
The additive model –
The Multiple Regression Approach
For a time series with trend and seasonality:
Yt = Tt +St + Rt, which translates to
Yt = b0 + b1t + b2S1 + … +bkSk + et
The trend element
The seasonality element
74
Troy’s Mobil Station
• Troy owns a gas station that experience seasonal
variation in sales.
• In addition, due to a steady increase in population
Troy feels that average sales are increasing
generally.
75
Troy’s Mobil Station
• Data
Gasoline Sales Over Five Year Period
5000
Average Daily
Gasoline Sales
(gallons)
4800
4600
4400
4200
4000
3800
3600
3400
3200
3000
1
2
3
4
F W Spg Smr
5
6
7
8
9
10
11
P e r i od
12
13
14
15
16
17
18
19
20
76
Troy’s Mobil Station –
Multiple Regression input data
Trend
variable
Year 1
Year 2
Sales
3497
3484
3553
3837
3726
3589
t
1
2
3
4
5
6
Seasonal variables
Quarterly Input Data
X1
X2
Fall
1
0
Winter
0
1
0
0
Not0Fall
Not Winter
0
1
0
0
1
X3
0
0
Spring
1
Not 0
Spring
0
0
77
Troy’s Mobil Station Template
78
Troy’s Mobil Station –
Multiple regression (Excel output)
Extremely good fit
All the variable
are linearly related
to sales.
Extremely useful
79
Troy’s Mobil Station –
Multiple regression (Graphical interpretation)

-155.00

-248.27
-155.00

Fall
-322.93

Winter

Spring
Summer
80
Troy’s Mobil Station –
Performing the forecast
• The forecasting additive model is:
Ft = 3610.625 + 58.33t – 155F – 323W – 248.27S
• Forecasts for year 5 are produced as follows:
•
•
•
•
F(Year 5, Fall) = 3610.625+58.33(21) – 155(1) – 323(0) – 248.27(0)
F(Year 5, Winter) = 3610.625+58.33(22) – 155(0) – 323(1) – 248.27(0)
F(Year 5, Spring) = 3610.625+58.33(23) – 155(0) – 323(0) – 248.27(1)
F(Year 5, Summer) = 3610.625+58.33(24) – 155(0) – 323(0) – 248.27(0)
81
Copyright 2002 John Wiley & Sons, Inc. All rights
reserved. Reproduction or translation of this work beyond
that named in Section 117 of the United States Copyright
Act without the express written consent of the copyright
owner is unlawful. Requests for further information
should be addressed to the Permissions Department, John
Wiley & Sons, Inc. Adopters of the textbook are granted
permission to make back-up copies for their own use
only, to make copies for distribution to students of the
course the textbook is used in, and to modify this material
to best suit their instructional needs. Under no
circumstances can copies be made for resale. The
Publisher assumes no responsibility for errors, omissions,
or damages, caused by the use of these programs or from
the use of the information contained herein.
82