Transcript 8114-08

Session 8
Overview
Forecasting Methods
• Exponential Smoothing
– Simple
– Trend (Holt’s Method)
– Seasonality (Winters’ Method)
• Regression
– Trend
– Seasonality
– Lagged Variables
Applied Regression -- Prof. Juran
2
Forecasting
1. Analysis of Historical Data
• Time Series (Extrapolation)
• Regression (Causal)
2. Projecting Historical Patterns into the Future
3. Measurement of Forecast Quality
Applied Regression -- Prof. Juran
3
Measuring Forecasting Errors
• Mean Absolute Error
• Mean Absolute Percent Error
• Root Mean Squared Error
• R-square
Applied Regression -- Prof. Juran
4
Mean Absolute Error
n
MAE
Applied Regression -- Prof. Juran


i 1
i
n
5
Mean Absolute Percent Error
i
n
Y
i

MAPE 100% *
1
n
Or, alternatively  100% *
Applied Regression -- Prof. Juran
i
n
i
 Yˆ
i 1
i
n
6
Root Mean Squared Error
n
RMSE

  
2
i
i 1
n
SSE

n
Applied Regression -- Prof. Juran
7
R-Square
R
2
SSE SSR
 1

TSS TSS
Applied Regression -- Prof. Juran
8
Trend Analysis
• Part of the variation in Y is believed to
be “explained” by the passage of time
• Several convenient models available in
an Excel chart
Applied Regression -- Prof. Juran
9
Example: Revenues at GM
GM Revenue
60000
50000
Revenue
40000
30000
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
10
You can “add chart element –
trendline”, and choose to
superimpose a trend line on the
graph.
Applied Regression -- Prof. Juran
11
GM Revenue - Linear Trend
60000
50000
Revenue
40000
30000
20000
y = 340.23x + 31862
2
R = 0.6618
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
12
GM Revenue - Logarithmic Trend
60000
50000
Revenue
40000
30000
20000
y = 5162.3Ln(x) + 24937
2
R = 0.6601
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
13
GM Revenue - Polynomial Trend
60000
50000
Revenue
40000
30000
20000
y = -5.6121x2 + 604x + 29752
R2 = 0.6872
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
14
GM Revenue - Power Trend
60000
50000
Revenue
40000
30000
20000
0.1372
y = 26532x
2
R = 0.6783
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
15
GM Revenue - Exponential Trend
60000
50000
Revenue
40000
30000
y = 32044e0.0088x
2
R = 0.6505
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
16
You can also show moving-average trend lines, although
showing the equation and R-square are no longer options:
GM Revenue - 4-Period Moving Average
60000
50000
Revenue
40000
30000
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
17
GM Revenue - 3-Period Moving Average
60000
50000
Revenue
40000
30000
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
18
GM Revenue - 2-Period Moving Average
60000
50000
Revenue
40000
30000
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
19
Simple Exponential Smoothing
Basically, this method uses a forecast formula of the form:
Ft  k  Lt
Forecast “k” periods in the future = Current “Level”
= Weighted Current Observed Value
+ Weighted Previous Level
 Yt  1   Lt 1
Note that the weights must add up to 1.0.
Applied Regression -- Prof. Juran
20
Why is it called “exponential”?
Lt
 Lt 1    t 
 Yt   1   Yt 1   1   2 Yt 2   1   3 Yt 3  ...
See p. 918 in W&A for more details.
Applied Regression -- Prof. Juran
21
Example: GM Revenue
GM Revenue
60000
50000
Revenue
40000
30000
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
22
In this spreadsheet model, the forecasts appear in column G.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
A
Alpha
B
0.100
MAE
RMSE
MAPE
4014.376
4690.9738
11.148%
C
D
E
F
GM_Rev SmLevel
1-91
29200 29200.0
=AVERAGE(I3:I47)
2-91
31300 29410.0
=SQRT(AVERAGE(K3:K47))
3-91
28900 29359.0
=AVERAGE(J3:J47)
4-91
33600 29783.1
1-92
32000 30004.8
2-92
35200 30524.3
3-92
29400 30411.9
4-92
35800 30950.7
1-93
35000 31355.6
2-93
36658 31885.9
3-93
30138 31711.1
4-93
37268 32266.8
1-94
37495 32789.6
2-94
40392 33549.8
3-94
34510 33645.8
4-94
42553 34536.6
G
Forecast
H
I
J
Error abs(error) abs(%error)
K
error^2
L
M
29200.0
2100.0
2100.0
7.2% 4410000.0
29410.0
-510.0
510.0
1.7%
260100.0
29359.0
4241.0
4241.0
14.4% 17986081.0
29783.1
2216.9
2216.9
7.4% 4914645.6
=$B$1*E7+(1-$B$1)*F6
30004.8
5195.2
5195.2
17.3% 26990206.8
30524.3
-1124.3
1124.3
3.7% 1264075.3
=F8
30411.9
5388.1
5388.1
17.7% 29031838.1
30950.7
4049.3
4049.3
13.1% 16396895.8
=E11-G11
31355.6
5302.4
5302.4
16.9% 28115204.6
31885.9
-1747.9
1747.9
5.5% 3055016.2
=ABS(H13)
31711.1
5556.9
5556.9
17.5% 30879421.8
32266.8
5228.2
5228.2
16.2% 27334420.4
=ABS(H15/G15)
32789.6
7602.4
7602.4
23.2% 57796633.2
33549.8
960.2
960.2
2.9%
921924.0 =H17^2
33645.8
8907.2
8907.2
26.5% 79337354.0
Note that our model assumes that there is no trend. We use a
default alpha of 0.10.
Applied Regression -- Prof. Juran
23
GM Revenue - Simple Smoothing (alpha 0.10)
60000
50000
Revenue
40000
30000
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
24
We use Solver to minimize RMSE by manipulating alpha.
A
1 Alpha
2
3 MAE
4 RMSE
5 MAPE
6
7
8
9
10
11
12
13
14
15
16
17
B
0.350
3275.989
3653.2722
8.584%
C
D
1-91
2-91
3-91
4-91
1-92
2-92
3-92
4-92
1-93
2-93
3-93
4-93
1-94
2-94
3-94
4-94
E
F
GM_Rev SmLevel
29200 29200.0
31300 29935.9
28900 29572.9
33600 30984.1
32000 31340.1
35200 32692.7
29400 31538.9
35800 33032.1
35000 33721.7
36658 34750.6
30138 33134.3
37268 34582.8
37495 35603.3
40392 37281.4
34510 36310.2
42553 38497.9
G
Forecast
29200.0
29935.9
29572.9
30984.1
31340.1
32692.7
31538.9
33032.1
33721.7
34750.6
33134.3
34582.8
35603.3
37281.4
36310.2
H
I
J
Error abs(error)abs(%error)
2100.0
-1035.9
4027.1
1015.9
3859.9
-3292.7
4261.1
1967.9
2936.3
-4612.6
4133.7
2912.2
4788.7
-2771.4
6242.8
2100.0
1035.9
4027.1
1015.9
3859.9
3292.7
4261.1
1967.9
2936.3
4612.6
4133.7
2912.2
4788.7
2771.4
6242.8
7.2%
3.5%
13.6%
3.3%
12.3%
10.1%
13.5%
6.0%
8.7%
13.3%
12.5%
8.4%
13.5%
7.4%
17.2%
K
error^2
4410000.0
1073072.4
16217616.5
1032075.0
14898909.3
10841859.2
18157357.9
3872765.0
8621982.5
21276434.3
17087842.8
8480779.8
22931441.8
7680620.3
38972198.5
After optimizing, we see that alpha is 0.350 (instead of 0.10). This
makes an improvement in RMSE, from 4691 to 3653.
Applied Regression -- Prof. Juran
25
GM Revenue - Simple Smoothing (alpha 0.35)
60000
50000
Revenue
40000
30000
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
26
Exponential Smoothing with Trend:
Holt’s Method
Weighted Current Level
Ft  k
 Lt  kTt
 Yt   1   Lt 1  Tt 1   k  Lt  Lt 1   1   Tt 1 
Weighted Current Observation
Applied Regression -- Prof. Juran
Weighted Current Trend
27
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
A
Smoothing constant(s)
Level (alpha)
Trend (beta)
B
C
MAE
RMSE
MAPE
3094.683
3568.391
8.01%
0.266
0.048
D
1-91
2-91
3-91
4-91
1-92
2-92
3-92
4-92
1-93
2-93
3-93
4-93
1-94
2-94
3-94
4-94
E
GM_Rev
29200
31300
28900
33600
32000
35200
29400
35800
35000
36658
30138
37268
37495
40392
34510
42553
Applied Regression -- Prof. Juran
F
SmLevel
29200.000
29757.957
29549.579
30637.081
31048.143
32212.293
31564.039
32760.991
33465.944
34443.591
33457.270
34585.269
35507.934
36980.394
36542.128
38331.485
G
H
I
J
K
L
M
SmTrend
Forecast
Error abs(error)abs(%error)
error^2
0.000
26.659
29200.000
2100.0
2100.0
7.2%
4410000.0
=$B$2*E4+(1-$B$2)*(F3+G3)
15.429
29784.616
-884.6
884.6
3.0%
782545.7
66.652
29565.008
4035.0
4035.0
13.6% 16281159.9
=$B$3*(F6-F5)+(1-$B$3)*G5
83.108
30703.733
1296.3
1296.3
4.2%
1680308.2
134.760
31131.251
4068.7
4068.7
13.1% 16554716.2
=F7+G7
97.348
32347.053
-2947.1
2947.1
9.1%
8685120.1
149.887
31661.387
4138.6
4138.6
13.1% 17128119.6
=E10-H10
176.407
32910.877
2089.1
2089.1
6.3%
4364433.2
214.690
33642.352
3015.6
3015.6
9.0%
9094133.3
=ABS(I12)
157.306
34658.282
-4520.3
4520.3
13.0% 20432945.1
203.686
33614.577
3653.4
3653.4
10.9% 13347499.7
=ABS(I14/H14)
238.038
34788.955
2706.0
2706.0
7.8%
7322680.1
297.019
35745.973
4646.0
4646.0
13.0% 21585567.4
=I16^2
261.887
37277.413
-2767.4
2767.4
7.4%
7658572.1
334.869
36804.015
5749.0
5749.0
15.6% 33050827.6
28
N
GM Revenue - Holts Method (Smoothing with Trend)
60000
50000
Revenue
40000
30000
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Holt’s model with optimized smoothing constants. This model is slightly
better than the simple model (RMSE drops from 3653 to 3568).
Applied Regression -- Prof. Juran
29
Exponential Smoothing with Seasonality:
Winters’ Method
This method includes an explicit term for seasonality, where M is the number of
periods in a season. We will use M = 4 because we have quarterly data.
Yt
 1   Lt 1  Tt 1 
St  M
Level:
Lt

Trend:
Tt
  Lt  Lt 1   1   Tt 1
Seasonality:
St

Yt
 1   St  M
Lt
Now, for any time k periods in the future, the forecast is given by:
Ft  k  Lt  kTt St  k  M 
Note that the trend term is additive, and the seasonality term is multiplicative.
Applied Regression -- Prof. Juran
30
Weighted Current Seasonal Factor
St
Yt

 1   St  M
Lt
Weighted Seasonal Factor from Last Year
Applied Regression -- Prof. Juran
31
Now, for any time k periods in the future, the forecast is given by:
Ft  k  Lt  kTt St  k  M 
Note that the trend term is additive, and the seasonality term is multiplicative.
This is a little tricky at first, because we need a few periods of data to get the
model started. The first forecast has no trend information (so we use 0 as the
default), and the first four have no seasonality (so we use 1.0 as the default).
Applied Regression -- Prof. Juran
32
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
A
Smoothing constant(s)
Level (alpha)
Trend (beta)
Seasonality (gamma)
B
C
MAE
RMSE
MAPE
2670.440
3233.995
6.82%
0.312
0.037
0.202
D
1-91
2-91
3-91
4-91
1-92
2-92
3-92
4-92
1-93
2-93
3-93
4-93
1-94
2-94
3-94
E
GM_Rev
29200
31300
28900
33600
32000
35200
29400
35800
35000
36658
30138
37268
37495
40392
34510
F
SmLevel
29200.000
29855.671
29573.917
30839.828
31242.711
32527.642
31631.252
32987.284
33649.376
34502.621
33394.029
34492.750
35401.921
36772.011
36570.220
G
H
I
J
K
L
SmTrend
SmSeason
Forecast
Error abs(error) abs(%error)
0.000
1.000
24.178
1.000
12.897
1.000
59.102
1.000
71.779
1.005
30898.931
1101.1
1101.1
3.6%
116.515
1.017
31314.491
3885.5
3885.5
12.4%
=$B$2*(E8/H4)+(1-$B$2)*(F7+G7)
79.164
0.986
32644.157
-3244.2
3244.2
9.9%
=$B$3*(F9-F8)+(1-$B$3)*G8
126.249
1.017
31710.415
4089.6
4089.6
12.9%
=$B$4*(E10/F10)+(1-$B$4)*(H6)
146.008
1.012
33275.398
1724.6
1724.6
5.2%
=(F10+G10)*H7
172.088
1.026
34355.316
2302.7
2302.7
6.7%
124.862
0.969
34181.443
-4043.4
4043.4
11.8%
160.774
1.030
34095.266
3172.7
3172.7
9.3%
188.371
1.022
35069.261
2425.7
2425.7
6.9%
231.948
1.040
36509.418
3882.6
3882.6
10.6%
215.954
0.964
35856.098
-1346.1
1346.1
3.8%
M
error^2
1212353.7
15097181.1
10524553.4
16724701.7
2974252.4
5302351.4
16349433.8
10066242.9
5884208.4
15074445.5
1811980.8
Winters’ model with optimized smoothing constants. This model is better
than the simple model and the Holt’s model (as measured by RMSE).
Applied Regression -- Prof. Juran
33
GM Revenue - Winters Method (Smoothing with Trend and Seasonality)
60000
50000
Revenue
40000
30000
20000
10000
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
Applied Regression -- Prof. Juran
34
Forecasting with Regression
A
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
1-91
2-91
3-91
4-91
1-92
2-92
3-92
4-92
1-93
2-93
3-93
4-93
1-94
2-94
3-94
4-94
1-95
2-95
B
GM_Rev
29200
31300
28900
33600
32000
35200
29400
35800
35000
36658
30138
37268
37495
40392
34510
42553
43285
42204
C
GM_EPS
-1.28
-1.44
-1.88
-4.25
-0.53
-1.18
-1.86
-1.25
0.42
0.92
-0.49
1.28
1.86
2.23
0.4
1.74
2.51
2.39
Applied Regression -- Prof. Juran
D
E
F
Trend
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
G
1Q
1
0
0
0
1
0
0
0
1
0
0
0
1
0
0
0
1
0
H
2Q
0
1
0
0
0
1
0
0
0
1
0
0
0
1
0
0
0
1
3Q
0
0
1
0
0
0
1
0
0
0
1
0
0
0
1
0
0
0
35
Applied Regression -- Prof. Juran
36
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
B
C
Regression Statistics
Multiple R
0.8852
R Square
0.7835
Adjusted R Square
0.7624
Standard Error
2736.1392
Observations
46
D
E
F
G
ANOVA
df
Regression
Residual
Total
Intercept
Trend
1Q
2Q
3Q
SS
MS
4 1111067275.3109 277766818.8277
41 306944777.4065
7486457.9855
45 1418012052.7174
Coefficients
33286.7628
335.8508
-1289.9144
423.4015
-4582.6038
Applied Regression -- Prof. Juran
Standard Error
1101.4629
30.4091
1142.5337
1142.1290
1167.0899
t Stat
30.2205
11.0444
-1.1290
0.3707
-3.9265
F
Significance F
37.1026
0.0000
P-value
0.0000
0.0000
0.2655
0.7128
0.0003
37
Which Method is Better?
The most reasonable statistic for comparison is probably RMSE
for smoothing models vs. standard error for regression models,
as is reported here:
Revenue Mattel McD
Lilly
GM
MSFT
ATT
Nike
GE
Coke
Ford
Regression $99.37 $112.23 $109.22 $2,736.14 $154.25 $12,836.41 $279.45 $1,164.02 $164.20 $969.14
Winters'
$76.44 $84.92 $135.33 $3234.00 $103.91 $14,622.26 $191.94 $1,184.06 $258.02 $1,648.61
EPS
Regression $0.0874 $0.0205 $0.3727
Winters' $0.1327 $0.0295 $0.5285
$1.3882
$1.2603
$0.0687
$0.0635
$0.6879
$0.5719
$0.2080
$0.2687
$0.6591
$0.7587
$0.0164
$0.0228
$0.4988
$1.3475
The regression models are superior most of the time (6 out of 10
revenue models and 7 out of 10 EPS models).
Applied Regression -- Prof. Juran
38
For GM, a regression model seems best for forecasting revenue, but a Winters
model seems best for earnings:
GM Revenue - Regression
60000
50000
50000
40000
40000
Re v e nue
Re v e nue
GM Revenue - Winters Method (Smoothing with Trend and Seasonality)
60000
30000
30000
20000
20000
10000
10000
0
0
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
1-91
1-92
1-93
1-94
1-95
1-96
Quarters
GM EPS - Winters Method (Smoothing with Trend and Seasonality)
1-98
1-99
1-00
1-01
1-02
GM EPS - Regression
4
4
3
3
2
2
1
1
Re v e nue
Re v e nue
1-97
Quarters
0
-1
0
-1
-2
-2
-3
-3
-4
-4
-5
-5
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
Quarters
Applied Regression -- Prof. Juran
1-01
1-02
1-91
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
Quarters
39
For Nike, the Winters model is better for revenue, and the regression model is
best for earnings.
Nike Revenue (Regression)
3500
3000
3000
2500
2500
Re v e nue
Re v e nue
Nike Revenue (Winters)
3500
2000
1500
2000
1500
1000
1000
500
500
0
0
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
1-02
1-03
1-92
1-93
1-94
1-95
1-96
1-97
Quarters
Nike EPS (Winters)
1-99
1-00
1-01
1-02
1-03
1-99
1-00
1-01
1-02
1-03
Nike EPS (Regression)
1.4
1.4
1.2
1.2
1
1
0.8
0.8
Re v e nue
Re v e nue
1-98
Quarters
0.6
0.4
0.6
0.4
0.2
0.2
0
0
-0.2
-0.2
-0.4
-0.4
1-92
1-93
1-94
1-95
1-96
1-97
1-98
1-99
1-00
1-01
Quarters
Applied Regression -- Prof. Juran
1-02
1-03
1-92
1-93
1-94
1-95
1-96
1-97
1-98
Quarters
40
Time series characterized by relatively consistent trends
and seasonality favor the regression model.
If the trend and seasonality are not stable over time, then
Winters’ method does a better job of responding to their
changing patterns.
Applied Regression -- Prof. Juran
41
Lagged Variables
• Only applicable in a causal model
• Effects of independent variables might
not be felt immediately
• Used for advertising’s effect on sales
Applied Regression -- Prof. Juran
42
Example: Motel Chain
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
B
Sales
1200
880
1800
1050
1700
350
2500
760
2300
1000
1570
2430
1320
1400
1890
3200
2200
1440
4000
4100
C
Applied Regression -- Prof. Juran
D
E
F
G
H
I
Quarter Adv Adv-Lag1 Qtr_1 Qtr_2 Qtr_3
1
30 *
1
0
0
2
20
30
0
1
0
3
15
20
0
0
1
4
40
15
0
0
0
5
10
40
1
0
0
6
50
10
0
1
0
7
5
50
0
0
1
8
40
5
0
0
0
9
20
40
1
0
0
10
10
20
0
1
0
11
60
10
0
0
1
12
5
60
0
0
0
13
35
5
1
0
0
14
15
35
0
1
0
15
70
15
0
0
1
16
25
70
0
0
0
17
30
25
1
0
0
18
60
30
0
1
0
19
80
60
0
0
1
20
50
80
0
0
0
43
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
A
Summary measures
Multiple R
R-Square
Adj R-Square
StErr of Est
B
C
D
E
F
G
0.9856
0.9714
0.9571
213.2
ANOVA Table
df
SS
MS
6 18515047.32 3085841.22
12
545531.63
45460.97
Regression
Residual
F
67.88
p-value
0.0000
Regression coefficients
Constant
Quarter
Advertising
Advertising_Lag1
Qtr_1
Qtr_2
Qtr_3
Coefficient
98.36
41.58
4.53
34.03
280.62
-491.59
532.60
Applied Regression -- Prof. Juran
Std Err
174.96
13.56
3.25
3.13
157.66
145.37
143.04
t-value
0.5622
3.0672
1.3959
10.8759
1.7799
-3.3817
3.7235
p-value Lower limit Upper limit
0.5843
-282.9
479.6
0.0098
12.0
71.1
0.1880
-2.5
11.6
0.0000
27.2
40.9
0.1004
-62.9
624.1
0.0055
-808.3
-174.9
0.0029
221.0
844.2
44
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
A
B
Qtr Sales
1 1200
2
880
3 1800
4 1050
1 1700
2
350
3 2500
4
760
1 2300
2 1000
3 1570
4 2430
1 1320
2 1400
3 1890
4 3200
1 2200
2 1440
3 4000
4 4100
1
2
3
4
C
D
E
F
G
H
I
J
Quarter Adv Adv-Lag1 Qtr_1 Qtr_2 Qtr_3 Forecast
1
30 *
1
0
0
2
20
30
0
1
0
802
3
15
20
0
0
1
1504
4
40
15
0
0
0
957
5
10
40
1
0
0
1994
6
50
10
0
1
0
423
7
5
50
0
0
1
2646
8
40
5
0
0
0
783
9
20
40
1
0
0
2205
10
10
20
0
1
0
749
11
60
10
0
0
1
1701
12
5
60
0
0
0
2662
13
35
5
1
0
0
1248
14
15
35
0
1
0
1448
15
70
15
0
0
1
2083
16
25
70
0
0
0
3259
17
30
25
1
0
0
2073
18
60
30
0
1
0
1648
19
80
60
0
0
1
3826
20
50
80
0
0
0
3879
21
50
50
1
0
0
3181
22
50
50
0
1
0
2450
23
50
50
0
0
1
3516
24
50
50
0
0
0
3025
Applied Regression -- Prof. Juran
K
L
Coefficient
M
N
O
P
Q
R
S
Constant Quarter Adv Adv-Lag1 Qtr_1
Qtr_2 Qtr_3
98.36
41.58 4.53
34.03 280.62 -491.59 532.60
=$M$2+SUMPRODUCT($N$2:$S$2,D5:I5)
45
Simple Exponential Smoothing Forecast
6,000
5,250
Observation
Forecast
4,500
3,750
3,000
2,250
1,500
750
-
1
2
3
4
5
6
7
8
9
Applied Regression -- Prof. Juran
10
11
12 13
Quarter
14
15
16
17
18
19
20
21
22
46
23
24
Holt's Forecast
6,000
5,250
Observation
Forecast
4,500
3,750
3,000
2,250
1,500
750
-
1
2
3
4
5
6
7
8
9
Applied Regression -- Prof. Juran
10
11
12 13
Quarter
14
15
16
17
18
19
20
21
22
47
23
24
Winters' Forecast
6,000
5,250
Observation
Forecast
4,500
3,750
3,000
2,250
1,500
750
-
1
2
3
4
5
6
7
8
9
Applied Regression -- Prof. Juran
10
11
12 13
Quarter
14
15
16
17
18
19
20
21
22
48
23
24
Multiple Regression Forecast (with Lagged Advertising)
6,000
5,250
Observation
Forecast
4,500
3,750
3,000
2,250
1,500
750
-
1
2
3
4
5
6
7
8
9
Applied Regression -- Prof. Juran
10
11
12 13
Quarter
14
15
16
17
18
19
20
21
22
49
23
24
Here are measures of model fit for the non-regression models:
MAE
RMSE
MAPE
Simple
769.6
939.9
50.5%
Holt's
766.8
866.6
36.7%
Winters'
708.0
845.6
47.3%
The regression model has a standard error of only 213,
which is much better than any of the other models.
Applied Regression -- Prof. Juran
50
Forecasting with Minitab
Applied Regression -- Prof. Juran
51
Applied Regression -- Prof. Juran
52
Pay Rate/Job Grade
1500
Pay Rate
1300
Men
Women
1100
900
700
500
0
1
2
3
4
5
6
7
8
9
Job Grade
Applied Regression -- Prof. Juran
53
Distribution of Pay Rate / Job Grade / Sex
$1,800
$1,600
$1,400
Women
Men
Pay Rate
$1,200
$1,000
$800
$600
$400
$200
$0
1
2
3
4
5
6
7
8
9
Job Grade
Applied Regression -- Prof. Juran
54
Regression Statistics
Multiple R
R Square
Adjusted R Square
Standard Error
Observations
0.9072
0.8231
0.8210
97.0601
256
ANOVA
Regression
Residual
Total
df
3
252
255
Intercept
GRADE
SEX
TinGRADE
Coefficients
526.7312
75.0323
59.6220
30.8215
Applied Regression -- Prof. Juran
SS
MS
11045358.5367 3681786.1789
2374008.5129
9420.6687
13419367.0496
F
390.8200
Standard Error
14.1422
3.3261
15.9887
4.5689
P-value
0.0000
0.0000
0.0002
0.0000
t Stat
37.2454
22.5586
3.7290
6.7460
Significance F
0.0000
55
Artsy: Analysis of Pay Rates by Grade
Grade -->
SEX
Female
Male
1
2
3
4
5
6
7
8
Total
Mean
664.71
725.50
830.02
833.61
886.92
1006.15
1093.36
1273.86
832.76
Std Dev
81.44
56.27
57.12
87.64
67.58
99.77
122.87
128.89
158.53
Count
22
51
22
18
24
15
17
2
171
Mean
804.00
835.33
824.05
918.52
1130.76
1212.91
1375.92
1128.17
36.77
87.85
161.55
114.01
133.45
103.52
223.40
Std Dev
Count
1
0
9
5
11
10
33
16
85
Mean
670.77
725.50
831.56
831.53
896.85
1055.99
1172.26
1364.58
930.85
Std Dev
84.71
56.27
51.48
85.76
104.82
120.68
140.83
107.34
229.40
Count
23
51
31
23
35
25
50
18
256
Mean
139.29
5.31
-9.55
31.60
124.61
119.54
102.06
295.40
s Pooled
81.44
52.30
87.68
105.32
105.57
130.02
105.28
182.55
Confidence
Lower
307.23
46.86
80.10
108.40
211.86
197.07
267.95
343.02
Limits
Upper
-28.65
-36.23
-99.21
-45.19
37.36
42.02
-63.83
247.78
t Stat
1.673
0.257
-0.216
0.824
2.891
3.080
1.293
12.193
p Value
0.055
0.400
0.584
0.208
0.004
0.002
0.107
0.000
Tabelled t
2.017
2.008
2.023
2.002
2.024
1.997
2.101
1.966
Total
Difference
1.984
Weighted Average Female Shortfall
Applied Regression -- Prof. Juran
$
65.61
56
Summary
Forecasting Methods
• Exponential Smoothing
– Simple
– Trend (Holt’s Method)
– Seasonality (Winters’ Method)
• Regression
– Trend
– Seasonality
– Lagged Variables
Applied Regression -- Prof. Juran
57
For Session 9 and 10
• Cars (B)
• Steam
Applied Regression -- Prof. Juran
58