Practical Guidelines for Forecasting

Download Report

Transcript Practical Guidelines for Forecasting

Session 7: Evaluating forecasts
Demand Forecasting and
Planning in Crisis
30-31 July, Shanghai
Joseph Ogrodowczyk, Ph.D.
Evaluating forecasts

Session agenda




Background
Measures of accuracy
Cost of forecast error
Activity: Produce forecast error calculations for the
forecasts made on Day 1
Session 7
Joseph Ogrodowczyk, Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
2
Evaluating forecasts

Background

How do we measure the accuracy of our forecasts?


How do we know which forecasts were good and which need
improvement?
Error can be calculated across products within a given time
period or across time periods for a given product


The following examples are for one product over multiple time
periods
Two topics of forecast evaluation
1.
2.
How accurate was the forecast?
What was the cost of being wrong?
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
3
Evaluating forecasts

Background

Definitions for evaluation:





Forecast period: The time increment for which the forecast is
produced (month, week, quarter)
Forecast bucket: The time increment being forecasted (period,
month, quarter)
Forecast horizon: The time increment including all forecast
buckets being forecasted (12 months, 8 quarters)
Forecast lag: The time between when the forecast is produced
and the bucket that is forecasted
Forecast snapshot: the specific combination of period, horizon,
bucket, and lag associated with a forecast
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
4
Evaluating forecasts

Background

Sources of error




Data: Missing or omitted data, mislabeled data
Assumptions: Seasonality is not constant, trend changes are
unanticipated, experts have insufficient information
Model: Wrong choice of model type (judgment, statistical),
correct model type and misspecified model (missing variables
or too many variables), did not account for outliers
Measures of accuracy



Point error
Average error
Trend of error
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
5
Evaluating forecasts

Measures of accuracy

Point error



Error: The difference between the forecasted quantity and the
actual demand quantity
Squared error: The square of the error
Percent error: The error relative to the actual demand quantity




Denominator of actuals answers the question: How did well did we
predict actual demand?
Denominator of forecast answers the question: How much were we
wrong relative to what we said we would do?
Absolute error: The absolute value of the error
Absolute percent error: The absolute value of the error relative
to the actual demand quantity
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
6
Evaluating forecasts

Measures of accuracy

Point error


Month
Jan
Feb
Mar
Apr
May
Jun
Data from Session 4, Naïve one-step model
One product over multiple time periods
Forecast
88.9
88.2
88.5
90.1
91.6
90.4
Session 7
Joseph Ogrodowczyk Ph.D.
Actual
88.2
88.5
90.1
91.6
90.4
93.0
Error
0.66
-0.31
-1.58
-1.47
1.22
-2.59
Squared
error
0.44
0.10
2.48
2.17
1.49
6.73
Percent Absolute
Absolute
error
error
percent error
0.74%
0.66
0.74%
-0.35%
0.31
0.35%
-1.78%
1.58
1.78%
-1.64%
1.47
1.64%
1.33%
1.22
1.33%
-2.87%
2.59
2.87%
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
7
Evaluating forecasts

Measures of accuracy

Average error






Mean square error (MSE): Sum of the squared errors
Root mean square error (RMSE): Square root of the MSE
Mean percent error (MPE): Average of the percent errors
Mean absolute error (MAE): Average of the absolute errors
Mean absolute percent error (MAPE): Average of the APE
Weighted mean absolute percent error (WMAPE): Weighted
average of the APE
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
8
Evaluating forecasts

Measures of accuracy

Average error

One product over multiple time periods
Month
Jan
Feb
Mar
Apr
May
Jun
Sum
Mean
Session 7
Joseph Ogrodowczyk Ph.D.
Error
0.66
-0.31
-1.58
-1.47
1.22
-2.59
Squared
error
0.44
0.10
2.48
2.17
1.49
6.73
13.41
2.235
Percent Absolute
Absolute
error
error
percent error
0.74%
0.66
0.74%
-0.35%
0.31
0.35%
-1.78%
1.58
1.78%
-1.64%
1.47
1.64%
1.33%
1.22
1.33%
-2.87%
2.59
2.87%
-4.56%
-0.76%
7.84
1.306
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
8.71%
1.45%
9
Evaluating forecasts

Measures of accuracy

Average error

Weighted mean absolute percent error (WMAPE)


Introduced as a method for overcoming inconsistencies in the
MAPE

All time periods, regardless of the quantity of sales, have equal
ability to affect MAPE

A 12% APE for a period in which 10 units were sold has no
more importance than a 12% APE for a period in which 100K
units were sold
Weight each APE calculation by the respective quantity
WMAPE=
Session 7
Joseph Ogrodowczyk Ph.D.
  F  A  

 * 100 * A 
  

A
 


A
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
10
Evaluating forecasts

Measures of accuracy

Average error

Weighted mean absolute percent error (WMAPE)





In Session 4, we used a naïve one-step model and forecasted
January 2008 using December 2007 data.
Forecast was 88.9 units and actual demand was 88.2
Absolute percent error (APE) = |F-A|/A = |88.9-88.2|/88.2 = .74%
Multiply .74% by 88.2 (actual demand) = .66
.66% is the weighted error value for the January forecast
WMAPE=
Session 7
Joseph Ogrodowczyk Ph.D.
  F  A  

 * 100 * A 
  

A
 


A
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
11
Evaluating forecasts

Measures of accuracy

Average error

Weighted mean absolute percent error (WMAPE)
Month
Jan
Feb
Mar
Apr
May
Jun
Forecast
88.9
88.2
88.5
90.1
91.6
90.4
Sum
WMAPE
Session 7
Joseph Ogrodowczyk Ph.D.
Actual
88.2
88.5
90.1
91.6
90.4
93.0
Percent
error
0.75%
0.35%
1.75%
1.61%
1.35%
2.79%
Percent error
* Actual
0.66
0.31
1.58
1.47
1.22
2.59
  F  A  

 * 100 * A 
  

A
 


A
7.84
541.85
1.45%
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
12
Evaluating forecasts

Measures of accuracy

Trend of error

Point error calculations and average error calculations are static


They are calculated for a set time interval
Additional information can be obtained by tracking these
calculations over time



How does the error change over time?
Also called the forecast bias
Statistical analysis can be performed on the trending data

Mean, standard deviation, coefficient of variation
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
13
Evaluating forecasts

Measures of accuracy

Trend of error

Two suggested methods



Month
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Track a statistic through time (3 month MAPE)
Compare time intervals (Q1 against Q2)
Example is the 2008 naïve one-step forecast
Forecast
88.9
88.2
88.5
90.1
91.6
90.4
93.0
88.6
89.8
85.9
81.4
75.6
Session 7
Joseph Ogrodowczyk Ph.D.
Actual
88.2
88.5
90.1
91.6
90.4
93.0
88.6
89.8
85.9
81.4
75.6
67.4
Absolute percent MAPE 3
error
month
0.75%
0.35%
1.75%
0.95%
1.61%
1.24%
1.35%
1.57%
2.79%
1.92%
4.88%
3.01%
1.30%
2.99%
4.53%
3.57%
5.52%
3.78%
7.74%
5.93%
12.17%
8.48%
Quarter
Q1
Q2
Q3
Q4
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
Mean
MAPE 3
0.95%
1.92%
3.57%
8.48%
Std dev
MAPE 3
0.007
0.008
0.020
0.034
14
Evaluating forecasts

Cost of forecast error


Accuracy measures do not contain the costs associated with
forecast error
Two methods for incorporating costs


Calculate costs based on percent error and differentiating
between over- and under-forecasting
Calculate costs based on a loss function dependent on safety
stock levels, lost sales, and service levels
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
15
Evaluating forecasts

Cost of forecast error

Incorporating costs

Error differentiation



Costs are calculated according to the mathematical sign of the
percent error (+ or -)
Costs of under-forecasting can be reflected in loss of sales, loss of
related goods, increased production costs, increased shipment
costs, etc.

Shipment and production costs are associated with production
and expediting additional units to meet demand
Costs of over-forecasting can be reflected in excess inventory,
increased obsolescence, increased firesale items, etc.
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
16
Evaluating forecasts

Cost of forecast error

Incorporating costs

Loss function





A cost of forecast error metric (CFE) can be used to quantify the
loss associated with both under- and over-forecasting
Loss function based on the mean absolute error (MAE)
First part of CFE calculates the necessary unit requirements to
maintain a specified service level
This is balanced against the volume of lost sales and associated
cost of stock-outs
Plotting a graph of cost of error against different service levels can
supply information with regards to the service level corresponding
to the lowest cost of forecast error
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
17
Evaluating forecasts

Cost of forecast error

Final notes

Cost of error helps to guide forecast improvement process



These costs can be company specific and can be explored through
understanding the implications of shortages and surpluses of
products
The specific mathematical calculations are beyond the scope of this
workshop
Applying costs to forecast errors will always require assumptions
within the models

Recommend explicitly writing assumptions

Changing assumptions will lead to changes in the costs of the
errors and can produce a range of estimated costs
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
18
Evaluating forecasts

References


Jain, Chaman L. and Jack Malehorn. 2005.
Practical Guide to Business Forecasting (2nd
Ed.). Flushing, New York: Graceway Publishing
Inc.
Catt, Peter Maurice. 2007. Assessing the cost of
forecast error: A practical example. Foresight.
Summer: 5-10.
Session 7
Joseph Ogrodowczyk Ph.D.
Demand Forecasting and Planning in Crisis
30-31 July, Shanghai
19