Transcript Slide 1

How To
Measure Anything:
Finding the Value of
‘Intangibles’ in Business
Copyright HDR 2007
[email protected]
1
How to Measure Anything
• It started 12 years ago…
• I conducted over 60 major risk/return
analysis projects so far that included a
variety of “impossible” measurements
• I found such a high need for
measuring difficult things that I
decided I had to write a book
• The book was released in July 2007
with the publisher John Wiley & Sons
• This is a “sneak preview” of many of
the methods in the book
Copyright HDR 2007
[email protected]
2
How To Measure Anything
• “I love this book. Douglas Hubbard helps us create a path to know the
answer to almost any question, in business, in science or in life.” Peter
Tippett, Ph.D., M.D. Chief Technology Officer at CyberTrust and inventor
of the first antivirus software
• “Doug Hubbard has provided an easy-to-read, demystifying explanation of
how managers can inform themselves to make less risky, more profitable
business decisions.” Peter Schay, EVP and COO of The Advisory Council
• “As a reader you soon realize that actually everything can be measured
while learning how to measure only what matters. This book cuts through
conventional clichés and business rhetoric and it offers practical steps to
using measurements as a tool for better decision making.” Ray Gilbert,
EVP Lucent
• “This book is remarkable in it's range of measurement applications and it's
clarity of style. A must read for every professional who has ever exclaimed
‘Sure, that concept is important but can we measure it?’” Dr. Jack Stenner,
CEO and co-founder of MetaMetrics, Inc.
Copyright HDR 2007
[email protected]
3
CFO Measurement Problem
1. The Risk Paradox: The largest and riskiest decisions
often get the least quantitative risk analysis
2. The Measurement Inversion: According to an
economic valuation of the benefits of a measurement,
most measurement priorities are the opposite of the
optimal solution.
3. Better alternatives than:
– Traditional business cases that don’t quantify
uncertainty, risks, and intangibles
– “Scores” that quantify nothing
Copyright HDR 2007
[email protected]
4
Style vs. Substance
• If you are adding and multiplying subjective “scores” on a scale of 15 for things like risk, alignment, etc. chances are your method
doesn’t improve on your intuition
• Also don’t be fooled by the terms “structured” or “formal” (Astrology
is both structured and formal, it just doesn’t work)
• The Following Charts Mean Nothing:
Innovation
Strategy
8
6
Alignment
4
2
Efficiency
0
Relationship
Process
Customer Value
Copyright HDR 2007
[email protected]
Effectiveness
5
Assessing Assessment Methods
• “Proven” should mean more than some previous
users feel good about it (the “testimonial proof”)
• Only empirical evidence that forecasts and
decisions are actually improved can separate
real benefits from a “placebo effect”
• Effective methods for evaluating IT investments
should have a lot in common with well-known
methods in other fields (actuarial science,
portfolio optimization, etc.)
Copyright HDR 2007
[email protected]
6
My Three Measurement “Heroes”
• Eratosthenes – measured the Earth’s
circumference to within 1% accuracy
• Enrico Fermi – the physicist who used “Fermi
Questions” to break down any uncertain quantity
(and was the first to estimate the yield of the first
atom bomb)
• Emily Rosa – the 11 yr old who was published in
JAMA (youngest author ever) for her experiment
that debunked “therapeutic touch”
Copyright HDR 2007
[email protected]
7
Three Illusions of Intangibles
(The “howtomeasureanything.com” approach)
• The perceived impossibility of
measurement is an illusion caused by not
understanding:
– the Concept of measurement
– the Object of measurement
– the Methods of measurement
• See my “Everything is Measurable”
article in CIO Magazine (go to “articles”
link on www.hubbardresearch.com
Copyright HDR 2007
[email protected]
8
An Approach
•
•
•
•
Model what you know now
Compute the value of additional
information
Where economically justified, conduct
observations that reduce uncertainty
Update the model and optimize the
decision
Copyright HDR 2007
[email protected]
9
Uncertainty, Risk & Measurement
•
•
•
Measuring Uncertainty, Risk and the Value of Information are closely
related concepts, important measurements themselves, and
precursors to most other measurements
The “Measurement Theory” definition of measurement: “A
measurement is an observation that results in information
(reduction of uncertainty) about a quantity.”
We model uncertainty statistically – with Monte Carlo simulations
Copyright HDR 2007
[email protected]
10
A Few of My Examples
•
•
•
•
•
•
•
•
•
•
•
Risk of IT
The value of better information
The value of better security
Forecasting the demand for space tourism
Forecasting fuel for Marines in the battlefield
Measuring the effectiveness of combat training to reduce
roadside bomb/IED casualties
The Risk of obsolescence
The value of a human life
The value of saving an endangered species
The value of public health
The value of IQ points lost by children exposed to MethylMercury
Copyright HDR 2007
[email protected]
11
Calibrated Estimates
• Decades of studies show that most managers are
statistically “overconfident” when assessing their
own uncertainty
• Studies also show that measuring your own
uncertainty about a quantity is a general skill that
can be taught with a measurable improvement
• Training can “calibrate” people so that of all the
times they say they are 90% confident, they will be
right 90% of the time
Copyright HDR 2007
[email protected]
12
1997 Calibration Experiment
16 IT Industry Analysts and 16 CIO’s , the analysts were calibrated
In January 1997, they were asked To Predict 20 IT Industry events
Example: Steve Jobs will be CEO of Apple again, by Aug 8, 1997 - True or False?
100%
17
Percent Correct
•
•
•
90%
45
80%
70%
“Ideal” Confidence
21
Statistical Error
65
68
60%
Giga Clients
152
21
75
71
65
58
Giga Analysts
50%
40%
99 # of Responses
25
30%
50%
60%
70%
80%
90% 100%
Assessed Chance Of Being Correct
Source: Hubbard Decision Research
Copyright HDR 2007
[email protected]
13
The Value of Information
z
z
 z

EVI   p(ri ) max V1, j p( j | ri ),V2 , j p( j | ri ),... Vl , j p( j | ri ),  EV *
i 1
j 1
j 1
 j 1

k
I use macro in Excel for this formula. In the book, I made a
simple table that can estimate it with some simple multiplication
What it means:
1. Information reduces
uncertainty
2. Reduced uncertainty
improves decisions
3. Improved decisions have
observable consequences
with measurable value
0.5
0.4
0.3
0.2
0.1
0
-0.1
-0.2
-0.3
-0.4
-0.5
100
100
80
80
60
60
40
40
20
10
6 4
1 0.8 0.6 0.4 2 1
0.2
10
0.1
0.05
20
10
8
6
4
10
8
2
1
0.8
0.6
1
0.4
0.2
0.1 0.08
.01
0.06
0.04
Copyright HDR 2007
[email protected]
14
Next Step: Observations
• Now that we know what to measure, we
can think of observations that would
reduce uncertainty
• The value of the information limits what
methods we should use, but we have a
variety of methods available
• Take the “Nike Method”: Just Do It – don’t
let imagined difficulties get in the way of
starting observations
Copyright HDR 2007
[email protected]
15
Some Useful Suggestions
•
•
•
•
It has been done before
You have more data than you think
You need less data than you think
It is more economical than you think
Copyright HDR 2007
[email protected]
16
The “Math-less” Statistics Table
• Measurement is based on
observation and most
observations are just
samples
• Reducing your uncertainty
with random samples is not
made intuitive in most
statistics texts
• This table makes computing
a 90% confidence interval
easy
Copyright HDR 2007
[email protected]
17
Measuring to the Threshold
Number Sampled
Chance the Median is Below the Threshold
• Measurements have
value usually
because there is
some point where
the quantity makes a
difference
• Its often much
harder to ask “How
much is X” than “Is X
enough”
2
4
6
8 10 12 14 16 18 20
2
3
4
50%
40%
30%
20%
10%
5%
2%
1%
0.5%
0.2%
0.1%
0
1
5
6
7
8
9
10
Samples Below Threshold
Copyright HDR 2007
[email protected]
18
Statistics Goes to War
• Several clever sampling methods exist that can
measure more with less data than you might
think
• Examples: estimating the population of fish in
the ocean, estimating the number of tanks
created by the Germans in WWII, extremely
small samples, etc.
Copyright HDR 2007
[email protected]
19
Reducing Inconsistency
•
•
•
The “Lens Model” is another method used to improve on expert intuition
The chart shows the reduction in error from this method on intuitive estimates
In every case, this method equaled or bettered the judgment of experts
IT Portfolio Priorities
My Studies
Battlefield Fuel Forecasts
Student ratings of teaching effectiveness
Cancer patient life-expectancy
Psychology course grades
Graduate students grades
Changes in stock prices
IQ scores using Rorschach tests
Mental illness using personality tests
Business failures using financial ratios
Life-insurance salesrep performance
Source: Hubbard Decision Research
0%
10%
20%
30%
40%
Reduction in Errors
Copyright HDR 2007
[email protected]
20
The Simplest Method
• Bayesian methods in statistics use new
information to update prior knowledge
• Bayesian methods can be even more
elaborate that other statistical methods
BUT…
• It turns out that calibrated people are
already mostly “instinctively Bayesian”
Copyright HDR 2007
[email protected]
21
Comparison of Methods
Gullible
Overconfident (Stated
uncertainty is lower than
rational)
NonBayesian
Statistics
Under-confident (Stated
uncertainty is higher than
rational)
Typical
Un-calibrated
Estimator
Bayesian
Calibrated
Estimator
Vacillating,
Indecisive
Ignores Prior Knowledge;
Emphasizes new data
Copyright HDR 2007
[email protected]
Stubborn
Calibrated
Skeptic
Cautious
Ignores New data;
Emphasizes Prior
Knowledge
22
Risk/ROI w/ “Monte Carlo”
•
•
A Monte Carlo simulation generates
thousands of random scenarios using
the defined probabilities and ranges
The result is a range ROI not a point
ROI
Administrative Cost
Reduction
5%
10%
Customer Retention
Increase
10%
20%
30%
$4 million
$6 million
15%
Total Project Cost
$2 million
ROI
-50%
0%
Copyright HDR 2007
[email protected]
50%
100%
23
Quantifying Risk Aversion
The simplest element of the Nobel Prize-winning method “Modern Portfolio
Theory” is documenting how much risk an investor accepts for a given return
Acceptable Risk/Return
Boundary
Investment Region
Investment
Copyright HDR 2007
[email protected]
24
Approach Summary
Define Decision
Model
Populate Model
with Calibrated
Estimates &
Measurements
Conduct Value
of Information
Analysis (VIA)
Analyze
Remaining Risk
Calibrate
Estimators
Measure
according to VIA
results and
update model
Optimize
Decision
Copyright HDR 2007
[email protected]
25
Final Tips
• Learn how to think about uncertainty, risk and
information value in a quantitative way
• Assume its been measured before
• You have more data than you think and you
need less data than you think
• Methods that reduce your uncertainty are more
economical than many managers assume
• Don’t let “exception anxiety” cause you to avoid
any observations at all
• Just do it
Copyright HDR 2007
[email protected]
26
Questions?
Doug Hubbard
Hubbard Decision Research
[email protected]
www.hubbardresearch.com
630 858 2788
Copyright HDR 2007
[email protected]
27