Transcript Slide 1

2012 Australian Cost Conference Keynote
Cost Estimation: A Key Component of
Affordable Program Success
Dan Galorath
[email protected]
Copyright 2012 Galorath Incorporated
Key Points
Cost is a key
project
performance
parameter
Cost
Estimation
with
repeatable
process is
best
practice
Viable
affordability
decisions yield
project
achievements
2
© 2012 Copyright Galorath Incorporated
2
Using Best Practices Are A Best
Practice Themselves
•
“Method or technique that has consistently shown
results superior to those achieved with other means,
and that is used as a benchmark”
•
“Best" practice can evolve to become better as
improvements are discovered
Processes
Tools
People
© 2012 Copyright Galorath Incorporated
3
Delusions of Success: How Optimism
Undermines Executives' Decisions
Richard Hartley, HBR)
(Source:
•
Problem:
Humans seem hardwired to be
optimists
• Optimism from cognitive biases &
organizational pressures
• Exaggerate talents & degree of control
• Attribute negative consequences to external factors
•
Anchoring (relying too heavily on one piece of
information) magnifies optimism
Best practice: Temper with “outside view”
• Most
pronounced for new initiatives
Supplement traditional forecasting w/ statistical
parametrics
Don’t remove optimism, but balance optimism & realism
© 2012 Copyright Galorath Incorporated
4
While Optimism Needs Tempering, So Does
Short Sightedness (Source Northrop)
© 2012 Copyright Galorath Incorporated
5
Key Points
Cost is a key
project
performance
parameter
© 2012 Copyright Galorath Incorporated
6
Cost Overruns Are Everywhere
•
“GAO: Staggering cost overruns dwarf modest
improvements in Defense acquisition”
• R&D costs of weapons programs increased 42% over
original estimates
• Average delay 22 months in delivering initial capabilities
• Evolving technical requirements
• Shortage of qualified government staff to manage
• "Every dollar of cost growth on a DoD weapon system
represents a lost opportunity to pay for another national
priority"
•
“The Collins Class submarine program: Murphy
was an optimist”
http://epress.anu.edu.au/apps/bookworm/view/Agenda,+Volume+19,+Number+1,+2012/9601/Ergas.htm
•
“BG Group shares hit by $5.4B cost overruns on
Australian Liquified Natural Gas project”
© 2012 Copyright Galorath Incorporated
7
Software Is A Key Risk Item In
Weapons Systems
•
Navy Mobile User Objective Satellite Communication
System delays to the Joint Tactical Radio
System, a set of software-defined radios causes
advanced MUOS capabilities to be drastically
underused… GAO
•
GAO identified 42 programs at risk for cost & schedule
1. military requirements changes
2. software development challenges
3. workforce issues
•
National Institute of Standards and Technology (NIST)
• Software defects cost nearly $60 Billion Annually
• 80% of development costs involve identifying and
correcting defects
Software, not Hardware or technology readiness levels
were called out
© 2012 Copyright Galorath Incorporated
8
Example: Project Cost Alone Is not
The Cost of IT Failure (Source: HBR)
•
Case Study: Levi Strauss
• $5M ERP deployment contracted
• Risks seemed small
• Difficulty interfacing with customer’s systems
• Had to shut down production
• Unable to fill orders for 3 weeks
• $192.5M charge against earnings on a $5M IT project
failure
“IT projects touch so many aspects of organization
they pose a new singular risk”
http://hbr.org/2011/09/why-your-it-project-may-be-riskier-than-you-think/ar/1
© 2012 Copyright Galorath Incorporated
9
An ROI Analysis of A New System:
Should We Fund This?
•
•
•
Can we do better?
Will stakeholders tolerate a
loss for 3 years?
What is the risk?
© 2012 Copyright Galorath Incorporated
10
Software & IT Systems Are About
Business Value
Total Ownership
Cost
Value
•
“Software economics should provide methods for analyzing the
choices software projects must make.” Leon Levy
•
Business economics should provide methods for analyzing
choices in which projects to fund
© 2012 Copyright Galorath Incorporated
11
Potentially 80% of Projects Don’t
Return Adequate Value
Most projects cost more than they return, Mercer
Consulting:
•
“When the true costs are added up, as
many as 80% of technology projects
actually cost more than they return. It
is not done intentionally but the costs
are always underestimated and the
benefits are always overestimated.”
Dosani, 2001
© 2012 Copyright Galorath Incorporated
12
IT Has Similar Failures
• Cutter Consortium Software Project Survey:
•
62% overran original schedule by more than 50%
•
64% more than 50% over budget
•
70% had critical product quality defects after release
• Standish Group CHAOS Report
• 46% challenged
• 19% failed
• 35% successful
•“ Fully one in six of the projects
we studied was a black swan,
with a cost overrun of 200%
on average, and a schedule
overrun of almost 70%”
~$875 billion spent on IT
~$300 billion spent on IT projects
~$57 billion wasted annually
© 2012 Copyright Galorath Incorporated
13
Using Gates and Refining Estimates is
a Best Practice (Adapted from K. Aguanno)
Gate 1
Great
Idea
Concept
- Describe
idea &
Possible
benefits
Gate 2
Opportunity
Analysis
Gate 3
Preliminary
Business
Case
Marketing Feasibility
Study
Analysis
-Determine
customer
acceptance
-Interview
focus groups,
etc.
- Design
solution
- Estimate
cost /
schedule
- Analyze risk
- Determine
feasibility /
ROI
Gate 4
Committed
Business
Case
Pilot or
Proof of
Concept
Achieve
Business
Case
Full Execution
or Deployment
- Validate & commit
to design &
approach
-Build solution
-Revised
estimates &
schedule
-Achieve
-business case
-Risk reduction
-Baselined plan
- Deploy
-Capture lessons
learned
-Including estimating
Causes of Project Failure
Source: POST Report on UK Government IT
Projects
1. Lack of a clear link between the project and the organisation’s key strategic
priorities, including agreed measures of success.
2. Lack of clear senior management and ministerial ownership and leadership
3. Lack of effective engagement with Stakeholders
4. Lack of skills and proven approach to project management and risk
management
5. Lack of understanding of and contact with the supply industry at senior levels
within the organisation
6. Evaluation of proposals driven by initial price rather than long-term
value for money (especially securing the delivery of business benefits)
7. Too little attention to breaking development and implementation into
manageable steps
8. Inadequate resources and skill to deliver the total delivery portfolio
Source: POST Report on UK
Government IT Projects
© 2012 Copyright Galorath Incorporated
15
US Better Buying Power Initiatives
•
•
•
June 28, 2010 Mandate
September 14, 2010 Guidance
November 3, 2010
Implementation
• Five Specific Areas of Concern:
• Target Affordability and Control
Cost Growth
• Reduce Non-Productive
Processes and Bureaucracy
• Incentivize Productivity and
Innovation in Industry
• Promote Real Competition
• Improve Tradecraft in Services
Acquisition
© 2012 Copyright Galorath Incorporated
16
Affordability Initiatives With
“Should Cost” and “Will Cost”
Will Cost
Performance
© 2012 Copyright Galorath Incorporated
-
Cost Initiatives
(Applied practices
& improvements)
=
Should Cost
Performance
17
Characteristics of a Successful
Should Cost Review (Source: AT Kearney)
© 2012 Copyright Galorath Incorporated
18
Key Points
Cost
Estimation
with
repeatable
Process is
best
practice
© 2012 Copyright Galorath Incorporated
19
An Estimate Defined
•
An estimate is the most knowledgeable statement you
can make at a particular point in time regarding:
• Effort / Cost
• Schedule
• Staffing
• Risk
• Reliability
•
•
Estimates more precise with progress
A WELL FORMED ESTIMATE IS A
DISTRIBUTION
20
© 2012 Copyright Galorath Incorporated
20
Viable Estimation Is Critical
•
Estimating is critical for all kinds of systems
• Yet many treat is as a second rate process
•
Everyone estimates…. Just most get it wrong and
don’t have a process
•
Having a repeatable estimation process is critical to
both estimating AND to successful projects
•
Estimation and measurement go hand in hand
© 2012 Copyright Galorath Incorporated
21
Estimation Methods 1 of 2
Model
Category
Description
Advantages
Limitations
Guessing
Off the cuff estimates
Quick
Can obtain any answer
desired
No Basis or substantiation
No Process
Usually Wrong
Analogy
Compare project with past
similar projects.
Estimates are based on
actual experience.
Truly similar projects must exist
Expert
Judgment
Consult with one or more
experts.
Little or no historical data
is needed; good for new or
unique projects.
Experts tend to be biased;
knowledge level is sometimes
questionable; may not be
consistent.
Top Down
Estimation
A hierarchical decomposition
of the system into
progressively smaller
components is used to
estimate the size of a
software component.
Provides an estimate
linked to requirements and
allows common libraries to
size lower level
components.
Need valid requirements.
Difficult to track architecture;
engineering bias may lead to
underestimation.
© 2012 Copyright Galorath Incorporated
22
Estimation Methods 2 of 2
Model Category
Description
Bottoms Up
Estimation
Divide the problem into
the lowest items.
Estimate each item…
sum the parts.
Complete WBS
can be verified.
Design To Cost
Uses expert judgment to
determine how much
functionality can be
provided for given
budget.
Easy to get under
stakeholder
number.
Little or no engineering basis.
Simple CER’s
Equation with one or
more unknowns that
provides cost / schedule
estimate.
Some basis in
data.
Simple relationships may not tell the
whole story.
Historical data may not tell the whole
story.
Comprehensive
Parametric Models
Perform overall estimate
using design
parameters and
mathematical
algorithms.
Models are usually
fast and easy to
use, and useful
early in a program;
they are also
objective and
repeatable.
Models can be inaccurate if not
properly calibrated and validated;
historical data may not be relevant to
new programs; optimism in parameters
may lead to underestimation.
© 2012 Copyright Galorath Incorporated
Advantages
Limitations
The whole is generally bigger than the
sum of the parts.
Costs occur in items that are not
considered in the WBS.
23
23
“Best Value” Data Needs
(adapted from Boeing Value Front)
Customer Need Priorities
(Decision Analysis)
Customer
Desirability
Attribute
Value
Models &
Methods
TOC Cost-Risk
Uncertainty
Simulation
Attribute
Value
Distribution
© 2012 Copyright Galorath Incorporated
DevProd O/S
24
Functional Focus Example:
Ladies Purse
•
•
•
Function ………………………………………... Hold stuff
Cost……………………………………………
$400 at Nordstrom
What else will perform the function?
• Paper bag - Cost = $0.05
•
Go to plastic bag for more durability
• Cost = $0.10
•
•
Add color………………………………………..Cost = $0.15
Add strap.……………………………………….Cost = $0.25
Misses one component of customer
satisfaction
© 2012 Copyright Galorath Incorporated
25
“Far Out” Higher TRL Level Estimation
Goal: Better Cost For Highly Advanced Space Missions
(15-20 Years in the Future)
Proposed Hyperspectral Imaging
Satellite predicted fielding: 2016
This capability would be of interest to:
• Military space asset planners
• Government agencies
• Commercial satellite producers
• Advanced concept designers
Critical items at less than TRL 4…
Like asking Edison in 1876 “How much
longer for the light bulb?”
•“Hard to say”
In 1879, once he had found a workable
carbon filament, “How much will a
production version of the light bulb cost to
develop and produce Tom?”
•Then a TRL 4 question
TRL
9
TRL
8
TRL
7
TRL
6
TRL
5
TRL
4
TRL
3
TRL
2
TRL
1
TRL9: Actual system “flight proven” thorough successful mission operations
TRL8: Actual system completed and “flight qualified” through test and demonstration
Desired
capability
Impact
At TRL
TRL7: System prototype demonstration in a space environment
TRL6: System/subsystem model or prototype demonstration in a relevant environment
Impact
At TRL
TRL5: Component and/or breadboard validation in relevant environment
TRL4: Component and/or breadboard validation in laboratory environment
TRL3: Analytical and experimental critical function and/or characteristic proof-of-concept
TRL2: Technology concept and/or application formulated
Early impacts
have a much
greater impact
on the final
system
7 C
3
B
Impact
At TRL
1
A
Limits of
potential
impacts
TRL1: Basic principles observed and reported
TRL
1
© 2012 Copyright Galorath Incorporated
2
3
4
5
6
7
8
9
26
Black Swans (Unknowns)
Overrun %
The match is unlikely to ever be perfect because
some projects are affected by “unknown unknowns,”
also called Black Swans, that is, events that are
essentially unpredictable (e.g., a severe worldwide
credit squeeze)
Score
Overrun %
•
Score
Score
© 2012 Copyright Galorath Incorporated
27
Dealing With the “Problem of
Assumptions”
•
•
Assumptions are essential but…
•
Use an assumption verification process
Incorrect assumptions can drive an estimate to
uselessness
1. Identify
assumptions
2. Rank order
assumptions
based on
estimate
impact
3. Identify
high ranking
assumptions
that are risky
4. Clarify high
ranking, high
risk
assumptions
& quantify
what happens
if those
assumptions
change
5. Adjust
range of
SEER inputs
to describe
the
uncertainty in
assumptions
Estimates must have assumptions defined, but…
Bad assumptions should not be justification for
bad estimates
© 2012 Copyright Galorath Incorporated
28
System Description (Parametrics Can
Estimate More, Earlier)
Adapted from CEBOK
“If you can’t tell me what it is,
I can’t tell you what it costs.”
-Mike Jeffers
“If you can tell me the range of
what it might be, I can tell you the
range of cost, schedule &
probability.”
-Dan Galorath
© 2012 Copyright Galorath Incorporated
29
Uncertainty in the Cost Depends On
Uncertainty of the Project Itself
SEER includes
uncertainty in
its estimates
Within 10%
Even though the entire project may be highly uncertain,
tasks to the next gate should be estimatible within 10%.
© 2012 Copyright Galorath Incorporated
30
Statistician Drowns in River
with Average Depth of 3 Feet!
31
© 2012 Copyright Galorath Incorporated
31
Range vs. Point Estimates (Source
US Army)
Point estimate is most likely within range
estimate with higher potential for cost
increase
Target Cost
Actual
-3% to
+10%
Engineering
-5% to +15%
Parametric -10% to +20%
Analogy -15% to +30%
Range estimate provides a degree of risk and
uncertainty
ROM -30% to +75%
Range of Risk & Uncertainty
+75%
Technical and Program Maturity
Estimating Accuracy
Trumpet
-30%
A
Materiel
Solution
Analysis
B
Technology
Development
Pre-Systems Acquisition
© 2012 Copyright Galorath Incorporated
C
Engineering and
Manufacturing Development
Systems Acquisition
Production &
Deployment
Operations &
Support
Sustainment
32
Firm Fixed
Price?
Feel lucky?
What is likely
to happen
Understand the risk before you commit!
33
© 2012 Copyright Galorath Incorporated
33
Dealing With Early Estimates
•
Give a range: but they will hear the bottom of the
range
•
Give a high probability number:
• Will still be low in some cases and may be high in many
cases but consider it a probably “not to exceed”
• Sticker shock may be a problem
•
Give a category rather than an number: e.g over
$10m Cat 1; over $5m Cat 2; over $1m Cat 3, etc.
Stakeholders always remember the first number
even when told it is preliminary.
Developers will be optimistic by nature
unless the process is tempered.
© 2012 Copyright Galorath Incorporated
34
Potential Accuracy at Various Gates
(Adapted From Canada Treasury)
Concept /
Approach
•+- 10% to
next gate
•+_50 with
Analogies &
kbases
Business
Case
•+_ 40%
•Analogies,
Kbases & some
specifics
Project
Charter &
Plan
•+_25%
Detailed Plan
& functional
specs
Construction
/
Deployment
•+_ 15%
•+- 0%
•SEER +_10% if
good inputs
PostDeployment
•+-0%
•NOTE Should
understand if full
functionality was
delivered or if
schedule/cost
relief based on
deferred
functionality
•While Galorath
states SEER is
Within 10%,
many
organizations
report much
closer
Estimate Accuracy Is a Function of
Input Information Quality.
Estimates Can Be Much Closer than Shown
IF Data Is Available.
http://www.tbs-sct.gc.ca/itp-pti/pog-spg/irh-mei/irh-mei03-eng.asp
© 2012 Copyright Galorath Incorporated
35
Converting Uncertainty to Risk Is
A Best Practice
•
Many treat as synonymous but…
• In risk situations, probabilities assigned based on data
• In uncertainty situations, we may assign probabilities
• But there is no data to back them up
•
If probability of rain tomorrow is 50%, that’s a risk
• We have historical data & scientific analyses about rain
which make it reasonable to estimate the probability
•
If we say Probability of success in harnessing nuclear
fusion for routine energy production within the next
ten years is 50%, that’s just a guess about
uncertainty—evidence for the assignment is lacking
or very weak
If you toss this
thumbtack, what is
the probability it will
land this way instead
of on its back?
© 2012 Copyright Galorath Incorporated
36
GAO Publication: Characteristics of credible
cost estimates and a reliable process for
creating them
•
•
This chapter discusses a 1972 GAO report on cost estimating
•
We reported that cost estimates were understated and causing unexpected cost growth
•
Many of the factors causing this problem are still relevant today
We also discuss a 12 step process for producing high quality cost estimates
© 2012 Copyright Galorath Incorporated
37
Generalized 10 Step System Estimation Process
2011
1.
2.
Establish
Estimate Scope
10.
Establish Technical
Baseline, Ground
Rules, Assumptions
9.
8.
4.
Refine Technical
Baseline Into
Estimable Components
4.
6.
Collect data /
estimation inputs
6.
5.
© 2012 Copyright Galorath Incorporated
Track Project
Throughout
Development
Document Estimates
and Lessons
Learned
Generate a
Project Plan
Validate Business
Case Costs &
Benefits (go / no
go)
Quantify Risks
and Risk Analysis
Estimate Baseline Cost,
Schedule, Affordability Value
38
Contractors At Least Level 3 Would Be
Acceptable
Level
0
Informal or no
estimating
Manual effort
estimating
without a
process
Level
1
Direct Task
Estimation
Spreadsheets
Level
2
Formal
Sizing
(e.g.
function
points)
Direct
Task
Estimation
Ad Hoc
Process
Simple model
(Size *
Productivity)
or informal
SEER Use
Some
measureme
nt &
analysis
Informal
Process
Level
3
Formal
Sizing
Level
4
Formal sizing
Repeatable
process
Robust
parametric
estimating
(SEER)
Rigorous
measurement
& analysis
Parametric
estimation
with tracking
& control
Risk
Management
Process
improvement
via lessons
learned
Level
5
Formal sizing
Repeatable
process
Robust
parametric
estimating
(SEER)
Rigorous
measurement
& analysis
Parametric
estimation
with tracking
& control
Risk
Management
Continuous
process
improvement
Robust
Parametric
estimation
(SEER)
Estimate vs.
actual capture
Formalized
Multiple
Estimate
Process
Rigorous
measurement
& analysis
Parametric
planning &
Control
Risk
Management
Repeatable
process
Why should we care? Maturity is related to estimate
viability… Better estimation process more likely to
be successful in execution
© 2012 Copyright Galorath Incorporated
39
Key Points
Viable
Affordability
decisions yield
project
achievements
40
© 2012 Copyright Galorath Incorporated
40
Affordability Process
Step 1.
Procure Key
Performance Parameters
that are inviolate
Step 8. Perform
Probabilistic Risk Analysis
Step 9. Assess Alternatives
& Select Optimal
Alternative
Step 2. Identify
Affordability Goals &
Figures of Merit
(development, life cycle,
payback, ROI, NPV, kill
ratio, Budget constraints,
etc.)
Step 7. Assess Benefits
Based on Figures of Merit
Step 10. Document Analysis
and Lessons Learned
Step 3. Gather
Requirements, Features,
Performance
Step 6. Perform Cost
Schedule Analysis of Each
Alternative
Step 4. Define Baseline
Alternatives
Step 5. Perform Technical
Design Analysis for Each
Alternative
© 2012 Copyright Galorath Incorporated
41
Should Cost: Trade Study Flow
Requirements And Features
Analysis
Preliminary
Design
Process
Alternative 1
Alternative 2
Alternative 3
Alternative 4
Other Factors
•Human Factors
•Security
•Reliability
•Availability
•Survivability
•Supportability
•Testability
•Producibility
•Reuse
•Transportability
Selection
Process
4
3
2
SEER
Assessment
•Performance
•Schedule
•Risk Assessment
•Life Cycle Cost
Iterate
5
1
No
OK?
Optimized?
Yes
Cost
Performance
Schedule
Risk
Bottoms Up Estimation
as Required
© 2012 Copyright Galorath Incorporated
42
Manual Estimates: Human Reasons For
Error (Metrics Can Help)
•
Manual Task estimates yield SIGNIFICANT
error without ranges
•
Desire for “credibility” motivates
overestimate behavior (80% probability?)
• So must spend all the time to be “reliable”
• Best practice approach force 50% probability
& have “buffer” for overruns
•
Technical pride sometimes causes underestimates
© 2012 Copyright Galorath
43
Incorporated
43
Balancing Resources & Schedule Is
A Best Practice
For a given Size, Complexity and Technology
Work Expands
To Fill Time
(Effort Increases
Minimum Time
To Complete
(Effort Increases
to Reduce Schedule)
Effort Months
Minimum Time
due to lack of pressure
)
Effort Increase
due to Longer
Schedule
Optimal Effort
(Lower Effort
for Longer
Schedule)
Calendar Time
© 2012 Copyright Galorath Incorporated
44
Understand Project Risks Include Them In Planning
Decisions (Example SEER-SEM Outputs)
Probability
99%
90%
80%
70%
60%
50%
40%
30%
20%
10%
1%
0
Schedule Probability
Example Application 1
4
8
12
16
20
Probability
99%
90%
80%
70%
60%
50%
40%
30%
20%
10%
1%
0
Effort Probability
Example Application 1
1800
Time (calendar months)
3600
5400
7200
9000
Effort (person-hours)
Probability
99%
90%
80%
70%
60%
50%
40%
30%
20%
10%
1%
0
Defects Probability
Example Application 1
12
24
36
48
60
Defects (count)
•
© 2012 Copyright Galorath Incorporated
45
Understanding & Tracking Defects,
Growth And Other Metrics
Health and Status Indicator
shows status and trends from
the previous snapshot
Track defect
discovery and
removal
rates against
expected
rates
•Including Size Growth and Defect
Discovery/Removal Rate
•User defined control limits to
control the transition between
red-yellow-green
Increased defect
reporting rate
shows a
worsening trend
Track
software size
growth
© 2012 Copyright Galorath Incorporated
46
Goal Question Metric Approach
Best Practice
Goal
Organizational
Goal
Question
Development
Contractors
Organizations
Metric
•
Combine goal-orientation bottoms up, decision-support
& other operational management techniques
• www.weather.com to decide to bring an umbrella is
decision support
© 2012 Copyright Galorath Incorporated
47
Reasons Many Don’t Want To
Provide Data
•
They could be proven wrong
•
It could be used against them
•
Data often doesn’t exist
• Even if processes dictate data requirements
•
If it exists, it may not be clean
•
It may give away corporate productivity & bid strategy
© 2012 Copyright Galorath Incorporated
48
Data Must Be Used With Caution
•
Run sanity checks on data
• A million lines of code can’t be developed in 3 months
•
Ongoing issue between our statisticians and
engineers
•
Some Statisticians claim.. “That is what the data
says, so it must be right”
• Sometimes even if it is obviously wrong
© 2012 Copyright Galorath Incorporated
49
Data Doesn’t Have To Be Perfect To Be
Useful: But Is Has To Be Viable
•
•
•
80 Calories per serving
2.5 Servings per can
4 Ounces, Condensed, 8 Ounces With Water
© 2012 Copyright Galorath Incorporated
50
You Have An Estimate …
Now What?
© 2012 Copyright Galorath Incorporated
51
The Error of Causal Analysis
Creating a False Association
•
Correlation does not imply causation
• Just because two data points may sit side by side
doesn’t mean they are the same or will have the same
outcome
•
Casual analysis is a recognized error in medicine
Tumor Can Cause
Headache
Perhaps ???
Headache doesn’t mean a
tumor
© 2012 Copyright Galorath Incorporated
52
Use Historical Measurement to
Evaluate Your Estimate!
It’s easy to dig deeper and deeper to justify an estimate!
© 2012 Copyright Galorath Incorporated
53
Estimation Best Practices
•
•
Decide Why You Want An Estimate
•
•
Have A Documented, Repeatable Estimation Process
•
Be Proactive: The Process Is Important, The Tools Go
Along With The Process
•
•
Get Buy-in From Program Managers
•
Tie The Estimate To The Plan
Map Estimation Goals To Estimate Process Maturity &
Develop Plan To Achieve The Maturity
Make The Estimating Process As Simple As Possible;
But No Simpler
Hold People Accountable: Center Of Excellence Can
Prepare Estimate But Program Managers Must Own
Them
© 2012 Copyright Galorath Incorporated
54
Estimation Best Practices 2
•
•
•
•
Evaluate Total Ownership Cost; Not Just Development
•
•
•
Keep A History: Start An Enterprise Database NOW…
Estimate A Range And Pick A Point For The Plan
Re-estimate The Program When It Changes
Avoid Death Marches: Programs With Unachievable
Schedules Are Likely To Fail And Drain Morale
Business Case: Evaluate ROI In Addition To Costs
Convert Expert Spreadsheets Into A Common
Language
© 2012 Copyright Galorath Incorporated
55
Estimation Best Practices 3
•
•
Track Progress Vs. Estimate Throughout The Life Cycle
•
•
Tie The Business Case Into The Estimating Process
Estimate Schedule As Well As Effort (Cost) For
Complete Picture
Attack Non-productive Rework As Part Of The Process
© 2012 Copyright Galorath Incorporated
56
Estimation Best Practices 4
•
Have clear definitions: What does “complete” mean?
What activities are included and excluded (E.g.
development only or total ownership; help desk
included or excluded, etc.) Which labor categories
are included and excluded in the estimate (e.g. are
managers included? Help desk? Etc.)
•
•
•
•
Measure what you care about
Estimating & tracking rework can help control costs
Don’t ignore IT infrastructure and IT services costs
Tracking defect sources can go along with the
process
© 2012 Copyright Galorath Incorporated
57
Conclusions
•
Cost estimation and analysis are VITAL core
processes
•
Best practices ferret out what the cost is REALLY
anticipated to be
•
•
Risk and uncertainty must be taken into account
•
Applying affordability analysis to the business case
yields the best value
Best practice project management understands the
difference and acts to reduce uncertainty, or convert
it to risk
© 2012 Copyright Galorath Incorporated
58
Additional Information
•
•
•
•
www.galorath.com
Dan on estimating BLOG: www.galorath.com/wp
Email: [email protected]
Phone: +1 310 414-3222 x614
© 2012 Copyright Galorath Incorporated
59