Transcript Slide 1

Industrial Committee on Program Management (ICPM)
Predictive Measures of
Program Performance
June 9, 2008
Copyright © 2008 Industrial Committee on Program Management. All rights reserved.
Members



















BAE Systems
Battelle
Bell Helicopter
Boeing
Boeing
Honeywell
Honeywell
Lockheed Martin
Lockheed Martin
MCR
Northrop Grumman
Northrop Grumman
Raytheon
Rockwell Collins
SI International
Army
NAVAIR
NAVAIR
USAF
Susan Dong
Bill Altman
Bob Kenney
Steve Goo
Randy Steeno
Keith Munson
Tracie Thompson
Peter Wynne
Steve Stern
Neil Albert
Marilyn McAlice
Kevin Carpenter
Skip Burns
Ron Hornish
Bill Chadick
Craig Tallman
Dave Burgess
Ted Rogers
LtCol Fred Gregory
Copyright © 2008 Industrial Committee on Program Management. All rights reserved.
2
Agenda
 Recap from last NDIA Meeting
 Common Definitions
 Plan and Next Steps
 Help Needed
Copyright © 2008 Industrial Committee on Program Management. All rights reserved.
3
Recap from NDIA Meeting
Copyright © 2008 Industrial Committee on Program Management. All rights reserved.
4
Creation Date:
Revision Date:
METRIC DEFINITION
Metric Name:
05/16/08
06/05/08
Metric Backup
Cost Performance Index (CPI)
Unsatisfactory
Marginal
Acceptable
Excellent
3- or 6-month Rolling Average CPI
TCPI
CUM CPI
108
Purpose of Chart:
Compare the value of work performed to
the actual cost of that work to determine
how efficiently work was performed –
“bang for the buck” as one element to
assess program progress
106
Deployment Criteria / Linkage :
NDIA Industry Standard Metric. Links to
SPI, CPI-TCPI
104
102
100
Source of Data:
Program Management, Program Control
Finance
98
96
94
Benchmark/Comparative Data:
None at this time
92
90
88
Sep Oct Nov Dec Jan Feb Mar AprilMay Jun Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Jan Feb
'06
'07
'08
Metric Definition:
This metric is the ratio of budgeted cost of work performed (BCWP) divided by actual cost of
work performed (ACWP).
BCWP / ACWP = % Comp * total BCWS / ACWP
[(% Comp * total budget hours) / actual hours]
Rating = 1.00: On-cost; CPI < 1.00: Negative cost; CPI > 1.00: Positive cost
Blue: CPI > 1.00; Green: 0.95 < CPI < 1.00; Yellow: 0.90 < CPI < 0.95; Red: CPI < 0.90
Applicable Life Cycle Phase:
All (Development, Production,
Sustainment)
Predictive / Leading Qualities:
When trended, indicates patterns that
can be used to project an outcome based
on time phased actual performance
Warning Signs & Actions to Take:
When rolling average less than 0.95,
identify root cause and take corrective
action
Creation Date:
Revision Date:
METRIC DEFINITION
Metric Name:
Metric Backup
Schedule Performance Index (SPI)
106
Unsatisfactory
Acceptable
3- or 6-month Rolling Average SPI
05/16/08
06/05/08
Marginal
Excellent
CUM SPI
Purpose of Chart:
Compare the value of work performed to
the value of work scheduled
Deployment Criteria / Linkage :
NDIA Industry Standard Metric. Links to
CPI
104
102
Source of Data:
Program Management, Program Control
Finance
100
98
96
Benchmark/Comparative Data:
None at this time
94
Applicable Life Cycle Phase:
All (Development, Production,
Sustainment)
92
90
88
Sep Oct NovDec Jan Feb MarAprilMayJun Jul AugSep Oct NovDec Jan Feb Mar Apr MayJun Jul AugSep Oct NovDec Jan Feb
'06
'07
'08
Metric Definition:
This metric is the ratio of budgeted cost of work performed (BCWP) divided by budgeted cost of
work scheduled (BCWS).
BCWP / BCWS = % Comp * total BCWS / BCWS
[(% Comp * total budget hours) / budget hours]
Rating = 1.00: On-schedule; SPI < 1.00: Behind schedule; SPI > 1.00: Ahead of schedule
Blue: SPI > 1.00; Green: 0.95 < SPI < 1.00; Yellow: 0.90 < SPI < 0.95; Red: SPI < 0.90
Note: SPI does not guarantee that the work completed is the same work that was planned and
should be used with detailed milestone metrics.
Usage Assumption:
SPI should be used in conjunction with
detailed schedule milestone metrics (late
start, late finish, etc.)
Predictive / Leading Qualities:
When trended, indicates patterns that
can be used to project an outcome based
on time phased actual performance
Warning Signs & Actions to Take:
When rolling average less than 0.95,
identify root cause and take corrective
action
Creation Date:
Revision Date:
METRIC DEFINITION
05/16/08
06/05/08
Metric Backup
Metric Name:
Purpose of Chart:
Measure the estimate of total cost for
authorized work (EAC) including actual costs
plus estimated costs to complete
Estimate at Completion (EAC)
NORTHROP GRUMMAN PRIVATE/PROPRIETARY
– LEVEL 1
Stat EAC vs. EAC
Deployment Criteria / Linkage:
NDIA Industry Standard Metric.
Links to SPI, CPI, Average Cost, Burn Rate
400
IPT EAC
Source of Data: Finance
Cum CPI EAC
375
$ in Millions
6 Mos EAC
Benchmark/Comparative Data:
None at this time
350
Applicable Life Cycle Phase:
All (Development, Production, Sustainment)
325
300
Jul-05
Aug-05
Sep-05
NORTHROP GRUMMAN PRIVATE/PROPRIETARY
Oct-05
Nov-05
Dec-05
– LEVEL 1
Metric Definition:
Management's assessment of probable outcomes based on current relevant information (i.e.
performance to date), realistic plans and assumptions, cost quantification of risks and
opportunities (e.g. cost to mitigate risks and opportunities with cost reductions),
comparison/usage of available management reserve, sound management judgment, and the
proper inclusion of financial assessments and assumed liabilities.
Compares Program “Grass Roots” EAC with forecasted EAC calculated based upon past
performance. Key Attributes/Thresholds:
•IPT EAC
•CUM CPI EAC
•6-month EAC
Usage Assumption:
EAC should be used in conjunction with other
metrics: 1) schedule completion, 2)
performance to date, 3) remaining work and
its anticipated performance, 4) rates, 5)
outstanding commitments, 6) approved
and/or pending scope changes, 7) funding
constraints, 8) subcontractor EACs, and 9)
program risks and opportunities
Predictive / Leading Qualities:
When trended, provides a comparative
analysis (objective indicator) of projected
outcomes based upon actual performance
Warning Signs & Actions to Take:
If Program EAC is less than CPI-derived
EAC, understand the differences; if not
rationalized, take management reserve
against EAC to align with statistical EAC.
Plan & Next Steps
 Estimate at Completion (EAC)
 Cost Performance Index (CPI)
 Schedule Performance Index (SPI)
3/17
4/30
Define Relevant
Predictive
Metrics
Determine
Metrics
5/23
No
5/16
Determine Alarms,
Warning Signs &
Actions to Take
6/9
6/6
Coordinate
Metrics with
Working Group
Finalize Metric
Definitions
Propose Metric
Definitions
to NDIA
Approved?
Analyze
Aggregate
Relationship(s)
Select Next
Metrics
Yes
Copyright © 2008 Industrial Committee on Program Management. All rights reserved.
8
Help Needed
 None
Copyright © 2008 Industrial Committee on Program Management. All rights reserved.
9
BACKUP
Copyright © 2008 Industrial Committee on Program Management. All rights reserved.
10
Charter & Objective – Program Measures & EVM
Charter
 Facilitate the use of predictive measures including earned value to
ensure program success
 Meeting frequency
 Period of performance
Objective
 Communicate a set of predictive measures that will help contractors
and their government counterparts predict program performance (early
enough to make corrective action effective) and understand root
causes of performance
– Predictive measures that cover the program’s lifecycle from preaward through contract close-out
– Predictive measures that can be tailored to the contract
characteristic, contract type, and phase of the program
 Recommend joint next steps
Copyright © 2008 Industrial Committee on Program Management. All rights reserved.
11
Commonly Measured Predictive Metrics
Metric Category
Examples
EVM Cost Performance
CPI vs TCPI
Estimate at Completion (EAC)
IEAC vs Program EAC
Staffing (Critical Skills)
Open Requisition/Aging, Fill Rate
Risks and Opportunities
Risk Burndown & Opportunity Capture integrated with MR;
MR Burndown, %ETC, Risk Register
Schedule (IMS) Quality & Performance
SPI, BEI, CPLI, Float, Late Starts/Finishes
Requirements Definition & Stability
Added, Changed, Deleted Requirements, Volatility, Validation,
TBD Burndown
Technical Performance Measures
SLAs, TPP, KPP
Contract Health
UCAs, Volume, Traffic, Requests, Funding Profile
Supply Chain Performance
Cost, Schedule, Quality, Delivery Performance, Process
Compliance
30/60/90 Day Look Ahead
Significant Upcoming Milestones/Events Comparison
Top Issues & Action Items
Corrective / Preventive, Latency,
Product Quality
Defect Rate (Actual vs Predicted), Timeliness of delivery
Program Financial Summary
Billings, Expenditure Profile, Funding
Resources
Infrastructure, Availability of PP&E, GFE Availability & Quality
Customer Satisfaction
CPAR, BPAR, Award Fee, Survey
Productivity Variance
Plan vs Actual Component per Unit
Copyright © 2008 Industrial Committee on Program Management. All rights reserved.
12