TTC Cost & Schedule Control Training

Download Report

Transcript TTC Cost & Schedule Control Training

Earned Value Management
as a measure of
“Quality Technical
Accomplishment”
Joseph Houser
Dan Milano
Paul Solomon
January 2009
Most descriptions of EVM include a measure
of technical progress
Objective:
Review the current expectations of EVM to
provide management a measure of quality
technical accomplishments and progress
•
•
•
•
•
2
Current environment
Review OMB and DoD Policy and Guides
Review the ANSI/EIA 748 EVM Standard
Identify gaps
Next steps
Earned Value Management has matured for the past
40+ years with several success stories
• 1967 – DoD - Cost/Schedule Control Systems Criteria
• 1996 – OMB Circular A-11, Part 3
• 1997 – DoD - Earned Value Management Systems Criteria
• 1998 – ANSI/EIA-748 EVM Standard (Comm’l)
• 2002 – OMB Circular A-11, Part 7 (requires 748 compliance – all Agencies)
• 2002 – ANSI/EIA-748-A
• 2004 – NDIA EVMS Intent Guide
• 2005 – PMI EVMS Practice Standard
• 2006 ANSI/EIA 748 (update to recognize Program Level EVM)
•))
)
EVM has matured over the years and the Government accepts
and endorses ANSI/EIA 748 EVM Standard
3
MATURE PM PROCESSES AND
PRACTICES USING EVM IMPROVES BUSINESS MEASURES
+15%
PROGRAM MANAGEMENT CAPABILITY VS.
PROGRAM CPARS RATING
COMPANY RETURN ON SALES
Return on Sales (%)
+10%
Best
+5%
Mid-Point
- 5%
- 10%
Worst
-15%
Minimal
Capability
Marginal
Performer
Qualified
Participant
Best In Class
Minimal
Capability
Composite
World Class
Program Management Capability
Composite
World Class
COMPANY WIN RATE VS.
PROGRAM MANAGEMENT CAPABILITY
PROGRAM AWARD FEE CAPTURE
+ 8%
+ 30%
+ 6%
3yr Average Win Rate
Percent Award Fee Capture
Marginal
Qualified
Best In Class
Performer
Participant
Program Management Capability
+ 4%
+ 2%
Mid-Point
- 2%
- 4%
- 6%
+ 20%
+ 10%
Mid-Point
- 10%
- 20%
- 30%
- 8%
Minimal
Capability
Marginal
Performer
Qualified
Performer
Best in Class
Program Management Capability
Composite
World Class
Minimal
Capability
Marginal
Performer
Qualified
Participant
Best In Class
Program Management Capability
Source: 00-Mar 21 DCMC Conference
4
Composite
World Class
Improved cost and schedule control processes and
practices do not have to increase PM costs
Program Management Capability
Composite
World
Class
Best in
Class
Qualified
Participant
Marginal
Performer
Minimal
Capability
-30%
-20%
-10%
MID-POINT
+10%
+20%
+30%
Program Office FTEs as % of Total Program FTEs
FTE — Full Time Equivalent
Program Manager(s), Deputy Program Manager(s), Financial Manager(s)/Financial Analyst(s), Scheduler(s)/Planner(s),
Configuration and Data Manager(s), Chief Engineer(s)/Chief Technical Specialists, IPT or Functional Team Leads, Risk
Focal Point(s), Subcontract Management, Administrative Support, Other Program Office functions
5
FAA “Cost of EVM” study indicated programs with
mature EVM incur less PM costs
Cost of PM Using EVM (FY06)
FAA Cost of Program
Management Using EVM
PMof
% of
Total Costs
PM %
Total
Cost
34.5%
27.5%
▪
22.2% Avg
20.5%
▪
16.6% Avg
▪
13.0% Avg
13.5%
6.5%
0.7
RED
1.2
1.7
YELLOW
2.2
Quality of EVM Implementation
Quality of EVM Implementation
2.7
▪
GREEN
3.2
= Wg’t Avg PM %
- Quality of EVM implementation based on EVM assessments (FAA EVM Flag)
- PM% of total cost based on FY06 Resource Planning Document (RPD)
6
The use of EVM has several success stories with the Government and
industry striving to increase EVM success stories
Cost of PM Using EVM (FY06)
FAA Cost of Program
Management Using EVM
PMof
% of
Total Costs
PM %
Total
Cost
34.5%
27.5%
▪
22.2% Avg
20.5%
▪
16.6% Avg
▪
13.0% Avg
13.5%
6.5%
0.7
RED
1.2
1.7
YELLOW
2.2
Quality of EVM Implementation
Quality of EVM Implementation
2.7
▪
GREEN
3.2
= Wg’t Avg PM %
- Quality of EVM implementation based on EVM assessments (FAA EVM Flag)
- PM% of total cost based on FY06 Resource Planning Document (RPD)
7
Most EVM training include integration of
technical / schedule / costs
“All programs have an element of risk requiring effective and integrated cost
/ schedule management processes.”
Risk
Management
Cost
Low cost
8
OMB requires Quality measurement during
Acquisition of Capital Assets
Circular No. A-11, Section 300,
Planning, Budgeting, Acquisition and Management of
Capital Assets, Section 300-5
• Performance-based acquisition management
• Based on EVMS standard
• Measure progress towards milestones
• Cost
• Capability to meet specified requirements
• Timeliness
• Quality
9
PMI PMBOK® Guide recognizes Product Scope and
quality/technical parameters
10.5.1.1 Project Management Plan
• PMB:
- Typically integrates scope, schedule, and cost
parameters of a project
- May also include technical and quality parameters
5. Project Scope Management, 2 elements
• Product scope. The features and functions that
characterize a product, service, or result
• Project scope. The work that needs to be accomplished
to deliver a product, service, or result with the specified
features and functions.
It can be argued that project management plans should
always include technical and quality parameters
10
GAO Expects EVM to measure technical progress
GAO Cost Guide:
“Reliable EVM data usually indicate monthly how well a program is
performing in terms of cost, schedule, and technical matters.”
“A WBS is an essential part of EVM cost, schedule, and technical
monitoring, because it provides a consistent framework from
which to measure actual progress.”
“The benefits of using EVM are singularly dependent on the data
from the EVM system. Organizations must be able to evaluate the
quality of an EVM system in order to determine the extent to which
the cost, schedule, and technical performance data can be relied
on for program management purposes.”
11
Management expectation for EVM to include measures of
quality technical progress is reasonable
GAO Cost
Guide
PMI PMBOK
10.5.1.1 Project Management Plan
5. Project Scope Management
12
OSD “the left bar chart illustrates the fact that roughly half of our
key Earned Value data fields are empty, for a variety of reasons”
•Funding Data Quality: Acceptable for Critical Measurements and Decision-making
•EVM Data Quality: Unacceptable for Critical Measurements and Decision-making
13
OSD “the left bar chart illustrates the fact that roughly half of our
key Earned Value data fields are empty, for a variety of reasons”
14
GAO recent report included poor quality data
finding on a major procurement
March 2008:
• DCMA determined that the data as being of poor quality
and issued a report stating that it is deficient to the point
where the government is not obtaining useful program
performance data to manage risks.
15
The EVM community needs to conduct a root cause analysis
with corrective action to regain our customers confidence
GAO March 2008:
•
DCMA determined that the data as being of poor quality and issued a report stating
that it is deficient to the point where the government is not obtaining useful program
performance data to manage risks.
DCMA has significantly increased oversight with the
intent to improve the usefulness of EVM to management
16
Let’s summarize,
• EVM works with numerous success stories
• Some implementations go beyond the ANSI 32
Guidelines and have integrated quality and
technical parameters
• Integration of scope, schedule, cost, quality, and
technical measures is “Desired by our stakeholders
using EVM data”
• EVM data integrity is a major issue with OSD
17
DoD Policy and Guides Specify Technical Performance
• DoDI 5000.02, Operation of the Defense Acquisition System (POL)
• Defense Acquisition Guidebook (DAG)
• Systems Engineering Plan (SEP) Preparation Guide
• WBS Handbook, Mil-HDBK-881A (WBS)
• Integrated Master Plan & Integrated Master Schedule Preparation
& Use Guide (IMP/IMS)
• Guide for Integrating SE into DOD Acquisition Contracts (Integ
SE)
18
DoD Policy and Guides: Common Elements
• DAG, WBS, IMP/IMS, SEP
• Integrated Plans
•
•
•
WBS, SEP, IMP/IMS
Technical Performance Measures (TPM)
EVM
• Technical reviews
•
•
•
Event-driven timing
Success criteria
Assess technical maturity
• Integ SE Guide
•
•
19
Include technical baselines in IMP/IMS
During IBR, review:
• Correlation of TPMs, IMP, IMS, EVM
• Success criteria
ANSI does not require technical nor quality parameter
measurement, only the “quantity of work accomplished”
ANSI/EIA 748 EVM Standard
Paragraph 3.8 – Performance Measurement
“Earned value is a direct measurement of the quantity of work
accomplished. The quality and technical content of work performed is
controlled by other processes. Earned value is a value added metric that is
computed on the basis of the resources assigned to the completed work
scope as budget.”
EXAMPLES:
1. If a test is complete (design meets the requirements); then it is acceptable to
claim 100% earned value of the planned scope for “test”
2. If software design, code, and test is complete, then it is acceptable to claim
100% earned value of the planned scope for “SW Development”
ANSI does not require links or interfaces to quality or
technical parameter measurement processes
20
ANSI recognizes technical performance goals (but not
required) and there are no references to quality parameters
ANSI/EIA 748 EVM Standard
Paragraph 2.2
Planning, Scheduling, and Budgeting
•
•a) Schedule the authorized work in a manner which describes the sequence of
work and identifies significant task interdependencies required to meet the
requirements of the program.
•
•B) Identify physical products, milestones, technical performance goals, or
other indicators that will be used to measure progress.
Note: Technical performance goals are acceptable, but
not required
21
Let’s summarize OMB, OSD policies and ANSI/EIA 748 as
related to EVM, quality, and technical performance:
• OMB Guide
• Measure capability to meet requirements
• Measure quality
• OSD Policies and Guides
• Integrate WBS, IMP/IMS, EVM with TPMs
• Include success criteria of technical reviews in IMS
• Assess technical maturity
• ANSI/EIA 748 EVM Standard
•
•
•
•
22
EVM limited to “quantity of work accomplished”
Technical performance goals recognized, but not required
Quality performance is not referenced
Quality and Technical are specifically referenced as being
“controlled by other processes”
Is it possible to include quality and technical parameters
with EVM?
• FAA has incorporated standard program
milestones with exit criteria that represent:
• Quality and technical parameters
• Decision authority to verify acceptable quality
and technical performance progress
23
The FAA technical milestones (required by AMS policy) are
defined with clear exit criteria and decision authority
Applicability Legend:
F= Full Scale Development
C= Commercial Off The Shelf
N= Non-Development Items
Final investment decision - The JRC approves and
baselines an investment program at the final investment
decision. The decision document is the XPB and its
planning attachments
S18
AMS Phase
WBS MAPPING AND DECISION AUTHORITY
Applicability
Milestone Description
Level
Milestone
STANDARD AMS SYSTEM MILESTONES
WBS 2.2.3 : Final Investment Decision Documentation
1
Final investment
analysis
F,C,N
Integrated baseline review completed - The IBR conduct
is complete and the service team and leader agree on
action items.
S19
S24
24
F,C,N
Joint Resources Council
Service team leader
All activities associated with developing the strategy for
implementing and executing the overall program.
All activities associated with planning, authorizing, and managing
all actions and activities that must be accomplished for successful
program development, which includes preparation of the
acquisition strategy paper and the integrated program plan, and
the project-specific input to agency-level planning documents,
such as the call for estimates and the NAS architecture. It also
includes all activities required to ensure that all cost, schedule,
performance, and benefit objectives are met.
WBS 3.2.3 : Solution Development - Analysis,
Design, and Integration
2
Operational test & evaluation (OT&E) Completed:
OT&E is conducted in an environment as operationally
realistic as possible. This milestone is completed when
government integration and shakedown testing has been
performed at the test and evaluation site. It occurs when all
test procedures have been successfully completed per the
test plan.
S31
Solution
implementation
Decision Authority
All activities associated with completing the final investment
decision which includes the following documents: JRC briefing
package, revalidated MNS, final requirements document, final
XPB, final acquisition strategy, final integrated program plan (with
risk management plan), and the final investment analysis report.
WBS 3.1.1 : Solution Implementation - Program
Planning, Authorization, Management and Control
2
Preliminary design review (PDR) completed - PDR is
conducted by the service team to determine conformity of
functional characteristics of the design to baseline
requirements. The PDR represents approval to begin
detailed design. The PDR is complete when the service
team determines that action items resulting from the review
are sufficiently completed and the contracting officer
authorizes the contractor to proceed.
WBS Description
Solution
implementation
F,N
All activities associated with the overall analysis, design, test, and
integration of the solution, (e.g., hardware system, software,
facility, and telecommunications ). This includes design, integrity,
test and analysis, intra- and inter-system compatibility assurance The service team leader
(interface identification, analysis, and design), and the integration
and balancing of reliability, maintainability, producibility, safety,
and survivability. Design includes allocating functions to
appropriate elements (e.g., hardware, software,
telecommunications, user functions, services, facilities, etc.), and
presenting prepared design information at identified design
reviews.
OT&E team leader (WJHTC)
WBS 3.5.2 : System Operational Test and
Evaluation
2
Solution
implementation
F,C,N
All activities associated with tests and evaluations conducted to
assess the prospective system’s utility, operational effectiveness,
operational suitability, and logistics supportability (including
compatibility, interoperability, reliability, maintainability, logistics
requirements, security administration, etc.). It includes all support
activities (e.g., technical assistance, maintenance, labor, material,
support elements and testing spares etc.) required during this
phase of testing.
All activities associated with development and construction of
those special test facilities, test simulators, test beds, and models
required for performance of the operational tests.
The FAA AMS technical milestones are used by multiple
processes to align common performance measures
Applicability
Level
Milestone
S24
WBS 3.1.1 : Solution Implementation - Program
Planning, Authorization, Management and Control
2
Preliminary design review (PDR) completed - PDR is
conducted by the service team to determine conformity of
functional characteristics of the design to baseline
requirements. The PDR represents approval to begin
detailed design. The PDR is complete when the service
team determines that action items resulting from the review
are sufficiently completed and the contracting officer
authorizes the contractor to proceed.
25
F,C,N
WBS 3.2.3 : Solution Development - Analysis,
Design, and Integration
2
Operational test & evaluation (OT&E) Completed:
OT&E is conducted in an environment as operationally
realistic as possible. This milestone is completed when
government integration and shakedown testing has been
performed at the test and evaluation site. It occurs when all
test procedures have been successfully completed per the
test plan.
S31
Solution
implementation
Service team leader
All activities associated with developing the strategy for
implementing and executing the overall program.
All activities associated with planning, authorizing, and managing
all actions and activities that must be accomplished for successful
program development, which includes preparation of the
acquisition strategy paper and the integrated program plan, and
the project-specific input to agency-level planning documents,
such as the call for estimates and the NAS architecture. It also
includes all activities required to ensure that all cost, schedule,
performance, and benefit objectives are met.
Solution
implementation
F,N
All activities associated with the overall analysis, design, test, and
integration of the solution, (e.g., hardware system, software,
facility, and telecommunications ). This includes design, integrity,
test and analysis, intra- and inter-system compatibility assurance The service team leader
(interface identification, analysis, and design), and the integration
and balancing of reliability, maintainability, producibility, safety,
and survivability. Design includes allocating functions to
appropriate elements (e.g., hardware, software,
telecommunications, user functions, services, facilities, etc.), and
presenting prepared design information at identified design
reviews.
OT&E team leader (WJHTC)
WBS 3.5.2 : System Operational Test and
Evaluation
2
Solution
implementation
F,C,N
All activities associated with tests and evaluations conducted to
assess the prospective system’s utility, operational effectiveness,
operational suitability, and logistics supportability (including
compatibility, interoperability, reliability, maintainability, logistics
requirements, security administration, etc.). It includes all support
activities (e.g., technical assistance, maintenance, labor, material,
support elements and testing spares etc.) required during this
phase of testing.
All activities associated with development and construction of
those special test facilities, test simulators, test beds, and models
required for performance of the operational tests.
FAA Product Oriented WBS
Integrated baseline review completed - The IBR conduct
is complete and the service team and leader agree on
action items.
All activities associated with completing the final investment
decision which includes the following documents: JRC briefing
package, revalidated MNS, final requirements document, final
XPB, final acquisition strategy, final integrated program plan (with
risk management plan), and the final investment analysis report.
FAA Functional WBS Mapping
F,C,N
Systems Engineering
Final investment
analysis
Decision Authority
Joint Resources Council
FAA Annual Perf Goals
1
S19
WBS Description
WBS 2.2.3 : Final Investment Decision Documentation
OMB Exhibit 300 Report
S18
AMS Phase
Cost Baseline
Milestone Description
Final investment decision - The JRC approves and
baselines an investment program at the final investment
decision. The decision document is the XPB and its
planning attachments
WBS MAPPING AND DECISION AUTHORITY
Schedule Baseline
Applicability Legend:
F= Full Scale Development
C= Commercial Off The Shelf
N= Non-Development Items
STANDARD AMS SYSTEM MILESTONES
Earned Value Management
FID
FAA EVM is summarizes the work and activities required to
achieve the FAA AMS Standard Program Milestones
Description of Milestone
Total x300
Program Baseline
Planned
Completion
Date
(mm/dd/yyyy)
Total Cost
($M)
Estimated
Completion Date
(mm/dd/yyyy)
Planned
Completion Date
(mm/dd/yyyy)
Total Cost Total Cost ($M) Schedule:#
Percent
Actual
($M) Planned:
Actual
days
Cost $M Complete
4/30/2015
111.400
3/4/2012
3/4/2012
133.000
48.503
0
64.547
85.0%
6/7/2006
2.500
6/7/2006
6/7/2006
2.500
1.430
0
1.070
100.0%
2/23/2007
2/23/2007
2.500
5.000
2/23/2007
2/23/2007
2/23/2007
1/0/1900
2.500
5.000
1.470
2.900
0
0
1.030
2.100
100.0%
100.0%
5.000
0.500
2.500
3.500
2.000
3.000
2.000
18.500
8/13/2008
4/5/2007
4/30/2007
6/17/2007
11/30/2007
8/13/2008
8/13/2008
8/13/2008
9/15/2008
4/7/2007
5/15/2007
7/21/2007
12/12/2007
9/15/2008
9/15/2008
8/13/2008
5.000
0.500
2.500
3.500
2.000
3.000
2.000
18.500
2.340
0.453
2.870
3.522
1.544
2.713
1.970
15.412
0
-2
-15
-34
-12
-33
-33
-129
2.660
0.047
-0.370
-0.022
0.456
0.287
0.030
3.088
100.0%
100.0%
100.0%
100.0%
100.0%
100.0%
100.0%
100.0%
5.000
2.400
3.200
2.000
4.000
3.205
4.600
24.405
11/11/2009
8/30/2008
12/17/2008
3/11/2009
7/23/2009
11/11/2009
11/11/2009
11/11/2009
11/11/2009
9/15/2008
12/17/2008
3/11/2009
7/23/2009
5.000
2.400
3.200
2.000
4.000
3.205
4.600
24.405
2.367
1.250
2.598
2.175
2.301
2.800
4.200
17.691
0
-15
0
0
0
2.633
1.150
0.602
-0.175
1.699
0.052
0.170
6.131
100.0%
100.0%
100.0%
100.0%
100.0%
89.0%
95.0%
1.000
Planning
(S9) Initial Investment Decision
(S18) Final Investment Decision (FID)
Total Planning
Solution Implementation (Acquisition)
Design Phase ( JRC Approved)
Program Management
8/13/2008
(S19) IBR
4/5/2007
(S20) Contract Award
4/30/2007
(S24) PDR
6/17/2007
(S25) CDR
11/30/2007
(S26) Prod Demo Decision
8/13/2008
8/13/2008
Other
Subtotal Design
8/13/2008
Product Demonstration Phase (JRC Approved)
Program Management
11/11/2009
(S28) System Delivery
8/30/2008
(S30) Development T&E (DT&E)
12/17/2008
(S31) Operational T&E (OT&E)
3/11/2009
(S34) Prod Readiness Review (PRR)
7/23/2009
(S35) Production Decision
11/11/2009
Other
11/11/2009
Subtotal Product Demonstration 11/11/2009
Production and Deployment Phase (Planned)
26
11/11/2009
-15
The FAA implementation of the ANSI/EIA 748 includes clear
and well understood quality and technical parameters
Description of Milestone
Total x300
Program Baseline
Planned
Completion
Date
(mm/dd/yyyy)
Total Cost
($M)
Estimated
4/30/2015
111.400
6/7/2006
2.500
2/23/2007
2/23/2007
2.500
5.000
5.000
0.500
2.500
3.500
2.000
3.000
2.000
18.500
5.000
2.400
3.200
2.000
4.000
3.205
4.600
24.405
Completion Date
(mm/dd/yyyy)
Planned
3/4/2012
Completion Date
(mm/dd/yyyy)
Total Cost Total Cost ($M) Schedule:#
Percent
Actual
($M) Planned:
Actual
days
Cost $M Complete
3/4/2012
133.000
48.503
0
64.547
85.0%
6/7/2006
6/7/2006
2.500
1.430
0
1.070
100.0%
2/23/2007
2/23/2007
2/23/2007
1/0/1900
2.500
5.000
1.470
2.900
0
0
1.030
2.100
100.0%
100.0%
8/13/2008
4/5/2007
4/30/2007
6/17/2007
11/30/2007
8/13/2008
8/13/2008
8/13/2008
9/15/2008
4/7/2007
5/15/2007
7/21/2007
12/12/2007
9/15/2008
9/15/2008
8/13/2008
5.000
0.500
2.500
3.500
2.000
3.000
2.000
18.500
2.340
0.453
2.870
3.522
1.544
2.713
1.970
15.412
0
-2
-15
-34
-12
-33
-33
-129
2.660
0.047
-0.370
-0.022
0.456
0.287
0.030
3.088
100.0%
100.0%
100.0%
100.0%
100.0%
100.0%
100.0%
100.0%
11/11/2009
8/30/2008
12/17/2008
3/11/2009
7/23/2009
11/11/2009
11/11/2009
11/11/2009
11/11/2009
9/15/2008
12/17/2008
3/11/2009
7/23/2009
5.000
2.400
3.200
2.000
4.000
3.205
4.600
24.405
2.367
1.250
2.598
2.175
2.301
2.800
4.200
17.691
0
-15
0
0
0
2.633
1.150
0.602
-0.175
1.699
0.052
0.170
6.131
100.0%
100.0%
100.0%
100.0%
100.0%
89.0%
95.0%
1.000
Planning
(S9) Initial Investment Decision
(S18) Final Investment Decision (FID)
Total Planning
Solution Implementation (Acquisition)
27
Design Phase ( JRC Approved)
Program Management
8/13/2008
(S19) IBR
4/5/2007
(S20) Contract Award
4/30/2007
(S24) PDR
6/17/2007
(S25) CDR
11/30/2007
(S26) Prod Demo Decision
8/13/2008
8/13/2008
Other
Subtotal Design
8/13/2008
Product Demonstration Phase (JRC Approved)
Program Management
11/11/2009
(S28) System Delivery
8/30/2008
(S30) Development T&E (DT&E)
12/17/2008
(S31) Operational T&E (OT&E)
3/11/2009
(S34) Prod Readiness Review (PRR)
7/23/2009
(S35) Production Decision
11/11/2009
Other
11/11/2009
Subtotal Product Demonstration 11/11/2009
Production and Deployment Phase (Planned)
11/11/2009
-15
Examples of integrating technical performance with EVM
• Milestone success criteria met to claim 100% EV
•
CDR: Design solution meets
– Allocated performance requirements
– Functional performance requirements
– Interface requirements
• Interim milestones with planned values for TPMs
• Weight does not exceed 300 lb. at (date)
• 90% of software functionality shalls met (date)
• Base EV on 2 measures:
• Completion of enabling work products (drawings, code)
• Meeting product requirements (as documented in technical
baseline)
28
Some of the possible ANSI revisions are:
Section
Is:
Recommended
Clarification:
Intro:
1. Introduction
Clarify work scope
The principles of an EVMS
includes technical
are:
and quality
 Plan all work scope for the requirements
program to completion
Paragraph
Earned value is a direct
Clarify that earned
3.8 –
measurement of the quantity value is a direct
Performance of work accomplished. The measurement of the
Measurement quality and technical content quantity of work
of work performed is
accomplished and
controlled by other
technical/quality
processes. Earned value is performance.
a value added metric that is
computed on the basis of the
resources assigned to the
completed work scope as
budget.”
29