Transcript Document

USC
C S E
University of Southern California
Center for Software Engineering
COSYSMO-IP
COnstructive SYStems Engineering Cost
Model – Information Processing
PSM User’s Group Conference
Keystone, Colorado
July 24 & 25, 2002
Dr. Barry Boehm
Ricardo Valerdi
University of Southern California
Center for Software Engineering
July 2002
Version 3
USC
C S E
University of Southern California
Center for Software Engineering
Outline – Day 1
•
•
•
•
•
•
•
•
USC Center for Software Engineering
Background & Update on COSYSMO-IP
Ops Con & EIA632
Delphi Round 1 Results
Updated Drivers
Lessons Learned/Improvements
LMCO & INCOSE Comments
Q&A
July 2002
2
USC
C S E
University of Southern California
Center for Software Engineering
Outline – Day 2
• Review of yesterday’s modified slides
to clarify terminology
• A few new slides to emphasize points
• Review of current driver definitions
• Definition for two new Cost drivers
– Technology Maturity
– Physical system/information system
tradeoff analysis complexity
July 2002
3
USC
C S E
University of Southern California
Center for Software Engineering
Objectives of the Workshop
•
•
•
•
Agree on a Concept of Operation
Converge on scope of COSYSMO-IP model
Address definitions of model parameters
Discuss data collection process
July 2002
4
• 8 faculty/research staff, 18 PhD students
• Corporate Affiliates program (TRW, Aero
Galorath, Raytheon, Lockheed, Motorola,
et al)
• 17th International Forum on COCOMO and
Software Cost Modeling
October 22-25, 2002, Los Angeles, CA
– Theme: Software Cost Estimation and Risk
Management
• Annual research review in March 2003
July 2002
5
USC
C S E
University of Southern California
Center for Software Engineering
COSYSMO-IP: What is it?
The purpose of the COSYSMO-IP project
is to develop an initial increment of a
parametric model to estimate the cost of
system engineering activities during system
development.
The focus of the initial increment is on the
cost of systems engineering for information
processing systems or subsystems.
July 2002
6
USC
C
S E
University of Southern California
Center for Software Engineering
What Does COSYSMO-IP Cover?
• Includes:
– System engineering in the
inception, elaboration, and
construction phases, including
test planning
– Requirements development and
specification activities
– Physical system/information
system tradeoff analysis
– Operations analysis and design
activities
– System architecture tasks
• Defers:
– Physical system/information
system operation test &
evaluation, deployment
– Special-purpose hardware
design and development
– Structure, power and/or
specialty engineering
– Manufacturing and/or production
analysis
• Including allocations to
hardware/software and
consideration of COTS, NDI and
legacy impacts
– Algorithm development and
validation tasks
July 2002
7
USC
C S E
University of Southern California
Center for Software Engineering
Candidate COSYSMO
Evolution Path
Inception
Elaboration
Construction
IP (Sub)system
1. COSYSMO-IP
C4ISR System
2. COSYSMO-C4ISR
Physical Machine
System
System of
Systems (SoS)
July 2002
Oper Test
& Eval
Transition
3. COSYSMO-Machine
4. COSYSMO-SoS
8
USC
C S E
University of Southern California
Center for Software Engineering
Current COSYSMO-IP Operational
Concept
# Requirements
# Interfaces
# Scenarios
# Algorithms
Volatility Factor
…
Size
Drivers
Effort
COSYSMO-IP
Duration
Effort
Multipliers
- Application factors
- Team factors
- Schedule driver
Calibration
WBS guided
By EIA 632
July 2002
9
USC
C S E
University of Southern California
Center for Software Engineering
EIA632/COSYSMO-IP Mapping
COSYSMO-IP Category
Supplier Performance
Technical Management
Requirements Definition
Solution Definition
Systems Analysis
Requirements Validation
Design Solution Verification
End Products Validation - COTS
EIA632 Requirement
3
4-12
14-16
17-19
22-24
25-29
30
33a
EIA632 Reqs. not included in COSYSMO-IP are: 1,2,13,20,21,31,32,33b
July 2002
10
USC
C S E
University of Southern California
Center for Software Engineering
Activity Elements Covered by EIA632,
COCOMOII, and COSYSMO-IP
EIA Stage
Development
Inception
Elaboration
Construction
Transition
14
12
10
14
10
8
5
5
38
18
8
4
19
36
16
4
8
13
34
19
8
10
24
24
3
3
3
30
Management
Environment/CM
Requirements
Design
Implementation
Assessment
When doing
COSYSMO-IP and
COCOMOII,
Subtract grey areas
prevent double
counting.
Deployment
TBD
TBD
TBD
= COCOMOII
July 2002
= COSYSMO-IP
11
USC
C S E
University of Southern California
Center for Software Engineering
Past, Present, and Future
Initial set if parameters
compiled by Affiliates
Performed First
Delphi Round
PSM
Workshop
2001
Meeting at CCII
Conference
July 2002
2002
2003
Working Group
meeting at ARR
12
USC
University of Southern California
C S E
Center for Software Engineering
Future Parameter
Refinement Opportunities
2003
2004
2005
Driver definitions
Data collection (Delphi)
First iteration of model
Model calibration
July 2002
13
USC
C S E
University of Southern California
Center for Software Engineering
Delphi Survey
• Survey was conducted to:
– Determine the distribution of effort across effort categories
– Determine the range for size driver and effort multiplier
ratings
– Identify the cost drivers to which effort is most sensitive to
– Reach consensus from a sample of systems engineering
experts
• Distributed Delphi surveys to Affiliates and received 28
responses
• 3 Sections:
– Scope, Size, Cost
• Also helped us refine the scope of the model elements
July 2002
14
USC
C S E
University of Southern California
Center for Software Engineering
Delphi Round 1 Results
System Engineering Effort Distribution
Category (EIA Requirement)
Suggested
Delphi
Std.
Dev.
5%
15%
15%
20%
20%
15%
5%
5%
5.2%
13.1%
16.6%
18.1%
19.2%
11.3%
10.5%
6.6%
3.05
4.25
4.54
4.28
5.97
4.58
6.07
3.58
Supplier Performance (3)
Technical Management (4-12)
Requirements Definition (14-16)
Solution Definition (17-19)
Systems Analysis (22-24)
Requirements Validation (25-29)
Design Solution Verification (30)
End Products Validation (33a)
July 2002
15
USC
C S E
University of Southern California
Center for Software Engineering
Delphi Round 1 Highlights (cont.)
Range of sensitivity for Size Drivers
6.48
5.57
6
4
July 2002
2.21
2.23
# Modes
# TPM’s
2.54
# Algorithms
# Interfaces
1
# Requirements
2
2.10
# Platforms
Effort
# Scenarios
Relative
16
USC
C S E
University of Southern California
Center for Software Engineering
Two Most Sensitive Size Drivers
Suggested
Rel. Effort
Delphi Respondents
EMR
Rel.
Standard
Effort
Deviation
# Interfaces
4
5.57
1.80
# Algorithms
6
6.48
2.09
July 2002
17
USC
C S E
University of Southern California
Center for Software Engineering
Delphi Round 1 Highlights (cont.)
Range of sensitivity for Cost Drivers (Application Factors)
4
July 2002
Legacy transition
2.81
Level of service reqs.
1.93
2.43
Requirements und.
1.74
Platform difficulty
1.13
COTS
2.13
Bus. process reeng.
2
2.24
Architecture und.
EMR
18
USC
C S E
University of Southern California
Center for Software Engineering
Delphi Round 1 Highlights (cont.)
Range of sensitivity for Cost Drivers (Team Factors)
July 2002
Process maturity
2.16
2.46
Personnel capability
1.84
1.94
Personal experience
1.78
Formality of deliv.
1.28
1.91
Tool support
1.25
Stakeholder cohesion
2
Stakeholder comm.
EMR
Multisite coord.
4
19
USC
C S E
University of Southern California
Center for Software Engineering
Four Most Sensitive Cost Drivers
Suggested
EMR
Delphi Respondents
EMR
Mean
Standard
Deviation
Arch. Under.
1.66
2.24
0.83
Reqs. Under.
1.73
2.43
0.70
Pers. Cap.
2.15
2.46
0.66
Serv. Req.
2.5
2.81
0.67
July 2002
20
USC
University of Southern California
C S E
Center for Software Engineering
4 Size Drivers
1.
2.
3.
4.
COST
Driver
COST
Driver
Number of System Requirements
Number of Major Interfaces
Number of Operational Scenarios
Number of Unique Algorithms
Number of Technical Performance
Measures
Number of Modes of Operation
Number of Different Platforms
July 2002
21
USC
C S E
University of Southern California
Center for Software Engineering
Size Driver Definitions (1 of 4)
Number of System Requirements
The number of requirements taken from the system
specification. A requirement is a statement of capability or
attribute containing a normative verb such as shall or will. It
may be functional or system service-oriented in nature
depending on the methodology used for specification. System
requirements can typically be quantified by counting the
number of applicable shall’s or will’s in the system or
marketing specification.
Note: Use this driver as the basis of
comparison for the rest of the drivers.
July 2002
22
USC
C S E
University of Southern California
Center for Software Engineering
Size Driver Definitions (2 of 4)
Number of Major Interfaces
The number of shared major physical and logical
boundaries between system components or functions
(internal interfaces) and those external to the system
(external interfaces). These interfaces typically can be
quantified by counting the number of interfaces
identified in either the system’s context diagram and/or by counting
the significant interfaces in applicable Interface Control
Documents.
July 2002
23
USC
C S E
University of Southern California
Center for Software Engineering
Size Driver Definitions (3 of 4)
Number of Operational Scenarios*
The number of operational scenarios** that a system is specified to
satisfy. Such threads typically result in end-to-end test scenarios
that are developed to validate the system satisfies its requirements.
The number of scenarios can typically be quantified by counting
the number of end-to-end tests used to validate the system
functionality and performance. They can also be calculated by
counting the number of high-level use cases developed as part of
the operational architecture.
Number of Modes of Operation (to be merged with Op Scen)
The number of defined modes of operation for a system. For example,
in a radar system, the operational modes could be air-to-air, air-toground, weather, targeting, etc. The number of modes is quantified by
counting the number of operational modes specified in the Operational
Requirements Document.
*counting rules need to be refined
24
July 2002
**Op Scen can be derived from system modes
USC
C S E
University of Southern California
Center for Software Engineering
Size Driver Definitions (4 of 4)
Number of Unique Algorithms
The number of newly defined or significantly altered functions that
require unique mathematical algorithms to be derived in order to
achieve the system performance requirements.
Note: Examples could include a complex aircraft
tracking algorithm like a Kalman Filter being derived using existing
experience as the basis for the all aspect search function. Another
Example could be a brand new discrimination algorithm being
derived to identify friend or foe function in space-based
applications. The number can be quantified by counting the number
of unique algorithms needed to support each of the mathematical
functions specified in the system specification or mode description
document (for sensor-based systems).
July 2002
25
USC
C S E
University of Southern California
Center for Software Engineering
12 Cost Drivers
Application Factors (5)
1.
2.
3.
4.
5.
6.
7.
8.
9.
Requirements understanding
Architecture complexity
Level of service requirements
Migration complexity
COTS assessment complexity
Platform difficulty
Required business process reengineering
Technology Maturity
Physical system/information subsystem
tradeoff analysis complexity
July 2002
26
USC
C S E
University of Southern California
Center for Software Engineering
Cost Driver Definitions (1,2 of 5)
Requirements understanding
The level of understanding of the system requirements
by all stakeholders including the systems, software, hardware,
customers, team members, users, etc…
Architecture complexity
The relative difficulty of determining and managing the
system architecture in terms of IP platforms, standards,
components (COTS/GOTS/NDI/new), connectors
(protocols), and constraints. This includes systems analysis,
tradeoff analysis, modeling, simulation, case studies, etc…
July 2002
27
USC
C S E
University of Southern California
Center for Software Engineering
Cost Driver Definitions (3,4,5 of 5)
Level of service requirements
The difficulty and criticality of satisfying the Key Performance
Parameters (KPP). For example: security, safety, response time,
the “illities”, etc…
Migration complexity (formerly Legacy transition
complexity)
The complexity of migrating the system from previous system
components, databases, workflows, etc, due to new technology
introductions, planned upgrades, increased performance, business
process reengineering etc…
Technology Maturity
The relative readiness for operational use of the key
technologies.
July 2002
28
USC
C S E
University of Southern California
Center for Software Engineering
12 Cost Drivers (cont.)
Team Factors (7)
1. Number and diversity of stakeholder
communities
2. Stakeholder team cohesion
3. Personnel capability
4. Personal experience/continuity
5. Process maturity
6. Multisite coordination
7. Formality of deliverables
8. Tool support
July 2002
29
USC
C S E
University of Southern California
Center for Software Engineering
Cost Driver Definitions (1,2,3 of 7)
Stakeholder team cohesion
Leadership, frequency of meetings, shared vision, approval cycles,
group dynamics (self-directed teams, project engineers/managers),
IPT framework, and effective team dynamics.
Personnel capability
Systems Engineering’s ability to perform in their duties and the
quality of human capital.
Personnel experience/continuity
The applicability and consistency of the staff over the life of the
project with respect to the customer, user, technology, domain,
etc…
July 2002
30
USC
C S E
University of Southern California
Center for Software Engineering
Cost Driver Definitions (4,5,6,7 of 7)
Process maturity
Maturity per EIA/IS 731, SE CMM or CMMI.
Multisite coordination
Location of stakeholders, team members, resources (travel).
Formality of deliverables
The breadth and depth of documentation required to be formally
delivered.
Tool support
Use of tools in the System Engineering environment.
July 2002
31
USC
C S E
University of Southern California
Center for Software Engineering
Lessons
Learned/Improvements
Lesson 1 – Need to better define the scope and future
of COSYSMO-IP via Con Ops
Lesson 2 – Drivers can be interpreted in different
Ways depending on the type of program
Lesson 3 – COSYSMO is too software-oriented
Lesson 4 – Delphi needs to take less time to fill out
Lesson 5 – Need to develop examples, rating scales
July 2002
32
USC
C S E
University of Southern California
Center for Software Engineering
LMCO Comments
The current COSYSMO focus is too software
oriented. This is a good point. We propose to change the
scope from "software-intensive systems or subsystems" to
"information processing (IP) systems or subsystems." These
include not just the software but also the associated IP
hardware processors; memory; networking; display or other
human-computer interaction devices. System engineering of
these IP systems or subsystems includes considerations of IP
hardware device acquisition lead times, producibility, and
logistics. Considerations on non-IP hardware acquisition,
producibility, and logistics are considered as IP systems
engineering cost and schedule drivers for the IOC version of
COSYSMO. Perhaps we should call it COSYSMO-IP.
July 2002
33
USC
C S E
University of Southern California
Center for Software Engineering
LMCO Comments (cont.)
The COSYSMO project should begin by working out the
general framework and WBS for the full life cycle of a
general system. We agree that such a general framework
and WBS will eventually be needed. However, we feel that
progress toward it can be most expeditiously advanced by
working on definitions of and data for a key element of the
general problem first. If another group would like to
concurrently work out the counterpart definitions and data
considerations for the general system engineering
framework, WBS, and estimation model, we will be happy to
collaborate with them.
July 2002
34
USC
C S E
University of Southern California
Center for Software Engineering
Points of Contact
Dr. Barry Boehm [[email protected]]
(213) 740-8163
Ricardo Valerdi [[email protected]]
(213) 440-4378
Donald Reifer [[email protected]]
(310) 530-4493
Websites
http://valerdi.com/cosysmo
http://sunset.usc.edu
July 2002
35
USC
C S E
University of Southern California
Center for Software Engineering
Backup slides
July 2002
36
USC
C S E
University of Southern California
Center for Software Engineering
COCOMOII Suite
COPROMO
COQUALMO
COPSEMO
COCOMOII
CORADMO
COCOTS
COSYSMO-IP
July 2002
For more information visit http://sunset.usc.edu
37