USC C S E University of Southern California Center for Software Engineering Value-Based Software Engineering: Motivation and Key Practices Barry Boehm, USC CS 510 Lecture Fall 2011 August 26,

Download Report

Transcript USC C S E University of Southern California Center for Software Engineering Value-Based Software Engineering: Motivation and Key Practices Barry Boehm, USC CS 510 Lecture Fall 2011 August 26,

USC
C S E
University of Southern California
Center for Software Engineering
Value-Based Software Engineering:
Motivation and Key Practices
Barry Boehm, USC
CS 510 Lecture
Fall 2011
August 26, 2011
©USC-CSSE
1
USC
Outline
University of Southern California
C S E
Center for Software Engineering
• Value-based software engineering (VBSE)
motivation, examples, and definitions
• VBSE key practices
–
–
–
–
–
–
–
Benefits realization analysis
Stakeholder Win-Win negotiation
Business case analysis
Continuous risk and opportunity management
Concurrent system and software engineering
Value-based monitoring and control
Change as opportunity
• Conclusions and references
August 26, 2011
©USC-CSSE
2
USC
C S E
University of Southern California
Center for Software Engineering
Software Testing Business Case
• Vendor proposition
– Our test data generator will cut your test costs in half
– We’ll provide it to you for 30% of your test costs
– After you run all your tests for 50% of your original
cost, you are 20% ahead
• Any concerns with vendor proposition?
August 26, 2011
©USC-CSSE
3
USC
C S E
University of Southern California
Center for Software Engineering
Software Testing Business Case
• Vendor proposition
– Our test data generator will cut your test costs in half
– We’ll provide it to you for 30% of your test costs
– After you run all your tests for 50% of your original
cost, you are 20% ahead
• Any concerns with vendor proposition?
– 34 reasons in 2004 ABB experience paper
• Unrepresentative test coverage, too much output data,
lack of test validity criteria, poor test design, instability
due to rapid feature changes, lack of preparation and
experience (automated chaos yields faster chaos), …
• C. Persson and N. Yilmazturk, Proceedings, ASE 2004
– But one more significant reason
August 26, 2011
©USC-CSSE
4
USC
C S E
University of Southern California
Center for Software Engineering
Software Testing Business Case
• Vendor proposition
– Our test data generator will cut your test costs in half
– We’ll provide it to you for 30% of your test costs
– After you run all your tests for 50% of your original
cost, you are 20% ahead
• Any concerns with vendor proposition?
– Test data generator is value-neutral*
– Assumes every test case, defect is equally important
– Usually, 20% of test cases cover 80% of business case
* As are most current software engineering techniques
August 26, 2011
©USC-CSSE
5
USC
University of Southern California
C S E
Center for Software Engineering
20% of Features Provide 80% of Value:
Focus Testing on These (Bullock, 2000)
100
80
% of
Value
for
Correct
Customer
Billing
60
Automated test
generation tool
- all tests have equal value
40
20
5
10
15
Customer Type
August 26, 2011
©USC-CSSE
6
USC
University of Southern California
C S E
Center for Software Engineering
Value-Based Testing Provides More Net Value
60
(30, 58)
Value-Based Testing
40
Net
Value 20
NV
0
(100, 20)
20
40
60
80
100
Percent of tests run
-20
Test Data Generator
% Tests
Test Data Generator
Value-Based Testing
Cost
Value
NV
Cost
Value
NV
0
30
0
-30
0
0
0
10
35
10
-25
10
50
40
20
40
20
-20
20
75
55
30
45
30
-15
30
88
58
40
50
40
-10
40
94
54
….
….
….
….
….
….
….
100
80
100
+20
100
100
0
-40
August 26, 2011
©USC-CSSE
7
USC
C S E
University of Southern California
Center for Software Engineering
Value-Based Reading (VBR) Experiment
— Keun Lee, ISESE 2005
By Number
P-value
% Gr A higher
By Impact
P-value
% Gr A higher
Average of Concerns
0.202
34
Average Impact of
Concerns
0.049
65
Average of Problems
0.056
51
Average Impact of
Problems
0.012
89
Average of Concerns
per hour
0.026
55
Average Cost
Effectiveness of
Concerns
0.004
105
Average of Problems
per hour
0.023
61
Average Cost
Effectiveness of
Problems
0.007
108
• Group A: 15 IV&V personnel using VBR procedures and checklists
• Group B 13 IV&V personnel using previous value-neutral checklists
– Significantly higher numbers of trivial typo and grammar faults
August 26, 2011
©USC-CSSE
8
USC
C S E
University of Southern California
Center for Software Engineering
Motivation for Value-Based SE
• Current SE methods are basically value-neutral
– Every requirement, use case, object, test case, and defect is
equally important
– Object oriented development is a logic exercise
– “Earned Value” Systems don’t track business value
– Separation of concerns: SE’s job is to turn requirements into
verified code
– Ethical concerns separated from daily practices
• Value – neutral SE methods are increasingly risky
– Software decisions increasingly drive system value
– Corporate adaptability to change achieved via software
decisions
– System value-domain problems are the chief sources of
software project failures
August 26, 2011
©USC-CSSE
9
USC
C S E
University of Southern California
Center for Software Engineering
Why Software Projects Fail
August 26, 2011
©USC-CSSE
10
USC
C S E
University of Southern California
Center for Software Engineering
The “Separation of Concerns” Legacy
•
“The notion of ‘user’ cannot be precisely defined, and
therefore has no place in CS or SE.”
- Edsger Dijkstra, ICSE 4, 1979
•
“Analysis and allocation of the system requirements is
not the responsibility of the SE group but is a
prerequisite for their work”
- Mark Paulk at al., SEI Software CMM* v.1.1, 1993
*Capability Maturity Model
August 26, 2011
©USC-CSSE
11
USC
C S E
University of Southern California
Center for Software Engineering
Resulting Project Social Structure
I wonder when
they'll give us our
requirements?
SOFTWARE
AERO.
ELEC.
MGMT.
MFG.
COMM
August 26, 2011
G&C
PAYLOAD
©USC-CSSE
12
USC
C S E
University of Southern California
Center for Software Engineering
20% of Fires Cause 80% of Property Loss:
Focus Fire Dispatching on These?
100
80
% of
Property
Loss
60
40
20
20
40
60
80
100
% of Fires
August 26, 2011
©USC-CSSE
13
USC
C S E
University of Southern California
Center for Software Engineering
Missing Stakeholder Concerns:
Fire Dispatching System
• Dispatch to minimize value of property loss
– Neglect safety, least-advantaged property owners
• English-only dispatcher service
– Neglect least-advantaged immigrants
• Minimal recordkeeping
– Reduced accountability
• Tight budget; design for nominal case
– Neglect reliability, safety, crisis performance
August 26, 2011
©USC-CSSE
14
USC
C S E
University of Southern California
Center for Software Engineering
Key Definitions
•
Value (from Latin “valere” – to be worth
1. A fair or equivalent in goods, services, or money
2. The monetary worth of something
3. Relative worth, utility or importance
•
Software validation (also from Latin “valere”)
– Validation: Are we building the right product?
– Verification: Are we building the product right?
August 26, 2011
©USC-CSSE
15
USC
C S E
University of Southern California
Center for Software Engineering
Conclusions So Far
• Value considerations are software successcritical
• “Success” is a function of key stakeholder
values
– Risky to exclude key stakeholders
• Values vary by stakeholder role
• Non-monetary values are important
– Fairness, customer satisfaction, trust
• Value-based approach integrates ethics
into daily software engineering practice
August 26, 2011
©USC-CSSE
16
USC
Outline
University of Southern California
C S E
Center for Software Engineering
• Value-based software engineering (VBSE)
motivation, examples, and definitions
• VBSE key practices
–
–
–
–
–
–
–
Benefits realization analysis
Stakeholder Win-Win negotiation
Business case analysis
Continuous risk and opportunity management
Concurrent system and software engineering
Value-based monitoring and control
Change as opportunity
• Conclusions and references
August 26, 2011
©USC-CSSE
17
USC
C S E
University of Southern California
Center for Software Engineering
DMR/BRA* Results Chain
Order to delivery time is
an important buying criterion
INITIATIVE
Contribution
Implement a new order
entry system
ASSUMPTION
OUTCOME
Contribution
OUTCOME
Reduced order processing cycle
(intermediate outcome)
Increased sales
Reduce time to process
order
Reduce time to deliver product
*DMR Consulting Group’s Benefits Realization Approach
August 26, 2011
©USC-CSSE
18
USC
C S E
University of Southern California
Center for Software Engineering
Expanded Order Processing System Benefits Chain
Distributors, retailers,
customers
Assumptions
- Increasing market size
- Continuing consumer satisfaction with product
- Relatively stable e-commerce infrastructure
- Continued high staff performance
New order-entry
system
Developers
Less time,
fewer
errors per
order
entry
system
Safety, fairness
inputs
Less time, fewer
errors in order
processing
Faster,
better
order
entry
system
Interoperability
inputs
Increased
customer
satisfaction,
decreased
operations costs
Faster order-entry steps, errors
On-time assembly
New order-entry
processes,
outreach, training
Improved supplier
coordination
Sales personnel,
distributors
August 26, 2011
New order fulfillment
processes,
outreach, training
New order fulfillment
system
©USC-CSSE
Increased
sales,
profitability,
customer
satisfaction
Increased profits,
growth
Suppliers
19
USC
C S E
University of Southern California
Center for Software Engineering
The Model-Clash Spider Web: Master Net
- Stakeholder value propositions (win conditions)
August 26, 2011
©USC-CSSE
20
USC
C S E
University of Southern California
Center for Software Engineering
Projecting Yourself Into Others’ Win Situations
Counterexample: The Golden Rule
• Do unto others
.. As you would have others
do unto you
August 26, 2011
©USC-CSSE
21
USC
C S E
University of Southern California
Center for Software Engineering
Projecting Yourself Into Others’ Win Situations
Counterexample: The Golden Rule
• Do unto others
• Build computer systems to
serve users and operators
.. As you would have others
.. Assuming users and operators
like to write programs, and
know computer science
do unto you
•Computer sciences world (compilers, OS, etc.)
–Users are programmers
•Applications world
–Users are pilots, doctors, tellers
August 26, 2011
©USC-CSSE
22
USC
C S E
University of Southern California
Center for Software Engineering
EasyWinWin OnLine Negotiation Steps
August 26, 2011
©USC-CSSE
23
USC
C S E
University of Southern California
Center for Software Engineering
Red cells indicate lack of
consensus.
Oral discussion of cell
graph reveals unshared
information, unnoticed
assumptions, hidden
issues, constraints, etc.
August 26, 2011
©USC-CSSE
24
USC
C S E
University of Southern California
Center for Software Engineering
Example of Business Case Analysis
ROI= Present Value [(Benefits-Costs)/Costs]
Option B
3
Return on
Investment
2
Option A
1
Time
-1
August 26, 2011
©USC-CSSE
25
USC
C S E
University of Southern California
Center for Software Engineering
Example of Business Case Analysis
ROI= Present Value [(Benefits-Costs)/Costs]
Option BRapid
3
Return on
Investment
Option B
2
Option A
1
Time
-1
August 26, 2011
©USC-CSSE
26
USC
C S E
University of Southern California
Center for Software Engineering
Examples of Utility Functions: Response Time
Value
Value
Critical
Region
Time
Time
Mission Planning,
Competitive Time-to-Market
Real-Time Control;
Event Support
Value
Value
Time
Time
Event Prediction
- Weather; Software Size
August 26, 2011
Data Archiving
Priced Quality of Service
©USC-CSSE
27
USC
C S E
University of Southern California
Center for Software Engineering
How Much Testing is Enough?
- Early Startup: Risk due to low dependability
- Commercial: Risk due to low dependability
- High Finance: Risk due to low dependability
- Risk due to market share erosion
Combined Risk Exposure
1
Market Share
Erosion
0.8
Early Startup
0.6
RE =
P(L) * S(L)
Sweet
Spot
0.4
Commercial
High Finance
0.2
0
VL
L
N
H
VH
RELY
COCOMO II:
0
12
22
34
54
Added % test time
COQUALMO:
1.0
.475
.24
.125
0.06
P(L)
Early Startup:
.33
.19
.11
.06
.03
S(L)
Commercial:
1.0
.56
.32
.18
.10
S(L)
High Finance:
3.0
1.68
.96
.54
.30
S(L)
Market Risk:
.008
.027
.09
.30
1.0
REm
August 26, 2011
©USC-CSSE
28
USC
C S E
University of Southern California
Center for Software Engineering
Value-Based Defect Reduction
Example:
Goal-Question-Metric (GQM) Approach
Goal:Our supply chain software packages have too
many defects. We need to get their defect rates
down
Question: ?
August 26, 2011
©USC-CSSE
29
USC
C S E
University of Southern California
Center for Software Engineering
Value-Based GQM Approach – I
Q: How do software defects affect system value
goals?
ask why initiative is needed
-
Order processing
Too much downtime on operations critical path
Too many defects in operational plans
Too many new-release operational problems
G: New system-level goal: Decrease
software-defect-related losses in operational
effectiveness
- With high-leverage problem areas above as specific
subgoals
New Q: ?
August 26, 2011
©USC-CSSE
30
USC
University of Southern California
C S E
Center for Software Engineering
Value-Based GQM Approach – II
New Q: Perform system problem-area root cause analysis:
ask why problems are happening via models
Example: Downtime on critical path
Order Validate
order
items
Validate
items in
stock
Schedule
packaging,
delivery
Produce
status
reports
Prepare
delivery
packages
Deliver
order
• Where are primary software-defect-related delays?
• Where are biggest improvement-leverage areas?
–
–
–
–
Reducing software defects in Scheduling module
Reducing non-software order-validation delays
Taking Status Reporting off the critical path
Downstream, getting a new Web-based order entry system
• Ask “why not?” as well as “why?”
August 26, 2011
©USC-CSSE
31
USC
C S E
University of Southern California
Center for Software Engineering
Value-Based GQM Results
• Defect tracking weighted by system-value priority
– Focuses defect removal on highest-value effort
• Significantly higher effect on bottom-line business
value
– And on customer satisfaction levels
• Engages software engineers in system issues
– Fits increasing system-criticality of software
• Strategies often helped by quantitative models
– COQUALMO, iDAVE
August 26, 2011
©USC-CSSE
32
USC
Outline
University of Southern California
C S E
Center for Software Engineering
• Value-based software engineering (VBSE)
motivation, examples, and definitions
• VBSE key practices
–
–
–
–
–
–
–
Benefits realization analysis
Stakeholder Win-Win negotiation
Business case analysis
Continuous risk and opportunity management
Concurrent system and software engineering
Value-based monitoring and control
Change as opportunity
• Conclusions and references
August 26, 2011
©USC-CSSE
33
USC
C S E
University of Southern California
Center for Software Engineering
Is This A Risk?
• We just started integrating the software
– and we found out that COTS* products A and
B just can’t talk to each other
• We’ve got too much tied into A and B to change
• Our best solution is to build wrappers around A
and B to get them to talk via CORBA**
• This will take 3 months and $300K
• It will also delay integration and delivery by at
least 3 months
*COTS: Commercial off-the-shelf
**CORBA: Common Object Request Broker Architecture
August 26, 2011
©USC-CSSE
34
USC
C S E
University of Southern California
Center for Software Engineering
Is This A Risk?
• We just started integrating the software
– and we found out that COTS* products A and B just
can’t talk to each other
• We’ve got too much tied into A and B to change
*******
• No, it is a problem
– Being dealt with reactively
• Risks involve uncertainties
– And can be dealt with pro-actively
– Earlier, this problem was a risk
August 26, 2011
©USC-CSSE
35
USC
C S E
University of Southern California
Center for Software Engineering
Earlier, This Problem Was A Risk
• A and B are our strongest COTS choices
– But there is some chance that they can’t talk to each
other
– Probability of loss P(L)
• If we commit to using A and B
– And we find out in integration that they can’t talk to each
other
– We’ll add more cost and delay delivery by at least 3
months
– Size of loss S(L)
• We have a risk exposure of
RE = P(L) * S(L)
August 26, 2011
©USC-CSSE
36
USC
C S E
University of Southern California
Center for Software Engineering
How Can Risk Management Help
You Deal With Risks?
•
•
•
•
•
Buying information
Risk avoidance
Risk transfer
Risk reduction
Risk acceptance
August 26, 2011
©USC-CSSE
37
USC
C S E
University of Southern California
Center for Software Engineering
Risk Management Strategies:
- Buying Information
• Let’s spend $30K and 2 weeks prototyping
the integration of A and B’s HCI’s
• This will buy information on the magnitude
of P(L) and S(L)
• If RE = P(L) * S(L) is small, we’ll accept and
monitor the risk
• If RE is large, we’ll use one/some of the
other strategies
August 26, 2011
©USC-CSSE
38
•
•
•
•
USC
C S E
University of Southern California
Center for Software Engineering
Other Risk Management Strategies
Risk Avoidance
–
–
COTS product C is almost as good as B, and its HCI is compatible with A
Delivering on time is worth more to the customer than the small
performance loss
Risk Transfer
–
–
If the customer insists on using A and B, have them establish a risk
reserve.
To be used to the extent that A and B have incompatible HCI’s to
reconcile
Risk Reduction
–
If we build the wrapper right now, we add cost but minimize the schedule
delay
Risk Acceptance
–
–
If we can solve the A and B HCI compatibility problem, we’ll have a big
competitive edge on the future procurements
Let’s do this on our own money, and patent the solution
August 26, 2011
©USC-CSSE
39
USC
C S E
University of Southern California
Center for Software Engineering
Is Risk Management Fundamentally Negative?
• It usually is, but it shouldn’t be
• As illustrated in the Risk Acceptance strategy, it is
equivalent to Opportunity Management
Opportunity Exposure OE = P(Gain) * S(Gain)
= Expected Value
• Buying information and the other Risk Strategies
have their Opportunity counterparts
– P(Gain): Are we likely to get there before the
competition?
– S(Gain): How big is the market for the solution?
August 26, 2011
©USC-CSSE
40
USC
C S E
University of Southern California
Center for Software Engineering
What Else Can Risk Management Help You Do?
•
•
•
Determine “How much is enough?” for your products and
processes
– Functionality, documentation, prototyping, COTS evaluation,
architecting, testing, formal methods, agility, discipline, …
– What’s the risk exposure of doing too much?
– What’s the risk exposure of doing too little?
Tailor and adapt your life cycle processes
– Determine what to do next (specify, prototype, COTS
evaluation, business case analysis)
– Determine how much of it is enough
– Examples: Risk-driven spiral model and extensions (win-win,
anchor points, RUP, MBASE, CeBASE Method)
Get help from higher management
– Organize management reviews around top-10 risks
August 26, 2011
©USC-CSSE
41
USC
C S E
University of Southern California
Center for Software Engineering
Example Large-System Risk Analysis:
How Much Architecting is Enough?
• Large system involves subcontracting to over a dozen
software/hardware specialty suppliers
• Early procurement package means early start
– But later delays due to inadequate architecture
– And resulting integration rework delays
• Developing thorough architecture specs reduces rework
delays
– But increases subcontractor startup delays
August 26, 2011
©USC-CSSE
42
USC
C S E
University of Southern California
Center for Software Engineering
How Soon to Define Subcontractor Interfaces?
Risk exposure RE = Prob(Loss) * Size(Loss)
-Loss due to rework delays
Many interface defects: high P(L)
Critical IF defects: high S(L)
RE =
P(L) * S(L)
Few IF defects: low P(L)
Minor IF defects: low S(L)
Time spent defining & validating architecture
August 26, 2011
©USC-CSSE
43
USC
University of Southern California
C S E
Center for Software Engineering
How Soon to Define Subcontractor Interfaces?
- Loss due to rework delays
- Loss due to late subcontact startups
Many interface defects: high P(L)
Critical IF defects: high S(L)
Many delays: high P(L)
Long delays: high S(L)
RE =
P(L) * S(L)
Few delays: low P(L)
Short Delays: low S(L)
Few IF defects: low P(L)
Minor IF defects: low S(L)
Time spent defining & validating architecture
August 26, 2011
©USC-CSSE
44
USC
C S E
University of Southern California
Center for Software Engineering
How Soon to Define Subcontractor Interfaces?
- Sum of Risk Exposures
Many interface defects: high P(L)
Critical IF defects: high S(L)
RE =
P(L) * S(L)
Many delays: high P(L)
Long delays: high S(L)
Sweet
Spot
Few delays: low P(L)
Short delays: low S(L)
Few IF defects: low P(L)
Minor IF defects: low S(L)
Time spent defining & validating architecture
August 26, 2011
©USC-CSSE
45
USC
University of Southern California
C S E
Center for Software Engineering
How Soon to Define Subcontractor Interfaces?
-Very Many Subcontractors
RE =
P(L) * S(L)
Higher P(L),
S(L): many more IF’s
Many-Subs
Sweet
Spot
Mainstream
Sweet
Spot
Time spent defining & validating architecture
August 26, 2011
©USC-CSSE
46
USC
University of Southern California
C S E
Center for Software Engineering
How Much Architecting Is Enough?
-A COCOMO II Analysis
100
Percent of Time Added to Overall Schedule
90
10000
KSLOC
80
Percent of Project Schedule Devoted to
Initial Architecture and Risk Resolution
70
Added Schedule Devoted to Rework
(COCOMO II RESL factor)
Total % Added Schedule
60
Sweet Spot
50
40
100 KSLOC
30
Sweet Spot Drivers:
20
Rapid Change: leftward
10 KSLOC
High Assurance: rightward
10
0
0
10
20
30
40
50
60
Percent of Time Added for Architecture and Risk Resolution
August 26, 2011
©USC-CSSE
47
USC
Outline
University of Southern California
C S E
Center for Software Engineering
• Value-based software engineering (VBSE)
motivation, examples, and definitions
• VBSE key practices
–
–
–
–
–
–
–
Benefits realization analysis
Stakeholder Win-Win negotiation
Business case analysis
Continuous risk and opportunity management
Concurrent system and software engineering
Value-based monitoring and control
Change as opportunity
• Conclusions and references
August 26, 2011
©USC-CSSE
48
USC
C S E
University of Southern California
Center for Software Engineering
Sequential Engineering Neglects Risk
$100M
Arch. A:
Custom
many cache processors
$50M
Arch. B:
Modified
Client-Server
Original Spec
1
After Prototyping
2
3
4
5
Response Time (sec)
August 26, 2011
©USC-CSSE
49
USC
C S E
University of Southern California
Center for Software Engineering
Change As Opportunity: Agile Methods
• Continuous customer interaction
• Short value - adding increments
• Tacit interpersonal knowledge
– Stories, Planning game, pair programming
– Explicit documented knowledge expensive to change
• Simple design and refactoring
– Vs. Big Design Up Front
August 26, 2011
©USC-CSSE
50
USC
C S E
University of Southern California
Center for Software Engineering
Five Critical Decision Factors
• Represent five dimensions
• Size, Criticality, Dynamism, Personnel, Culture
Personnel
(% Level 1B) (% Level 2&3)
Criticality
(Loss due to impact of defects)
Many
Lives
Single
Life
40
15
30
20
20
25
10
30
0
35
Essential
Discretionary
Funds
Comfort
Funds
Dynamism
(% Requirements-change/month)
10
5
1
30
50
3
90
10
70
30
50
100
30
300
Size
(# of personnel)
August 26, 2011
10
©USC-CSSE
Culture
(% thriving on chaos vs. order)
51
USC
C S E
•
•
•
University of Southern California
Center for Software Engineering
Conclusions
Marketplace trends favor transition to VBSE paradigm
– Software a/the major source of product value
– Software the primary enabler of adaptability
VBSE involves 7 key elements
1. Benefits Realization Analysis
2. Stakeholders’ Value Proposition Elicitation and
Reconciliation
3. Business Case Analysis
4. Continuous Risk and Opportunity Management
5. Concurrent System and Software Engineering
6. Value-Based Monitoring and Control
7. Change as Opportunity
Processes for implementing VBSE emerging
– ICM, Lean Development, DMR/BRA, Balanced Scorecard,
Quality Economics, Agile Methods
August 26, 2011
©USC-CSSE
52
USC
C S E
University of Southern California
Center for Software Engineering
References
C. Baldwin & K. Clark, Design Rules: The Power of Modularity, MIT Press, 1999.
B. Boehm, “Value-Based Software Engineering,” ACM Software Engineering Notes,
March 2003.
B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D.
Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000.
B. Boehm and L. Huang, “Value-Based Software Engineering: A Case Study, Computer,
March 2003, pp. 33-41.
B. Boehm & K. Sullivan, “Software Economics: A Roadmap,” The Future of Software
Economics, A. Finkelstein (ed.), ACM Press, 2000.
B. Boehm and R. Turner, Balancing Agility and Discipline: A Guide for the Perplexed,
Addison Wesley, 2003 (to appear).
J. Bullock, “Calculating the Value of Testing,” Software Testing and Quality
Engineering, May/June 2000, pp. 56-62.
S. Faulk, D. Harmon, and D. Raffo, “Value-Based Software Engineering (VBSE): A
Value-Driven Approach to Product-Line Engineering,” Proceedings, First Intl. Conf. On
SW Product Line Engineering, August 2000.
August 26, 2011
©USC-CSSE
53
USC
C S E
University of Southern California
Center for Software Engineering
R. Kaplan & D. Norton, The Balanced Scorecard: Translating Strategy into Action,
Harvard Business School Press, 1996.
C. Persson and N. Yilmazturk, “Establishment of Automated Regression Testing at
ABB,” Proceedings, ASE 2004, August 2004, pp. 112-121.
D. Reifer, Making the Software Business Case, Addison Wesley, 2002.
K. Sullivan, Y. Cai, B. Hallen, and W. Griswold, “The Structure and Value of
Modularity in Software Design,” Proceedings, ESEC/FSE, 2001, ACM Press, pp.
99-108.
J. Thorp and DMR, The Information Paradox, McGraw Hill, 1998.
Economics-Driven Software Engineering Research (EDSER) web site:
www.edser.org
MBASE web site : sunset.usc.edu/research/MBASE
August 26, 2011
©USC-CSSE
54