Transcript Slide 1

University of Southern California
Center for Systems and Software Engineering
Value-Based Software Inspection & Testing
Process
Qi Li
[email protected]
University of Southern California
November 9, 2011
University of Southern California
Center for Systems and Software Engineering
Outline
• Software Testing In General
• Value-based Software Inspection & Testing
• Value-based Testing Guideline in 577ab
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Methods of testing
• Test to specification:
–
–
–
–
Black box
Data driven
Functional testing
Code is ignored: only use specification document to
develop test cases
• Test to code:
– Glass box/White box
– Logic driven testing
– Ignore specification and only examine the code.
7/17/2015
3
University of Southern California
Center for Systems and Software Engineering
Black-box testing
I
n
a
Input test data
I
p
n
b
e
u
t
o
m
h
a
s
c
a
v
l
i
a
o
o
u
u
u
s
i
n
g
s
r
e
S
y
s
t
e
m
O
t
Output test results
7/17/2015
Oe
d
u
h
t
e
e
p
u
p
f
e
t
r
c
e
t
s
s
w
e
h
n
c
i
e
c
h
o
r
e
v
e
a
l
f
s
4
University of Southern California
Center for Systems and Software Engineering
Pairing down test cases
• Use methods that take advantage of symmetries,
data equivalencies, and independencies to reduce
the number of necessary test cases.
– Equivalence Testing
– Boundary Value Analysis
• Determine the ranges of working system
• Develop equivalence classes of test cases
• Examine the boundaries of these classes carefully
7/17/2015
5
University of Southern California
Center for Systems and Software Engineering
Equivalence partitioning
Invalid inputs
Valid inputs
System
Outputs
7/17/2015
6
University of Southern California
Center for Systems and Software Engineering
Boundary value testing
• Partition system inputs and outputs into
‘equivalence sets’
– If input is an integer between 4 and 10,
equivalence partitions are < 4, 4-10 and > 10
• Choose test cases at the boundary of these
sets
– 3, 4, 7, 10,11
7/17/2015
7
University of Southern California
Center for Systems and Software Engineering
Common Mistakes for ATPC
•
•
•
•
•
Unspecific input and output test data
Unspecific privileges
Incomplete Dependency set
None full coverage of requirements
None extension of test cases, especially test cases
deals with invalid or boundary value inputs
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Testing Levels
•
•
•
•
•
7/17/2015
Unit testing
Integration testing
System testing
Acceptance testing
Performance testing
9
University of Southern California
Center for Systems and Software Engineering
Outline
• General Software Testing Knowledge
• Value-based Software Inspection Testing Process
• Value-based Testing Guideline in 577ab
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Motivation
• Value-neutral SE methods are increasingly risky [Boehm, 2003]
– Every requirement, use case, object, test case, and defect is equally important
– “Earned Value” Systems don’t track business value
– System value-domain problems are the chief sources of software project failures
• Testing & Inspection resources are expensive and scarce
– 30%-50%, even higher for high reliability projects [Ramler, 2005]
– Time-to-market [Boehm, Huang, 2005]
• Empirical Findings [Bullock 2000, Boehm & Basili 2001 ]
–
–
–
–
–
About 20 percent of the features provide 80 percent of business value
About 80 percent of the defects come from 20 percent of the modules
About 80 percent of avoidable rework comes from 20 percent of the defects
About 90 percent of the downtime comes from, at most, 10 percent of the defects
…
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Fundamental Theory
• Value-based Software Engineering 4+1 theorem [Boehm, 2005]
Dependency
Theory
How do dependencies
affect value realization?
Utility Theory
What values are important?
How is success assured?
How important are the
values?
Theory W:
SCS Win-Win
How to adapt to change and
control value realization?
Control Theory
7/17/2015
How do values determine
decision choices?
Decision Theory
University of Southern California
Center for Systems and Software Engineering
Motivation
•20% of Features Provide 80% of Value:
Focus Testing on These [Bullock, 2000]
100
80
% of
Value
for
Correct
Customer
Billing
Automated test
generation tool
- all tests have equal value
60
40
Value-neutral
20
5
7/17/2015
10
Customer Type
15
13
University of Southern California
Center for Systems and Software Engineering
Value-based Review Prioritization
• Most Relevant Review Processes
– Keun Lee, 2005, Value-Based Review Process
Users
Developers
Negotiation
Customers
Reviewing
Artifacts
Priority
Priorities of
system
capabilities
Meeting
Critic
ality
Other stakeholders
General Valuebased checklist
High
Medi
um
Low
High
1
4
6
Medi
um
2
5
Low
3
optio
nal
optio
nal
optio
nal
Number indicates the usual ordering of
review*
Criticalities of
issues
Domain Expert
Artifacts-oriented
checklist
* May be more cost-effective to review
highly-coupled mixed-priority artifacts.
University of Southern California
Center for Systems and Software Engineering
V&V Course Schedule
V&Ver
Review Package
Assignment
Learn to Use Bugzilla System for Your Project Team
Eval
of
VC
OCD,FED, LCP
Package
Eval
of Initial
PRO
Prototype
Eval of Core FC
OCD,PRO,SSRD**,SSAD,LCP,FED, SID
Package
Eval of Draft FC
OCD,PRO,SSRD**,SSAD,LCP,FED, SID
Package
Eval of FC/DC
OCD,PRO,SSRD**,SSAD,LCP,FED, SID, QMP,
Package
ATPC^, IP^
Eval
of
Draft
OCD,PRO,SSRD**,SSAD,LCP,FED, SID, QMP,
DC/TRR Package
ATPC^, IP^, TP^
Eval of DC/TRR
OCD,PRO,SSRD**,SSAD,LCP,FED, SID, QMP,
Package
ATPC, IP, TP, IAR^,UM^,TM^,TPR^
2015/7/17
2009 V&V
Method
2010 V&V
Method
FV&V
FV&V
FV&V
FV&V
FV&V
Growing
VbV&V
FV&V
VbV&V
FV&V
VbV&V
VbV&V
VbV&V
VbV&V
VbV&V
15
University of Southern California
Center for Systems and Software Engineering
Content to be reviewed
FC/DCP
CoreFCP
DraftFCP
1&2 sem
1&2 sem
2 sem
1 sem
OCD
100%
100%
100%
100%
FED
AA(Section 1,5)
NDI(Section1,3,4.1,4.2.1,4.
2.2)
Section 1-5
Section 1-5
100%
LCP
Section 1, 3.3
100%
100%
100%
SSRD
AA(100%)
NDI(N/A)
AA(100%)
NDI(N/A)
AA(100%)
NDI(N/A)
AA(100%)
NDI(N/A)
SSAD
Section 1, 2.1.1-2.1.3
Section 1, 2
Section 1, 2
100%
PRO
Most critical/important use
cases
100%
100%
100%
SID
100%
100%
100%
100%
QMP
N/A
N/A
Section 1,2
100%
ATPC
N/A
N/A
N/A
100%
IP
N/A
N/A
N/A
100%
Doc/Sec
Growing
2015/7/17
16
University of Southern California
Center for Systems and Software Engineering
Value-Based V&V Process
Importance
Artifacts
Prioritization
Quality
Risk
Review and Report
Concern
Dependency
List top 10 defects/
issues
Manage
ment
Overview
2015/7/17
Technical
Details
Review
Cost
Major
Errors/O
missions
Critical
Concerns
17
University of Southern California
Center for Systems and Software Engineering
Artifact Prioritization
Priority Factor
Importance
5: most important
3: normal
1: least important
2015/7/17
Rating Guideline
 Without this document, the project can’t move
forward or could even fail; it should be rated with
high importance
 Some documents serve a supporting function.
Without them, the project still could move on;
this kind of document should be rated with lower
importance
18
University of Southern California
Center for Systems and Software Engineering
Artifact Prioritization
Priority Factor
Rating Guideline
 Based on previous reviews, the documents
with intensive defects might be still fault-prone,
so this indicates a high quality risk
 Personnel factors, e.g. the author of this
Quality Risk
documents is not proficient or motivated
enough; this indicates a high quality risk
5: highly risky
3: normal
 A more complex document might have a high
1: least risky
quality risk
 A new document or an old document with a
large portion of newly added sections might
have a high quality risk
2015/7/17
19
University of Southern California
Center for Systems and Software Engineering
Artifact Prioritization
Priority Factor
Rating Guideline
 Sometimes some lower-priority artifacts are
required to be reviewed at least for reference
before reviewing a higher-priority one. For
Dependency
example, in order to review SSAD or TPC,
5: highly dependent
SSRD is required for reference
3:normal
 Basically, the more documents this document
1: not dependent
depends on, the higher the Dependency
rating is, and the lower the reviewing priority
will be
2015/7/17
20
University of Southern California
Center for Systems and Software Engineering
Artifact Prioritization
Priority Factor
Review Cost
5: need intensive
effort
3: need moderate
effort
1: need little effort
2015/7/17
Rating Guideline
 A new document or an old document with a
large portion of newly added sections
usually takes more time to review and vice
versa
 A more complex document usually takes
more time to review and vice versa
21
University of Southern California
Center for Systems and Software Engineering
Artifact Prioritization
Determine
Weights
Priority
Calculation
2015/7/17
 Weights for each factor (Importance, Quality Risk,
Review Cost, and Dependency) could be set according
to the project context. Default values are 1.0 for each
factor
 E.g: for a document, Importance=5, Quality Risk=3,
Review Cost=2, Dependency = 1, default weights are
used=> Priority= (5*3)/(2*1)=7.5
 A spreadsheet helps to calculate the priority
automatically, 5-level ratings for each factor are VH, H,
M, L VL with values from 5 to 1, intermediate values 2,
4 are also allowed.
22
University of Southern California
Center for Systems and Software Engineering
2015/7/17
23
University of Southern California
Center for Systems and Software Engineering
Weights:
LCP
1
1
1
1
Importance
M
Quality Risk
L
Dependency
L
Review Cost
M
This document describes the life cycle plan of the
project. This document serves as supporting function,
without this, the project still could move on. With his
document, the project could move more smoothly.
H
OCD
SSRD
This document should be rated high because it
provides feasibility evidence for the project. Without
this document, we don't know whether the project is
feasible.
VH
PRO
The author of this document does not have
appropriate time to complete this document
with quality work.
VH
VH
This document contains the architecture of the system. This is a complex document and it is a new
Without this document, the project can't move forward document. The author of this document did
or even fail.
not know that this document was due until
the morning of the due date.
VL
SID
H
A lot of new sections added, but
this document is not very complex.
M
SSRD
H
SSRD, SSAD
L
This document contains the requirements of the
This is a complex document. This document
system. Without this document, the project can't move needs to be consistent with win conditions
forward or even fail.
negotiation, which might not be complete at
this point. Also, a lot of rework was required
based on comments from TA.
VH
SSAD
VH
This document gives the overall operational concept of This is a complex document and a lot of the
the system. This document is important, but it is not
sections in this document needed to be
critical for this success of the system.
redone based on the comments received
from the TA.
H
FED
Based on previous reviews, the author of this
document has a strong sense of
responsibility.
This document serves as supporting function, without
this document, the project still could move on, but the
project could move on more smoothly with this
document.
L
This is an old document. Only additions
made to existing sections.
1.00
H
Old document, but a lot of rework
done.
1.67
H
A lot of new section added to this
version of the document.
1.00
VH
This is an old document, but it is
complex with a lot of rework.
2.50
H
SSRD, OCD
VH
VH
This is an old document, but it is
complex with a lot of rework done
for this version.
L
M
Without this document, the project can probably move
forward, but the system might not be what the customer
is expecting. This document allows the customer to
have a glimpse of the system.
This is an old document with little new
contents. The author has a high sense of
responsibility and he fixed bugs from the last
review in reasonable time.
FED
1.25
VL
OCD, SSRD, FED, This is an old document and this
LCP, SSAD, PRO document has no technical
contents.
H
2015/7/17
Priority
0.40
L
This is an old document with little
content added since last version
and not much rework required.
1.33
24
University of Southern California
Center for Systems and Software Engineering
Value-Based V&V Process
Importan
ce
Artifacts
Prioritization
Quality
Risk
Review and Report
Concern
Depende
ncy
List top 10 defects/
issues
Manage
ment
Overview
2015/7/17
Technical
Details
Review
Cost
Major
Errors/O
missions
Critical
Concerns
25
University of Southern California
Center for Systems and Software Engineering
Top 10 Issues
Issue
1
SSRD
2
SSRD
Requirement supporting information
too generic.
3
SSAD
4
OCD
Wrong cardinality in the system
context diagram.
The client and client advisor
stakeholders should be concentrating
on the deployment benefits.
The cardinality of this diagram needs to be accurate since this describes the top level of
the system context. This is important for system success.
It is important for that this benefits chain diagram accurately shows the benefits of the
system during deployment in order for the client to show to potential investor to gather
fund to support the continuation of system development.
5
OCD
The system boundary and
environment missing support
infrastructure.
It is important for the System boundary and environment diagram to capture all necessary
support infrastructure in order for the team to consider all risks and requirements related
the system support infrastructure.
6
FED
Missing use case references in the
FED.
Capability feasibility table proves the feasibility of all system capabilities to date.
Reference to the use case is important for the important stakeholders to understand the
capabilities and their feasibility.
7
FED
Incorrect mitigation plan.
8
LCP
Missing skills and roles
Mitigation plans for project risks are important to overcome the risks. This is important for
system success.
The LCP did not identify the skill required and roles for next semester. This information is
important for the success of the project because the team next semester can use these
information and recruit new team members meeting the identified needed skills.
9
FED
10
LCP
CR# in FED doesn't match with CR#
in SSRD
COCOMO drivers rework
The CR numbers need to match in both FED and SSRD for correct requirement
references.
COCOMO driver values need to be accurate to have a better estimate for the client.
2015/7/17
Summary
Missing important requirements.
Rationale
A lot of important requirements are missing. Without these requirements, the system will
not succeed.
The output, destination, precondition, and post condition should be defined better. These
description will allows the development team and the client better understand the
requirements. This is important for system success.
26
University of Southern California
Center for Systems and Software Engineering
Value-based Testing Prioritization
• Risk Exposure (RE)
– Where Size (Loss) is the risk impact size of loss if the
outcome is unsatisfactory, Pro (Loss) is the probability
of an unsatisfactory outcome
• Risk Reduction Leverage (RRL)
– Where REbefore is the RE before initiating the risk
reduction effort and REafter is the RE afterwards.
– RRL is a measure of the cost-benefit ratio of
performing a candidate risk reduction or defect
removal activity
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Value-Based Testing Prioritization
• Value-Based Prioritization Drivers:
– Business Case Analysis
– Business Value
– Stakeholder
Prioritization
– Impact of Defect
– Size of
Loss
Defect Criticality
Risk Exposure
– Experience Base
7/17/2015
– Defect-prone
Components,
Performers
– Probability
of Loss
University of Southern California
Center for Systems and Software Engineering
Value-Based Testing Prioritization
• Objects are to be ranked by how well they can reduce risk
exposure
• Combining with their relative option costs
• =>Priority Trigger:
• This proposed strategy enables them to be prioritized in
terms of Risk Reduction Leverage (RRL) or ROI
• Supposed to improve the lifecycle cost-effectiveness of
defect removal techniques
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Value-based Testing Process
7/17/2015
Qi Li _Qualifying Exam
30
University of Southern California
Center for Systems and Software Engineering
Dependency Aware
Value of software product to organization
Natural speech input
Tertairy application
functions
Animated displays
Secondary application functions
User amenities
Main application functions
Operating System
Investment
Basic application functions
Data management system
High-payoff
Diminishing returns
Cost of software product [Boehm, 1981]
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Dependency Aware
• Dependency:
– Example: dependencies
among test cases to be
executed
– Solution: Prioritization
Algorithm (greedy alg)
• Select the one with the
highest RRL
• Check dependency
7/17/2015
9->3->9->5->9->4->7
University of Southern California
Center for Systems and Software Engineering
Research Method: Metrics
• Testing Cost Effectiveness
– Average Percentage of Business Importance Earned (APBIE)
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Case Studies
Study 1: Prioritize testing scenarios to be walked through
[Li TR1 2011]
• Case Study:
– Galorath Inc. (2011 Summer)
– Project: Installation Process Automation
– Challenge: Testing all scenarios is impossible under limited
testing resources (69 scenarios)
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Case Studies
Study 1: Prioritize testing scenarios to be walked through
• Case Study Results:
100.00%
90.00%
Value-based 100.00%
93.83%
90.12%
95.06%
83.95%
87.65%
77.78%
80.00%
Value-neutral
70.00%
74.07%
58.02%
61.73%
60.00%
58.02%
51.85%
50.00%
PBIE-1
45.68%
40.00%
Value-inverse
(worst case)
39.51%
30.00%
– Value-based prioritization
can improve the costeffectiveness of testing
PBIE-2
PBIE-3
35.80%
25.93%
20.00%
22.22%
16.05%
10.00%
9.88%
4.94%
6.17%
0.00%
8
10
7/17/2015
12
14
16Stop 18
Testing20
22
24
26
28
APBIE-1
70.99%
APBIE-2
10.08%
APBIE-3
32.10%
30
University of Southern California
Center for Systems and Software Engineering
Case Studies
Study 2: Prioritize software features to be tested
[Li ICSP 2009]
• Case Study:
– Institute of Software, Chinese Academy of Sciences
(2008 Spring)
– System testing of Version3.1 SoftPM that covers 9
features
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Case Studies
Study 2: Prioritize software features to be tested
• Case Study Results
99.9%
92.5%96.2%
87.6%
81.4%
75.3%
69.1%
69.1%
59.2%
100.0%
Value-based
90.0%
80.0%
PBIE
70.0%
60.0%
50.0%
40.0%
30.8%
30.0%
20.0%
10.0%
3.7% 7.4%
0.0%
1
2
3
40.7%
30.9%
24.7%
18.5% Value-inverse
12.3%
(worst case)
4
5
6
7
8
9
Features
7/17/2015
APBIE-1
76.9%
APBIE-2
34.1%
– Value-based prioritization
can improve the costeffectiveness of testing
University of Southern California
Center for Systems and Software Engineering
Business Importance (BI)
•Use Karl Wieger’s Method to get the relative BI for each feature
•BIi= WBenefit* Benifiti +WPenalty* Penaltyi
35.00%
30.00%
30.86%
28.40%
25.00%
20.00%
15.00%
9.88%
10.00%
6.17% 6.17%
6.17%
4.94%
3.70%
5.00%
3.70%
0.00%
F1
F2
F3
F4
F5
F6
Business Importance
7/17/2015
F7
F8
F9
University of Southern California
Center for Systems and Software Engineering
Quality Risks & Weights Allocation
Analytical Hierarchy Process (AHP)
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Quality Risks Probability
n
Pi 
7/17/2015
R
j 1
i, j
*Wj
9
40
University of Southern California
Center for Systems and Software Engineering
Testing Cost
25.00%
21.43%
20.00%
14.29%
15.00%
11.90%11.90%
11.90%
9.52%
10.00%
5.00%
7.14% 7.14%
4.76%
0.00%
F1
F2
F3
F4
F5
Cost
7/17/2015
F6
F7
F8
F9
University of Southern California
Center for Systems and Software Engineering
Put the three together: Value Priority
0.90
0.81
0.80
0.70
0.63
0.60
0.50
0.40
0.30
0.20
0.10
0.09
0.07
0.05
0.03
F3
F4
F5
0.14
0.04
0.02
F8
F9
0.00
F1
F2
F6
7/17/2015
Priority
F7
University of Southern California
Center for Systems and Software Engineering
Outline
• General Software Testing Knowledge
• Value-based Software Inspection & Testing Process
• Value-based Testing Guideline in 577ab
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Step1: Dependency Analysis
Start
TC-01-01(3 ,VH )
TC-03-01 (3 , VH )
TC-04-01 (5 , VH )
TC-05-01
( 4,N )
TC-11-01 (4 , VL )
7/17/2015
TC-03-02 ( 1,VL )
TC-04-02 (4, VL )
TC-05-02
( 4,VL )
TC-13-01 (4 , VL )
TC-05-03
( 4,VL )
TC-03-03 (1 ,VL )
TC-04-03 (4 , VL )
TC-05-05
( 4,VL )
TC-16-03 (2 , VL )
TC-01-02 (2 ,VL )
TC-03-04 (1,VL)
TC-14-01 (3 , VL )
TC-05-07
( 4,VL )
TC-19-01 (4 , VL)
TC-05-08
( 4,VL )
TC-02-01 (3 ,VL )
TC-02-02 (2,VL)
TC-15-01 (2 ,VL )
TC-16-01 (2 , VL )
TC-05-10
( 4,VL )
TC-12-01
(4 ,VL )
TC-16-02
(2 , VL)
TC-18-01
( 4, VL)
University of Southern California
Center for Systems and Software Engineering
Business Importance
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Step 2:Rate Business Importance
VH:5
H:4
N:3
L:2
VL:1
7/17/2015
This test case is used to test the functionality that will bring the Very High
benefit for the client, without passing it, the functionality won’t run
This test case is used to test the functionality that will bring the Very High
benefit for the client, without passing it, the functionality can still run
This test case is used to test the functionality that will bring the High benefit for
the client, without passing, the functionality won’t run
This test case is used to test the functionality that will bring the High benefit for
the client, without passing it, the functionality can still run
This test case is used to test the functionality that will bring the Normal benefit
for the client, without passing it, the functionality won’t run
This test case is used to test the functionality that will bring the Normal benefit
for the client, without passing it, the functionality can still run
This test case is used to test the functionality that will bring the Low benefit for
the client, without passing it, the functionality won’t run
This test case is used to test the functionality that will bring the Low benefit for
the client, without passing it, the functionality can still run
This test case is used to test the functionality that will bring the Very Low benefit
for the client, without passing it, the functionality won’t run
University of Southern California
Center for Systems and Software Engineering
Step 3: Rate Criticality
Block most (70%-100%) of the test cases, AND most of those
blocked test cases have High Business Importance or above
Block most (70%-100%) of the test cases, OR most of those
H:4
blocked test cases have High Business Importance or above
Block some (40%-70%) of the test cases, AND most of those
N:3
blocked test cases have Normal Business Importance
Block a few (0%-40%) of the test cases, OR most of those
L:2
blocked test cases have Normal Business Importance or below
VL:1 Won’t block any other test cases
VH:5
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Step 5: Rate Fail Probability
Experience
Did the test case fail before? --People tend to repeat
previous mistakes, so does software. From pervious
observations, e.g. unit test, performance at CCD, or
informal random testing, the test case failed before tends
to fail again
Is the test case new? --The test case that hasn’t not been
tested before has a higher probability to fail
Change Impact Does any recent code change (delete/modify/add) have
impact on some features? --if so, the test cases for these
features have a higher probability to fail
Personnel
Are the people responsible for this feature qualified? -- If
not, the test case for this feature tends to fail
Complexity
Does the feature have some complex algorithm/ IO
functions? --If so, the test case for this feature have a
higher probability to fail
Dependencies Does this test cases have a lot of connections (either
depend on or to be depended on) with other test case? -If so, this test case have a higher probability to fail
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Step 6: Prioritization
• Prioritization Algorithm
– Value First:
• Test the one with the highest Testing Value. If several test cases’
Test Values are the same, test the one with the highest Criticality
– Dependency Second:
• If the test case selected from the first step is not “Ready-to-Test”,
which means at least one of the test cases in its Dependencies Set
is “Not-Tested-Yet”. In such situation, prioritize the “Not-TestedYet” test cases according to “Value First” in this Dependencies Set
and start to test until all test cases in the Dependencies Set are
“Passed”. Then the test case with the highest value is “Ready-toTest”
– Update the prioritization:
7/17/2015
• After one round, update the Fail Probability based on updated
observation from previous testing rounds
University of Southern California
Center for Systems and Software Engineering
Research Proposal: Dependency Aware
Pick the one with the
highest Test Value (if the
same, choose the one
with higher Criticality)
Exclude the “Passed”
one for prioritization
N
<<- -In the Whole Test
Case Set- ->> Y
<<In the Dependencies Set>>
N
Have dependencies?
Start to test
<<Ready-to-Test>>
Failed?
<<Report for Resolution>>
Y
N
Resovled?
All dependencies
passed?
7/17/2015
Y
<<Ready-to-Test>>
N
Exclude the “Failed” one
and the others “NA” that
depends on it for
prioritization
University of Southern California
Center for Systems and Software Engineering
3
5
9
4
9
9->3->9->5->9->4->7
9
7
University of Southern California
Center for Systems and Software Engineering
Case Studies
Study 3: Prioritize software test cases to be executed
[Li TR2 2011]
• Experiment
– USC-CSCI 577b (2011 Spring)
– 5 Real Client Course Projects
– Acceptance testing phase
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Case Studies
Study 4: Prioritize software test cases to be executed
• Experiment Results (Quantitative)
APBIE-1
81.9%
100.0%
Value-based
90.0%
80.0%
PBIE
70.0%
– Project 1 as an example,
– Value-based prioritization
can improve the costeffectiveness of testing
60.0%
50.0%
40.0%
30.0%
20.0%
10.0%
0.0%
APBIE-2
60.00%
APBIE-3
50.38%
100.0%
90.0%
80.0%
PBIE
70.0%
Value-based
60.0%
50.0%
40.0%
Value-neutral
30.0%
20.0%
10.0%
0.0%
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
Test Case Order
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Case Studies
Study 3: Prioritize software test cases to be executed
• Experiment Results (Observance)
Defect ID in
Bugzilla
Severity
Priority
Test Case
ID
BI
FP
#4444
Critical
Resolve
Immediately
TC-04-01
VH
0.7
#4445
Major
Normal Queue
TC-04-03
H
0.7
#4460
Major
Normal Queue
TC-05-10
H
0.7
#4461
Major
Resolve
Immediately
TC-18-01
H
0.7
– Defects with higher Priority and Severity are reported earlier and
resolved earlier
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Case Studies
Study 4: Prioritize software test cases to be executed
• Experiment Results (Qualitative)
“Before doing the prioritization, I had a vague idea of which test cases are
important to clients. But after going through the Value-Based testing, I had a
better picture as to which ones are of critical importance to the client.”
“I prioritized test cases mainly based on the sequence of the system work
flow, which is performing test cases with lower dependencies at first before
using value-based testing. I like the value-based process because it can
save time by letting me focus on more valuable test cases or risky ones.
Therefore, it improves testing efficiency.”
“Value-based testing is very useful in complex systems with hundreds or
thousands of test-cases. However in 577 it should not be difficult to run
every test-case in every test iteration, making the prioritization less useful.
The impact of value-based testing and automated test management on
software quality is entirely dependent on the complexity of the project. If
complete test coverage is possible in the time given, the benefit of VBST to
software quality is minimal.”
7/17/2015
University of Southern California
Center for Systems and Software Engineering
Other information
•CSCI599 Program Analysis & Software Testing
Professor William Halfond
•Testing Tools:
http://www.opensourcetesting.org/functional.php
http://www.softwareqatest.com/qatweb1.html
7/17/2015
56