Transcript Chapter 1

Software Testing and Quality Assurance:
Planning for Testing
• Reading Assignment:
– John McGregor and David A. Sykes, A
Practical Guide to Testing Object-Oriented
Software, Addison-Wesley, 2001, ISBN: 0-201325640.
• Chapter 3: Planning for Testing
Objectives
• To be able to plan a test process that complement
the development process.
• To learn how to analyze the risks associated with
verifying the required functionality.
• To be able to develop test plans for different
levels and types of testing required for the
comprehensive test process.
Outline
•
•
•
•
•
•
•
•
•
•
Introduction
A development process overview
A testing process overview
Risk analysis
A testing process
Roles in the testing process
A detailed set of test activities
Planning activities
Document templates
Test metrics
Introduction
• Testing requires considerable resources.
Effective utilization of those resources
requires good planning and good
management.
• Planning at technical level is guided by
templates that are “instantiated” as needed
by developers
• Basic testing process:
– Test early, test often, test enough.
A Development process overview
• A process is a continuous series of activities that
convey you to an end.
• Four main activities of software development
– Analysis: (domain and application analysis) focuses on understanding
the problem and defining the requirements for the software portions of
the system
– Design: (architectural design, subsystem and package design, class
design, and algorithm design) focuses on solving the problem in
software
– Implementation: (class implementation and integration) focuses on
translating the design into executable code
– Testing: (basic unit testing, integrated units testing, subsystem testing,
system testing) focuses on ensuring that inputs produce the desired
results as specified by the requirements.
• Maintenance focuses on bug repairs and enhancements.
A Development process overview (cont...)
• Development models: Evolutionary, incremental (our
focus), spiral, concurrent.
• Under an increment development process, a system is
developed as a sequence of increments.
• An increment is a deliverable, including models,
documentation, and code, which provides some of the
functionality required by the system.
• The products developed in one increment feeds into the
development of the next increment.
• The final increment delivers a deployable system that
meets all requirements.
• Increments can be developed in sequence or one or more
can be developed concurrently.
A Development process overview
(cont...)
• In each increment developers: analyze,
design, code, and test as needed.
• Developers have to perform these activities
repeatedly in building an increment because
they find errors in previous work
(incremental, iterative development
process).
A Development process overview
(cont...)
Increment 1
Increment 2
Increment N
Analysis
Analysis
Analysis
Design
Design
Design
Implementation
Testing
Implementation
Testing
Implementation
Testing
A Development process overview
(cont...)
• Object-oriented development is particularly
well suited to evolutionary development
because OO analysis, design and
implementation entail the successive
refinement of a single model.
• In OO analysis, we understand a problem
by modeling it in terms of objects and
classes of objects, their relationships and
responsibilities.
A Development process overview
(cont...)
• In OO design, we solve the problem by
manipulating those same objects and
relationships identified in analysis and
introducing solution-specific classes,
objects, relationships, and responsibilities.
• Implementation is straightforward from a
well-specified set of design products.
• Testing should also be done in every
increment. Regression tests must be run
between increments and within iterations.
Example: Incremental, Iterative
Development Plan for Brickles
Increment
1. Present user
interface
showing puck
bouncing in
window
Iteration
1.a Domain analysis: Construct class diagrams
1.b Application analysis: Construct class
diagrams and state diagrams
1.c Design: Study MFC and animation.
1.d Implement: Code Hello World using MFC.
2.a Design: Complete class diagram for puck
bouncing in a window
2.b Implementation: Code puck bouncing in
window
2.c Testing: Test the code for puck bouncing in
window
Example: Incremental, Iterative
Development Plan for Brickles (cont...)
Increment
2. Move paddle
in window and
detect
collisions
Iteration
1.a Application analysis: Add details of
Paddle control and collisions to class
diagrams, other diagrams
1.b Design: Design Paddle and collision
classes.
1.c Implementation: Code Paddle classes
incrementally from MovableSprite and
collision class from Exception.
Example: Incremental, Iterative
Development Plan for Brickles (cont...)
Increment
Iteration
3. Display brick 1.a Application analysis: Add collections
pile and
of sprites to class diagram
detect
1.b. Design: Design collision detection
collisions.
algorithm.
1.c Implementation: Code Brickpile
class by aggregating collection class.
Example: Incremental, Iterative
Development Plan for Brickles (cont...)
Increment
4. Add supply of
pucks and detect end
of match
Iteration
1.a Design: End of match algorithm to use
exceptions to detect endOfMatch
1.b. Implementation: PuckSupply class.
A Testing process overview
• Development and testing are two distinct but
intimately related processes. Their activities
overlap when test cases have to be designed,
coded, and executed.
• The roles of development and testing functionality
are usually assigned to different people.
Development
process
Test
results
Development
product
Testing
process
A Testing process overview:
testability
• Testability is related to how easily you can
evaluate the results of the tests.
A Testing process overview: test
cases and test suites
• The basic component of testing is a test
case.
• A test case is a pair (input, expected result):
– Input: is a description of an input to the
software under test.
– Expected result: is a description of the output
that the software should exhibit for the
associated input.
• Input and expected result may not be simple
data values, they can be arbitrarily complex.
A Testing process overview: test
cases and test suites (cont...)
• A test case execution is a running of the
software that provides the input specified in the
test case and observes the results and compares
them to those specified by the test case.
• Test suites have some sort of organization based
on the kinds of test cases, e.g. system capacity
test suites, typical uses of the system suite, etc.
– Main issues in test suite: correctness, observability of
results, and adequacy.
A Testing process overview: STEP
testing technique
• Analysis: The product to be tested is examined to
identify any special features that must receive
particular attention and to determine the test cases
that should be constructed.
• Construction: the artifacts that are needed for
testing are created. Test cases identified during
analysis are translated into programming
languages and scripting languages.
• Execution and evaluation: the results are
examined to determine whether the software
passed the test suite or failed it.
Risk analysis ―A tool for testing
• Risk analysis is part of planning and development
effort.
• A risk ―anything that threatens the successful
achievement of a project’s goals
• A risk is an event that has some probability of
happening and, if it occurs, there will be some
loss (down time, financial).
• Fundamental principle for risk-based testing
– Test most heavily those portions of the system that
pose the highest risks to the project to ensure that the
most harmful faults are identified.
Risk analysis ―A tool for testing: risk types
• Project risks: include managerial and
environmental risks (e.g. insufficient supply
of qualified personnel).
• Business risks: are associated with domainrelated concepts. This type of risk is related
to the functionality of the program (e.g.
changes on the health insurance policy).
• Technical risks: include some
implementation concepts (e.g. the quality of
the code).
Risk analysis ―A tool for testing: risk analysis
• Risk analysis ―is a procedure for identifying risks and
for identifying ways to prevent problems from becoming
real.
• The output of risk analysis is a list of identified risks in
the order of the level of risk that can be used to allocate
limited resources and to prioritize decisions.
• Risks in OO software projects are unique to the
architectural features:
– Complex interactions among objects
– Complex behavior associated with a class specification
– Changing or evolving project requirement
– Complexity of a class measured by:
• The size of its specification
• The number of relationships a class has with other classes
Risk analysis ―A tool for testing:
risk analysis (cont...)
• Source of risks:
– For system testing, the various uses of the
system are prioritized based on the importance
to the user and proper operation of the system.
– Risks are also associated with the programming
language and development tools that are being
used to implement the software (strong typed
vs. weak typed languages).
Risk analysis ―A tool for testing:
risk analysis (cont...)
• Conducting the analysis:
– Risk analysis technique includes three tasks:
• Identify the risks that each use case poses to the
development effort (classify as low, medium and
high –may be also very high)
• Quantify the risk
• Produce a ranked list of use cases
Risk analysis example
• Brickles example
Use
Risk
Level
Frequency Criticality
Scenario
Wins
Medium
Low
High
Player wins game
Loses
Medium
High
Low
Player loses game
• You may combine the frequency and the
criticality to determine which should be tested
more heavily.
• Risk level:
– Conservative strategy: select the higher value.
– Averaging strategy: select the value between the two
values.
A testing process: planning issues
• Issues that must be addressed to give a basic
shape to the test process:
– Planning issues:
• Testing the models.
• Class testing, interaction testing, system testing,
regression testing.
A testing process: dimensions of
software testing
• Who performs the testing?
– Developers, independent testers or combination of the
two.
– Developers may exchange code and test each other’s
code.
• Which pieces will be tested?
– Test nothing, test a sample, test everything, or just the
ones associated with high risks.
• When will testing be performed?
– Test every day, test components as they are
developed, test all components together at the end, at
special milestones.
A testing process: dimensions of
software testing (cont...)
• How will testing be performed?
– Knowledge of specification only, knowledge of
specification and implementation.
• How much testing is adequate?
– No testing, exhaustive testing (consider the expected
lifetime of the software), is the software life-critical,
– Coverage: is a measure of how completely a test suite
exercises the capabilities of a piece of software (other
measures: if every LOC executed at least once, the
number of requirement that is checked by the test
suite).
A testing process: adequacy of test
cases
• Adequacy of test cases: test the software enough
to be reasonably sure that the software works as
it is supposed to.
• Adequacy can be measured based on coverage:
– Coverage based on the requirements: based on what
the software is supposed to do ―how many of the
requirements called out in the specification are tested
– Coverage based on the code: based on how the
software actually works ―how much of the software
was executed as a result of running test suites
A testing process: adequacy of test
cases (cont...)
• Functional testing (specification-based or black
box testing): test cases are constructed based
solely on the software’s specification and not on
how the software is implemented.
• Structural testing (implementation-based or
white box testing): test cases are constructed
based on the code that implements the software.
• Use some combination of both approaches.
A testing process: adequacy of test
cases― risk analysis (cont...)
–
–
–
–
Prototype components
Production components
Library components
Framework components
Risk increases
• Risk analysis in the testing process is applied to
determine the level of details and amount of time
to dedicate to testing a component.
• A reasonable scale of increasing risk for
components is:
Roles in the testing process
• One or more people can assume each role. One
person can assume multiple roles. Roles must be
separate:
– Class tester: test individual classes as they are
programmed.
– Integration tester: test a set of objects that are being
brought together from different development
resources (requires testing and development skills).
– System tester: responsible for independently
verifying that the completed application satisfies the
system requirements.
– Test manager: managing the test process.
Planning activities: scheduling testing
activities
• Class tests:
– Scheduled based on the developer’s judgment as they
become useful or necessary.
– Useful during coding
– Necessary when adding a component to the code base.
• Integration tests:
– Scheduled as specific intervals, usually at the end of major
iteration that signal the completion of an increment
• System tests:
– Performed on major deliverables at specified intervals
throughout the project, usually specified in the project
plan.
Planning activities: estimation
• Estimation: estimating resources ―cost, time and
personnel
• Factors should be considered for estimation:
– Level of coverage: estimate the amount of effort for one use
case then you can construct the estimate for the whole system
– Domain type: logic intensive vs. data intensive (simple logic
but difficult to test).
– Equipment required: environment as close as possible to the
deployment environment. Costs of equipment for simulation
must also be considered.
– Organizational model: buddy approach (two developers swap
code with each other and test) or different approach.
– Testing effort estimate: effort for testing all classes: total
number of classes times the effort per class.
Planning activities: document
templates
• Project test plan
• Component test plan (one per important
component/class).
• Use case test plan (one per use case)
• Integration test plan (one per increment)
• System test plan (only one)
Planning activities: document templates
— The IEEE 829 Standard Test Plan outline
1.0
2.0
3.0
4.0
5.0
Introduction
Test Items
Tested Features
Features Not Tested (per cycle)
Testing Strategy and Approach
5.1 Syntax
5.2 Description of Functionality
5.3 Arguments for Tests
5.4 Expected Output
5.5 Specific Exclusions
5.6 Dependencies
5.7 Test case Success/Failure criteria
Planning activities: document templates —
The IEEE 829 Standard Test Plan outline (Cont.)
6.0
7.0
8.0
9.0
10.0
11.0
12.0
13.0
14.0
15.0
16.0
Pass/Fail Criteria for Complete Test Cycle
Entrance Criteria / Exit Criteria
Test Suspension Criteria and resumption Requirements
Test Deliverables/Status Communication Vehicles
Testing Tasks
Hardware and Software Requirements
Problem Determination and Correction Responsibilities
Staffing and Training Needs/Assignments
Test Schedules
Risks and Contingencies
Approvals
Planning activities: document
templates— Project test plan
• The purpose of project test plan is to summarize the
testing strategy that is to be employed for the project:
– The steps in the development process at which testing will
occur
– The frequency of testing with which the testing should
occur
– Who is responsible for the activity
• The project test plan may be an independent
document or it may be included in either the overall
project plan or the project’s quality assurance plan.
Planning activities: document
templates— Component test plan
• The purpose of the component test plan is to
define the overall strategy and specific test cases
that will be used to test a certain component.
• Two types of guiding information:
– Project criteria: standards that have been agreed upon
as to how thoroughly each component will be tested (e.g.
100% of the critical components will be tested)
– Project procedure: identify techniques that have been
agreed upon as the best way to handle a particular task
(e.g. constructing a PACT class for each component that
will be tested)
Planning activities: document
templates— Use case test plan
• The purpose of the use case test plan is to describe the
system-level tests to be derived from a single use case
• Three levels of use cases:
– High level: abstract use cases that are the basis for being
extended to end-to-end system level use case
– End-to-end system
– Functional sub-use cases: aggregated into end-to-end systemlevel use cases.
• Types of use cases:
– Functionality use cases: modified the data maintained by the
system in some way
– Report use cases: accessed information to the user.
Planning activities: document
templates— Integration test plan
• Integration test plan is important in an iterative
development environment.
• Specific sets of functionality will be delivered before
others.
• Out of these increments the full system slowly emerges.
• Since small, localized behavior should have already been
tested. Integration tests should be more complex and
more comprehensive than the typical component tests.
• Individual test plans are combined to form the integration
test plan for a specific increment.
• Integration test depends on other test cases, so no
template is provided.
Planning activities: document
templates— System test plan
• The system test plan is a document that
summarizes the individual use case test
plans and provides information on
additional types of testing that will be
conducted at the system level.
Planning activities: document
templates—Iteration in planning
• Iteration in the development process affects how
planning is carried out for testing.
• Changes in product or increment requirements at
least require that test plans be reviewed
• Requirements-to-use-case mapping matrix is a
spreadsheet with requirement IDs in the vertical
axis and use case IDs on the horizontal axis.
• Use-case-to-package(s) mapping matrix: each use
case is related to a set of packages or classes.
Planning activities: document
templates—planning effort
• The effort expended on planning depends
on:
– The amount of reuse that exists among the
templates
– The effort required to complete each plan from
the template
– The effort to modify an existing plan
Planning activities: test metrics
• Test metrics include measures that provide
information for evaluating the effectiveness
of individual testing technique and the
complete testing process
• Metrics are also used to provide planning
information such as estimates of effort
required for testing.
Planning activities: test metrics
(cont...)
• Coverage indicates which items have been touched by the
test cases.
• Examples of coverage measures (product measures):
– Code coverage: which lines of code have been executed
– Postcondition coverage: which method postconditions have
reached
– Model-element coverage: which classes and relationships in a
model have been used in test cases
• Complexity measures
– Number and complexity of methods in the class
– Number of lines of code
– Amount of dynamic binding
Planning activities: test metrics
(cont...)
• Historical data used for projections and for
planning a new project
• The (developer-hour) / (number of defects)
metric provides a measure of the cost of the
process
• Each company will need baseline their
process and collect actual performance data
before using these numbers for planning
purposes.
Planning activities: test metrics
(cont...)
• The effectiveness of the testing process is
evaluated by collecting a data over the complete
development life cycle
• The efficiency of the testing process is measured
by considering the intervals between the
development phase in which the defect is injected
and the phase in which it is detected for all
defects:
– The perfectly effective testing process finds every
defect in the same development phase in which it was
injected.
Planning activities: fundamental
testing metrics
• Time:
– The time required to run a test (sample units:
generally estimated in minutes or hours per test ).
– The time available for the test effort (sample units:
generally estimated in weeks and measured in
minutes)
• The cost of testing (sample units: currency, such
as dollars; can also be measured in units of
time)
Key points
• Basic testing process: Test early, test often, test enough.
• Four main activities of software development: analysis,
design, implementation and testing.
• Development and testing are two distinct but intimately
related processes.
• The basic component of testing is a test case (pair of
input and expected result)
• STEP testing technique: analysis, construction, execution
and evaluation.
• A risk ―anything that threatens the successful
achievement of a project’s goals
Key points (cont...)
• Risk analysis technique includes three tasks:
identify, quantify the risk and produce a ranked
list of use cases.
• Adequacy of test cases: test the software enough
to be reasonably sure that the software works as it
is supposed to.
• Roles in the testing process: class tester,
integration tester, system tester, and test manager.
• Factors should be considered for estimation: level
of coverage, domain type, equipment required,
organizational model, and testing effort estimate.
Key points (cont...)
• Document templates:
– The purpose of project test plan is to summarize the testing
strategy that is to be employed for the project
– The purpose of the component test plan is to define the overall
strategy and specific test cases that will be used to test a certain
component.
– The purpose of the use case test plan is to describe the systemlevel tests to be derived from a single use case
– The system test plan is a document that summarizes the
individual use case test plans and provides information on
additional types of testing that will be conducted at the system
level
• Test metrics include measures that provide information
for evaluating the effectiveness of individual testing
technique and the complete testing process.