Transcript Integration

Integration: Process, Testing, Issues
SLIDES FROM:
- CH8 – SOFTWARE TESTING, SOMMERVILLE, SOFTWARE
ENGINEERING 9
- UNIT TESTING SWENET MODULE
Chapter 8 Software testing
1
Testing: Implementation
 Once one has determined the testing strategy, and
the units to tested, and completed the unit test
plans, the next concern is how to carry on the tests.
 If you are testing a single, simple unit that does not
interact with other units , then one can write a program
that runs the test cases in the test plan.
 However, if you are testing a unit that must interact with
other units, then it can be difficult to test it in isolation.
Test Implementation Terms
 Test Driver
 a class or utility program that applies test cases to a
component being tested.
 Test Stub
 a temporary, minimal implementation of a component
to increase controllability and observability in testing.
 When testing a unit that references another unit, the unit
must either be complete (and tested) or stubs must be
created that can be used when executing a test case
referencing the other unit.
 Test Harness
 A system of test drivers, stubs and other tools to support
test execution
Testing sequence
 Once the design for the unit is complete, carry out a static
test of the unit. This might be a single desk check of the
unit or it may involve a more extensive symbolic execution
or mathematic analysis.
 Complete a test plan for a unit.
 If the unit references other units, not yet complete, create
stubs for these units.
 Create a driver (or set of drivers) for the unit, which includes
the following;
 construction of test case data (from the test plan)
 execution of the unit, using the test case data
 provision for the results of the test case execution to be
printed or logged as appropriate
Integration: Beyond the unit
 Classes
 Components/Modules
 System
Chapter 8 Software testing
5
What is integration?
The term "integration" refers to a software development
activity in which separate software components are
combined into a whole. Integration is done at several
levels and stages of the implementation:
•Integrating the work of a team working in the same
implementation subsystem before releasing the
subsystem to system integrators.
•Integrating subsystems into a complete system.
Big Bang Integration
 Often the choice of rushed/inexperienced
programmers
 Create modules in parallel
 Assemble them in one operation
 Benefits:
 No need for test drivers and stubs!
 Issues:
 Complexity and cost of fault isolation
Chapter 8 Software testing
7
Incremental Integration
 Software modules are developed and




assembled into progressively larger parts of
the system
Complexity increases incrementally
Easier to isolate integration problems
Continues until complete system (at which
point system testing commences)
Software builds created – integration tests
executed against the build
Chapter 8 Software testing
8
Bottom up integration
 Need to understand the dependencies
between modules
 Modules that are most depended on are
developed and integrated first
 Then the modules that depend on them are
integrated next, etc.
 Reduces the needs for stubs
Chapter 8 Software testing
9
Top down integration
 Modules at the top of the dependency tree
are developed first, with integration
proceeding down the tree
 Requires a lot of stubs (dependent modules
not yet ready)
 Advantage: top level modules are typically
higher level functionality (e.g., user
interfaces) and are tested early in cycle –
gives early feeling for application, allows time
for important modifications
Chapter 8 Software testing
10
Sandwich Integration
 Employ both top down and bottom up
 Create stubs for the intervening classes
Chapter 8 Software testing
11
Continuous Integration
 Small amounts of code added to the baseline
on a very frequent basis (perhaps daily)
 Newly integrated code may not even be at
level of a completed method/class
 As long as code is unit-tested (introduces no
errors), it can be integrated into the baseline
 One of the 12 practices of Extreme
Programming
Chapter 8 Software testing
12
INTEGRATION IN PRACTICE
Chapter 8 Software testing
13
The role of the Integrator
Subsystem integration planning
 Define the builds
 What use cases and / or scenarios
 Identify the classes and components that will
be integrated
 Identify subsystem imports
 The subsystems (and versions) needed to make
this integration work
Use-case testing
 System level example
 The use-cases developed to identify system
interactions can be used as a basis for system
testing.
 Each use case usually involves several system
components so testing the use case forces
these interactions to occur.
 The sequence diagrams associated with the
use case documents the components and
interactions that are being tested.
Chapter 8 Software testing
16
Collect weather data
sequence chart
Chapter 8 Software testing
17
A usage scenario for MHC-PMS
Kate is a nurse who specializes in mental health care. One of her responsibilities
is to visit patients at home to check that their treatment is effective and that they
are not suffering from medication sid-effects.
On a day for home visits, Kate logs into the MHC-PMS and uses it to print her
schedule of home visits for that day, along with summary information about the
patients to be visited. She requests that the records for these patients be
downloaded to her laptop. She is prompted for her key phrase to encrypt the
records on the laptop.
One of the patients that she visits is Jim, who is being treated with medication for
depression. Jim feels that the medication is helping him but believes that it has the
side -effect of keeping him awake at night. Kate looks up Jim’s record and is
prompted for her key phrase to decrypt the record. She checks the drug
prescribed and queries its side effects. Sleeplessness is a known side effect so
she notes the problem in Jim’s record and suggests that he visits the clinic to have
his medication changed. He agrees so Kate enters a prompt to call him when she
gets back to the clinic to make an appointment with a physician. She ends the
consultation and the system re-encrypts Jim’s record.
After, finishing her consultations, Kate returns to the clinic and uploads the records
of patients visited to the database. The system generates a call list for Kate of
those patients who she has to contact for follow-up information and make clinic
appointments.
Chapter 8 Software testing
18
Features tested by scenario
 Authentication by logging on to the system.
 Downloading and uploading of specified patient
records to a laptop.
 Home visit scheduling.
 Encryption and decryption of patient records on
a mobile device.
 Record retrieval and modification.
 Links with the drugs database that maintains
side-effect information.
 The system for call prompting.
Chapter 8 Software testing
19
Subsystem integration planning
 Define the builds
 What use cases and / or scenarios
 Identify the classes and components that will
be integrated
 Identify subsystem imports
 The subsystems (and versions) needed to make
this integration work
Defining the build
System integration
 Identify subsystems
 Define build sets
 Things that are typically integrated together
 Define a series of builds
 To incrementally integrate the work
 Evaluate the integration plan
Identify subsystems
Define build sets
Define a series of builds
Regular builds important
 Build the software and regression test it at
regular intervals
 Make sure the new code doe not compromise
pre-existing functionality
 Build frequency depends on phase of project
 Weekly builds early on
 Daily builds at tail end to deal with last-minute
changes/additions (bux fixes!)
Chapter 8 Software testing
26
Code freezes
 Deadlines set after which no new code is
accepted for the day
 Can be late (6pm)
 Can be early (noon) to make sure that if there are build
issues, developers are around to help resolve them
 Code then built
 If issues, assumed the defect lies in new checked
in since last build
 Can confirm new baseline or revert to previous
baseline
Chapter 8 Software testing
27
Beyond unit testing
TESTING TIPS & TECHNIQUES
Chapter 8 Software testing
28
Component testing
 Software components are often composite
components that are made up of several interacting
objects.
 For example, in a weather station system, a reconfiguration
component includes objects that deal with each aspect of
the reconfiguration.
 You access the functionality of these objects through
the defined component interface.
 Testing composite components should therefore
focus on showing that the component interface
behaves according to its specification.
 You can assume that unit tests on the individual objects
within the component have been completed.
Chapter 8 Software testing
29
Interface testing
Chapter 8 Software testing
30
Interface testing
 Objective is to detect faults due to interface
errors or invalid assumptions about interfaces.
 Interface types
 Parameter interfaces Data passed from one method or
procedure to another.
 Shared memory interfaces Block of memory is shared
between procedures or functions.
 Procedural interfaces Sub-system encapsulates a set
of procedures to be called by other sub-systems.
 Message passing interfaces Sub-systems request
services from other sub-systems
Chapter 8 Software testing
31
Interface errors
 Interface misuse
 A calling component calls another component and makes an
error in its use of its interface e.g. parameters in the wrong order.
 Interface misunderstanding
 A calling component embeds assumptions about the behaviour
of the called component which are incorrect.
 Timing errors
 The called and the calling component operate at different speeds
and out-of-date information is accessed.
Chapter 8 Software testing
32
Interface testing guidelines
 Design tests so that parameters to a called procedure are




at the extreme ends of their ranges.
Always test pointer parameters with null pointers.
Design tests which cause the component to fail.
Use stress testing in message passing systems.
In shared memory systems, vary the order in which
components are activated.
Chapter 8 Software testing
33
Module integration
 Once the interface tests are complete, can
feel confident about the interface between
modules
 Then ready to test the functionality more
completely with integration tests
 Integration tests focus on the additional
functionality gained by assembling modules
Chapter 8 Software testing
34
Integration testing
 Ensure that the components in the
implementation model operate properly
when combined to execute a use case
 Common point of software failure
 Usually black-box
Integration Test Plan
 An Integration Test checks and verifies that the
various components in a class cluster communicate
properly with each other.
 With this plan, the emphasis is not on whether
system functionality is correctly implemented, but
rather do all the cluster classes “fit” together
properly.
 A single test plan, for each cluster, which tests the
communication between all cluster components is
typically sufficient.
Incremental Testing
 Incremental testing indicates that we test each
unit in isolation, and then integrate each unit,
one at a time, into the system, testing the overall
system as we go.
 Classes that are dependent on each other called
class clusters, are good candidates for an
increment integration.
 Candidate class clusters:
 Classes in a package
 Classes in a class hierarchy.
 Classes associated with the interaction diagram for a
use case.
Increment Test Planning
 Determine the classes to be tested in an




increment.
Determine the appropriate “class clusters”.
Develop a unit test plan and test driver for each
class.
Develop an integration test plan and test driver for
each class cluster.
Develop a test script that details the components
to be tested and the order in which the plans will
be executed.
Object-Oriented Testing Issues
 The object class is the basic testing unit for
system developed with OO techniques.
 This is especially appropriate if strong cohesion and
loose coupling is applied effectively in the class
design.
 An incremental testing approach is dictated by
 The package/class design decomposition.
 The incremental nature of the development process.
 Information hiding restricts/constrains testing to
using a white-box approach.
 Encapsulation motivates testing based on the
class interface.
Class Testing (1)
 Class test cases are created by examining the specification of the
class.
 This is more difficult if the class is a subclass and inherits data and
behavior from a super class. A complicated class hierarchy can be
pose significant testing problems.
 If you have a state model for the class, test each transition - devise a
driver that sets an object of the class in the source state of the
transition and generates the transition event.
 Class Constraints/Invariants should be incorporated into the class
test.
 All class methods should be tested, but testing a method in isolation
from the rest of the class is usually meaningless
 In a given increment, you may not implement all methods; if so,
create stubs for such methods.
Class Testing (2)
 To test a class, you may need instances of
other classes.
 Interaction diagrams provide useful guidance
in constructing class test cases:
 They provide more specific knowledge about how
objects of one class interact with instances of
other classes.
 Messages sent to a class object provide a guidance
on which test cases are most critical for that class.
Method Testing (1)
 A public method in a class is typically tested
using a black-box approach.
 Start from the specification, no need to look at the
method body.
 Consider each parameter in the method signature,
and identify its equivalence classes.
 Incorporate pre-conditions and post-conditions in your
test of a method.
 Test exceptions.
 For complex logic, also use white-box testing or
static testing.
Method Testing (2)
 For private methods, either
 modify the class (temporarily) so that it can be
tested externally:
 change the access to public
 or incorporate a test driver within the class
 or use static test methods, such as program
tracing or symbolic execution
Testing Tools
 There are a number of tools that have been
developed to support the testing of a unit or
system.
 googling “Software Testing Tools” will yield thousands of
results.
 JUnit testing (http://www.junit.org/index.htm) is a
popular tool/technique that can be integrated into
the development process for a unit coded in Java.
 Test infected: Programmers just love writing tests
http://members.pingnet.ch/gamma/junit.htm
System testing
 System testing during development involves
integrating components to create a version of
the system and then testing the integrated
system.
 The focus in system testing is testing the
interactions between components.
 System testing checks that components are
compatible, interact correctly and transfer the
right data at the right time across their
interfaces.
 System testing tests the emergent behaviour of
a system.
Chapter 8 Software testing
45
System and component testing
 During system testing, reusable components
that have been separately developed and offthe-shelf systems may be integrated with newly
developed components. The complete system is
then tested.
 Components developed by different team
members or sub-teams may be integrated at this
stage. System testing is a collective rather than
an individual process.
 In some companies, system testing may involve a
separate testing team with no involvement from
designers and programmers.
Chapter 8 Software testing
46
Testing policies
 Exhaustive system testing is impossible so
testing policies which define the required
system test coverage may be developed.
 Examples of testing policies:
 All system functions that are accessed through
menus should be tested.
 Combinations of functions (e.g. text formatting)
that are accessed through the same menu must
be tested.
 Where user input is provided, all functions must
be tested with both correct and incorrect input.
Chapter 8 Software testing
47
Release testing
 Release testing is the process of testing a particular
release of a system that is intended for use outside
of the development team.
 The primary goal of the release testing process is to
convince the supplier of the system that it is good
enough for use.
 Release testing, therefore, has to show that the system
delivers its specified functionality, performance and
dependability, and that it does not fail during normal use.
 Release testing is usually a black-box testing process
where tests are only derived from the system
specification.
Chapter 8 Software testing
48
Release testing and system
testing
 Release testing is a form of system testing.
 Important differences:
 A separate team that has not been involved in the
system development, should be responsible for
release testing.
 System testing by the development team should
focus on discovering bugs in the system (defect
testing). The objective of release testing is to
check that the system meets its requirements and
is good enough for external use (validation
testing).
Chapter 8 Software testing
49
Requirements based testing
 Requirements-based testing involves examining
each requirement and developing a test or tests
for it.
 Example: mental health care prescription
management system (MHC-PMS) requirements:
 If a patient is known to be allergic to any particular
medication, then prescription of that medication shall
result in a warning message being issued to the
system user.
 If a prescriber chooses to ignore an allergy warning,
they shall provide a reason why this has been ignored.
Chapter 8 Software testing
50
Requirements tests
 Set up a patient record with no known allergies. Prescribe medication
for allergies that are known to exist. Check that a warning message is
not issued by the system.
 Set up a patient record with a known allergy. Prescribe the medication
to that the patient is allergic to, and check that the warning is issued by
the system.
 Set up a patient record in which allergies to two or more drugs are
recorded. Prescribe both of these drugs separately and check that the
correct warning for each drug is issued.
 Prescribe two drugs that the patient is allergic to. Check that two
warnings are correctly issued.
 Prescribe a drug that issues a warning and overrule that warning.
Check that the system requires the user to provide information
explaining why the warning was overruled.
Chapter 8 Software testing
51