System Integration Verification and Validation

Download Report

Transcript System Integration Verification and Validation

System Integration
Verification and Validation
Remember V-Cycle for all Increments?
System
Requirements
System
Planning
V4
System Verification
SE/PVV
Activities
System
Architecture & Design
System Integration
Verification
& Validation
V3
ME Activities
EE Activities
SW Activities
V1
Disc.
Planning
SW Requirements
SW Architecture
SW Verification
SW Integration
V2
SW Design
SW Coding
SW Module Test
Code Review
Realization
Test Overview
Test Area
Who
Tests What
Against What
Where
SW
Module
Test
The SWDeveloper
Tests a SW
Package (or single
Modules)
Against the
SW Design
Simulation
on PC,
Target
SW
Integration
Test
The SWTest
Engineer
Tests a SW
Component
(possibly within a
SW System)
Against the
SW
Architecture
Partial or
whole
System
SW
Verification
Test
The SWTest
Engineer
Tests SW
Components
(within a SW
System)
Against the
SW
Requirements
Usually
whole
System
Why SI?
Why do we need System Integration ?
The purpose of System Integration is
to assemble a system from its defined components
to ensure that the interfaces between the components of the integrated system function properly
Focus is on the System Level, i.e.
releases of sub projects are handled as one part (system modules / discipline components)
interfaces between these parts are the object of tests
Remark:
Even if a project has no official System Integration Team, the activities are performed anyway
(by highest / last integration in the project)
Integration testing(interface testing)
Examines the interaction of software elements (components) after system integration
Integration is the activity of combining individual software components into a larger subsystems
Further integration of subsystems is also part of the system integration process
Each component has already been tested for its internal functionality (component test). Integration tests
examine the external functions
Scope:
Integration tests examine the interaction of software components (subsystems) with each other:
interfaces with other components
interfaces among GUIs/ MMIs
Integration tests examine the interfaces with the system environment
Tests cases may be derived from interface specifications, architectural design or data models
Verification vs. Validation
Verification: Proof of compliance with the stated requirements (def. after ISO 9000)
" Did we proceed correctly when building the system?"
Validation: Proof of fitness for expected use
" Did we build the right system software system ?"
Verification within the general V-Model
Each development level is verified against the contents of the level above it
-to verify: to give proof of evidence
-to verify: means to check whether the requirements and definitions of the previous level
were implemented correctly
requirements
definition
acceptance test
functional system
design
system test
technical system
design
integration test
component
specification
component
test
programming
Development
and Integration
Verification
System test(Verification)
Testing the integrated software system to prove compliance with the specified requirements
-software quality is looked at form the user's point of view
functional requirements: suitability, accuracy,
interoperability, compliance, security
"what the system does"
System tests refer to
nonfunctional requirements: reliability, usability,
efficiency, portability, maintability
"how the system works"
Terminologie
RELIABILITY: the ability of the software product to perform required functions under stated conditions for a
specified period of time for a specified number of operations
USABILITY: testing to determine the extent of which the software product is understood, easy to learn,
easy to operate
EFFICIENCY: the process of testing to determine the efficiency of software product
MAINTABILITY: the process of testing to determine the maintability of software product
PORTABILITY: the process of testing to determine the easy to transfer the software product from one
hardware to another (or software environment to another)
SUITABILITY: the capability of a software product to provide an appropriate set of functions for specified
tasks and user objectives
ACCURACY: the capability of a software product to provide the right or agreed results or effects with the
needed degree of precision
COMPLIANCE: ability of software product to adhere to standards, conventions
INTEROPERABILITY: ability of software product to interact with one or more specified components or
system
SECURITY: attributes of software product that bear on its ability to prevent unauthorized access to
programs and data
System test
Test cases may be derived from:
functional specifications
use cases
business processes
risk assessments
Scope:
Test of the integrated system from the user's point of view
The test environment should match the true environment:
All external interfaces are tested under true conditions
No tests in the real life environment!
Validation within the general V-Model
Validation refers to the correctness of the each development level
to validate: to give proof of having value
to validate: means to check the appropriateness of the results of one development level
requirements
definition
acceptance test
functional system
design
system test
technical system
design
integration test
component
specification
component
test
programming
Development
and Integration
Verification
Validation
Testing process @ Continental
Discipline
(Entertainment, Navi etc.)
System Integration
PVV
Requirements /
Architecture
(Product Verification and Validation)
Failed
Test
Result
Passed
PRODUCTION
Verification vs.Validation
Verification
Validation
Test design techniques
Deriving test cases from requirements
Designing test cases must be a controlled process
test object and requirements
on the test object
defining test
requirements
and test criteria
test
test case
case 11
Test cases can be created in a formal way or in a informal way depending on the
project delimitations and on the maturity of the process in use.
Test design techniques
Traceability
Tests should be traceable: which test case was included in the test portofolio, based on which
requirement?
test scenarios
test object and requirements
on the test object
defining test
requirements
and test criteria
test
test case
case 11
test case
test case
test case4
test case
test case
test case 1
Test design techiques
Definitions
Test object
the subject to be examined (a document or a piece of software in the software development process)
Test condition
an item or an event: a function ,a transaction ,a quality criterion or an element in the system
Test criteria
the test object has to confirm the test criteria in order to pass the test
Test design techniques
Test case description according to IEEE 829:
Input values: description of the input data on the test object
Preconditions: situation previous to test execution or characteristics of the test object before conducting
the test case
Expected results: output data that the test object is expected to produce
Postconditions: characteristics of the test object after test execution, description of its situation after test
Dependencies: order of execution of test cases, reason for dependencies
Distinct identification: Id or key in order to link ,for example ,an error report to the test cases where it
appeared
Requirements: characteristics of the test object that the test case will examine
Test level
PVV (Product Verification and Validation) - Test Level Definition
To optimize test execution, the following test levels were defined:
XXS (extreme small): Screening-Test -> handover test between PVV and SI/SW
XS (extra small): Quick-Test -> to get a fast overview about all main functions/features
S (small): Pre-release-Test -> check all functions and find major errors (feedback for SW dev.)
M (medium): Release-Test (part 1) -> to find all serious errors
L (large): Release-Test (part 2) -> to find all errors
Remarks: M- and L-level are forming 100% of all test cases, therefore all other test levels are sub groups of
the complete set (e.g. S will be a sub group of M).
The complete release test can be divided in two parts (M- and L-level). This gives the opportunity, to deliver
the release to the customer with a finalized M-test first.
Note: It is not allowed to change the SW version through hole release test run!
Types of tests
Module tests (MP3 Player, GPS, HMI) - smoke tests
Integration tests (e.g. MP3 player vs HMI)
Verification tests
Validation tests
System requirements document????
Improvements for applications-user point of view?
Test suites
A Test suite is a collection of test cases which are grouped based on one criteria or more
A system can have an unlimited number of test suites based on different criteria:
- A test suite for HMI tests
- A test suite for Performance tests
- A test suite for Navigation tests
- A test suite for Entertainment tests
- etc
Test plan-Test suite-Test package
Verification test -Example
Verification test-Examples
Examples
Examples
Examples
Examples
Validation test-Example
Evaluation
Create System requirement document (1p)
Create Test plan(1p)
Improvements for HMI,GPS,MP3 player and speech applications (1p)
Create 2 verification tests (1p)
Create 2 validation tests for each application (1p)
Free tests (EBTs –Experience Based Testing) on HMI, GPS, MP3 player ,Speech applications (1p)
Find Bugs