Verification and Validation

Download Report

Transcript Verification and Validation

Verification and Validation
Lecture 25
Lecture 26
Software Engineering, COMP201
Slide 1
Verification and Validation
Assuring that a software
system meets a user's needs
Software Engineering, COMP201
Slide 2
Objectives




To introduce software verification and validation
and to discuss the distinction between them
To describe the program inspection process and
its role in V & V
To explain static analysis as a verification
technique
To describe the Cleanroom software development
process
Software Engineering, COMP201
Slide 3
Verification vs validation




Verification:
"Are we building the product right"
The software should conform to its specification
Validation:
"Are we building the right product"
The software should do what the user really
requires
Software Engineering, COMP201
Slide 4
The V & V process


As a whole life-cycle process - V & V must be
applied at each stage in the software process.
Has two principal objectives
•
•
The discovery of defects in a system
The assessment of whether or not the system is usable in
an operational situation.
Software Engineering, COMP201
Slide 5
Static and dynamic verification

Software inspections Concerned with analysis
of the static system representation to discover
problems (static verification)
•

May be supplement by tool-based document and code analysis
Software testing Concerned with exercising
and observing product behaviour
(dynamic verification)
•
The system is executed with test data and its operational
behaviour is observed
Software Engineering, COMP201
Slide 6
Static and dynamic V&V
S t at ic
verifi cat io n
R equ irem ent s
s pecifi cat io n
Hi g h-lev el
d es ig n
Fo rm al
s pecifi cat io n
Det ail ed
d es ig n
P rog ram
Dy nam i c
v ali dat io n
P ro to ty pe
Software Engineering, COMP201
Slide 7
Program testing




Can reveal the presence of errors NOT their
absence !!!
A successful test is a test which discovers one
or more errors
The only validation technique for non-functional
requirements
Should be used in conjunction with static
verification to provide full V&V coverage
Software Engineering, COMP201
Slide 8
Types of testing

Defect testing
•
•

Tests designed to discover system defects.
A successful defect test is one which reveals the presence
of defects in a system.
Statistical testing
•
tests designed to reflect the frequency of user inputs. Used
for reliability estimation.
Software Engineering, COMP201
Slide 9
V& V goals
Verification and validation should establish
confidence that the software is fit for purpose


This does NOT mean completely free of defects
Rather, it must be good enough for its intended
use and the type of use will determine the degree
of confidence that is needed
Software Engineering, COMP201
Slide 10
V & V confidence
Depends on system’s purpose, user
expectations and marketing environment
•
Software function
» The level of confidence depends on how critical the software is to
an organisation
•
User expectations
» Users may have low expectations of certain kinds of software.
» Now it is less acceptable to deliver unreliable systems, so software
companies must devote more effort to V&V!
•
Marketing environment
» Getting a product to market early may be more important than
finding defects in the program
Software Engineering, COMP201
Slide 11
Testing and debugging
Defect testing and debugging are distinct processes

(!) Verification and validation is concerned with
establishing the existence of defects in a program
Debugging is concerned with
- locating and
- repairing these errors

(!!) Debugging involves
•
•
formulating a hypothesis about program behaviour
then testing these hypotheses to find the system error
Software Engineering, COMP201
Slide 12
The debugging process
Test
resu lt s
Lo cat e
erro r
Test
cases
S p eci ficat io n
Des ig n
erro r repai r
Repai r
erro r
Software Engineering, COMP201
R e-tes t
p rog ram
Slide 13
V & V planning




Careful planning is required to get the most
out of testing and inspection processes
Planning should start early in the development
process
The plan should identify the balance between
static verification and testing
Test planning is about defining standards for the
testing process rather than describing product
tests
Software Engineering, COMP201
Slide 14
The V-model of development
Requ ir em ent s
s pecifi cat io n
S y st em
s pecifi cat io n
S y st em
i nt eg rat io n
t es t pl an
Accep tan ce
t es t pl an
S ervi ce
S y stem
d es ig n
Accep tan ce
t es t
Det ail ed
d es ig n
S u b-s ys tem
i nt eg rat io n
t es t pl an
S y stem
i nt eg ratio n t est
M o du le an d
u ni t co de
and t ess
S u b-s ys tem
i nt eg rat io n t est
This diagram shows how test plans should be derived from the
system specification and design.
Software Engineering, COMP201
Slide 15
The structure of a software test plan





The testing process (a description of the major phases)
Requirements traceability (a part of the user)
Tested items
Testing schedule
Test recording procedures (it is not enough simply to run
tests )


Hardware and software requirements
Constraints
Software Engineering, COMP201
Slide 16
Software inspections




Involve people examining the source
representation with the aim of discovering
anomalies and defects
Do not require execution of a system so may be
used before implementation
May be applied to any representation of the
system (requirements, design, test data, etc.)
Very effective technique for discovering errors
Software Engineering, COMP201
Slide 17
Inspection success


Many different defects may be discovered in
a single inspection. In testing, one defect, may
mask another so several executions are required
The reuse domain and programming knowledge
so reviewers are likely to have seen the types of
error that commonly arise
Software Engineering, COMP201
Slide 18
Inspections and testing




Inspections and testing are complementary
and not opposing verification techniques
Both should be used during the V & V process
Inspections can check conformance with a
specification but not conformance with the
customer’s real requirements
Also inspections cannot check non-functional
characteristics such as performance, usability,
etc.
Software Engineering, COMP201
Slide 19
Program inspections – are reviews
whose objective is program defect detection.



Formalised approach to document reviews
Intended explicitly for defect DETECTION (not
correction)
Defects may be logical errors, anomalies in the
code that might indicate an erroneous condition
(e.g. an uninitialised variable) or non-compliance
with standards
Software Engineering, COMP201
Slide 20
Inspection pre-conditions






A precise specification must be available
Team members must be familiar with the
organisation standards
Syntactically correct code must be available
An error checklist should be prepared
Management must accept that inspection will
increase costs early in the software process
Management must not use inspections for
staff appraisal
Software Engineering, COMP201
Slide 21
The inspection process
Pl ann in g
Ov ervi ew
F o ll ow-u p
Ind iv id ual
p reparat io n
R ework
Ins pect io n
m eeti ng
Software Engineering, COMP201
Slide 22
Inspection procedure





System overview presented to inspection team
Code and associated documents are
distributed to inspection team in advance
Inspection takes place and discovered errors
are noted
Modifications are made to repair discovered
errors
Re-inspection may or may not be required
Software Engineering, COMP201
Slide 23
Inspection teams






Made up of at least 4 members
Author of the code being inspected
Inspector who finds errors, omissions and
inconsistencies
Reader who reads the code to the team
Moderator who chairs the meeting and notes
discovered errors
Other roles are Scribe and Chief moderator
Software Engineering, COMP201
Slide 24
Inspection checklists




Checklist of common errors should be used to
drive the inspection
Error checklist is programming language
dependent
The 'weaker' the type checking, the larger the
checklist
Examples:
•
•
•
•
Initialisation,
Constant naming,
loop termination,
array bounds, etc.
Software Engineering, COMP201
Slide 25
Fault class
Data faults
Con trol fau lts
Inpu t/ou tp ut faults
Interface faults
Storage
fau lts
man agemen t
Excep tion
man agemen t fau lts
Inspection check
Are all prog ram variables initialised befo re their
values
are used?
Hav e all con stants b een named ?
Sh ou ld th e lo wer bou nd of arrays
be 0, 1, o r someth in g
else?
Sh ou ld th e u pp er bo un d of array s b e
equ al to the size of
the array or Size -1?
If character strin gs are
used, is a d elimiter explicitly
assig ned?
Fo r each cond itio nal statemen t, is th e con dition co rrect?
Is each loo p certain to terminate?
Are co mp ou nd statemen ts correctly b racketed?
In case statements, are all possible cases accou nted fo r?
Are all inp ut v ariables used?
Are all ou tp ut variables assign ed
a value before they are
ou tp ut?
Do all fun ction an d p ro ced ure calls h av e the correct
nu mb er o f parameters?
Do fo rmal an d actu al p arameter ty pes match ?
Are the p arameters in th e righ t order?
If compo nents access shared memory, do they have
the
same model of th e sh ared memo ry stru cture?
If a lin ked stru ctu re is mod ified,
hav e all links b een
correctly reassign ed?
If dy namic
sto rage is used, has sp ace b een allocated
correctly ?
Is space explicitly d e-allo cated after it
is no lo nger
req uired ?
Hav e all po ssib le error cond itio ns been taken
into
acco un t?
Inspection checks
Inspection rate





500 statements/hour during overview
125 source statement/hour during individual
preparation
90-125 statements/hour can be inspected
Inspection is therefore an expensive process
Inspecting 500 lines costs about 40 man/hours
effort = £2800
Software Engineering, COMP201
Slide 27
Automated static analysis



Static analysers are software tools for source
text processing
They parse the program text and try to discover
potentially erroneous conditions and bring these
to the attention of the V & V team
Very effective as an aid to inspections. A
supplement to but not a replacement for
inspections
Software Engineering, COMP201
Slide 28
Static analysis checks
Fault class
Data faults
Con trol fau lts
Inpu t/ou tp ut faults
Interface faults
Storage
fau lts
man agemen t
Static analysis check
Variables used before in itialisatio n
Variables declared bu t n ev er used
Variables
assig ned twice but never used
between assig nments
Po ssib le array b oun d vio lation s
Undeclared v ariables
Unreachab le cod e
Uncon ditio nal bran ches in to loop s
Variables ou tp ut twice with n o interv ening
assig nment
Parameter ty pe mismatches
Parameter numb er mismatches
Non-usage of th e results of fu nctio ns
Uncalled fun ction s an d procedu res
Unassig ned p ointers
Po inter arithmetic
Software Engineering, COMP201
Slide 29
Stages of static analysis



Control flow analysis. Checks for loops with
multiple exit or entry points, finds unreachable
code, etc.
Data use analysis. Detects uninitialised
variables, variables written twice without an
intervening assignment, variables which are
declared but never used, etc.
Interface analysis. Checks the consistency of
routine and procedure declarations and their
use
Software Engineering, COMP201
Slide 30
Stages of static analysis



Information flow analysis. Identifies the
dependencies of output variables. Does not
detect anomalies itself but highlights
information for code inspection or review
Path analysis. Identifies paths through the
program and sets out the statements
executed in that path. Again, potentially
useful in the review process
Both these stages generate vast amounts of
information. Must be used with care.
Software Engineering, COMP201
Slide 31
138% more lint_ex.c
#include <stdio.h>
printarray (Anarray)
int Anarray;
{
printf(“%d”,Anarray);
}
main ()
{
int Anarray[5]; int i; char c;
printarray (Anarray, i, c);
printarray (Anarray) ;
}
139% cc lint_ex.c
140% lint lint_ex.c
lint_ex.c(10): warning: c may be used before set
lint_ex.c(10): warning: i may be used before set
printarray: variable # of args. lint_ex.c(4) :: lint_ex.c(10)
printarray, arg. 1 used inconsistently lint_ex.c(4) ::
lint_ex.c(10)
printarray, arg. 1 used inconsistently lint_ex.c(4) ::
lint_ex.c(11)
printf returns value which is always ignored
LINT static analysis
Use of static analysis


Particularly valuable when a language such as C
is used which has weak typing and hence many
errors are undetected by the compiler
Less cost-effective for languages like Java that
have strong type checking and can therefore
detect many errors during compilation
Software Engineering, COMP201
Slide 33
Cleanroom software development


The name is derived from the 'Cleanroom'
process in semiconductor fabrication. The
philosophy is defect avoidance rather than
defect removal
Software development process based on:
•
•
•
•
Incremental development
Formal specification.
Static verification using correctness arguments
Statistical testing to determine program reliability.
Software Engineering, COMP201
Slide 34
The Cleanroom process
Fo rm ally
s pecify
s ys tem
Erro r rewo rk
Defi ne
s oft ware
i ncrem ent s
Develo p
o perat io nal
p ro fi le
C o ns tru ct
s tru ct ured
p rog ram
Fo rm all y
verify
cod e
Des ig n
stat is ti cal
t es ts
Software Engineering, COMP201
Int egrat e
i ncrem ent
Test
i nt eg rat ed
s ys tem
Slide 35
Cleanroom process characteristics





Formal specification using a state transition
model
Incremental development
Structured programming - limited control and
abstraction constructs are used
Static verification using rigorous inspections
Statistical testing of the system
Software Engineering, COMP201
Slide 36
Incremental development
F ro zen
s pecifi cat io n
Es t abl is h
rerq ui rem ent s
Fo rm al
s pecifi cat io n
Develo p s/ w
i ncrem ent
Del iver
s oft ware
Requ ir em ent s chan ge requ est
Software Engineering, COMP201
Slide 37
Formal specification and inspections



The state based model is a system specification
and the inspection process checks the program
against this model
Programming approach is defined so that the
correspondence between the model and the
system is clear
Mathematical arguments (not proofs) are used to
increase confidence in the inspection process
Software Engineering, COMP201
Slide 38
Cleanroom process teams



Specification team. Responsible for developing
and maintaining the system specification
Development team. Responsible for
developing and verifying the software. The
software is NOT executed or even compiled
during this process
Certification team. Responsible for developing
a set of statistical tests to exercise the software
after development. Reliability growth models
used to determine when reliability is acceptable
Software Engineering, COMP201
Slide 39
Cleanroom process evaluation




Results in IBM have been very impressive with
few discovered faults in delivered systems
Independent assessment shows that the
process is no more expensive than other
approaches
Fewer errors than in a 'traditional' development
process
Not clear how this approach can be transferred
to an environment with less skilled or less
highly motivated engineers
Software Engineering, COMP201
Slide 40
Key points



Verification and validation are not the same
thing. Verification shows conformance with
specification; validation shows that the program
meets the customer’s needs
Test plans should be drawn up to guide the
testing process.
Static verification techniques involve examination
and analysis of the program for error detection
Software Engineering, COMP201
Slide 41
Key points




Program inspections are very effective in
discovering errors
Program code in inspections is checked by a
small team to locate software faults
Static analysis tools can discover program
anomalies which may be an indication of faults in
the code
The Cleanroom development process depends
on incremental development, static verification
and statistical testing
Software Engineering, COMP201
Slide 42