Analysis of Defect (and other) Data

Download Report

Transcript Analysis of Defect (and other) Data

OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Analysis of Defect (and other) Data
SPIN London, February 2006
C. C. Shelley
OXFORD SOFTWARE ENGINEERING Ltd
9 Spinners Court, 53 West End,
Witney,
Oxfordshire
OX28 1NH
www.osel.co.uk
[email protected]
Tel. +44 (0) 1993 700878
© OSEL 2005
Page 1 ofSlide
30 1.1
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Contents:
•
Uses of Defect Data
•
Statistical Process Control
•
Software Development…
•
…Compared to Production
•
Analytical techniques for software development
•
Additional data from inspections?
© OSEL 2005
Page 2 ofSlide
30 1.2
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Uses of Defect Data:
•
Defect counts derived from QC activity and combined with other data is an
irresistible subject for analysis
•
Radice (2000) identified minimal set of data types:
– LOC (or other size measure)
– Defects
– Engineer months
– Calendar months
– Cost
– Test progress
•
Can investigate: rework, schedule, quality, predicted quality, working practices….
•
Defect data are software engineering’s ‘lab rat’
© OSEL 2005
Page 3 ofSlide
30 1.3
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Statistical Process Control1:…
•
SPC is a set of techniques that have been promoted
•
It is attractive to software process engineers
– Its an engineer’s analytical tool – legitimises software development as ‘engineering’
– Data appears to be appropriate
•
Controls chart types
– Two principal families of control charts – for variables and attributes
– Neither are appropriate ‘as is’ for software – can lead to misuse and ad hoc
modifications.
•
Raises numerous questions that must be answered to avoid ‘jumping to conclusions’…
1 – Many techniques but centred on run and control charts.
© OSEL 2005
Page 4 ofSlide
30 1.4
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
…Statistical Process Control:
•
•
For example:
–
–
–
–
–
–
–
–
–
–
What type of control chart should be used
What types or classes of QC should be included
What types or classes of artefacts can be included
What are the allowable relationships between artefacts
What determines the sequence of the plots
What evidence is required to demonstrate parametric distribution
What is required to demonstrate linear relationship for normalized data
What significance have the +/- 3 sigma control limits for software
What do deviations from ‘norm’ mean?
etc…
–
What is the control chart for?
Difficult to answer because software is product of a design process that only approximates to
a manufacturing/replication process – the data does not behave.
© OSEL 2005
Page 5 ofSlide
30 1.5
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Software Development:
•
Development (and inspection) of a software artefact is a unique event
– Different people, skills, and priorities
•
The artefact is unique too
– Variation in size and complexity can affect relative defect levels
•
There are similarities of course – but fundamentally different to hardware
replication
– E.g. ‘norms’ and conformance are not the essence of software development.
•
To reiterate - what does a deviation from ‘norm’ mean?
– Interesting
– Prompts investigation
– Exploratory data – not control data
© OSEL 2005
Page 6 ofSlide
30 1.6
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Production Data and Development Data:
•
Software development data (including defect counts) can be subject to SPC – but
should it?
•
Process control has particular needs
– But control data is specialized and conceals more than it reveals
•
Process control data has particular characteristics and constraints
– Well characterized
– Needs to be demonstrated – no jumping to conclusions
– (and it usually cannot)
•
So what analyses should be used?
© OSEL 2005
Page 7 ofSlide
30 1.7
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Analytical Techniques for Software and Software
Development:
•
Well know
•
Simple
•
Robust
•
Look to Production Engineering and TQC
© OSEL 2005
Page 8 ofSlide
30 1.8
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
TQC:
•
Seven Tools:
–
–
–
–
–
–
–
•
Process chart
Pareto analysis
Ishikawa diagram
Histogram
Scatter diagram
Control chart
Check sheets
In particular - the Histogram and the Scatter diagram
– Familiar and often overlooked
– Tukey’s Exploratory Data Analysis
© OSEL 2005
Page 9 ofSlide
30 1.9
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Compare the use of control charts and histograms:
Control chart:
– Establish criteria for inclusion of data
– Normalized, parametric data
– Select chart type (standardized C chart?)
– Calculate sample size to set control limits (+/- 3 sigma?)
– Revise control limits?
– plot data
– Then…?
•
Histogram and Scattergram:
– Plot number of defects found for each artefact inspected
– Plot scattergram of artefact size and number of defects.
•
Histogram and scattergram are simpler, less error prone and more useful
© OSEL 2005
Page 10Slide
of 301.10
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Further graphical analyses:
•
Box plots
– For groups of data
•
Other possibilities – e.g. Jeremy Dick’s fault grid
– Defect ‘containment’
•
Integrate symbolic diagrams and metrics to make ‘pictures’
– Draw planned as well as display actual
– And animate them?
© OSEL 2005
Page 11Slide
of 301.11
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Additional Data from Inspections?:
•
Inspections are the primary software quality control
•
Derived from production engineering
•
But software development is not replication - it is design
•
And inspection is an aware process performed by intelligent people
•
Identify record and analyses design excellence too?
•
Potential is not bounded by a lower limit
© OSEL 2005
Page 12Slide
of 301.12
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Envoi:
•
Software ‘Engineering’ is (or will become) an engineering discipline2
•
It has a debt to other engineering disciplines
•
This will be repaid when it is recognized that it is a design discipline not a
manufacturing (replication) discipline - and develops its analytical tools
accordingly
2 – When we stop reinventing, acknowledge prior work and build on it.
© OSEL 2005
Page 13Slide
of 301.13
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2005
Page 14Slide
of 301.14
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
OXFORD
S O FTWAR E E N G I N E E R I N G
LIMITED
9 Spinners Court, 53 West End,
Witney,
Oxfordshire
OX28 1NH
www.osel.co.uk
[email protected]
Tel. +44 (0) 1993 700878
© OSEL 2005
Page 15Slide
of 301.15
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2005
Page 16Slide
of 301.16
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
real world
mathematical world
Empirical
relational
system
Formal
relational
system
measurement
decisions
and
actions
Relevant
empirical
information
© OSEL 2005
mathematics
and
statistics
interpretation
Results
From Pfleeger 1998
Page 17Slide
of 301.17
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2005
Page 18Slide
of 301.18
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
I
X
II
Y
X
III
Y
X
IV
Y
X
Y
10.00
8.04
10.00
9.14
10.00
7.46
8.00
6.58
8.00
6.95
8.00
8.14
8.00
6.77
8.00
5.76
13.00
7.58
13.00
8.74
13.00
12.74
8.00
7.71
9.00
8.81
9.00
8.77
9.00
7.11
8.00
8.84
11.00
8.33
11.00
9.26
11.00
7.81
8.00
8.47
14.00
9.96
14.00
8.10
14.00
8.84
8.00
7.04
6.00
7.24
6.00
6.13
6.00
6.08
8.00
5.25
4.00
4.26
4.00
3.10
4.00
5.39
19.00
12.50
12.00
10.84
12.00
9.13
12.00
8.15
8.00
5.56
7.00
4.82
7.00
7.26
7.00
6.42
8.00
7.91
5.00
5.68
5.00
4.74
5.00
5.73
8.00
6.89
‘Anscombe’s quartet’
© OSEL 2005
Page 19Slide
of 301.19
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
I
II
12.00
10.00
10.00
8.00
8.00
6.00
6.00
4.00
4.00
2.00
2.00
0.00
0.00
2.00
4.00
6.00
8.00
10.00 12.00 14.00 16.00
0.00
0.00
2.00
4.00
III
14.00
12.00
12.00
10.00
10.00
8.00
8.00
6.00
6.00
4.00
4.00
2.00
2.00
2.00
© OSEL 2005
4.00
6.00
8.00
8.00
10.00 12.00 14.00 16.00
IV
14.00
0.00
0.00
6.00
10.00 12.00 14.00 16.00
0.00
0.00
5.00
10.00
15.00
20.00
Page 20Slide
of 301.20
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2005
Page 21Slide
of 301.21
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
4
3.5
3
57
29
121
321
120
19
60
281
density
15
8
38
189
29
13
20
169
2.5
3.8
3.6
3.2
1.7
4.1
1.5
3
1.5
350
4.5
300
4
250
3.5
1.5
1
250
350
2.8
2
defect counts
300
3
2.5
2
defects
4.5
size
defect density
1
2
3
4
5
6
7
8
defect density
item
item
item
item
item
item
item
item
defects
defects
item
density
200
150
200
100
150
50
100
0
0
1.5
50
100
50
0
0.5
1
0
0
0
1
2
3
200
size
1
0.5
150
4
2
50
5
3
4
item
150s
100
6
size
5
7
6
7
8
200
8
items
© OSEL 2005
Page 22Slide
of 301.22
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2005
Page 23Slide
of 301.23
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Requirements
7
Architectural Design
5
9
Detailed Design
9
16
12
Code
6
0
14
3
Unit Test
8
3
13
40
Integration Test
0
16
11
2
5
34
System Test
3
10
3
7
11
8
10
8
Acceptance & Warranty
© OSEL 2005
Page 24Slide
of 301.24
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2005
Page 25Slide
of 301.25
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
validate
transaction
initialize
read
struct ure
set flag
re-estimat e
value
reallocate
import
history
convert
history
assign
owner
reformat
history
respond
check
user
align
with client
recalculate
struct ure
© OSEL 2005
clear
errors
validate
history
set
privilege
clear field
display
summary
flag
errors
Page 26Slide
of 301.26
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Val ida te
transa cti on
in itial ize
rea d stru ctu re
imp ort h istory
cle ar errors
resp ond
set fl ag
re-e sti mate valu e
i mpor t j a
va. uti l .*;
cl a
s s C lockT a
lk
che ck u ser
publ ic st ati c voi d mai n(St r ing[ ] ar gument s)
{
/ / get curr ent t im e and da
te
Cal endar now =C ale ndar .get Ins t ance() ;
i nt hour = now.get ( Cale ndar .H OU R_OF _DAY );
set privi leg e
i nt mi nute = now.get ( Cale ndar .M I NU TE) ;
i nt mont h= now. get( Cal endar. M ON TH) + 1;
con vert h istory
reforma t
i nt day= now. get( Cal endar. DA Y_OF_M O NTH );
i nt year =now .get (C ale ndar .Y EAR );
i mpor t j a
va. uti l .*;
/ /di s pla y gre et ing
cl a
s s C lockT a
lk
Sys te m .out . pri ntl n( " ) ;
f ( hour < 12)
publ ic st ati c voi d mai n(St r ing[ ] ar gument s)
{
/ / get curr ent t im e and da
te
al ign w ith
cle ar fi eld
Cal endar now =C ale ndar .get Ins t ance() ;
i nt hour = now.get ( Cale ndar .H OU R_OF _DAY );
i nt mi nute = now.get ( Cale ndar .M I NU TE) ;
di spla y su mmary
clie nt
i nt mont h= now. get( Cal endar. M ON TH) + 1;
i nt day= now. get( Cal endar. DA Y_OF_M O NTH );
i nt year =now .get (C ale ndar .Y EAR );
/ /di s pla y gre et ing
Sys te m .out . pri ntl n( " ) ;
rea llo cate
assi gn ow ner
f ( hour < 12)
Re calcu late
stru ctu re
© OSEL 2005
Val ida te
flag e rrors
history
Page 27Slide
of 301.27
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2005
Page 28Slide
of 301.28
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
•
Validating Measurements
– do the data reveal the truth?
– is the representation accurate?
– are the data carefully documented?
– do the methods of display avoid spurious reading of the data?
– are appropriate contexts and comparisons shown?
– From Tufte
© OSEL 2005
Page 29Slide
of 301.29
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2005
Page 30Slide
of 301.30