INFO 636 Software Engineering Process I Prof. Glenn Booker Week 8 – Reviews INFO636 Week 8 www.ischool.drexel.edu.

Download Report

Transcript INFO 636 Software Engineering Process I Prof. Glenn Booker Week 8 – Reviews INFO636 Week 8 www.ischool.drexel.edu.

INFO 636
Software Engineering Process I
Prof. Glenn Booker
Week 8 – Reviews
INFO636 Week 8
1
www.ischool.drexel.edu
Reviews
• Conducting reviews of requirements,
design, and code is one of the best ways
to improve your work’s quality and your
productivity
• Here we’ll look at various types of reviews
and how to document them
INFO636 Week 8
2
www.ischool.drexel.edu
Reviews
• Review types in descending order of
formality include
– Inspections
– Walk-throughs
– Personal reviews
INFO636 Week 8
3
www.ischool.drexel.edu
Inspections
• Inspections follow a structured procedure
for evaluating a work product
– Fagan inspections are among the best known
brand of inspection
• Inspections start with preparation, where
each participant reviews the work
separately, and makes note of defects
found
INFO636 Week 8
4
www.ischool.drexel.edu
Inspections
• Then there’s an inspection meeting
to discuss the findings of each participant,
and put together a cumulative list of
defects
• Then the work product owner fixes the
defects, and puts together a report to say
so, in the repair and report phase
INFO636 Week 8
5
www.ischool.drexel.edu
Walk-throughs
• Walk-throughs require little preparation,
except by the work product owner
• A presentation is given, and participants
provide feedback during it
• Follow-up is informal, with the work
product owner responding to the
comments received
INFO636 Week 8
6
www.ischool.drexel.edu
Personal reviews
• Personal review is the work product owner
reviewing their own stuff
• As compiling code has gotten trivially
easy, many programmers have
dropped reviewing their own work in the
hopes that the computer will find their
mistakes
– Not a good strategy!
INFO636 Week 8
7
www.ischool.drexel.edu
Target of Reviews
• Any work product can be the subject of
reviews
– Any document
•
•
•
•
Requirements specification
Design models
Test plans
Internal project processes & procedures
– Source code
• Scripts too!
INFO636 Week 8
8
www.ischool.drexel.edu
Commentary
• For those taking INFO 637, the Team
Software Process uses formal reviews
extensively, so pay extra attention!
• N track people - while the text obviously
focuses on reviews related to code, keep
in mind that these methods and tools for
reviews can be used to plan and conduct
reviews for anything
INFO636 Week 8
9
www.ischool.drexel.edu
Why Review Software?
• The history of the PSP has shown that
most people
– Initially spend much of their time (30-50%) in
compiling and testing
– By the end of this course, only about 10% of
their time is spent testing
• Good reviews are a key to reducing testing
time
INFO636 Week 8
10
www.ischool.drexel.edu
Review Efficiency
• Finding and fixing defects is much faster to
do in review than in testing
– Humphrey found 8x faster fix time in review
than testing
• Code reviews are 3-5 times as efficient at
finding defects than testing
– Part of the reason is that testing finds
symptoms of the defect, which has to be
investigated by debugging
INFO636 Week 8
11
www.ischool.drexel.edu
Severity of Review
• We don’t mean to imply that every piece of
code needs exhaustive review
• Different approaches can be used,
depending on the complexity, risk, and
importance of the code
– Hence you might use inspections for critical
code, walk-throughs for typical code, and just
personal review for low risk code
INFO636 Week 8
12
www.ischool.drexel.edu
Review Principles
• Any kind of review process typically
follows three principles
– Establish defined review goals
– Follow a defined process for conducting
a review (here, we’ll use scripts)
– Measure and improve your review process
INFO636 Week 8
13
www.ischool.drexel.edu
Separate Design and Code Reviews
• Design and code should be reviewed
separately
– Forces making a design before coding
– It’s hard to decipher design from the code
– Helps spot logic errors in design, and identify
design improvements
– Helps focus review scope
INFO636 Week 8
14
www.ischool.drexel.edu
Design Reviews
• Make your design reviewable
– Follow a standard notation for design, such as
UML, DFD, ERD, etc.
– Make sure design addresses both functional
and non-functional requirements
– Follow personal design standards, hopefully
in concert with organizational standards
INFO636 Week 8
15
www.ischool.drexel.edu
Design Reviews
• Follow a design review strategy
– Look at various elements of design
systematically – don’t try to assess it all at
once
• Design review strategy stages might
include
– Check for required program elements
INFO636 Week 8
16
www.ischool.drexel.edu
Design Reviews
– Examine overall program structure and flow
– Check for logical completeness
– Check for robustness - handling errors, etc.
– Check parameters and types for methods and
procedure calls
– Check special variables, data types, and files,
including aliases
INFO636 Week 8
17
www.ischool.drexel.edu
Design Reviews
• Check design against the requirements
• More elaborate inspections might use
– A traceability matrix to prove completeness,
or
– Use formal methods (Z, Larch) to show
correctness mathematically
INFO636 Week 8
18
www.ischool.drexel.edu
Measuring Reviews
• Key basic measures for reviews are
– Size of product being reviewed (in pages or
LOC)
– The review time, in minutes
– The number of defects found
– And based on later work, the defects that
weren’t found by the review
INFO636 Week 8
19
www.ischool.drexel.edu
Measuring Reviews
• Derived metrics for reviews are
– Review yield, the percent of defects found by
review
• Yield = 100*(defects found) /
(defects found + defects not found)
– Number of defects found per kLOC or page
– Number of defects found per hour of review
time
INFO636 Week 8
20
www.ischool.drexel.edu
Measuring Reviews
– The number of LOC or pages reviewed
per hour
– Defect Removal Leverage (DRL)
• The ratio of defects removed per hour for any two
phases or activities
– DRL(coding) = Defects/hour(coding)/
Defects/hour(design)
INFO636 Week 8
21
www.ischool.drexel.edu
Checklists
• Checklists are used to help make sure
a process or procedure is followed
consistently each time
• A sample code review checklist for
C++ is on page 242; variations can
be developed for other languages
– It has several blank columns so each module
can be checked off separately
INFO636 Week 8
22
www.ischool.drexel.edu
Designing Checklists
• Checklists should be designed so that you
have to focus on only one topic at a time
– Similar to reviewing a book for grammar
versus plot development – it’s hard to look for
both at once
• To use a checklist most effectively,
completely review one module
INFO636 Week 8
23
www.ischool.drexel.edu
Using Checklists
• Different strategies should be considered
for different types of reviews
– Design review for a large application might
prefer to be from the top down
– Code review often works better from the
bottom up for your code, but top down for
someone else’s
INFO636 Week 8
24
www.ischool.drexel.edu
Building Checklists
• Don’t take the example on p. 242 as the
ultimate final perfect most-wonderful-of-all
checklist that ever was *breathe*
• Study the kinds of problems you encounter
(in your defect log) to see what you need
to emphasize in your checklist
– In other words, tailor the checklist for yourself
INFO636 Week 8
25
www.ischool.drexel.edu
Building Checklists
• The types of defects are given on page
260 – again, consider adapting this to your
needs and other languages
• One way to look for your most common
types of defects is to lump all your defect
logs together, and generate a Pareto chart
by defect type
INFO636 Week 8
26
www.ischool.drexel.edu
Building Checklists
• A refined defect type list is shown on page
262; you can use a Pareto diagram to
figure out which kinds of defects you need
to expand upon
• This also connects to the coding standard
developed ages ago – you can use
lessons learned from defect analysis to
help refine the coding standard
INFO636 Week 8
27
www.ischool.drexel.edu
Review Before or After Compile?
• A contentious issue in PSP is whether
to review code before compile or after
– A non-issue in some languages, which aren’t
compiled!
• In Humphrey’s experience, about 9%
of all syntax errors aren’t caught by
a compiler, so don’t expect it to catch
everything
INFO636 Week 8
28
www.ischool.drexel.edu
Reviews vs. Inspections
• As a matter of courtesy, make sure a
program or document is in pretty good
shape before submitting it for review
or inspection
– Very formal inspections might require code to
pass unit testing, and show test results as part
of the inspection
– Humphrey doesn’t like testing before
inspection, however
INFO636 Week 8
29
www.ischool.drexel.edu
(P track) Report R4
• Report R4 (p. 771) analyzes the defects
from all the previous assignments
• Tasks are:
– Develop a process and scripts to create your
report
– Follow that process and show the completed
report
INFO636 Week 8
30
www.ischool.drexel.edu
(P track) Report R4
• Sample contents of the report should
include, at a minimum
– An analysis of estimating accuracy for size
and time for the programs to date
– Analysis of defects injected and removed,
using table D23 as an example
– Analysis of defects found by the compiler (if
any), ala table C24
INFO636 Week 8
31
www.ischool.drexel.edu
(P track) Report R4
– Analysis of defect fix times, using table D22
again
– Develop a design checklist for use during
design review
– Develop a code checklist for use during code
review
– Discuss the results of the report, and set
improvement goals for yourself
INFO636 Week 8
32
www.ischool.drexel.edu
(P track) Report R4
• Use graphs where possible, but don’t
forget to discuss the trends observed on
them
– A graph with no discussion is lonely 
• This report is the culmination of the PSP
1.x level of process, leading us to PSP 2
INFO636 Week 8
33
www.ischool.drexel.edu