SE 532 Software Quality Management

Download Report

Transcript SE 532 Software Quality Management

Team Software Project (TSP)
June 26, 2006
System Test
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
1
Outline
Remaining Session Plan & Discussion
System Test Plan Discussion
Mythical Man Month
System Test Plan Recap
Metrics Presentations
More on Measurement
Next Phases
Cycle 1 Test
Cycle 1 Post-Mortem & Presentations
Cycle 2 Plan & Strategy
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
2
Due Today
Key Metrics Presentation (10-15 minutes)
All Implementation Quality Records (LOGD, CCRs, etc.)
Final code (source & executable)
Updated Products (code components, SRS, HLD, User Documentation)
Intermediate Products (e.g. Unit Test Plans)
Configuration Management Plan
Release CD:
Application
User Guide
Release Letter
No class on July 3
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
3
Project Performance Discussion
Remaining Lectures Plan/Discussion
July 10 – Cycle 1 Test Complete & Post-Mortem
Cycle 1 Results Presentation & Discussion
Cycle 1 Reports & Post-Mortem
Measurement
Team audit
July 17 – Cycle 2 Launch
Cycle 2 Launch, Project & Measurement Planning
Peopleware Topics: Management, Teams, Open Kimono, Quality, Hiring/Morale, …
July 24 – Cycle 2 Requirements Complete
Cycle 2 Requirements
Death March Projects:
July 31 – Cycle 2 Implementation Complete
System Test Plan Baselined
Cycle 2 Design & Implementation
Process topics – CMMI, TL-9000, ISO
August 7 – Cycle 2 Test Complete
Cycle 2 Test Complete
Cycle 2 Post-Mortem Complete
August 14 - Course Review
Course Review
Class exercise
Final
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
5
Remaining Course Topics Discussion
System Test Schedule
Note: Assumes system has already passed Integration Test
Full feature to system test and instructor by COB June 25 including:
Test environment
Executable
User documentation (note: CCRs can be filed against user documentation)
Source code
Tester generates CCRs for all finds & fills out LOGTEST
Email to instructor when generated (see below)
Development team updates LOGD referencing CCRs
Required turn-around times for fixes
80% within 24 hours
99% within 48 hours
Required test coverage short of blocking issues
80% First Pass Test Complete by June 28
100% First Pass Test Complete by July 1
Regression Test Complete by July 3
Daily test reports to instructor detailing test cases executed, results & CCRs
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
7
System Test Plan Recap
Areas to cover:
Installation
Start-up
All required functions available & working as specified
Diabolical (e.g. power failures, corner cases, incorrect handling)
Performance
Usability
Includes:
Test cases you plan to run (numbered / named)
Expected results
Ordering of testing & dependencies
Supporting materials needed
Traceability to requirements
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
8
Release “Letters”
Purpose
What’s in it?
– Version Information
– Release contents
Examples:
• All functionality defined in Change Counter Requirements v0.6 except GUI
• Phase 1 features as defined in project plan x.y
• Feature 1, Feature 2, Feature 3 as defined by …
– Known Problems
• Change Request IDs w/ brief customer oriented description
– Fixed Problems
– Upgrade Information
– Other?
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
9
Implementation Status
Implementation experience
Unit/Integration experience
Problems / Rework?
PIP forms
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
10
Team Presentation
Project Measurement
Source: Practical Software Measurement
John McGarry, et.al.
Measurement
“If you can’t measure it,
you can’t manage it”
Tom DeMarco
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
14
Fundamentals
Don’t try to measure everything
Align measures with:
Project goals & risks (basic survival mode)
Process improvement areas (continual improvement mode)
Define measurement program up front
Monitor continuously & take action where needed
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
15
Applications
Improve accuracy of size & cost estimates
Improve quality
Understand project status
Produce more predictable schedules
Improve organizational communication
Faster, better informed management decisions
Improve software processes
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
16
Basic In-Process Measurement Examples
Schedule
Earned Value vs. Planned Value
Schedule Variance
Development
Task completion
Actual code completed vs. planned
Project End Game
Defect Creation vs. Closure
Variations: severity
System Test
% Testing Complete
Variations: passed, failed, blocked
Test Time / Defect
Test Coverage (vs. requirements, white box code coverage)
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
17
Process Improvement Measurement Examples
Quality
Defect density
Post Deployment defect density
Inspection Effectiveness
Defects / inspection hour
Estimation Accuracy
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
18
Why Measure?
Support short & long term decision making
Mature software organization (CMMI level?) uses measurement to:
Plan & evaluate proposed projects
Objectively track actual performance against plan
Guide process improvement decisions
Assess business & technical performance
Organizations need the right kind of information, at the right time to make
the right decisions
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
19
Measurement in Software Lifecycle
Plan
Do – carry out change
Check – observe effects of change
Act – decide on additional areas for improvement
Repeat
Considerations: Cost, schedule, capability, quality
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
20
Measurement Psychological Effects
Measurement as measures of individual performance
Hawthorne Effect
Measurement Errors
Conscious: rounding, pencil whipping (ie. False data entry)
Unintentional: inadvertent, technique (ie. Consistent)
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
21
Use of Measures
Process Measures – time oriented, includes defect levels, events & cost elements
Used to improve software development & maintenance process
Product Measures – deliverables & artifacts such as documents
includes size, complexity, design features, performance & quality levels
Project Measures – project characteristics and execution
includes # of developers, cost, schedule, productivity
Resource Measures –resource utilization
includes training, costs, speed & ergonomic data
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
22
Glossary
Entity - object or event (e.g. personnel, materials, tools & methods)
Attribute - feature of an entity (e.g. # LOC inspected, # defects found, inspection time)
Measurement - # and symbols assigned to attributes to describe them
Measure – quantitative assessment of a product/process attribute (e.g. defect density, test pass
rate, cyclomatic complexity)
Measurement Reliability – consistency of measurements assuming nochange to method/subject
Software validity – proof that the software is trouble free & functions correctly (ie. high quality)
Predictive validity – accuracy of model estimates
Measurement errors – systematic (associated with validity) & random (associated w/ reliability)
Software Metrics – approach to measuring some attribute
Defect – product anomaly
Failure – termination of product’s ability to perform a required function
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
24
PSM Measurement Process
Measurement Plan
Information need – e.g.:
What is the quality of the product?
Are we on schedule?
Are we within budget?
How productive is the team?
Measurable Concept
Measured entities to satisfy need (abstract level: e.g. productivity)
Measurement Construct
What will be measured? How will data be combined? (e.g. size, effort)
Measurement Procedure
Defines mechanics for collecting and organizing data
Perform Measurement
Evaluate Measurement
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
25
Measurement Construct
Decision Criteria
Indicator
Analysis Model
Derived Measure
Derived Measure
Measurement Function
Base Measure
Measurement method
Attribute
6/26/2007
Base Measure
Measurement method
Attribute
SE 652- 2007_6_26_TestResults_PSMp1.ppt
26
Indicator
Derived Measure
Attributes
Base Measure
Base Measure
Attribute
Attribute
Derived Measure
Attribute
Distinguishable property or characteristic of a software entity
(Entities: processes, products, projects and resources)
Qualitative or Quantitative measure
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
27
Indicator
Derived Measure
Base Measure
Base Measure
Base Measure
Attribute
Attribute
Derived Measure
Measure of an attribute (one to one relationship)
Measurement method
Attribute quantification with respect to a scale
Method type
Subjective (e.g. high, medium, low), Objective (e.g. KLOC)
Scale
Ratio
Interval
Ordinal
Nominal
Unit of measurement
e.g. hours, pages, KLOC
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
28
Indicator
Derived Measure
Indicator
Derived Measure
Base Measure
Base Measure
Attribute
Attribute
Derived Measure
Derived Measure
Function of 2 or more base measures
Measurement Function
Algorithm for deriving data (e.g. productivity = KLOC/developer hours)
Indicator
Estimate or Evaluation
Analysis Model
Algorithm / calculation using 2 or more base &/or derived measures +
Decision Criteria
Numerical thresholds, targets, limits, etc.
used to determine need for action or further investigation
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
29
Indicator
Measurement Construct
Examples
Derived Measure
Base Measure
Base Measure
Attribute
Attribute
Derived Measure
Productivity
Attributes:
Base Measures:
Derived Measure:
Analysis Model:
Indicator:
Hours, KLOC
Effort (count total hrs), Size (KLOC counter)
Size / Effort = Productivity
Compute Mean, compute std deviation
Productivity: mean w/ 2  confidence limits
Quality
Attributes:
Base Measures:
Derived Measures:
Indicator:
measured defect rate
6/26/2007
Defects, KLOC
# Defects (count defects), Size (KLOC counter)
# Defects / Size = Defect Rate
Defect rate control: baseline mean, control limits &
SE 652- 2007_6_26_TestResults_PSMp1.ppt
30
Indicator
More Measurement Construct
Examples
Derived Measure
Base Measure
Base Measure
Attribute
Attribute
Derived Measure
Coding
Base Measure:
Schedule (w.r.t. coded units)
Derived Measure:
Planned units, actual units
Analysis Model:
Subtract units completed from planned units
Indicator:
Planned versus actual units complete + variance
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
31
Indicator
Class Measurement Construct
Examples
Derived Measure
Base Measure
Base Measure
Attribute
Attribute
Derived Measure
Coding
Base Measure:
Derived Measure:
Analysis Model:
Indicator:
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
32
Measurement Planning
Identify Candidate Information Needs
Project Objectives
Cost, schedule, quality, capability
Risks
Prioritize
One approach: probability of occurrence x project impact = project exposure
e.g.
Schedule
Budget
Reliability
Dependencies
Product Volatility
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
33
PSM Common Information Categories
Schedule & Progress
Resources & Cost
Product Size & Stability
Product Quality
Process Performance
Technology Effectiveness
Customer Satisfaction
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
34
PSM Common Information Categories
Measurement Concepts
Schedule & Progress
- milestone dates/completion, EV/PV
Resources & Cost
- staff level, effort, budget, expenditures
Product Size & Stability
- KLOC/FP, # requirements, # interfaces
Product Quality
- defects, defect age, MTBF, complexity
Process Performance
- productivity, rework effort, yield
Technology Effectiveness - requirements coverage
Customer Satisfaction
- customer feedback, satisfaction ratings,
support requests, support time, willingness to repurchase
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
35
Select & Specify Measures
Considerations
Utilize existing data collection mechanisms
As invisible as possible
Limit categories & choices
Use automated methods over manual
Beware of accuracy issues (e.g. timecards)
Frequency needs to be enough to support ongoing decision making
(alternative: gate processes)
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
36
Measurement Construct
Measurement Construct
Information Need
Measurable Concept
Relevant Entities
Attributes
Base Measures
Measurement Method
Type of Method
Scale
Type of Scale
Unit of Measurement
Derived Measures
Measurement
Function
Indicator
Analysis Model
Decision Criteria
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
37
Project Measurement Plan Template
(from PSM figure 3-10, p 56)
Introduction
Project Description
Measurement Roles, Responsibilities & Communications
Description of Project Information Needs
Measurement Specifications (i.e. constructs)
Project Aggregation Structures
Reporting Mechanisms & Periodicity
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
38
Team Project Postmortem
Why
Insanity
Continuous improvement
Mechanism to learn & improve
Improve by changing processes or better following current processes
Tracking process improvements during project
Process Improvement Proposals (PIP)
Post-Mortem
Areas to consider
Better personal practices
Improved tools
Process changes
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
39
Cycle 2 Measurement Plan
Identify cycle 2 risks & information needs
Review & revise measures & create measurement constructs
Document in a measurement plan
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
40
Postmortem process
Team discussion of project data
Review & critique of roles
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
41
Postmortem process
Review Process Data
Review of cycle data including SUMP & SUMQ forms
Examine data on team & team member activities & accomplishments
Identify where process worked & where it didn’t
Quality Review
Analysis of team’s defect data
Actual performance vs. plan
Lessons learned
Opportunities for improvement
Problems to be corrected in future
PIP forms for all improvement suggestions
Role Evaluations
What worked?
Problems?
Improvement areas?
Improvement goals for next cycle / project?
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
42
Cycle Report
Table of contents
Summary
Role Reports
Leadership – leadership perspective
Motivational & commitment issues, meeting facilitation, req’d instructor support
Development
Effectiveness of development strategy, design & implementation issues
Planning
Team’s performance vs. plan, improvements to planning process
Quality / Process
Process discipline, adherence, documentation, PIPs & analysis, inspections
Cross-team system testing planning & execution
Support
Facilities, CM & Change Control, change activity data & change handling, ITL
Engineer Reports – individual assessments
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
43
Role Evaluations & Peer Forms
Consider & fill out PEER forms
Ratings (1-5) on work, team & project performance, roles & team members
Additional role evaluations suggestions
Constructive feedback
Discuss behaviors or product, not person
Team leaders fill out TEAM EVALUATION form
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
44
Cycle 1 Project Notebook Update
Updated Requirements & Design documents
Conceptual Design, SRS, SDS, System Test Plan, User Documentation*
Updated Process descriptions
Baseline processes, continuous process improvement, CM
Tracking forms
ITL, LOGD, Inspection forms, LOGTEST
Planning & actual performance
Team Task, Schedule, SUMP, SUMQ, SUMS, SUMTASK, CCR*
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
45
Due July 10 Class
Cycle 1 Reports / Post-Mortem
Cycle 1 Results Presentation
Cycle 2 Project Plan
Cycle 2 Measurement Plan
6/26/2007
SE 652- 2007_6_26_TestResults_PSMp1.ppt
46
Cycle 1 Audit