Software Verification - University of Denver

Download Report

Transcript Software Verification - University of Denver

Advanced Software Engineering:
Software Testing, 2007
COMP 3705 (Lecture1)
Sada Narayanappa
TA: Seif Azgandhi
Anneliese Andrews (Chair DU)
Thomas Thelin
Carina Andersson
Facts about testing
System development:
•1/3 planning
•1/6 coding
•1/4 component test
•1/4 system test
[Brooks75]
Planning
Coding
Component test
System test
2
TMM Test process Evaluation
•Guided by Capability Maturity model
(CMM)
•Stages or levels to evolve from and to
•Each level, except Level 1, has structure
•Level 2 and above have:
• Structure
• Goals
• Organization structure
3
Internal structure of TMM
indicate
Level
Contains
Testing
Capability
Maturity Goals
Supported By
Maturity Sub goals
Achieved By
Addresses Activities/tasks/responsibilities
Implementation
Organized By
and Organizational
3 Critical views
adaptation
Manager
Developer
User/Client
4
Test Maturity Model
5
6
TMM Levels
•Level1- No process defined; debug/testing are
same
•Level2- Test & Debug tools/Test plan/Basic test
process
•Level3- Test Org/Technical Training/Lifecycle/
Control & Monitor, support V Model
•Level4- Review process/test
measurement/Software quality Evaluation – Test
Logging with severity
•Level5-Defect prevention/Quality/Process
optimization
7
Good enough quality
To claim that any given thing is good enough is to
agree with all of the following propositions:
• It has sufficient benefits
• It has no critical problems
• The benefits sufficiently outweigh the problems
• In the present situation, and all things considered,
further improvement would be more harmful than
helpful
James Bach, IEEE Computer, 30(8):96-98, 1997.
8
Quality attributes – ISO 9126
9
Quality attributes
10
Why use testing?
•Risk mitigation
•Faults are found early
•Faults can be prevented
•Reduce lead-time
•Deliverables can be reused
•…
11
Why do faults occur in software?
•Software is written by humans
•
•
•
•
Who know something, but not everything
Who have skills, but aren’t perfect
Who don’t usually use rigorous methods
Who do make mistakes (errors)
•Under increasing pressure to deliver to strict deadlines
• No time to check, assumptions may be wrong
• Systems may be incomplete
•Software is complex, abstract and invisible
•
•
•
•
Hard to understand
Hard to see if it is complete or working correctly
No one person can fully understand large systems
Numerous external interfaces and dependencies
12
Origins of defects
Defect sources
Lack of education
Poor communication
Oversight
Transcription
Immature process
Fault model
Impact of software
Errors artifacts
Faults / Defects
Failures
Impact from user’s view
Poor quality software
User dissatisfaction
13
Whoops, that’s my calculator
14
Testing, Verification & Validation
Testing
The process of evaluating a program or a system
Definition 1
•Verification
• is the product right?
•Validation
• is it the right product?
Definition 2
•Verification
• satisfies the conditions
at the start of the
phase
•Validation
• satisfies the
requirements
15
Definitions
• Failure is an event, fault is a state of the
software caused by an error
• Error – human mistake
• Fault / Defect – anomaly in the software
• Failure – inability to perform its required
functions
• Debugging / Fault localization –
localizing, repairing, retesting.
16
Definitions
• A TEST CASE consists of:
•
A set of inputs
•
Execution conditions
•
Expected outputs
IEEE Definition
Organization may define
additional attributes
• A Test is:
•
Group of related test cases
•
Procedures needed to carry out the test case
17
Scripted and non-scripted testing
•In scripted testing test cases are pre-documented in
detailed, step-by-step descriptions
• Different levels of scripting possible
• Scripts can be manual or automated
•Non-scripted testing is usually manual testing
without detailed test case descriptions
• Can be disciplined, planned, and well documented
exploratory testing
• or ad-hoc testing
18
Test oracle
•An oracle is the principle or mechanism by which
you recognize a problem
•Test oracle provides the expected result for a test,
for example
•
•
•
•
Specification document
Formula
Computer program
Person
•In many cases it is very hard to find an oracle
• Even the customer and end user might not be able to
tell which is the correct behaviour
19
Test Bed
•Environment contains
• all the hardware and software to test software
component/system
•Examples:
• Simulators
• Emulators
• Memory checkers
20
Other Definitions
•Important to understand the following
definitions
• Quality– degree of meeting specified
requirement
• Metric – quantitative measure
• Quality metric
• Correctness – perform the function
• Reliability –perform under stated condition
• Usability – effort to use the system
• Integrity – withstand attacks
• Portability/maintainability/interoperability …
21
Principle 1 – purpose of testing
Testing is the process of:
•
exercising a software component using
a selected set of test cases, with the
intent of
1. Revealing defects
2. Evaluating quality
22
Principles
2: A good test case – When the test
objective is to detect defects, then a
good test case is one that has high
probability of revealing a yet undetected
defect(s)
3: Test result – The results should be
inspected meticulously
4: Expected output – A test case must
contain the expected output
23
Principles
5: Input – Test cases should be developed for
both valid and invalid input conditions
6: Fault content estimation – The probability of
the existence of additional defects in a
software component is proportional to the
number of defects already detected in that
component
7: Test organization – Testing should be carried
out by a group that is independent of the
development group
24
Principles
8: Repeatable – Tests must be repeatable
and reusable
9: Planned – Testing should be planned
10: Life cycle – Testing activities should be
integrated into the software life cycle
11: Creative – Testing is a creative and
challenging task
25
Goals of the course
A test specialist - trained
engineer- have knowledge of
test-related
•Principles/processes/measu
rements, standards, plans,
tools, and methods, and
•learn how to apply - testing
tasks to be performed.
• Knowledg
e
• Skills
• Attitudes
26
www.swebok.org
27
www.swebok.org
28
Defect classes and Defect repository
zRequirement/Specification
Defect Classes
Functional Description
Feature
Feature interaction
Interface description
zDefect reports/analysis
zDefect Repository
Defect Classes
Severity
Occurrences
zCoding
Defect Classes
Algorithmic and processing
Control, Logic, and sequence
typographical data flow Data
Module interface
Code documentation
External hardware, software
zDesign Defect Classes
Algorithmic and processing
Control, Logic, and sequence Data
Module interface description
External interface description
zTesting
Defect Classes
zDefect reports/analysis
Test Harness
Test design
Test procedure
29
Lectures
•
•
Theory + discussions
Cover the basic parts of software testing
1. Introduction
2. Black-box, Reliability, Usability
3. Inspections, white-box testing
4. Lifecycle, documentation
5. Organization, tools
Overview
Technical
Technical /
Manager
6. Metrics, TMM
7. Research presentation
Managerial
E
c
o
n
o
m
i
c
30
Lab sessions
Preparation, Execution, Report
1. Black-box testing
2. Usage-based testing and reliability
3. White-box testing
4. Inspection and estimation
5. Software process simulation
31
Project: Option 1
• Learn a specific area of software testing
• Collect and summarize research information
• Critical thinking beyond the written information
• Present information in a structured way
• Peer review
32
Examination
•Written exam based on the book and lab
sessions
•Lab sessions (approved)
•Project/presentations are grades
•See class web site for Assignment policy
33
Schedule
•Read
• Course program
• Projects in Software Testing
•Check homepage
•Not decided
• Extra Lab dates
34
This week
•Read course program
•Project
•
•
•
•
Read Projects in Software Testing
Exercise on Thursday
Decide Research/subject
Discuss papers with me – describe why is it interesting
•Lab
• Prepare lab 1
•Read Burnstein 1-3
• Prepare Burnstein 4,12
35
Project: Option 1
•Research: solve a research problem; survey the
state-of-the-art and identify the research problems
in some area; develop and justify an extension to
an existing technique; etc.
•Evaluation: apply and evaluate a technique or
evaluate a commercial testing or, analysis tool.
•Practical: Use an existing technique to test a
system or design and implement a prototype for a
system.
36
Project: Option 1
• Read Projects in Software Testing
• Divide in groups (2-3 persons)
• Discuss with me
•http://www.cs.du.edu/~snarayan/sada/teaching/COMP3705/FilesFromCD/Project/Project_SwTest.pdf
37
Project: Option 2
•Research paper presentation
•Find an interesting paper
•Many research papers we come across
during class – pick one for presentation
•Have four paper presentation – Choose
your team and prepare
•Reading paper takes time – start early
38