8130 4130 test introduction - Home - CS - CECS

Download Report

Transcript 8130 4130 test introduction - Home - CS - CECS

Intro
Verification and Validation Processes
Introduction
Adrian Marshall
COMP8130 4130
Adrian Marshall
Agenda
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Introduction
Definitions
V&V
Test objectives
Testing challenges
Testing and the V model
Testing Approaches
Testing levels
Classes of test
Risk based testing
Requirements based testing
Test Methods
Common testing types
Testing tools
COMP8130 4130
Adrian Marshall
Intro
Verification
IEEE 1012: the process of determining whether or not
the products of a given phase of the software
development lifecycle fulfil the requirements
established during the previous phase.
ISO 12207: confirmation by examination and
provision of objective evidence that specified
requirements have been fulfilled
COMP8130 4130
Adrian Marshall
Intro
Validation
IEEE 1012: the process of evaluating software at the
end of the software development process to ensure
compliance with software requirements
ISO 12207: confirmation by examination and
provision of objective evidence that particular
requirements for a specific intended use have been
fulfilled
COMP8130 4130
Adrian Marshall
Intro
Put more simply:
We Verify that the output of each software phase meets
its requirements, and
We Validate that the software, at the end of the
development effort, meets the overall intended use
COMP8130 4130
Adrian Marshall
Types of V & V Activities
Requirements analysis and Traceability analysis
Design analysis
Interface analysis
Implementation evaluation
• static - reviews, inspections, structure analysis
• dynamic - simulation, prototyping, execution time analysis
• formal - mathematical analysis of algorithms
Testing
Project & Management analysis
COMP8130 4130
Adrian Marshall
Sources of guidance on V & V
V & V Standards
•
•
•
•
IEEE 1012 - Software V & V Plans
IEEE 1059 - Guide for Software V & V Plans
IEEE 1028 - Software Reviews & Audits
IEEE 829 - Software Test Documentation
Related Standards
• ISO 12207 - Software Lifecycle Processes
• ISO 9126 - Software Quality Characteristics
Text
V & V of Modern Software -Intensive Systems - Schulmeyer &
Mackenzie, Prentice Hall, 2000
COMP8130 4130
Adrian Marshall
Pros and Cons of V & V
Positive
• early error detection
• better product quality
• better project planning
• better adherence to standards, methods and practices
• better decision support information
• Cost of detection & prevention < cost of corrective action
Negative
• additional time and effort required for V&V activities
• additional cost (visible)
• Independence of V&V can be hard for small organisations
COMP8130 4130
Adrian Marshall
Defect Introduction by Phase
What is known about the quality of software systems?
Defect Source
Military
Industry
Requirements
20.0%
12.5%
Design
20.0%
24.2%
Code
35.0%
38.3%
Documentation
15.0%
13.3%
Bad Fixes
10.0%
11.7%
Applied Software Measurement 2nd Edition, by Capers Jones. McGraw-Hill, 1997. ISBN: 007-032826-9
COMP8130 4130
Adrian Marshall
Cost of Removing Defects
What is known about the quality of software systems?
Removal Phase
Cost Ratio
Example Cost
1:1
(say) $200
2.5 : 1
$500
Code & Unit Test
5:1
$1,000
Integration & Test
36 : 1
$7,200
150 : 1
$30,000
S/W Analysis
S/W Design
Deployment
Applied Software Measurement 2nd Edition, by Capers Jones. McGraw-Hill, 1997. ISBN: 007-032826-9
COMP8130 4130
Adrian Marshall
Testing Definitions (1)
• Testing is the process of executing a program with the
intent of finding errors
• Testing is an activity aimed at evaluating an attribute or
capability of a program or system and determining that it
meets its required results
• Testing is the process by which we understand the status
of the benefits and the risk associated with release of a
software system
• Testing includes all activities associated with the planning,
preparation, execution, and reporting of tests
COMP8130 4130
Adrian Marshall
Testing Definitions (2)
“Testing cannot guarantee that errors are not present, rather
it demonstrates that errors are present”….
COMP8130 4130
Adrian Marshall
Test Objectives
• Verifying the implementation of any or all products
• Requirements and solution validation
• Defect detection
• Provide assessment of deployment risks
• Provide performance & threshold benchmark data
• Establish testing processes, assets, data and
skills for on-going testing activities
COMP8130 4130
Adrian Marshall
Testing Challenges
• Complete testing is not possible
• Testing work is creative and difficult
• Testing is costly
• Testing is often not seen as a core activity
• Testers aim to find and report problems
COMP8130 4130
Adrian Marshall
Testing Challenges
• Technical personnel often do not want to become
testers, leaving testing to non-technical system
users
• Testing requires independence
• Testing is often a critical path activity
• Testing is often trimmed to solve schedule or
budget problems.…
COMP8130 4130
Adrian Marshall
Testing and the V Model
DESIGN
ACTIVITIES
BUSINESS
REQUIREMENTS LEVEL
SYSTEMS
REQUIREMENTS
LEVEL
DESIGN LEVEL
COMPONENT LEVEL
COMP8130 4130
TESTING
ACTIVITIES
Determine
business
requirements
BUILD
ACTIVITIES
Review requirements
Analyse test requirements
Determine
system
requirements
Test against
business
requirements
Review solution
Develop Master Test
Plan
Design
solution
Develop Detailed
Test Plan/s
Design
component
solution
Test against
system
requirements
Test integration
of components
Test components
Adrian Marshall
Install
system
Integrate
components
Buy / build
components
Accept
system
ACCEPTANCE LEVEL
SYSTEM LEVEL
INTEGRATION LEVEL
COMPONENT LEVEL
Testing Approaches
Bottom up
• tests smallest components / sub functions first
• test drivers are required
Top down
• tests major functional areas from the top down
• stubs are used where lower levels are incomplete
Functional thread
• process / path-oriented approach which crosses unit
boundaries
Combined….
COMP8130 4130
Adrian Marshall
Testing Levels
Unit / Component
testing conducted to verify the implementation of the design for one software
element (for example, unit, module, function, class instance, method) or a
collection of software elements
Integration
an orderly progression of testing in which software elements, hardware
elements, or both are incrementally combined and tested until the entire
system has been integrated
System
the process of testing an integrated hardware and software system to verify
that the system meets its specified requirements….
COMP8130 4130
Adrian Marshall
Classes of Test
White (or glass) box testing
• designed with knowledge of how the system is constructed
• aims to exercise the internal logical structure
• statements, decisions, paths & exception handling evaluated
Black box testing
• designed without knowledge of how the system is constructed
• verifies that functional & performance requirements have been
satisfied
• focuses on the external behaviour of the system….
Grey Box
• designed with some knowledge of how the system is constructed
COMP8130 4130
Adrian Marshall
White Box Testing
White box testing techniques
• Control flow based testing (e.g. decision & statement
coverage testing)
• Statement coverage – each statement is executed at least
once
• Decision coverage – each conditional statement is
executed at least once each way
• Complexity based testing – (eg McCabe cyclomatic
complexity measure) – higher concentration of tests for
more complex software
• Boundary case and exception handling
COMP8130 4130
Adrian Marshall
Black Box Testing
Black box testing techniques
• Equivalence partitioning
• Boundary value analysis
• Decision table
• Testing from formal specifications
• Error guessing
• Exploratory testing
COMP8130 4130
Adrian Marshall
Risk-Based Testing
Test areas of risk with more rigor (greater coverage of
functionality, and/or code)
Product risks may include:
• Performance (capacity, throughput, accuracy, etc)
• Safety
• Security (authentication…)
• Complexity
Test areas of higher risk first.
Focus on consequences and likelihood.
COMP8130 4130
Adrian Marshall
Requirements-Based Testing
Systematic requirements based testing ensures
complete testing scope is analysed.
Focus Areas (examples)
Functionality • Security, Accuracy, Regulatory Compliance, Technical Compliance,
Reliability • Data Integrity, Error Handling, Fault Tolerance, Recoverability
Useability • User Friendliness, User Guidance, Adaptability , Clarity of Control, Error Handling,
Conciseness, Ease of Learning, Documentation Quality, Ease of Installation,
Performance • Throughput, Acceptable Response Time, Data Storage Requirements, Acceptable
Memory Capacity, Acceptable Processing Speed
Portability • Portability to Different Hardware Platforms, Compatibility With Different
Operating Systems, Conformance, Replaceability, Languages Supported….
COMP8130 4130
Adrian Marshall
Test Methods
Method
Description
Demonstration
The actual operation of an item to provide evidence that it
accomplishes the required functions under specific
scenarios. This requires human observation against
definable objectives.
Inspection
Used to verify conformance with specified standards, by
examination of, and comparison with, drawings, code,
unit test results or other documentation.
Analysis
Analysis is the use of established technical or
mathematical models or simulations, algorithms, or other
scientific principles and procedures to provide evidence
that the item meets its stated requirements
Test
Test is the application of scientific principles and
procedures to determine the properties or functional
capabilities of items. The process will generate data that
is presented by calculated, precision measurement
equipment for subsequent evaluation.
COMP8130 4130
Adrian Marshall
Inspections and Reviews
Inspections and reviews require visual examination.
They can be conducted at the early definitions
phase and hence provide efficient defect
rectification.
Can be influenced by the ability of the
inspector/reviewer (use checklists to
standardise).
COMP8130 4130
Adrian Marshall
Common Testing Types (1)
Acceptance testing / User Acceptance Testing (UAT)
Testing a system’s behaviour against the customer’s requirements
Alpha & beta testing
Testing by a representative sample of users (internal = alpha, external =
beta)
Installation testing
Testing a system after installation in the target environment
Performance testing
Testing against specified performance requirements (eg. response time)
Reliability testing
Testing of stability, endurance, robustness, and recoverability….
COMP8130 4130
Adrian Marshall
Common Testing Types (2)
Regression testing
Re-testing previously run tests to evaluate the impact that a software change
may have on unaltered software components
Security testing
Testing a system’s ability to prevent unauthorised use or misuse,
authentication
Compatibility / Interoperability testing
Testing the ability of software to operate and coexist with other (application
and system) software and hardware
Stress testing
Exercising a system at the maximum design load and beyond
Usability testing
Testing a system’s user friendliness, ease of learning, and ease of use….
COMP8130 4130
Adrian Marshall
Testing Tools
Test management tools
•
•
•
•
information repositories
document generators
defect management tools
requirements traceability and test coverage tools
Test execution tools
•
•
•
•
•
•
compliers, debuggers, link loaders
source code analysers (coverage & complexity)
GUI testers
functional record and replay tools / robots
performance / load and stress testing tools
security vulnerability analysis tools….
COMP8130 4130
Adrian Marshall
Requirements
Requirements definition through design
Software Specifications
Requirement/specification reviews
COMP8130 4130
Adrian Marshall
Software Specification
COMP8130 4130
Adrian Marshall
Twelve Requirements Basics for Project Success”, Dr. Ralph R.
Young , Northrop Grumman Information Technology Defense Group
Criteria for good requirements
COMP8130 4130
Adrian Marshall
Twelve Requirements Basics for Project Success”, Dr. Ralph R.
Young , Northrop Grumman Information Technology Defense Group
Review for testability
Review criteria:
• Concise
• Complete
• Unambiguous
• Consistent
• Verifiable
• Traceable
COMP8130 4130
Adrian Marshall
Review
•
V & V can provide continuous information about the quality of the system and
the development effort
•
cost of detection & prevention < cost of corrective action
•
Testing is a process by which we understand the status of the benefits and the
risk associated with release of a software system.
•
There are many testing techniques available for developers and testers.
•
Risk based testing is used to focus scarce testing resources.
•
Systematic requirements based testing ensure complete testing scope is
analysed.
•
Automated testing tools may by used to assist test management and
execution.
COMP8130 4130
Adrian Marshall