Starcom Leo Burnett

Download Report

Transcript Starcom Leo Burnett

Don’t Just “Test”…
Validate!!
Agenda




Overview of Testing versus Validation
Common Activities and Responsibilities
Justification for Validation
Discussion
Project Life Cycle
Production
Environment
Design and Coding
Testing
Traditional View of Testing
Install
Set For
Year End
Project Life Cycle
Production
Environment
Design and Coding
Testing
Time & Resource Constraints
Install
Set For
Year End
Risk Factors
Examples of Project Risk Factors
SEI identifies at least 10 project risk factors:
1. Over-ambitious Schedule
2. Under-ambitious Budget
3. Over-ambitious or unrealistic expectations
4. Undefined or misunderstood obligations
5. Inadequate software sizing estimate
6. Unsuitable or lack of development process model
7. Continuous requirements change
8. Unsuitable organizational structure
9. Inadequate software development plan
10.Lack of political support
Assess these factors in relationship to the testing effort
Factors Affecting Testing
Short Discussion:
What have you seen?
Testing Objectives
Did we build the system right?
Maximiz
e
Minimize
System
Quality
Testing
Effort
What Is Validation?
Providing documented evidence with high degree
of assurance the system will consistently
produce a product or perform a process which
meets the predetermined specifications and
quality attributes
Factors Driving Validation
Degree of Validation
Relationship of Risk to Validation
Risk tells you
•What to address first
•What to address most thoroughly
Degree of Risk
Validation Objectives
Did we build the right system?
Maximiz
e
System
Quality
Meet
Busines
s Needs
Goals of Validation
 Control of system
development:
— Requirements
—
—
—
—
meaningful, specified,
and approved
Effective process
capability of developer
Existence of and
conformance to SOPs
Effective, appropriate,
documented testing
Configuration
management & change
control
 Control of system use:
— Current accuracy and
—
—
—
—
—
reliability
Continued accuracy and
reliability
Management awareness
and control
Auditability
Data integrity
Reviewer independence
and credentials
What Makes Up A System?
Peripherals
Documentation
Procedures
Hardware
Infrastructure
Software
Trained Personnel
The Computer
System
The Business
Environment
Testing (traditionally) Encompasses…
Documentation
Procedures
Hardware
Infrastructure
Software
Trained Personnel
The Computer
System
Validation Encompasses…
Peripherals
Documentation
Procedures
Hardware
Infrastructure
Software
Trained Personnel
The Computer
System
The Business
Environment
CMM Quality Assurance Program
Program Owner Accountabilities
QA Accountabilities occur at every touch point
Project Management Process
Product Process
Customer
Requirement
Idea
Generation
Design
Marketing
General
Release
Project
Orientation
Inception
Initiation
Planning
Executing
SDLC
Elaboration
Test Scope
Test Strategy
QC
Process
Construction
Test Construction
Test Execution
Test Planning
Transition
Transition
Test Evaluation
Closure
Implementation
Operations
Support
Retirement
QA & QC Coming Together
Quality Assurance
Process Audits
Process Consultation
Defect Analysis
Continuous Improvement
Standards Development
Project
Orientation
Initiation
Planning
Executing
Closure
Risk Management
Test Planning
Measurement Analysis
Transition
Defect Management
Static Testing
Dynamic Testing
Test & Quality Control
Measurement & Analysis
Test Automation
Common Activities
Each Level is Composed of These Activities
Strategy
Evaluate
Design
Execute
Strategy Development
 Determine Validation Strategy
 Perform Risk Assessment
 Identify Critical Success Factors
 Set Validation Objectives
 Define Validation Activities
 Estimate Time & Resources
 Document Validation Strategy
Strategy
Evaluate
Design
Execute
Identify Critical Success Factors
Factor
Extent to which ...
Correctness
… system must satisfy stated requirements
Authorization
… processing requires management authorization
Data Integrity
… data stored by the system must be accurate
Service Levels
… schedules must be met
Access Control
… access must be restricted
Methodology
… development plan must be followed
Reliability
… system cannot fail during operation
Ease of use
… effort is required to learn and use system
Maintainability
… effort is required to fix errors in the system
Portability
… system can operate in multiple environments
Performance
… functions must perform within a specified time
Design
 Develop/obtain tools for conducting Validation
Activities
—
—
—
—
—
—
Deliverables & Phase Checklists
Procedures describing reviews &
inspections
Validation Matrices
Test Cases
Automated Testing Tools
Audit plans
Evaluate
Strategy
Design
Execute
Examples of Validation Objectives
#
1
Validation Objective
Customer orders
processed correctly
Priority
Completion Criteria
High
Enter customer order information as
described in the validation test cases and
verify the results are correct
High
No open defects of type SEVERE or HIGH
Performance criteria are met
Users are trained, documentation is ready
2
System is ready to use
3
Hold is placed on
inventory when QC test
fails
Med
Enter FAIL test results for product
Verify MRP II indicates status as HOLD
Verify product is segregated in warehouse
4
Increased through-put
Low
Measure increase by time study
5
Labels printed as req’d
Low
Print labels, verify against approved label
copy specifications
Examples of Validation Activities
Activity
Performed by
Requirements
Reviews
Business
Analysts,
Developers,
Users
Study & discuss system
requirements to ensure they
meet business needs
Reviewed statement
of requirements,
ready to be
translated into
design
Developers
Study & discuss system design
to ensure it will support
requirements
Design, ready to be
translated into
software, hardware,
documents, etc.
Developers
Informal analysis of code to
find defects
Software ready to
be inspected or
tested
Developers
Formal analysis of code to find
defects, problem areas, coding
techniques, design flaws
Software ready to
be tested
Design
Reviews
Code
Inspections
Code Walkthroughs
Description
Deliverable
Examples of Validation Activities
Activity
Unit Testing
Integration
Testing
Performed by
Developers
Independent
Test Team
System
Testing
Acceptance
Testing
Users
Description
Deliverable
Test single units of code.
Validates that the software
performs as designed.
Software unit ready
for testing with
other components
Test related units, programs,
modules. Validates that
multiple parts of the system
interact according to design.
Portions of the
system ready for
testing with other
portions
Conducted at end of Integration
Testing. Test entire system.
Validates system requirements.
Tested system,
based on
specification
Test system to make sure it
works in the environment.
Validates business needs.
Tested system,
based on user
needs
Detailed Validation Matrices
Requirement
Validate sign-on
Invalid term ident
Validate password
Password/ Signon error
Validate Main Menu
Re-enter O, X, M, R
Validate Customer #
Customer # Required
Field entry too long
Customer ___ not found
1
2
Test Case
3
4
5
x
x
x
x
x
x
x
x
x
x
6
Execution
 Organize Validation Tools
 Train Team Members
 Execute Validation Plan
 Track Validation Progress
 Perform Regression Testing
 Document Results
Strategy
Evaluate
Design
Execute
Evaluation
 Evaluate Compliance of System Development
 Evaluate Defects
 Develop Findings and Recommendations
 Formalize Report
Strategy
Evaluate
Design
Execute
Who Is Responsible for Validation?
Developer
Business Unit
(User)
Independent
Tester
Quality
Assurance
Business Unit Responsibility
 Define system




requirements
Define data used in the
system
Enforce Change Control
Plan & Perform
Acceptance Testing
Ensure adequate
backup & recovery of
data is available
 Work with developers to






coordinate schedules
Investigate Defects
Develop User
Documentation
Develop User
Procedures
Enforce security
Provide audit trail
Train personnel
Developer Responsibility
 Follow development



process
Work with QA to improve
the process
Manage changes to
system components
Provide adequate backup
& recovery
 Maintain configuration



management
Plan & Perform Unit
Testing
Provide Technical
Documentation
Maintain
communication with QA
, Testers & Business
QA Responsibility
 Define Development &




Validation Process
Verify personnel have
adequate training &
resources for their job
Review documentation adherence to standards
Periodically review
process activities
Evaluate vendor
capabilities & processes
 Verify enhancements &



maintenance are
documented
Conduct Risk
Assessments
Verify business unit has
established procedures
Ensure system is
adequately secure
Independent Tester Responsibility



Follow Validation Process
Working with Users &/or
BA’s, perform
Requirements Validation
Work with QA to improve
the process



Plan & Perform
Integration & System
Testing
Conduct Risk
Assessments
If required, Plan &
Perform Stress &
Performance Testing
Testing Levels & Validation
Business Needs
Acceptance Testing
Requirements
System Testing
Design
Integration Testing
Code
Unit Testing
Production
Plan for Maintenance
 Prepare for Maintenance during Development
 Keep Deliverables and Documentation up-to-date

— Requirements
— Design
— Test Cases, Test Data, Test Results
Maintain all related deliverables during error
correction
— Requirement specifications
— Program specifications
— Design specifications
— Defect reports
Maintenance Deliverables







Risk Assessment
Software updates
Hardware upgrades
System documentation
Manuals, procedures
Revised/New Test Cases
Updated Validation Plan
Validation Issues
 Responsibility lies with the user
—If outsourced, the responsibility is shared
between the outsourcing organization and the
user




Develop deliverables as system is maintained
Group maintenance changes together
Schedule maintenance in releases
Follow defined methodology
Purchased System Responsibilities
Customer
Validation Activity
Feasibility Review
Requirements Review
Design Review
Code Review
Unit Testing
Integration Testing
System Testing
Acceptance Testing
Conversion Testing
Maintenance Testing
Vendor
x
x
x
x
x
x
System Staff
Users
x
x
x
x
x
x
x
x
x
x
x
Cost of Quality
The Cost of Correction
$
$
Reqmts
Design
Code
Test
Prod'n
Cost of Quality
Real Benefits
Savings
Failure
Failure
Appraisal
Appraisal
Prevention
Without
Quality Process
Prevention
With
Quality Process
Validation Costs and Benefits
Incremental
Benefits
Benefits
Net Benefits
Net Costs
Incremental
Costs
Time
Costs
Development
Maintenance
Self-Assessment
Y N ?
Are risk assessments being performed on projects?
Is there a methodology in place for system development and validation?
Are verification techniques, such as walk-throughs, being used?
Have all personnel been trained in validation techniques?
Are standards in place for system development and validation?
Are the users active participants in the validation process?
Are the results of the validation used to improve the development and
validation processes?
Is the culture ready to accept new processes for validation?
Is there someone responsible for validation of systems?
Validation:
Did we build the right system?
Testing:
Did we build the system right?
System Validation
 Validates BOTH Business & Technical





Requirements
Reduces Development Costs by identifying Errors
early in Development Cycle
Reduces Risk of Operating Failures after
Implementation
Reduces Post Implementation Development Costs
by Fixing Defects during Development
Improves Overall Quality of System
Greatly improves User Satisfaction
Summary
Projects Delivered
Without Validation
Projects Delivered
With Validation
The Road to Success!!
Questions / Discussion