Systems Reliability

Download Report

Transcript Systems Reliability

SysTrust Introduction
SYSTRUST COURSE
February 2001
SysTrust History
SYSTRUST COURSE
February 2001
Agenda
Vision
 Task Force Membership
 SysTrust Roll-out Activities
 Task Force’s Due Diligence
 Support Tools
 Successes to Date
 Feedback to Date
 Future Enhancements

Vision
Systems Reliability
Task Force Focus
Systems
Reliability
Assurance
Report on
internal control
Today
Tomorrow
Real-time
assurance on
on-line databases
Ultimately
Task Force Membership

Thomas E.Wallace, Chair

J. Efrim Boritz

Robert Parker

Robert J. Reimer

George H. Tucker III

Miklos A. Vasarhelyi

Sander Wexler

Dan White

CICA Staff
– Bryan Walker, Principal,
Research Studies

AICPA Staff
– Erin P. Mackler, Technical
Manager Assurance
Services
– Judith M. Sherinsky,
Technical Manager Audit
and Attest Standards
SysTrust Roll-out Activities 1
Issued
Development
Exposure
7/99
Supporting Tools
9/99 11/99
SysTrust Roll-out Activities 2
SCAS/TFAS 1996 - 1997
 Version 1 - Jan/88 - Nov/89

–
–
–
–
–
Development - Jan/88 - April/99
Review - April/99 - June/99
Exposure Draft - July/99 - September/99
Final issuance - Fall 1999
Training courses - Fall 1999
Version 2 - Jan - July 2000
 Version 3 - Jan - ? 2001

Task Force’s Due Diligence

Review of draft conducted by:
–
–
–
–
–

Associates - practitioners, academics
Institutes’ technical committees
Ev Johnson - Chair of eComm Committee
Selective members of Institutes’ ASB
Industry - Internal Audit, CFO, CIO
Considered:
– market and need, completeness and relevance
of principles & criteria, & other comments
Support Tools 1

Competency Model – What skills are needed for SysTrust

Training Courses – SysTrust Overview
– How to Perform a SysTrust Engagement
– In-Depth Training in SysTrust Principles &
Criteria
– Information Systems Audit & Control
Association (ISACA) courses
Support Tools 2

Practitioners Aids –
–
–
–
–
–
Workplans
Engagement letters
Representation letters
Checklists
Practice guides
Marketing ideas
Support Tools 3

Marketing
– Conceptual Marketing Plan by AICPA
– articles/ads e.g. Journal of Accountancy,
CA Magazine, ISACA
– AICPA and CICA websites
– pilot project testimonials by practitioners
– conferences and training (UWCISA/JIS)
– related organizations; e.g. ISACA

Alliances
Successes to Date
Approx. 40 engagements
 Typically $100 - 200,000 range
 Many pre-implementation/readiness
reviews
 Industries:

– Government, Banks, Utilities
– .Coms: Loudcloud.com, Agillion.com

Adoption by Internal Audit departments
Feedback to Date
Like framework:
 Need flexibility in use:

– ability to report on less than all principles
– ability to issue a point in time report

Clarify privacy’s impact on reliability:
– in - confidentiality of private information
– out - accuracy of data, consent, individuals’
right to view, remediation, etc
Future Enhancements

Versions 3.0 & 4.0?
– enhancements to principles & criteria
– enhancements to reporting
point in time, “seal” program, holistic
– continuous auditing & reporting

Buy-in by industry
– management, internal audit, developers

Buy-in by Practitioners
SysTrust!
SysTrust Overview
SYSTRUST COURSE
February 2001
Agenda
Systems Reliability in Business
 What is SysTrust?
 Positioning SysTrust
 SysTrust Framework

–
–
–
–
System
Reliability
Criteria
Controls
Systems Reliability
in Business
IT Running
the Business
IT Differentiates
in the Marketplace
SPEED, COST
IT Demanding
more Capital
IT Permeating all
areas of a Company
More Reliance on
IT of Partners
& QUALITY
Growth
Profitability
Mkt Share
Drivers of Need
Like a weak link in a chain, an unreliable
system can fail the entire business
Recent Headlines
Reliability & the Market
70
60
50
40
$ 2.5b
$737m
30
20
$767m
10
9
/9
22
3/
9
99
8/
3/
22
/9
99
2/
8/
2/
9
/9
25
1/
11
/9
9
98
1/
/2
8/
98
12
/1
4/
98
12
11
/3
6/
/1
0/
98
8
/9
11
/2
11
9/
/1
10
/5
/9
8
98
0
10
E*Trade Stock Price(EGRP)
E*Trade Publicized Network Failures
& Resulting Market Cap Decreases
Factors of Unreliability

Denial of Service
– system failures, crashes, capacity issues

Unauthorized Access
– Viruses, hackers, loss of confidentiality

Loss of Data Integrity
– corrupted, incomplete, fictitious data

Maintenance problems
– unintended impact of system changes

Failure to fulfill commitments
Need for SysTrust
What We Found:

No Common Definition of Reliability
– e.g. is security in or out?

No Basis for Comparison
– at what point is reliability achieved

Differing levels of Objectivity & Rigor
– how much and how good is assessment
What is “SysTrust” ?

SysTrust - A CA/CPA’s assurance report on
a system’s reliability
– US - SSAE #1
– Canada -section 5025
Opinion on controls using framework of 4
principles & 58 criteria on reliability
 To earn SysTrust opinion, a system must
meet all criteria for principles reported on

A “SysTrust” Opinion...
“ We have audited the assertion by mgmt that...
ABC company maintained effective controls...
over system availability, security, processing
integrity and maintainability...
based on SysTrust principles & criteria…”
“ In our opinion mgmt’s assertion…
is fairly stated in all material respects...”
Components of “SysTrust”
SysTrust Criteria
System Description
Mgmt’s Assertions
Auditor’s Report
Positioning “SysTrust”
Continuous
Auditing
Periodic
Assurance
SysTrust
Consulting
Services
Design ----Implement ---------------Operate
1
Positioning “SysTrust”
NonFinancial
SysTrust
W
e
b
T
r
u
s
S- 5900 t
SAS/70
Financial
Internal
Users
External
Users
2
Definitions
 “SYSTEM”
 “RELIABILITY”
 “CRITERIA”
 “CONTROLS”
(vs. internal control)
“SYSTEM”
1
A SYSTEM is an
organized collection
of software,
infrastructure,
Software Infrastructure
people, procedures
and data that,
Data
together within a
People
Procedures
business context,
produces
information
SYSTEM
“SYSTEM”
– infrastructure (facilities, equipment and
networks)
– software (systems, applications, utilities)
– people (developers, operators, users and
managers)
– procedures (automated and manual)
– data (transaction streams, data bases and
tables)
2
“RELIABILITY”
Reliable System defined as:
“A system that operates without material
error, fault or failure during a specified
time in a specified environment.”
Four Principles:
- Availability
- Integrity
- Security
- Maintainability
“Reliability” Framework
CRITERIA
MAINTAINABILITY
INTEGRITY
SECURITY
AVAILABILITY
RELIABILITY
CRITERIA CRITERIA CRITERIA
“CRITERIA”
Each Principle has series of Criteria
 Criteria categories:

– policies exist and are appropriate
– policies are implemented and operate
– adherence to policy is monitored

Definition of Criteria:
- measurable - relevant
- objective
- complete
Structure of Criteria
1
Structure of Criteria
2
PRINCIPLES
CRITERIA
CATEGORIES
Availability
Security
Integrity
Maintainability
TOTALS
Policies
5
5
5
5
20
Procedures
4
11
6
5
26
Monitoring
3
3
3
3
12
Totals
12
19
14
13
58
Example: Availability


Principle: The system is available for
operation and use at times set forth in
service level statements or agreements.
Criteria Categories:
– The entity has defined and communicated performance
objectives, policies, and standards for system availability.
– The entity utilizes processes, people, software, data, and
infrastructure to achieve system availability objectives in
accordance with established policies and standards.
– The entity monitors the system and takes action to achieve
compliance with system availability objectives, policies, and
standards.
Example: Availability (cont’d)
Availability: The system is available for operation and use at times set forth in service level statements or agreements.
Criteria
A1
The entity has defined and communicated performance objectives, policies, and standards for system availability.
A1.1
The system availability requirements of
authorized users, and system availability
objectives, policies, and standards are
identified and documented.
A1.2
The documented system availability
objectives, policies, and standards have been
communicated to authorized users.
A1.3
The documented system availability
objectives, policies, and standards are
consistent with the system availability
requirements specified in contractual, legal,
and other service level agreements and
applicable laws and regulations.
A1.4
Responsibility and accountability for system
availability have been assigned.
A1.5
Documented system availability objectives,
policies, and standards are communicated to
entity personnel responsible for implementing
them.
“CONTROLS”




primary evidential basis for evaluating whether
criteria, hence, reliability principles satisfied
assurance provider assesses controls deemed
relevant to concluding whether Criteria met
may supplement with direct tests of Criteria
require judgment to determine nature and extent of
evidence required to verify existence,
effectiveness and continuity of controls
Illustrative Controls

CICA’s ITCG
– comprehensive coverage
risk management &
control,
IT planning,
IS acquisition,
development &
maintenance,
operations & support,
security,
business continuity &
recovery, etc.
1
Illustrative Controls

ISACF’s COBIT
– also comprehensive
planning & organization,
acquisition &
implementation,
delivery & support,
monitoring, etc.
2
Example: Availability (cont’d)
Availability: The system is available for operation and use at times set forth in service level statements or agreements.
Criteria
Illustrative Controls
A1
The entity has defined and communicated performance objectives, policies, and standards for system availability.
A1.1
The system availability requirements of
authorized users, and system availability
objectives, policies, and standards are
identified and documented.
Procedures exist to identify and document authorized users of the system and their availability
requirements.
The documented system availability
objectives, policies, and standards have been
communicated to authorized users.
There is formal communication of system availability objectives, policies, and standards to
authorized users through means such as memos, meetings, and manuals.
The documented system availability
objectives, policies, and standards are
consistent with the system availability
requirements specified in contractual, legal,
and other service level agreements and
applicable laws and regulations.
A formal process exists to identify and review contractual, legal, and other service level
agreements and applicable laws and regulations that could impact system availability objectives,
policies, and standards.
A1.4
Responsibility and accountability for system
availability have been assigned.
A position(s) exists that has formal responsibility and accountability for system availability as
indicated by a documented job description and organization chart.
A1.5
Documented system availability objectives,
policies, and standards are communicated to
entity personnel responsible for implementing
them.
Documented system availability objectives, policies, and standards are communicated to
personnel responsible for implementing them through such means as memos, meetings, and
manuals.
A1.2
A1.3
User requirements are documented in service level agreements or other documents.
Procedures exist to log and review requests from authorized users for changes and additions to
system availability objectives, policies, and standards.
Procedures exist to review any new or changing contractual, legal, or other service level
agreements and applicable laws and regulations for their impact on current system availability
objectives, policies, and standards.
Additions and changes to system availability objectives, policies, and standards are
communicated on a timely basis to entity personnel responsible for implementing and monitoring
them.
Principles & Criteria
SYSTRUST COURSE
February 2001
SysTrust Principles




The system is available for operation and
use at times set forth in service level
statements or agreements.
The system is protected against
unauthorized physical and logical access.
System processing is complete, accurate,
timely and authorized.
The system can be updated when required
in a manner that continues to provide for
system availability, security, and integrity.
Security Principle

Category S1:
– The entity has defined and
communicated performance objectives,
policies, and standards for system
security.
Security Principle

S1.1: The system security requirements of authorized users,
and the system security objectives, policies and standards
are identified and documented.

S1.2: The documented system security objectives, policies,
and standards have been communicated to authorized users.

S1.3: Documented system security objectives, policies, and
standards are consistent with system security requirements
defined in contractual, legal, and other service level
agreements and applicable laws and regulations.

S1.4: Responsibility and accountability for system security
have been assigned.

S1.5: Documented system security objectives, policies, and
standards are communicated to entity personnel responsible
for implementing them.
Security Principle

Category S2:
– The entity utilizes processes, people,
software, data, and infrastructure to
achieve system security objectives in
accordance with established policies
and standards.
Security Principle

S2.1: Acquisition, implementation, configuration and
management of system components related to system
security are consistent with documented system security
objectives, policies, and standards.

S2.2: There are procedures to identify and authenticate all
users accessing the system.

S2.3: There are procedures to grant system access
privileges to users in accordance with the policies and
standards for granting such privileges.
Security Principle (cont.)

S2.4: There are procedures to restrict access to computer
processing output to authorized users.

S2.5: There are procedures to restrict access to files on offline storage media to authorized users.

S2.6: There are procedures to protect external access
points against unauthorized electronic access.

S2.7: There are procedures to protect the system against
infection by computer viruses, malicious codes, and
unauthorized software.

S2.8: Threats of sabotage, terrorism, vandalism and other
physical attacks have been considered when locating the
system.
Security Principle (cont.)

S2.9: There are procedures to segregate incompatible
functions within the system through security
authorizations.

S2.10: There are procedures to protect the system against
unauthorized physical access.

S2.11: There are procedures to ensure that personnel
responsible for the design, development, implementation
and operation of system security are qualified to fulfil their
responsibilities.
Security Principle

Category S3:
– The entity monitors the system and
takes action to achieve compliance with
system security objectives, policies,
and standards.
Security Principle

S3.1: System security performance is periodically reviewed
and compared with documented system security
requirements of authorized users and contractual, legal,
and other service level agreements.

S3.2: There is a process to identify potential impairments
to the system’s ongoing ability to address the documented
security objectives, policies, and standards, and to take
appropriate action.

S3.3: Environmental and technological changes are
monitored and their impact on system security is
periodically assessed on a timely basis.
Principle: Integrity

System processing is complete,
accurate, timely and authorized.
Integrity Principle

Category I1:
– The entity has defined and
communicated performance objectives,
policies, and standards for system
processing integrity.
Integrity Principle

I1.1: The system processing integrity requirements of
authorized users and the system processing integrity
objectives, policies, and standards are identified and
documented.

I1.2: Documented system processing integrity objectives,
policies, and standards have been communicated to
authorized users.

I1.3: Documented system processing integrity objectives,
policies, and standards are consistent with system
processing integrity requirements defined in contractual,
legal, and other service level agreements and applicable laws
and regulations.
Integrity Principle (cont.)

I1.4: There is assignment of responsibility and accountability
for system processing integrity.

I1.5: Documented system processing integrity objectives,
policies, and standards are communicated to entity personnel
responsible for implementing them.
Integrity Principle

Category I2:
– The entity utilizes processes, people,
software, data, and infrastructure to
achieve system processing integrity
objectives in accordance with
established policies and standards.
Integrity Principle

I2.1: Acquisition, implementation, configuration and
management of system components related to system
processing integrity are consistent with documented
system processing integrity objectives, policies, and
standards.

I2.2: The information processing integrity procedures
related to information inputs are consistent with the
documented system processing integrity requirements.

I2.3: There are procedures to ensure that system
processing is complete, accurate, timely, and authorized.
Integrity Principle (cont.)

I2.4: The information processing integrity procedures
related to information outputs are consistent with the
documented system processing integrity requirements.

I2.5: There are procedures to ensure that personnel
responsible for the design, development, implementation
and operation of the system are qualified to fulfil their
responsibilities.

I2.6: There are procedures to enable tracing of information
inputs from their source to their final disposition and vice
versa.
Integrity Principle

Category I3:
– The entity monitors the system and
takes action to achieve compliance with
system integrity objectives, policies,
and standards.
Integrity Principle

I3.1: System processing integrity performance is
periodically reviewed and compared to the documented
system processing integrity requirements of authorized
users and contractual, legal and other service level
agreements.

I3.2: There is a process to identify potential impairments to
the system’s ongoing ability to address the documented
processing integrity objectives, policies, and standards and
take appropriate action.

I3.3: Environmental and technological changes are
monitored and their impact on system processing integrity
is periodically assessed on a timely basis.
Principle: Maintainability

The system can be updated when
required in a manner that continues
to provide for system availability,
security, and integrity.
Maintainability Principle

Category M1:
– The entity has defined and
communicated performance objectives,
policies, and standards for system
maintainability.
Maintainability Principle

Category M2:
– The entity utilizes processes, people,
software, data, and infrastructure to
achieve system maintainability
objectives in accordance with
established policies and standards.
Maintainability Principle

Category M3:
– The entity monitors the system and
takes action to achieve compliance with
maintainability objectives, policies, and
standards.
SysTrust!