Practical Experiences of Safety-Critical Ada Technologies Roderick Chapman and Peter Amey Praxis Critical Systems Copyright © 2001 Praxis Critical Systems Limited 

Download Report

Transcript Practical Experiences of Safety-Critical Ada Technologies Roderick Chapman and Peter Amey Praxis Critical Systems Copyright © 2001 Praxis Critical Systems Limited 

Practical Experiences of Safety-Critical Ada
Technologies
Roderick Chapman and Peter Amey
Praxis Critical Systems
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Safety-critical Software
• Safety is a system property
– A system is safe if it will:
• not endanger life
• not cause a major environmental problem
• etc.
• Where the system relies on the operation of
software to achieve safe behaviour the software
is safety-critical
• Not all software in safety-related systems is
safety-critical
Copyright © 2001 Praxis Critical Systems Limited

Why Have Safety-critical Software?
• Advantages of software
– Complexity
• Disadvantages of software
– Complexity!
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

High-integrity Software
• Code where reliability is more important
than
– Efficiency
– Cost
– Time to market
– Functionality
• Clearly safety-critical software should be highintegrity (but reverse is not always true)
Copyright © 2001 Praxis Critical Systems Limited

Examples of High-integrity (but not S-C)
• Security systems
• Systems where direct financial loss could be
large
• Avoidance of “down-time”
• Avoiding product recall or loss of customer
confidence
• Loss of costly one-off mission capability (e.g.
Mars lander)
Copyright © 2001 Praxis Critical Systems Limited

Copyright © 2001 Praxis Critical Systems Limited

Unique Characteristic of High-integrity Software
• The need to be able to show, before there is any
service experience, that a system will meet
requirements
– Qualitatively different problem from “normal” software
– Standards required may be very high
• 109 hours > 114,000 years
• Meeting this challenge is not the same as
certification
• Not enough just to be “more careful”
Copyright © 2001 Praxis Critical Systems Limited

Copyright © 2001 Praxis Critical Systems Limited

Some Definitions
• Verification is the process of determining that a
system (or component) meets its specification
• Validation is the process of determining that a
system is appropriate for its purpose
• Certification is persuading an external regulatory
body that a set of specific requirements have been
met or processes followed
Copyright © 2001 Praxis Critical Systems Limited

A Single-sentence Standard
The fitness for purpose of a
software program shall be
established by logical reasoning
Copyright © 2001 Praxis Critical Systems Limited

V&V Techniques
• Inspections and reviews
– e.g.
• Requirements reviews
• Code inspections
• Test
– e.g.
• Requirements-based
• White box
• Analysis
– e.g.
• Flow analysis
• Model checking
• Proof
Copyright © 2001 Praxis Critical Systems Limited

Inspection and Reviews
• Advantages
– Very flexible, we can review an entity for internal consistency,
compare different artefacts for equivalence etc.
– No special tools needed
– Can be done early
• Disadvantages
– Informal process; may spot errors but cannot guarantee absence
– What is feasible limited by:
• Semantic precision of artefact being inspected
• Semantic gap between artefacts being compared
Copyright © 2001 Praxis Critical Systems Limited

Advantages of Dynamic Testing
• Spans the entire development process
– Potentially able to identify:
• Requirements errors
• Specification errors
• Coding errors
• Compiler bugs
• Hardware problems
Copyright © 2001 Praxis Critical Systems Limited

Disadvantages of Dynamic Testing
• Theoretical limitations
– Exhaustive testing almost invariably impossible
– High levels of confidence require infeasible amounts of
testing
• Practical disadvantages
–
–
–
–
–
Can only take place towards the end of development
Can be hard to diagnose unexpected behaviour
Frequently a bottleneck (e.g. Shared use of test rig)
Very expensive
Significant source of project risk
Copyright © 2001 Praxis Critical Systems Limited

A Note on Theoretical Limitations of
Testing
• Ultra-high reliability region (<10-7 failures per hour)
• Bayesian mathematics clearly limits what we can
claim from testing alone
• Reliability growth models cannot provide necessary
assurance
• See:
– The infeasibility of quantifying the reliability of life-critical realtime software. Butler & Finelli. NASA Langley Research Center
– Validation of Ultrahigh Dependability for Software-based Systems.
Littlewood & Strigini. CACM Nov 1993
Copyright © 2001 Praxis Critical Systems Limited

Advantages and Disadvantages of Analysis
• Advantages
– Can be used early in the development process
– Can establish properties that cannot be demonstrated in any
other way. e.g.
• Proof of absence of run-time errors
• Freedom from timing deadlocks
• Disadvantages
– Can only compare artefacts (e.g. Code against specification)
– What can be achieved is limited by precision of descriptions
and notations used
Copyright © 2001 Praxis Critical Systems Limited

A Balanced Approach
• Showing “correctness” is harder than building
correct systems
• Use a variety of complementary verification
techniques
• Bug detection and correction is expensive so:
– Focus on bug prevention
– Use techniques that find bugs early
– Regard final testing as demonstration of correct behaviour
rather than method of finding bugs
Copyright © 2001 Praxis Critical Systems Limited

Resources
• Littlewood, Bev; and Strigini, Lorenzo: Validation of Ultrahigh
Dependability for Software-based Systems. CACM 36(11): 69-80
(1993)
• Butler, Ricky W.; And Finelli, George B.: The Infeasibility of
Quantifying the Reliability of Life-critical Real-time Software.
IEEE Transactions on Software Engineering, vol. 19, no. 1,
Jan. 1993, pp 3-12
• Littlewood, B: Limits to Evaluation of Software Dependability.
In Software Reliability and Metrics (Procedings of Seventh
Annual CSR Conference, Garmisch-Partenkirchen). N. Fenton
and B. Littlewood. Eds. Elsevier, London, pp. 81-110
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-C
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-C
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Languages Affect the Way We Think
• Less likely to think of abstractions in machine
code or assembler
• Think more in machine terms in C
• Unlikely to think in object-terms in FORTRAN or
in functional terms in Smalltalk
• For larger systems, language support for
abstraction and encapsulation is vital
Copyright © 2001 Praxis Critical Systems Limited

When to Get Formal?
• Object (machine) code is formal
– The execution machine provides operational semantics for it
• Typically we cannot reason about machine code
we can only observe its behaviour
• Reliance on dynamic testing loads the lifecycle
towards the back (expensive) end
• Early reasoning saves money and reduces risk
but is hampered by lack of formality of
programming languages
Copyright © 2001 Praxis Critical Systems Limited

Causes of Uncertainty
• Deficiencies in language definitions
– Semantics of C integer division
– Pascal named vs. Structural type equivalence
• Implementation freedoms
– Execution order
– Parameter passing mechanisms
Copyright © 2001 Praxis Critical Systems Limited

Consequences of Uncertainty
• Leads to either:
– Ambiguity; programs whose behaviour cannot be predicted
from the source code; or
– Insecurity; violations of language rules that cannot be
detected
Copyright © 2001 Praxis Critical Systems Limited

Example of an Ambiguity
y = f(x) + g(x);
Suppose function f modifies x as a side-effect of its operation.
In this case the meaning of the expression depends on whether
f or g is evaluated first.
A rule stating “functions are not permitted to have side
effects” simply turns the ambiguity into an insecurity
Copyright © 2001 Praxis Critical Systems Limited

A Simple C Ambiguity
i = v[i++];
Page 46 of the C++ Annotated Reference Manual states that
this leads to the value of i being undefined.
Incidentally, this is a violation of MISRA-C Rule 46
Copyright © 2001 Praxis Critical Systems Limited

A Tiny Ada Ambiguity
procedure Init2(X, Y : out integer)
is
begin
X := 1;
Y := 2;
end Init2;
What is the meaning of:
Init2(A, A);
Copyright © 2001 Praxis Critical Systems Limited
Incidentally, this call is
not legal SPARK

Resolution
• Ambiguities are resolved .... by compiler authors
• Insecurities are left for the user to discover
• Possible solutions
– Invent new languages without these problems
– Work with dialects associated with compilers
– Use logically coherent language subsets to overcome ambiguities
and insecurities
Copyright © 2001 Praxis Critical Systems Limited

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-c
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Special-purpose Languages
• Inventing new languages is fun! e.g.
– Gypsy,
– Newspeak
• But very small user base leads to:
– Poor or non-existent tools
– Staff shortages
– Lack of training support
Copyright © 2001 Praxis Critical Systems Limited

The VIPER Analogy
• VIPER was a “formally verified” microprocessor
– Commercial failure
– Users preferred “defective” processors with wide support
• Abandoning the wider user-base is a high-risk
strategy
Copyright © 2001 Praxis Critical Systems Limited

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-C
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Using Language Dialects
• Find out what your compiler does
• Document it
• Include it in:
– Coding standards
– Review checklists
Copyright © 2001 Praxis Critical Systems Limited

Problems With Dialects
• What is the compiler’s behaviour?
– Is it consistent?
– Does it change with new releases?
– Is it the same on host and target?
• Knowing when it matters—to know the dialect
meaning of Init(A, A); we must both:
– recognise that implementation dependent behaviour is
present; and
– know what the compiler’s behaviour is in this case
Copyright © 2001 Praxis Critical Systems Limited

Dialect Observations
• Having a working knowledge of the compiler is
always useful, but
• You cannot expect to avoid all anomalous
behaviour by this method
• The approach may be valuable in special cases.
e.g. small, specialized processors where one
company provides processor, compiler and
analysis tools
Copyright © 2001 Praxis Critical Systems Limited

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-C
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Language Properties Required
• Simple
Produced by independent
specialists on behalf of
UK MoD.
• Application oriented
• Predictable
• Formally verifiable
• Sound supporting
tools
Copyright © 2001 Praxis Critical Systems Limited

• No currently standardised language
could be recommended without
reservation for the most critical
applications without subsetting”
Requirements for Programming Languages, Computer Standards
and Interfaces 1992 BAWichmann, National Physical Laboratory
Copyright © 2001 Praxis Critical Systems Limited

Safe Subsets
• Potentially give us the best of both worlds:
– Logical soundness and predictability
– Access to standard compilers, tools, training, staff
But
• Level of integrity achievable depends on
foundation language chosen
• Subsetting alone may not be enough for the
highest integrity levels
Copyright © 2001 Praxis Critical Systems Limited

Constructing a Safe Subset
• Selection of base language
• Removal of the most troublesome language
features
• Limitations on the way remaining features may
be used
• Introduction of annotations to provide extra
information
Copyright © 2001 Praxis Critical Systems Limited

The Subset Spectrum
• We can construct subsets that vary on 4 axes:
–
–
–
–
Precision (security and lack of ambiguity)
Expressive power
Depth of analysis possible
Efficiency of analysis process
• Trade-offs quite complex; we are trying to avoid
surprise: unexpected behaviour which we don’t
find until test
– Removing problematic features may reduce this risk
– Increased precision may require reduction in expressive
power but improves depth of analysis
– We may be able to combine expressiveness with depth of
analysis but at the cost of efficiency of analysis
Copyright © 2001 Praxis Critical Systems Limited

The Subset Spectrum (Contd.)
• Fundamental trade-off is between discipline we
accept to reduce bug insertion and the effort we
are prepared to make in bug detection
• For example:
– Unrestricted C provides little or protection from bug
insertion
– Ada requires extra discipline (e.g. strong typing) which
reduces bug insertion rate
• A qualitative shift in what is possible only
occurs when precision becomes exact
Copyright © 2001 Praxis Critical Systems Limited

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-C
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

MISRA-C
• C subset defined by UK motor industry research
association (MIRA) and associated software
reliability association (MISRA)
(www.misra.org.uk)
• Comprises 127 “rules” presented in the form of
a coding standard
• The language designers regard it as suitable for
“SIL 3” systems; they recommend Ada or
Modula-2 if available
Copyright © 2001 Praxis Critical Systems Limited

Constructing a Safe Subset - MISRA-C
• Selection of base language
–
• Removal of the most troublesome language
features
–
• Limitations on the way remaining features may
be used
–
• Introduction of annotations to provide extra
information
–
Copyright © 2001 Praxis Critical Systems Limited

Constructing a Safe Subset - MISRA-C
• Selection of base language
– ISO/IEC 9899:1990 + technical corrigendum 1995
• Removal of the most troublesome language
features
– e.g. pointer arithmetic
• Limitations on the way remaining features may
be used
– e.g all switch statements should have final default
• Introduction of annotations to provide extra
information
– not provided
Copyright © 2001 Praxis Critical Systems Limited

Language Issues Revisited
• ...
• Possible solutions
– Invent new languages without these problems
– Work with dialects associated with compilers
– Use logically coherent language subsets to overcome ambiguities
and insecurities
Copyright © 2001 Praxis Critical Systems Limited

Language Issues Revisited
• ...
• Possible solutions
– Invent new languages without these problems
– Work with dialects associated with compilers
– Use logically coherent languages subsets to overcome ambiguities
and insecurities.
Rule 41: The implementation of integer division in the
chosen compiler should be determined, documented and
taken into account.
Copyright © 2001 Praxis Critical Systems Limited

MISRA-C Overview
• MISRA-C concentrates on the “remove most
troublesome language features” approach
• There is no attempt to make the subset logically
coherent and free from ambiguity and insecurity
• Machine checkability well short of 100%
– Some rules cannot be machine-checked at all
• Rule 4. Provision should be made for appropriate runtime checking
– Some rules are unclear
• Nevertheless, the rules should prevent many
common C errors
Copyright © 2001 Praxis Critical Systems Limited

Some Examples
• Taken from “A Software Fault Prevention
Approach in Coding and Root Cause Analysis”
by Weider D. Yu
• Obtained from analysis of several million lines
of code in the Lucent 5ESS switching system
• Nearly 50% of the errors found were this kind of
low-level coding error
Copyright © 2001 Praxis Critical Systems Limited

Operator Precedence
if (blkptr->rpthead.fltdesc &
HMFLTCLAS == HWMATEFLT)
This is be
Should
a violation
corrected
of to:
MISRA-C Rule 47
if ((blkptr->rpthead.fltdesc &
“No dependence should be placed on C’s operator
HMFLTCLAS)
== HWMATEFLT)
precedence
rules in expressions”
Copyright © 2001 Praxis Critical Systems Limited

For Loop Control
for (idx = 0; idx < 40;
dispstring[idx] = COTsuccess[idx++]);
This is be
Should
a violation
corrected
of to:
MISRA-C Rule 66
for (idx = 0; idx < 40; idx++) {
“Only expressions concerned with loop control should
dispstring[idx]
= COTsuccess[idx]
appear
within a for statement”
};
Copyright © 2001 Praxis Critical Systems Limited

Flow Control Statements
if (condition == TRUE)
flag = DBYES;
This is be
Should
a violation
corrected
of to:
MISRA-C Rule 59
if (condition == TRUE) {
“The statements forming the body of an if, else if, else,
flag
= DBYES;
while,
do…while
or for statement shall always be
enclosed
in braces”
};
Copyright © 2001 Praxis Critical Systems Limited

Flow Control Statements
if (condition == TRUE) {
flag = DBYES;
ASfunction();
};
Copyright © 2001 Praxis Critical Systems Limited

Logic Faults on Lucent 5ESS Project
• Use of uninitialized variables
• Misuse of break and continue
statements
• Operator precedence
• Loop boundaries
• Indexing outside arrays
• Truncation of values
• Misuse of pointers
• Incorrect AND and OR tests
• Assignment/equality
Copyright © 2001 Praxis Critical Systems Limited
• Bit fields not unsigned or enum
• Incorrect logical AND and mask
operators
• Preprocessor conditional errors
• Comment delimiters
• Unsigned variables and
comparisons
• Misuse of type casting

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-C
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

The Ada HRG
• ISO committee under WG9 to investigate highintegrity Ada issues:
– Interpretation and development of Annex H of the LRM
– Focus for developing Ada in the high-integrity field
• Members from:
–
–
–
–
Tool vendors
Users
Government (e.g. UK MoD)
Academia
Copyright © 2001 Praxis Critical Systems Limited

HRG Report
ISO/IEC JTC1/SC22/WG9
Programming Languages Guide for the Use of the Ada
Programming Language in High
Integrity Systems
• Does not define a subset but identifies the basis
on which a subset might be selected
Copyright © 2001 Praxis Critical Systems Limited

General Approach
• Identifies verification techniques
• Compares each technique with Ada language
features
– Included - the feature and technique are fully compatible
– Excluded - use of the feature prevents cost-effective use of
the technique
– Allowed
• Use of the feature makes the technique difficult
• The difficulty can be circumvented
• Tables have explanatory notes for excluded and
allowed items
Copyright © 2001 Praxis Critical Systems Limited

Techniques (Methods) and Groups
Type of Method
Group Name
Flow Analysis (FA)
Symbolic Analysis (SA)
Analysis
Range Checking (RC)
Stack Usage (SU)
Timing Analysis (TA)
Other Memory Usage (OMU)
Object Code Analysis (OCA)
Requirements-based Testing (RT)
Testing
Structure-based Testing (ST)
Copyright © 2001 Praxis Critical Systems Limited
Method Name
Control Flow
Data Flow
Information Flow
Symbolic Execution
Formal Code Verification
Range Checking
Stack Usage
Timing
Other Memory Usage
Object Code Analysis
Equivalence Class
Boundary Value
Modified Condition/Decision Coverage
Branch Coverage
Statement Coverage

Example Table: Subprograms
Feature
Procedures10
Functions10
Default
Expression
Indefinite
Formal
Parameters
Indefinite
Return types
Inline
Expansion
Return in
Procedures
Aliasing
Access
Parameters
FA
Inc
Inc
Inc
SA
Inc
Inc1
Alld2
RC
Inc
Inc
Inc
SU
Inc
Inc
Inc
TA
Inc
Inc
Inc
OMU
Inc
Inc
Inc
OCA
Inc
Inc
Alld2
RT
Inc
Inc
Inc
ST
Inc
Inc
Inc
Inc
Inc
Inc
Alld3
Alld3
Alld3
Inc
Inc
Inc
Inc
Inc
Inc
Exc4
Exc4
Exc4
Inc
Inc
Inc
Inc
Inc
Inc
Inc
Inc
Inc
Alld5
Inc
Alld5
Alld6
Alld6
Inc
Inc
Inc
Inc
Alld6
Inc
Inc
Exc7
Exc8
Exc7
Exc8
Inc
Inc
Inc
Exc9
Inc
Inc
Inc
Exc9
Inc
Inc
Inc
Inc
Alld7
Inc
Copyright © 2001 Praxis Critical Systems Limited

Points of Interest
• Exceptions
– HRG clearly had to reconcile two views of exceptions:
• They are an essential aspect of the language for highintegrity applications
• They must be avoided because of the difficulties they
cause for flow and symbolic analysis
– Propagation is clearly a key issue
• Tasking
– HRG report defines the “Ravenscar Profile”
– (More on this later)
Copyright © 2001 Praxis Critical Systems Limited

Exclusions Summary
Feature
Discriminated records
Class-wide operations
Aliased objects : complex
Renaming: complex/dynamic
Goto
Complex return types
Access parameters
Unchecked access
Streams
Full access types
Controlled types
Indefinite objects
Non-Ravenscar tasking
Remote-call interface
Partition communications
FA
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Alld
Exc
Exc
Exc
Copyright © 2001 Praxis Critical Systems Limited
SA
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Alld
Exc
Exc
Exc
RC
SU
TA
OMU
Exc
Alld
Exc
Exc
Exc
Exc
Alld
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Alld
Exc
Exc
Exc
Exc
Exc
Exc
OCA
Alld
Alld
Exc
Exc
RT
ST
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Alld
Exc
Alld
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc
Exc

General Language Issues
• There are four justifiable reasons for restricting
the use of language features:
–
–
–
–
To avoid features that are unpredictable
To avoid features that cannot be modelled
To avoid features that cannot be tested
To avoid features that make verification ineffective
• Any set of restrictions on Ada creates another
“Ada-like” language - that language should be
coherent and consistent
Copyright © 2001 Praxis Critical Systems Limited

Using the Guide
• “Determine the verification techniques required
from ... standards or guidelines”
• “Identify and understand the objectives to be
satisfied by each of the ... techniques”
• “Using the tables in section 5, determine the
actual rating of the language features”
• “Confirm that the choice of subset ... can satisfy
the requirements. This step should take into
account available tools”
Copyright © 2001 Praxis Critical Systems Limited

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-C
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

What Is SPARK?
• A sub-language of Ada with particular properties
that make it ideally suited to the most critical of
applications:
–
–
–
–
–
Completely unambiguous
Free from implementation dependencies
All rule violations are detectable
Formally defined
Tool supported
• SPARK subsets of both Ada 83 and Ada 95 are
defined
• Language definition is open and widely available
Copyright © 2001 Praxis Critical Systems Limited

SPARK Design Considerations
•
•
•
•
•
•
•
•
•
Logical soundness
Simplicity of formal language definition
Expressive power
Security
Verifiability
Bounded space and time requirements
Correspondence with ... Ada?
Verifiability of compiled code
Minimal run-time system requirements
Copyright © 2001 Praxis Critical Systems Limited

Constructing a Safe Subset - SPARK
• Selection of base language
–
• Removal of the most troublesome language
features
–
• Limitations on the way remaining features may
be used
–
• Introduction of annotations to provide extra
information
–
Copyright © 2001 Praxis Critical Systems Limited

Constructing a Safe Subset - SPARK
• Selection of base language
– ANSI/MIL-STD-1815A-1983 & ISO-8652:1995
• Removal of the most troublesome language
features
– e.g. tasking
• Limitations on the way remaining features may
be used
– e.g. limitations on placement of exit and return
• Introduction of annotations to provide extra
information
– core (e.g. Global) and proof (e.g. Post) annotations
Copyright © 2001 Praxis Critical Systems Limited

Core Annotations
• Global definitions declaring use of global variables by
subprograms
• Dependency relations of procedures, specifying
information flow between their imports and exports
• Inherit clauses to restrict penetration of packages
• Own variable clauses to control access to package
variables, and to define refinement
• Initialisation specifications to indicate initialisations by
packages of their “own” variables
These annotations are related to executable code by staticsemantic rules, which are checked mechanically
Copyright © 2001 Praxis Critical Systems Limited

Static Analysis - the SPARK Examiner
• Language subset compliance
• System-wide data flow analysis
• Information flow analysis
}
“free”
• Demonstration, prior to execution, that a
program is “exception free”
• Formal verification including proof of safety
properties
Copyright © 2001 Praxis Critical Systems Limited

SPARK
• Amplifies the strengths of Ada
• Gives program source text a precise meaning
• Guarantees freedom from certain classes of error
• Simplifies early detection of other errors
• Captures important design information in the code
• Is compatible with HRG guidance and compiler-
defined high-integrity subsets
Copyright © 2001 Praxis Critical Systems Limited

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-C
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Other Ada Subsets
• Tend to fall into two groups:
– Informal coding standards involving fairly arbitrary
exclusion language features
• SysTeam “Safe Ada”
– Subsets generated by compiler back-end and run-time
library considerations
• GNAT Pro High-Integrity
more on these later
• C-Smart
• Raven
• Can be evaluated against HRG criteria
• Tool support for first group is important issue
Copyright © 2001 Praxis Critical Systems Limited

Outline
•
•
•
•
Language issues
Special purpose languages
Dialects
Subsets
–
–
–
–
MISRA-C
Ada HRG report
SPARK
Other Ada subsets
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Conclusions
• Language subsets are useful ways of helping
reduce risk and cost
• Ada subsets are the best defined and best
supported
• There is a variety of subsets each with a distinct
mix of: precision, expressive power, depth of
analysis possible, efficiency of analysis
• A qualitative shift in what is possible requires a
subset 100% precise semantics
• Tool support is vital
Copyright © 2001 Praxis Critical Systems Limited

“… one could communicate with these machines in any
language provided it was an exact language …”
“… the system should resemble normal mathematical
procedure closely, but at the same time should be as
unambiguous as possible.”
Alan Turing, 1948
Copyright © 2001 Praxis Critical Systems Limited

Tool Issues
• Without tools all rules are insecurities
• Tool capability depends on the precision of the
language chosen
– Simple lexical and syntactic checking
– Heuristic approaches
– Precision (only possible with “exact” subsets)
• Inability to check rules leads to insecurity; but,
excessive false alarm rates can overwhelm with
detail
Copyright © 2001 Praxis Critical Systems Limited

Resources
•
•
•
•
•
•
•
•
•
Cook, David. Evolution of Programming Languages and Why a Language Is Not Enough to
Solve Our Problems. Crosstalk Dec 99. pp 7-12
Carré, Bernard: Reliable Programming in Standard Languages. In High-integrity Software.
RSRE Malvern, Chris Sennett (ed). ISBN 0-273-03158-9, 1989
Finnie, Gavin et al: SPARK - The SPADE Ada Kernel. Edition 3.3, 1997, Praxis Critical
Systems
Finnie, Gavin et al: SPARK 95 - The SPADE Ada 95 kernel. 1999, Praxis Critical Systems
Barnes, John: High Integrity Ada - the SPARK Approach. Addison Wesley Longman, ISBN
0-201-17517-7
I.F. Currie: Newspeak - a Reliable Programming Language. In high-Integrity Software.
RSRE Malvern, Chris Sennett (ed). ISBN 0-273-03158-9, 1989
Motor Industry Research Association: Guidance for the Use of the C Language in Vehicle
Based Software. April 1998. www.misra.org.uk
Yu, Weider D: A Software Fault Prevention Approach in Coding and Root Cause Analysis.
Bell Labs Technical Journal April-Jun 1998
ISO/IEC JTC1/SC22/WG9. Programming Languages - Guide for the Use of the Ada
Programming Language in High Integrity Systems.
www.dkuug.dk/jtc1/sc22/wg9/n359.pdf
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Aerospace
Standards
Standards
Rail
Defence
Nuclear
Copyright © 2001 Praxis Critical Systems Limited
Automotive

Examples of (Safety-related) Standards
• RTCA DO-178B (civil avionics)
more on these later
• Def Stan 00-55/56 (UK military)
• IEC 61508 (generic ‘programmable systems’)
– CENELEC 50128 (railway industry)
– IEC 601 (medical equipment)
• IEC 880 (nuclear power control)
• MISRA (automotive industry)
• Vary in their power to mandate or recommend
Copyright © 2001 Praxis Critical Systems Limited

Security Standards
• ITSEC
• “Orange book”
• Common Criteria
– More on these later
Copyright © 2001 Praxis Critical Systems Limited

Defence Standards
Safety Case
Requirement
Software Specific
UK Def Stan 00-55


UK Def Stan 00-56


US MIL STD 882D


NATO STANAG 4404


NATO STANAG 4452


Def (Aust) 5679


Copyright © 2001 Praxis Critical Systems Limited

Rail Standards
Safety Case
Requirement
Software Specific
CENELEC EN 50126


CENELEC EN 50128


CENELEC ENV 50129


Copyright © 2001 Praxis Critical Systems Limited

Other Standards
Safety Case
Requirement
Software Specific
IEC 61508


IEC 880 (Nuclear)


DO 178 B


MISRA Guidelines (Automotive)


Copyright © 2001 Praxis Critical Systems Limited

Safety-related Software Techniques
• Common aspects:
–
–
–
–
–
–
–
–
Hazard/risk analysis
Safety integrity levels
Lifecycle definition
Formal methods for requirements and design
Modelling, for various purposes
Choice of language/RTOS etc.
Validation, verification and testing (VVT)
Accumulation of evidence – safety case
Copyright © 2001 Praxis Critical Systems Limited

Safety-related Software Techniques
• Common aspects:
–
–
–
–
–
–
–
–
Hazard/risk analysis
Safety integrity levels
Lifecycle definition
Formal methods for requirements and design
Modelling, for various purposes
Choice of language/RTOS etc.
Validation, verification and testing (VVT)
Accumulation of evidence – safety case
Copyright © 2001 Praxis Critical Systems Limited

Hazard/risk Analysis
• Identification of hazards and their causes
• Hazards have :
– An estimated probability of occurrence
– An associated severity
• ‘Risk’ is combined probability and severity
• ‘Risk regions’ can be identified:
– Intolerable
– ALARP – as low as reasonably possible
– Broadly acceptable
• ‘Safety’ is freedom from unacceptable risk
Copyright © 2001 Praxis Critical Systems Limited

Hazard/risk Analysis: Techniques
• Various formalised techniques exist:
– HAZOP (hazard and operability)
– Failure modes and effects analysis
– Fault tree analysis
• A hazard log will be used to detail identified
hazards and state steps being taken to mitigate
them.
• Different HAs:
– Preliminary HA (PHA)
– Design HA (DHA)
Copyright © 2001 Praxis Critical Systems Limited

Hazard Identification: HAZOP
• A systematic, creative examination of a design
• Performed by a multi-disciplinary team
• Inspect each component in turn
– Consider the design intention
– Apply a list of guide words
– Derive plausible deviations from the design intention
Copyright © 2001 Praxis Critical Systems Limited

Consequence Analysis
• Purpose
– To determine the intermediate conditions and final
consequences arising from the identified hazards
• Approach
– A diagrammatic representation is recommended
– Ideally quantitative estimate of probabilities
• Techniques
– Cause consequence diagrams
– Event trees
Copyright © 2001 Praxis Critical Systems Limited

Event Trees
Server
Extinguisher
Ignition
Extinguisher works
Extinguisher fails
Copyright © 2001 Praxis Critical Systems Limited
Alarm
Result
No Incident
Alarm sounds
Minor Fire
Alarm fails
Major Fire

Causal Analysis
• Purpose
– To determine credible combinations or sequences of causal
factors which can lead to the hazards
• Approach
– A diagrammatic representation is recommended
– Ideally quantitative estimate of probabilities
• Techniques
– Failure Modes and Effects Analysis (FMEA)
– Fault Tree Analysis (FTA)
Copyright © 2001 Praxis Critical Systems Limited

Accident Severity Categories
Catastrophic multiple deaths
Critical
a single death; and/or multiple
severe injuries or severe
occupational illnesses
Marginal
a single severe injury or
occupational illness; and/or
multiple minor injuries or minor
occupational illnesses
Negligible
at most a single minor injury or
minor occupational illness
Copyright © 2001 Praxis Critical Systems Limited

Probability Ranges
Frequent
likely to be continually experienced
Probable
Occasional
Remote
Improbable
likely to occur often
likely to occur several times
likely to occur some time
unlikely, but may exceptionally occur
Incredible
extremely unlikely that the event will
occur at all
Copyright © 2001 Praxis Critical Systems Limited

What Is Risk ?
Severity
Risk
Catastrophic
Intolerable
Tolerable
Negligible
Minor
Incredible
Copyright © 2001 Praxis Critical Systems Limited
Probability
Frequent

How Is Risk Regulated?
ALARP
ALARP = As Low As Reasonably Practicable
Copyright © 2001 Praxis Critical Systems Limited

Safety-related Software Techniques
• Common aspects:
–
–
–
–
–
–
–
–
Hazard/risk analysis
Safety integrity levels
Lifecycle definition
Formal methods for requirements and design
Modelling, for various purposes
Choice of language/RTOS etc.
Validation, verification and testing (VVT)
Accumulation of evidence – safety case
Copyright © 2001 Praxis Critical Systems Limited

Safety Integrity Levels
• System will be assigned a safety integrity level,
typically:
– 4 – 1 : ‘high’ – ‘low’
• Can be based on probabilistic or MTBF
concepts (e.g. 00-55)
• DO-178B talks about levels A-E.
• Techniques (and costs) will then be determined
by the assigned SIL
• Partitioning by SIL within system is normal
Copyright © 2001 Praxis Critical Systems Limited

What Is a SIL?
• A general indicator of quality for design/build
processes
• Applicable to software and hardware
• Probability that the system meets its
requirements
• Probability of systematic failure
• Often misunderstood and misused
Copyright © 2001 Praxis Critical Systems Limited

Safety Integrity Levels (Probabilistic IEC 6 1508)
SIL
Low Demand Mode of
Operation
Probability of failure
(on demand)
High Demand Mode
of Operation
Dangerous failure
rate (per year)
4
 10-5 to 10-4
 10-5 to 10-4
3
 10-4 to 10-3
 10-4 to 10-3
2
 10-3 to 10-2
 10-3 to 10-2
1
 10-2 to 10-1
 10-2 to 10-1
Copyright © 2001 Praxis Critical Systems Limited

Development for Different Sils (IEC 6 1508)
SIL1
Informal
SIL2
SIL3
Specification
Informal SemiFormal
Prototyping
R
R
R
Coding
HLL
HLL
SafePreferred
Subset
HLL
Defensive Code R
HR
Static Analysis R
HR
HR
Formal Proof
R
R
Dynamic
R
HR
HR
Testing
Performance
R
R
HR
Testing
SIL4
Formal
R
SafeSubset
HLL
HR
HR
HR
HR
HR
Partial summary of IEC 61508 recommendations for SILs.
Copyright © 2001 Praxis Critical Systems Limited

IEC 61508: Software Detailed Design
•
•
0
1
•
2
•
3
•
4
none
structured methodology (CORE, JSD,
MASCOT, Yourdon)
+ semi-formal methods (function block
diagrams, data-flow diagrams, pseudo
code)
+ formal methods (VDM, Z, CCS, CSP, HOL,
OBJ, LOTOS, Petri nets, state transition
diagrams)
+ formal proof to establish conformance to
software requirements specification.
Copyright © 2001 Praxis Critical Systems Limited

EN 50128: Software Development and
Implementation Requirements
TECHNIQUE/MEASURE
SIL 2
SIL 4
Formal Method (eg VDM, Z)
R
HR
Design and Coding Standards
HR
M
-
HR
Functional/Black Box Testing
HR
M
Data Recording and Analysis (eg
DRACAS, FRACAS)
HR
M
Language Subset (eg Spark Ada)
Copyright © 2001 Praxis Critical Systems Limited

Common SIL Myths
• “What are the safety requirements for the
system?”
• “It needs to be SIL4”
Copyright © 2001 Praxis Critical Systems Limited

Common SIL Myths
• “We’ve identified a new hazard, so there are
new safety requirements”
• “That’s ok, the software’s SIL4 so it won’t do
that”
Copyright © 2001 Praxis Critical Systems Limited

Common SIL Myths
• “This software is SIL0 so it’s unsafe”
• “The operating system is SIL3 so it must be
safe”
Copyright © 2001 Praxis Critical Systems Limited

What Is a SIL?
• Very similar to stars for hotel ratings!
Copyright © 2001 Praxis Critical Systems Limited

Safety Requirements and Sils
• Safety requirements define what we have to
demonstrate
• SILs define how thorough the demonstration
needs to be
• SILs apply to specific requirements
• Applying a consistent process to a software
package is a sensible management approach,
but may have little direct relation to safety
Copyright © 2001 Praxis Critical Systems Limited

Safety-related Software Techniques
• Common aspects:
–
–
–
–
–
–
–
–
Hazard/risk analysis
Safety integrity levels
Lifecycle definition
Formal methods for requirements and design
Modelling, for various purposes
Choice of language/RTOS etc.
Validation, verification and testing (VVT)
Accumulation of evidence – safety case
Copyright © 2001 Praxis Critical Systems Limited

Lifecycle Definition
The V Model
Req’ts
Spec
Spec
Test
HL
Design
Test
Detailed
Design
Test
Code
Copyright © 2001 Praxis Critical Systems Limited
•Some standards are
lifecycle independent.
•Others are explicit
that the V-Model is
assumed.
•QMS will define
deliverables:
>ISO9000-3
•There are Version and
Configuration
Management issues.

Safety-related Software Techniques
• Common aspects:
–
–
–
–
–
–
–
–
Hazard/risk analysis
Safety integrity levels
Lifecycle definition
Formal methods for requirements and design
Modelling, for various purposes
Choice of language/RTOS etc.
Validation, verification and testing (VVT)
Accumulation of evidence – safety case
Copyright © 2001 Praxis Critical Systems Limited
more later

Safety-related Software Techniques
• Common aspects:
–
–
–
–
–
–
–
–
Hazard/risk analysis
Safety integrity levels
Lifecycle definition
Formal methods for requirements and design
Modelling, for various purposes
Choice of language/rtos etc
Validation, verification and testing (VVT)
Accumulation of evidence – safety case
Copyright © 2001 Praxis Critical Systems Limited

Evidence and the Safety Case
• A successful safety assessment depends on the
inspection of evidence. Example:
–
–
–
–
–
–
–
–
Config. Management plan (including environment)
Software verification plan
Software quality assurance plan
Standards for design/coding
Specifications for requirements/design/tests
Software verification results
Software configuration index
Traceability matrix
Copyright © 2001 Praxis Critical Systems Limited

Resources
•
Ainsworth, Mike; Estaughffe, Katherine; Simpson, Alan. Safety
Cases for Software Intensive Systems. In Aspects of Safety
Management, Redmill & Anderson (Eds). Springer 2001. ISBN
1-85233-411-8
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Outline
• UK Defence Standards 00-55 and 00-56
– What are they?
– Who’s using them?
• Main Principles & Requirements
• Example Project - the SHOLIS
Copyright © 2001 Praxis Critical Systems Limited

Defence Standards 00-55 and 00-56
• “Requirements for Safety related Software in
Defence Equipment” UK Defence Standard 0055
– Interim issue 1991
– Final issue 1997
• “Safety Management Requirements for Defence
Systems” UK Defence Standard 00-56, 1996
• Always used together
Copyright © 2001 Praxis Critical Systems Limited

Who’s using 00-55 and 00-56
• Primarily UK-related defence projects
•
•
•
•
•
SHOLIS (more of which later…)
Tornado ADV PS8 MMS
Harrier II SMS
Ultra SAWCS
Lockheed C130J (via a mapping from DO-178B)
• Has (sadly) little influence in the USA
Copyright © 2001 Praxis Critical Systems Limited

UK MOD 00-56
• System-wide standard
• Defines the safety analysis process to be used
to determine safety requirements, including
SILs
• SILs are applied to components
• Promotes an active, managed approach to
safety and risk management
Copyright © 2001 Praxis Critical Systems Limited

Def. Stan. 00-56 Main Activities
Corrective
Action
Saf ety
Requirements
Saf ety
Programme
Plan
Hazard
Identif ication
and
Ref inement
Risk
Estimation
Saf ety
Compliance
Assessment
Saf ety
Verif ication
Hazard Log
Saf ety Case
Construction
Copyright © 2001 Praxis Critical Systems Limited

UK Defence Standard 00-55
• Applied to software components of various SILs
• Defines process and techniques to be followed
to achieve level of rigour appropriate to SIL
• A Software Safety Case must be produced:
– to justify the suitability of the software development
process, tools and methods
– present a well-organized and reasoned justification based
on objective evidence, that the software does or will satisfy
the safety aspects of the Software Requirement
Copyright © 2001 Praxis Critical Systems Limited

00-55: Software Development Methods
Technique/Measure
SIL2
SIL4
Formal Methods
J1
M
Structured Design
J1
M
Static Analysis
J1
M
Dynamic Testing
M
M
Formal Verification
J1
M
M = “Must be applied”
J1 = “Justification to be provided if not followed.”
Copyright © 2001 Praxis Critical Systems Limited

Comparison With DO-178B
• RTCA DO-178B
–
–
–
–
MC/DC testing
Comparatively little about process
Level A to level E
Focus on software correctness
• Def Stan 00-55
–
–
–
–
Heavy emphasis on formal methods & proof
Still stringent testing requirements
S1 to S4
Focus on software safety
Copyright © 2001 Praxis Critical Systems Limited

00-55 and Independent Safety Audit
• 00-55 requires a project to have an Independent
Safety Auditor (ISA)
• At SIL4, this engineer must be from a separate,
independent company from the developers.
• The ISA carries out:
– Audit - of actual process against plans
– Assessment - of product against safety requirements
• ISA has sign-off on the safety-case.
• Compare with role of DO-178B DER.
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS
• The Ship/Helicopter Operating Limits
Instrumentation System.
• Assesses and advises on the safety of
helicopter flying operations on board RN and
RFA vessels.
• Ship/Helicopter Operating Limits (SHOLs) for
each ship/helicopter/scenario in a database.
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS - The Problem
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS background
• Power Magnetics and Electronic Systems Ltd.
(now part of Ultra Electronics Group) - Prime
Contractor.
– System architecture and Hardware. (NES 620)
• Praxis Critical Systems Ltd. - all operational
software to Interim Def-Stan. 00-55 (1991)
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS block diagram
ABCB
DPU1
CLM
Ship’s
Data
Bus
and
UPSs
FDDU
BDU
DPU2
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS LRUs
• DPU1, DPU2 - Data Processing Units.
• BDU - Bridge display unit
• FDDU - Flight-deck display unit
• CLM - Change-over logic module
• ABCB - Auxiliary Bridge Control Box
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS FDDU on test...
• © Ultra Electronics 1999
Copyright © 2001 Praxis Critical Systems Limited

Fault tolerance
• Fault tolerance by hot-standby.
• Only one DPU is ever actually controlling the
displays.
• CLM is high-integrity H/W, and selects which
DPU is “selected” but ABCB switch can
override choice.
• DPUs run identical software and receive same
inputs.
Copyright © 2001 Praxis Critical Systems Limited

Fault tolerance(2)
• Unselected DPU outputs to display, but display
simply ignores the data.
• DPUs are not synchronised in any way - both
run free on their own clock source.
• DPUs at either end of ship in case one becomes
damaged.
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS history
• Phase 1
– Prototype demonstrating concept, U/I and major S/C
functions. Single DPU and display. Completed 1995
• Phase 2
– Full system. Software to Interim Def-Stan. 00-55.
Commenced 1995.
• Sea Trials
– Autumn 1998 onwards. Zero reported faults or anomalies.
• SHOLIS 2
– Summer 2001 - cosmetic upgrades, then full deployment.
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS software - challenges
• First system to Interim DS 00-55
– Many risks:
•
•
•
•
•
Scale of Z specification
Program refinement and proof effort
Complex User-Interface
Real-time, fault-tolerance and concurrency
Independent V&V and Safety Authority.
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS - Exceptions from Interim 00-55
• A very small number of exceptions from 00-55:
• No formal object code verification (beyond state of art?)
• No diverse proof checker (no such tool!)
• No “back-to-back” test against an “executable prototype”
(what does this mean?)
Copyright © 2001 Praxis Critical Systems Limited

Software process (simplified)
Requirements
English
SRS
Z & English
SDS: SPARK
Z, English
Code
SPARK
Copyright © 2001 Praxis Critical Systems Limited

Documents
• Requirements - over 4000 statements, all
numbered.
• SRS - c. 300 pages of Z, English, and Maths.
• SDS - adds implementation detail: refined Z
where needed, scheduling, resource usage,
plus usual “internal documentation.”
Copyright © 2001 Praxis Critical Systems Limited

The code...
• Approx. 133,000 lines:
•
•
•
•
•
13,000 declarations
14,000 statements
54,000 SPARK flow annotations
20,000 SPARK proof annotations
32,000 comment or blank
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS Z specification - example
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS code - example - specification
----------------------------------------------------------------- Up10
--- Purpose:
-Incrementally cycles the tens digit of the given integer.
--- Traceunit: ADS.InjP.Up10.SC
-- Traceto : SDS.InjP.Up10.SC
--------------------------------------------------------------function Up10 ( I : in InjPTypes.HeadingValTypeT )
return BasicTypes.Natural32;
--# return Result =>
--#
(((I/10) mod 10) /= 9 -> Result = I+10) and
--#
(((I/10) mod 10) = 9 -> Result = I-90) and
--#
Result in BasicTypes.Natural32;
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS - code example - body
-- Worst case serial chars : 0;
-- Worst case time (est)
: 4*LINES;
--- Traceunit: ADB.InjP.Up10.SC
-- Traceto : SDS.InjP.Up10.SC
-------------------------------------------------------------------function Up10 ( I : in InjPTypes.HeadingValTypeT )
return BasicTypes.Natural32
is
R : BasicTypes.Natural32;
begin
if ((I/10) mod 10) /= 9 then
R := I+10;
else
R := I-90;
end if;
return R;
end Up10;
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS code example - proof
• Up10 gives rise to 5 verification conditions - 3
for exception freedom (can you spot them all?!)
and 2 for partial correctness.
• The SPADE Simplifier proves 3 of them
automatically.
• The remaining 2 depend on knowing:
– Integer32’Base’Range
– (I >= 0) and ((I / 10) mod 10) = 9) -> (I >= 90)
Copyright © 2001 Praxis Critical Systems Limited

Verification activities(1)
• SRS - review plus Z proof
• Consistency of global variables and constants, existence
of initial state, derivation of preconditions, safety
properties: (CADiZ and “formal partial proof”)
• Traceability
• SDS - review, more Z proof, traceability.
Copyright © 2001 Praxis Critical Systems Limited

Verification activities(2)
• Code
– Development team:
• SPARK Static analysis. Worst-case timing, stack use,
and I/O. Informal tests. Program proof.
– IV&V Team:
• Module and Integration Tests based on Z specifications
and requirements. 100% statement and MCDC coverage.
(AdaTest plus custom tools)
Copyright © 2001 Praxis Critical Systems Limited

Verification activities(3)
• IV&V Team (cont…)
• System Validation Tests based on requirements.
• Acceptance test.
• Endurance test.
• Performance test.
• Review of everything…including code refinement, proof
obligations, proof scripts, proof rules, documents...
Copyright © 2001 Praxis Critical Systems Limited

Traceability
• Every requirement, SRS paragraph, SDS
paragraph, Z schema, SPARK fragment etc. is
identified using a consistent naming scheme.
• “Trace-units” map between levels.
• Automated tools check completeness and
coverage.
Copyright © 2001 Praxis Critical Systems Limited

Static analysis
• Timing - via comments giving WCET in terms of
statements executed and unit calls.
• PERL Script collects and evaluates.
• Execution time of “1 statement” obtained by experiment
+ safety margin.
• Actually quite useful! Not as good as a “real” WCET
tool, but the best we could do.
Copyright © 2001 Praxis Critical Systems Limited

Static analysis(2)
• Display output
• Output to display is via a 38400 baud serial line. Limited
bandwidth.
• A problem either display not being updated rapidly enough, or
buffer overflow.
• Again, expressions are given in comments giving the worst-case
number of characters that can be queued by every procedure.
PERL script evaluates.
• Some units very carefully programmed to minimize output.
Copyright © 2001 Praxis Critical Systems Limited

Static analysis(3)
• Stack usage
– SPARK is non-recursive.
– Careful coding to avoid large temporary objects and heap
allocation.
– Static analysis of generated code yields call tree and worstcase stack usage.
Copyright © 2001 Praxis Critical Systems Limited

Project status: January 2001
• System has passed all System Validation,
Endurance, and Performance Tests.
• Acceptance Test with customer completed with
no problems.
• Sea trial during Autumn 1998.
• Some cosmetic changes requested, but no
reported faults, crashes, or unexpected
behaviour.
Copyright © 2001 Praxis Critical Systems Limited

Some metrics
• Faults
• Any deliverable, once handed to IV&V, goes under
“change control” - no change unless under a formal
fault-report or change-request.
• All fault reports classified - who found it, what project
phase, what impact etc. etc.
• Faults range from simple clerical errors to observable
system failures.
Copyright © 2001 Praxis Critical Systems Limited

Fault metrics
Project Phase
Faults Found (%) Effort (%)
Specification
Z Proof
Design, Code & Informal Test
Unit & Integration Test
System Validation Tests
Acceptance Test
Code Proof
Copyright © 2001 Praxis Critical Systems Limited
3
12
28
21
32
0
4
5
3
19
26
9.5
1.5
4

0
Copyright © 2001 Praxis Critical Systems Limited
Other
Acceptance
System Validation
Code Proof
Integration Test
Unit Test
Code
High-level Design
Z Proof
Specification
No. of Faults Found
100
80
70
60
50
0.3
40
30
20
Efficiency
Fault metrics from SHOLIS
0.6
90
0.5
0.4
0.2
0.1
10
0

Faults - analysis
• Z proof and System Validation Tests were most
cost-effective.
• Traditional “module testing” was arduous and
found few faults, except in fixed-point numerical
code.
Copyright © 2001 Praxis Critical Systems Limited

Proof metrics
• Probably the largest program proof effort
attempted…
– c. 9000 VCs - 3100 Functional & Safety properties, 5900 from
RTC generator.
– 6800 discharged by simplifier (hint: buy a bigger
workstation!)
– 2200 discharged by SPARK proof checker or “rigorous
argument.”
Copyright © 2001 Praxis Critical Systems Limited

Proof metrics - comments
• Simplification of VCs is computationally intensive, so
buy the most powerful server available.
• (1998 comment) A big computer is far cheaper than the time of
the engineers using it!
• (Feb. 2001 comment) Times have changed - significant proofs
can now be attempted on a £1000 PC!
• Proof of exception-freedom is extremely useful, and
gives real confidence in the code.
• Proof is still far less effort than module testing.
Copyright © 2001 Praxis Critical Systems Limited

Difficult bits...
• User-interface.
• Tool support.
• Introduced state.
Copyright © 2001 Praxis Critical Systems Limited

User Interface
• Sequential code & serial interface to displays.
• Driving an essentially parallel user-interface is
difficult.
• e.g. Updating background pages, run-indicator, button
tell-backs etc.
• Some of the non-SIL4 displays were complex,
output-intensive and under-specified in SRS.
Copyright © 2001 Praxis Critical Systems Limited

Tool support
•
SPARK tools are now much better than they were five years
ago! Over 50 improvements identified as a result of SHOLIS.
•
SPARK 95 would have helped.
•
Compiler has been reliable, and generates good code.
•
Weak support in SPARK proof system for fixed and floating
point.
•
Many in-house static analysis tools developed: WCET analysis,
stack analysis, requirements traceability tools all new and
successful.
Copyright © 2001 Praxis Critical Systems Limited

Introduced state
• Some faults owe to introduced state:
– Optimisation of graphics output.
– Device driver complexity.
– Co-routine mechanisms.
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS - Successes
• One of the largest Z/SPARK developments ever.
– Z proof work proved very effective.
• One of the largest program proof efforts ever
attempted.
• Successful proof of exception-freedom on
whole system.
• Proof of system-level safety-properties at both Z
and code level.
Copyright © 2001 Praxis Critical Systems Limited

SHOLIS - Successes(2)
• Strong static analysis removes many common faults
before they even get a chance to arrive.
– Software Integration was trivial.
• Successful use of static analysis of WCET and stack use.
• Successful mixing of SIL4 and non-SIL4 code in one
program using SPARK static analysis.
• The first large-scale project to meet 00-55 SIL4. SHOLIS
influenced the revision of 00-55 between 1991 and 1997.
Copyright © 2001 Praxis Critical Systems Limited

00-55/56 Resources
• http://www.dstan.mod.uk/
• “Is Proof More Cost-Effective Than Testing?”
King, Chapman, Hammond, and Pryor. IEEE
Transactions on Software Engineering, Volume
26, Number 8. August 2000.
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Lockheed C130J
Copyright © 2001 Praxis Critical Systems Limited

C130J Key Features
• A new propulsion system with 29% more thrust and 15% better fuel
efficiency.
• An all composite six-blade propeller system which is lighter in
weight and has fewer moving parts than previous Hercules propellers.
• Advanced avionics technology includes Liquid Crystal Display
(LCD) instrument readouts for aircraft flight control, operating
systems, and navigation.
• Two mission computers and two backup bus interface units provide
dual redundancy for the Hercules' systems. These computers also
provide for an integrated diagnostics system to advise the crew of the
status of the aircraft's various systems.
Copyright © 2001 Praxis Critical Systems Limited

Copyright © 2001 Praxis Critical Systems Limited

C130J Software Certification
DO-178B/FAA
Civil
C130J
Military
Copyright © 2001 Praxis Critical Systems Limited
RAF lead
customer
UK MoD IV&V
programme
comparisons

Copyright © 2001 Praxis Critical Systems Limited

RTCA DO-178B:
Software Considerations in Airborne Systems
and Equipment Certification
• Industry-produced guidelines
– Consensus document for avionics software good practice
– Not a process document
– Not a standard
• Theoretically provides guidelines
• In practice:
– Endorsed by the FAA and JAA
– The “Bible” for civil avionics software certification
• Compliance established by ders
– Compliance here means process compliance
Copyright © 2001 Praxis Critical Systems Limited

Criticality Levels
• Levels assigned according to consequence of
failure
–
–
–
–
–
Level A:
Level B:
Level C:
Level D:
Level E:
catastrophic
hazardous/severe-major
major
minor
no effect
• DO-178B processes vary according to level
• Only A & B likely to be regarded as safetycritical
Copyright © 2001 Praxis Critical Systems Limited

Verification Activities
• For each development output standard either
requires:
– Tool that produced that output to be qualified; or
– The output itself to be verified
• Since most tools in the object code generation
path are not qualified
– The object code itself must be verified
– Achieved by specified test coverage criteria
– Coverage can be of object code itself or of source code if
source-to-object traceability can be achieved
Copyright © 2001 Praxis Critical Systems Limited

Tool Qualification
• Software development tools
– Tools whose output forms part of the airborne software (e.g.
Compiler)
• Software verification tools
– Tools which cannot introduce errors but may fail to detect
them (e.g. Static analyser)
• Qualification of tools essentially requires tool to
be developed to same standard as system on
which it will be used
Copyright © 2001 Praxis Critical Systems Limited

Test Coverage Requirements
• Level A:
MC/DC branch coverage
– Modified decision/condition coverage
• Level B:
branch (decision) coverage
• Level C:
statement coverage
• Level D/E:
no specific requirements
Copyright © 2001 Praxis Critical Systems Limited

What Is MC/DC?
Modified
Condition
- each
value
independently
affecting
Decision
coverage
- -coverage
each
branch/path
is executed
Condition
coverage
each
value
affecting
decision
is
executed
decision is executed
316test
cases
test
cases
4 test cases
if A and B then
X = X + 1;
elsif C or D then
(AAB)
B
A  B
A   B
(A
A B)^(C
B  D)

C  D
CD
C  D
C  D
Y := Y + 1;
end if;
Copyright © 2001 Praxis Critical Systems Limited
any
(A  B)^ (C 
D)1 of

Establishing Coverage
Copyright © 2001 Praxis Critical Systems Limited

Coverage Analysis
• Show that coverage required for level has been
achieved
• Confirm the data- and control-flow couplings
between components
• Coverage analysis can take place at source
level unless:
– The code is level A; and
– The object code is not directly traceable from the source
code
• e.g. An Ada run-time check
Copyright © 2001 Praxis Critical Systems Limited

Inadequate Coverage
• Shortcomings in requirements-based test cases
– Add more tests
• Inadequate requirements
– Modify requirements, add more tests as necessary
• Dead code (always semantically infeasible)
– Remove, reverify if necessary
• Deactivated code
– Show that code cannot be called in delivered configuration; or
– force test cases to execute it
Copyright © 2001 Praxis Critical Systems Limited

Coverage “Cheating”
Properties of a Tomato
function IsRed return Boolean is
begin
return True;
end IsRed;
function IsBig return Boolean is
begin
return False;
end IsBig;
3 tests required
function IsEdible return Boolean is
begin
return True;
end IsEdible;
Copyright © 2001 Praxis Critical Systems Limited

Coverage “Cheating”
Properties of a Tomato
type Properties
is (Red, Big, Edible)
Tomatoes constant : array(properties...
:= (Red
=> True,
Big
=> False,
Edible => True);
1 test required
function GetProperty (P : Properties)
return Boolean is
begin
return Tomatoes (P);
end GetProperty;
Copyright © 2001 Praxis Critical Systems Limited

DO-178B Critique
• Software correctness standard not a system safety
standard
– Cannot construct arguments to show how hazards are mitigated
– Can only state “whole box done to level A”
• Undue emphasis on test
– Little or no credit for worthwhile activity such as analysis and reviews
– Can create an environment where getting to test quickly is seen as the
prime aim
– MC/DC widely misinterpreted as “exhaustive” testing
– Deactivated code issues create problems for “good” languages like Ada
• DER role is establishing process compliance not system
safety
Copyright © 2001 Praxis Critical Systems Limited

Economic Arguments
• DO-178B gives little credit for anything except
testing
• The required testing is harder for Ada than C
• Therefore the best solution is to “hack in C”?
No!
• MC/DC testing is very expensive
– some estimates say x5 cost
– needs test rig: back-end, high-risk process
• Therefore processes that reduce errors going
into formal test are very worthwhile
Copyright © 2001 Praxis Critical Systems Limited

C130J Development Process - Principles
• Emphasis on correctness by construction (CbC)
– Focus on “doing it right first time”
– Verification-driven development
– Requirements of DO-178B kept in mind but not allowed to dominate
• “Formality” introduced early
– In specification
– In programming language
• Application domain rather than solution domain focus
– e.g. Identification of common paradigms
• Risk management
– e.g. “Thin-slice” prototyping
Copyright © 2001 Praxis Critical Systems Limited

Requirements and Specification
• CoRE (consortium reqts engineering)
– Parnas tables to specify input/output mappings
• Close mapping between CoRE requirements
and code for easy tracing
• Functional test cases obtain from CoRE tables
Copyright © 2001 Praxis Critical Systems Limited

CoRE Elements
Monitored
Variables
System
Input
Devices
Environment
IN
Copyright © 2001 Praxis Critical Systems Limited
Output
Data
Items
Input
Data
Items
Software
SOFT
Controlled
Variables
Output
Devices
Environment
OUT

REQ Template
MON
MON
CON
<value or
range>
<value or
range>
<f(MONs,
TERMs) or
value>
Copyright © 2001 Praxis Critical Systems Limited

CoRE REQ Example
<engine>
Fuelcp_mon_
Fuelcp_mon_auto transfer_
_xfer_class
method
<engine>
fuelcp_mon_
xfeed_class
<engine>
fmcs-con_ fuel_
control_valve_
class
X
X
from | off
shut
X
manual
to
open
auto
to
open
auto
to
shut
< <engine>
fmcs_mon_fuel_
level_tank_class
>= <engine>
fmcs_mon_fuel_
level_tank_class
Copyright © 2001 Praxis Critical Systems Limited

Implementation
• Close mapping from requirements: “one
procedure per table”
• Templates for common structures
– e.g. Obtaining raw data from bus and providing abstract
view of it or rest of system (“devices”)
• Coding in SPARK (150K SLOC)
• Static analysis as part of the coding process
Copyright © 2001 Praxis Critical Systems Limited

An Unexpected Benefit of SPARK
• SPARK requires all data interactions to be
described in annotations
• Some interactions were not clearly specified in
the CoRE tables
– e.g. Validity flags
• Coders could not ignore the problem
• Requirements/specification document became a
“live”, dynamic document as ambiguities were
resolved during coding
Copyright © 2001 Praxis Critical Systems Limited

Integrated Formality
Test Cases
CoRE
Specification
Templates
Copyright © 2001 Praxis Critical Systems Limited
Static Analysis
Testing
SPARK Code

The Value of Early Static Analysis
Example - an Aircrew Warning System
A specification for this
function might be:
An (incorrect) implementation might be:
mon_Event
con_Bell
type Alert is (Warning, Caution, Advisory);
Warning
Caution
Advisory
True
True
False
function RingBell(Event : Alert) return Boolean
is
Result : Boolean;
begin
if Event = Warning then
Result := True;
elsif Event = Advisory then
Result := False;
end if;
return Result;
end RingBell;
Copyright © 2001 Praxis Critical Systems Limited

Example Contd. - Test Results
• There is a very high probability that this code would pass MC/DC
testing
• The code returns True correctly in the case of Event=Warning
and False correctly in the case of Event=Advisory
• In the case of Caution it returns a random value picked up from
memory; however, there is a very high probability that this random
value will be non zero and will be interpreted as True which is the
expected test result
• The test result may depend on the order the tests are run (testing
Advisory before Caution may suddenly cause False to be
returned for example)
• The results obtained during integration testing may differ from those
obtained during unit testing
Copyright © 2001 Praxis Critical Systems Limited

Example Contd. - Examiner Output
13
14
15
16
17
18
19
20
21
22
??? (
23
??? (
function RingBell(Event : Alert) return Boolean
is
Result : Boolean;
begin
if Event = Warning then
Result := True;
elsif Event = Advisory then
Result := False;
end if;
return Result;
^1
1) Warning
: Expression contains reference(s) to
variable Result, which may be undefined.
end RingBell;
2)
Warning
: The undefined initial value of Result
may be used in the derivation of the function value.
Copyright © 2001 Praxis Critical Systems Limited

The Lockheed C130J and C27J Experience
• The testing required by DO-178B was greatly
simplified:
– “Very few errors have been found in the software during
even the most rigorous levels of FAA testing, which is
being successfully conducted for less than a fifth of the
normal cost in industry”
• Savings in verification are especially valuable
because it a large part of the overall cost
– “This level A system was developed at half of typical cost
of non-critical systems”
• Productivity gains:
– X4 on C130J compared to previous safety-critical projects
– X16 on C27J with re-use and increased process maturity
Copyright © 2001 Praxis Critical Systems Limited

C130J Software Certification
DO-178B/FAA
Civil
C130J
Military
Copyright © 2001 Praxis Critical Systems Limited
RAF lead
customer
UK MoD IV&V
programme
comparisons

UK Mod IV&V Programme
• The UK mod commissioned aerosystems
international to perform retrospective static
analysis of all the C130J critical software
• A variety of tools used: e.g.
– SPARK examiner for “proof” of SPARK code against CoRE
– MALPAS for Ada
• All “anomalies” investigated and “sentenced”
by system safety experts
• Interesting comparisons obtained
Copyright © 2001 Praxis Critical Systems Limited

Aerosystems’ IV&V Conclusions
• Significant, safety-critical errors were found by
static analysis in code developed to DO-178B
Level A
• Proof of SPARK code was shown to be cheaper
than other forms of semantic analysis
performed
• SPARK code was found to have only 10% of the
residual errors of full Ada and Ada was found to
have only 10% of the residual errors of C
• No statistically significant difference in residual
error rate could be found between DO-178B
Level A, Level B and Level C code
Copyright © 2001 Praxis Critical Systems Limited

Language Metrics
300
SPARK Ada
250
SLOCs/Anomaly
200
150
LUCOL
C
100
Assembler
PLM
50
Copyright © 2001 Praxis Critical Systems Limited
Average
HUD
 1% have safety implications
IDP
HUD
ECBU
DADS
FOD
NIU
BAECS
GCAS
FMCS
0
BAU II
SLOC per anomaly
Ada

Resources
•
•
•
•
•
•
RTCA-EUROCAE: Software Considerations in Airborne Systems and
Equipment Certification. DO-178B/ED-12B. 1992.
Croxford, Martin and Sutton, James: Breaking through the V&V
Bottleneck. Lecture Notes in Computer Science Volume 1031, 1996,
Springer-Verlag.
Software Productivity Consortium. www.software.org
Parnas, David L: Inspection of Safety-Critical Software Using
Program-Function Tables. IFIP Congress, Vol. 3 1994: 270-277
Sutton, James: Cost-Effective Approaches to Satisfy Safety-critical
Regulatory Requirements. Workshop Session, SIGAda 2000
German, Andy, Mooney, Gavin. Air Vehicle Static Code Analysis Lessons Learnt. In Aspects of Safety Management, Redmill &
Anderson (Eds). Springer 2001. ISBN 1-85233-411-8
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Outline
• UK ITSEC and Common Criteria schemes
– What are they?
– Who’s using them?
•
•
•
•
Main Principles
Main Requirements
Practical Consequences
Example Project - the MULTOS CA
Copyright © 2001 Praxis Critical Systems Limited

The UK ITSEC Scheme
• The “I.T. Security Evaluation Criteria”
• A set of guidelines for the development of
secure IT systems.
• Formed from an effort to merge the applicable
standards from Germany, UK, France and the
US (the “Orange Book”).
Copyright © 2001 Praxis Critical Systems Limited

ITSEC - Basic Concepts
• The “Target of Evaluation” (TOE) is an IT
System (possibly many components).
• The TOE provides security (e.g. confidentiality,
integrity, availability)
Copyright © 2001 Praxis Critical Systems Limited

ITSEC - Basic Concepts (2)
• The TOE has:
– Security Objectives (Why security is wanted.)
– Security Enforcing Functions (SEFs) (What functionality is
actually provided.)
– Security Mechanisms (How that functionality is provided.)
• The TOE has a Security Target
– Specifies the SEFs against which the TOE will be evaluated.
– Describes the TOE in relation to its environment.
Copyright © 2001 Praxis Critical Systems Limited

ITSEC - Basic Concepts (3)
• The Security Target contains:
–
–
–
–
Either a System Security Policy or a Product Rationale.
A specification of the required SEFs.
A definition of required security mechanisms.
A claimed rating of the minimum strength of the
mechanisms. (“Basic, Medium, or High”, based on threat
analysis)
– The target evaluation level.
Copyright © 2001 Praxis Critical Systems Limited

ITSEC Evaluation Levels
• ITSEC defines 7 levels of evaluation criteria,
called E0 through E6, with E6 being the most
rigorous.
• E0 is “inadequate assurance.”
• E6 is the toughest! Largely comparable with the
most stringent standards in the safety-critical
industries.
Copyright © 2001 Praxis Critical Systems Limited

ITSEC Evaluation Levels
and Required Information
for Vulnerability Analysis
Copyright © 2001 Praxis Critical Systems Limited

Evaluation
• To claim compliance with a particular ITSEC
level, a system or product must be evaluated
against that level by a Commercial Licensed
Evaluation Facility (CLEF).
• Evaluation reports answers “Does the TOE
satisfy its security target at the level of
confidence indicated by the stated evaluation
level?”
• A list of evaluated products and systems is
maintained.
Copyright © 2001 Praxis Critical Systems Limited

ITSEC Correctness Criteria for each Level
• Requirements for each level are organized
under the following headings:
• Construction - The Development Process
– Requirements, Architectural Design, Detailed Design,
Implementation
• Construction - Development Environment
– Configuration Control, Programming Languages and
Compilers, Developer Security
• Operation - Documentation
– User documentation, Administrative documentation
• Operation - Environment
– Delivery and Configuration, Start-up and Operation
Copyright © 2001 Praxis Critical Systems Limited

ITSEC Correctness Criteria - Examples
• Development Environment - Programming
languages and compilers
– E1 - No Requirement
– E3 - Well defined language - e.g. ISO standard.
Implementation dependent options shall be documented.
The definition of the programming languages shall define
unambiguously the meaning of all statements used in the
source code.
– E6 - As E3 + documentation of compiler options + source
code of any runtime libraries.
Copyright © 2001 Praxis Critical Systems Limited

The Common Criteria
• The US “Orange Book” and ITSEC are now
being replaced by the “Common Criteria for IT
Security Evaluation.”
• Aims to set a “level playing field” for developers
in all participating states.
– UK, USA, France, Spain, Netherlands, Germany, Korea,
Japan, Australia, Canada, Israel...
• Aims for international mutual recognition of
evaluated products.
Copyright © 2001 Praxis Critical Systems Limited

CC - Key Concepts
• Defines 2 type of IT Security Requirement:
• Functional Requirements
– Defines behaviour of system or product.
– What a product or system does.
• Assurance Requirements
– For establishing confidence in the implemented security
functions.
– Is the product built well? Does it meet its requirements?
Copyright © 2001 Praxis Critical Systems Limited

CC - Key Concepts (2)
• A Protection Profile (PP) - A set of security
objectives and requirements for a particular
class of system or product.
– e.g. Firewall PP, Electronic Cash PP etc.
• A Security Target (ST) - A set of security
requirements for specifications for a particular
product (the TOE), against which its evaluation
will be carried out.
– e.g. The ST for the DodgyTech6000 Router
Copyright © 2001 Praxis Critical Systems Limited

CC Requirements Hierarchy
• Functional and assurance requirements are
categorized into a hierarchy of:
• Classes
– e.g. FDP - User Data Protection
• Families
– e.g. FDP_ACC - Access Control Policy
• Components
– e.g. FDP_ACC.1 - Subset access control
– These are named in PPs and STs.
Copyright © 2001 Praxis Critical Systems Limited

Evaluations Assurance Levels (EALs)
• CC Defined 7 EALs - EAL1 through EAL7
• An EAL defines a set of functional and
assurance components which must be met.
• For example, EAL4 requires ALC_TAT.1, while
EAL6 and EAL7 require ALC_TAT.3
• EAL7 “roughly” corresponds with ITSEC E6 and
Orange Book A1.
Copyright © 2001 Praxis Critical Systems Limited

The MULTOS CA
• MULTOS is a multi-application operating system
for smart cards.
• Applications can be loaded and deleted
dynamically once a card is “in the field.”
• To prevent forging, applications and cardenablement data are signed by the MULTOS
Certification Authority (CA).
• At the heart of the CA is a high-security
computer system that issues these certificates.
Copyright © 2001 Praxis Critical Systems Limited

The MULTOS CA (2)
• The CA has some unusual requirements:
– Availability - aimed for c. 6 months between reboots, and
has warm-standby fault-tolerance.
– Throughput - system is distributed and has custom
cryptographic hardware.
– Lifetime - of decades, and must be supported for that long.
– Security - most of system is tamper-proof, and is subject to
the most stringent physical and procedural security.
– Was designed to meet the requirements of U.K. ITSEC E6.
• All requirements, design, implementation, and
(on-going) support by Praxis Critical Systems.
Copyright © 2001 Praxis Critical Systems Limited

The MULTOS CA - Development Approach
• Overall process conformed to E6
• Conformed in detail where retro-fitting
impossible:
– development environment security
– language and specification standards
– CM and audit information
• Reliance on COTS for E6 minimized or
eliminated.
– Assumed arbitrary but non-byzantine behaviour
Copyright © 2001 Praxis Critical Systems Limited

Development approach limitations
• COTS not certified (Windows NT, Backup tool,
SQL Server…)
• We were not responsible for operational
documentation and environment
• No formal proof
• No systematic effectiveness analysis
Copyright © 2001 Praxis Critical Systems Limited

System Lifecycle
• User requirements definition with REVEALTM
• User interface prototype
• Formalisation of security policy and top level
specification in Z.
• System architecture definition
• Detailed design including formal process
structure
• Implementation in SPARK, Ada95 and VC++
• Top-down testing with coverage measurement
Copyright © 2001 Praxis Critical Systems Limited

Some difficulties...
• Security Target - What exactly is an SEF?
– No one seems to have a common understanding…
• “Formal description of the architecture of the
TOE…”
– What does this mean?
• Source code or hardware drawings for all
security relevant components…
– Not for COTS hardware or software.
Copyright © 2001 Praxis Critical Systems Limited

The CA Test System
Copyright © 2001 Praxis Critical Systems Limited

Use of languages in the CA
• Mixed language development - the right tools
for the right job!
– SPARK
30%
– Ada95
30%
– C++
– C
30%
5%
– SQL
5%
Copyright © 2001 Praxis Critical Systems Limited
“Security kernel” of tamper-proof
software
Infrastructure (concurrency, inter-task
and inter-process communications,
database interfaces etc.), bindings to
ODBC and Win32
GUI (Microsoft Foundation Classes)
Device drivers, cryptographic
algorithms
Database stored procedures

Use of SPARK in the MULTOS CA
• SPARK is almost certainly the only industrialstrength language that meets the requirements
of ITSEC E6.
• Complete implementation in SPARK was simply
impractical.
• Use of Ada95 is “Ravenscar-like” - simple, static
allocation of memory and tasks.
• Dangerous, or new language features avoided
such as controlled types, requeue, user-defined
storage pools etc.
Copyright © 2001 Praxis Critical Systems Limited

Conclusions - Process Successes
• Use of Z for formal security policy and system
spec. helped produce an indisputable
specification of functionality
• Use of Z, CSP and SPARK “extended” formality
into design and implementation
• Top-down, incremental approach to integration
and test was effective and economic
Copyright © 2001 Praxis Critical Systems Limited

Conclusions - E6 Benefits and Issues
• E6 support of formality is in-tune with our
“Correctness by Construction” approach
– encourages sound requirements and specification
– we are more rigorous in later phases
• High-security using COTS both possible and
necessary
– cf safety world
• E6 approach sound, but clarifications useful
– and could gain even higher levels of assurance...
• CAVEAT
– We have not actually attempted evaluation
– but benefits from developing to this standard
Copyright © 2001 Praxis Critical Systems Limited

ITSEC and CC Resources
• ITSEC
– www.cesg.gov.uk
• Training, ITSEC Documents, UK Infosec Policy, “KeyMat”, “Non
Secret Encryption”
– www.itsec.gov.uk
• Documents, Certified products list, Background information.
• Common Criteria
– csrc.nist.gov/cc
– www.commoncriteria.org
• Mondex
– Ives, Blake and Earl Michael: Mondex International: Reengineering
Money. London Business School Case Study 97/2. See
http://isds.bus.lsu.edu/cases/mondex/mondex.html
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Outline
• Choosing a compiler
• Desirable properties of High-Integrity Compilers
• The “No Surprises” Rule
Copyright © 2001 Praxis Critical Systems Limited

Choosing a compiler
• In a high-integrity system, the choice of
compiler should be documented and justified.
• In a perfect world, we would have time and
money to:
– Search for all candidate compilers,
– Conduct an extensive practical evaluation of each,
– Choose one, based on fitness for purpose, technical
features and so on...
Copyright © 2001 Praxis Critical Systems Limited

Choosing a compiler (2)
• But in the real-world…
• Candidate set of compilers may only have 1
member!
• Your client’s favourite compiler is already
bought and paid for…
• Bias and/or familiarity with a particular product
may override technical issues.
Copyright © 2001 Praxis Critical Systems Limited

Desirable Properties of an HI compiler
• Much more than just “Validation”
•
•
•
•
•
•
Annex H support
Qualification
Optimization and other “switches”
Competence and availability of support
Runtime support for HI systems
Support for Object-Code Verification
Copyright © 2001 Praxis Critical Systems Limited

What does the HRG Report have to say?
• Recommends validation of appropriate annexes
- almost certainly A, B, C, D, and H. Annex G
(Numerics) may also be applicable for some
systems.
• Does not recommend use of a subset compiler,
although recognizes that a compiler may have a
mode in which a particular subset is enforced.
– Main compiler algorithms should be unchanged in such a
mode.
Copyright © 2001 Praxis Critical Systems Limited

HRG Report (2)
• Evidence required from compiler vendor:
–
–
–
–
–
Quality Management System (e.g. ISO 9001)
Fault tracking and reporting system
History of faults reported, found, fixed etc.
Availability of test evidence
Access to known faults database
• A full audit of a compiler vendor may be called
for.
Copyright © 2001 Praxis Critical Systems Limited

Annex H Support
• Pragma Normalize_Scalars
– Useful! Compilers should support this, but remember that
many scalar types do not have an invalid representation.
• Documentation of Implementation Decisions
– Yes. Demand this from compiler vendor. If they can’t or
won’t supply such information, then find out why not!
Copyright © 2001 Praxis Critical Systems Limited

Annex H Support (2)
• Pragma Reviewable.
– Useful in theory. Does anyone implement this other than to
“turn on debugging”?
• Pragma Inspection_Point
– Yes please. Is particularly useful in combination with
hardware-level debugging tools such as in-circuit emulation,
processor probes, and logic analysis.
Copyright © 2001 Praxis Critical Systems Limited

Annex H Support (3)
• Pragma Restrictions
– Useful.
– Some runtime options (e.g. Ravenscar) imply a predefined
set of Restrictions defined by the compiler vendor.
– Better to use a coherent predefined set than to “roll your
own”
– Understand effect of each restriction on code-gen and
runtime strategies.
– Even in SPARK, some restrictions are still useful - e.g.
No_Implicit_Heap_Allocation, No_Floating_Point,
No_Fixed_Point etc.
Copyright © 2001 Praxis Critical Systems Limited

Compiler Qualification
• “Qualification” (whatever that means) of a full
compiler is beyond reach.
• Pragmatic approaches:
– Avoidance of “difficult to compile” language features in HI
subset.
– In service history
– Choice of “most commonly used” options
– Access to faults history and database
– Verification and Validation
– Object code verification (last resort!)
Copyright © 2001 Praxis Critical Systems Limited

Optimization and other “switches”
• Was the compiler validated using the set of
“switches” that will be used in practice?
• Is the compiler fully supported using those
switches?
• Your choice must be (as with everything else…)
documented and justified.
Copyright © 2001 Praxis Critical Systems Limited

Optimization and High-Integrity
• Optimization remains a difficult issue
• In the past (I.e. 5 - 10 years ago…):
– Optimizers were seen as “buggy” or unreliable.
– Performance improvement was not that great.
– “High Integrity implies No Optimization” became accepted
practice.
– This position is no longer appropriate in many cases!
Copyright © 2001 Praxis Critical Systems Limited

Optimization and High-Integrity (2)
• Today…
– Optimizers are far more reliable.
– Some optimization is very important in obtaining acceptable
performance from modern architectures.
– Appropriate optimizations: local improvements such as
CSE, register tracking, elimination of redundant loads and
stores etc.
– Inappropriate: Global restructuring, such as loop unrolling,
elimination of partial redundancies etc.
• Seek evidence from compiler vendor as to most
appropriate, widely used setting.
Copyright © 2001 Praxis Critical Systems Limited

Support...
• Support is a crucial issue.
• High Integrity projects tend to “stretch”
compilers, so finding and resolving problems is
common.
• Try to foster a close relationship with compiler
support staff. Become friends with the
technical staff as well.
• You get what you pay for!
Copyright © 2001 Praxis Critical Systems Limited

Runtime Systems and High Integrity
• In delivering a high-integrity system, a run-time
library (RTL) must be verified to the same level
of assurance as the rest of the application.
• Most Ada compiler vendors have responded to
this need with various products:
– Ada83 - SMART, C-SMART (Alsys, TSP); VADSsc (Verdix,
Rational)
– Ada95 - ObjectAda Real-Time/Raven (Aonix); APEX MARK
(Rational); GMART, GSTART (GreenHills); GNAT Pro HighIntegrity Edition (ACT)
Copyright © 2001 Praxis Critical Systems Limited

Certifiable runtime systems
• 3 Main approaches:
– “Small, Certifiable” runtime systems
– Ravenscar
– No Runtime
Copyright © 2001 Praxis Critical Systems Limited

“Small, Certifiable” runtime systems
• Marketing goal: largely aimed at meeting the
requirements of DO-178B up to and including
level A systems.
• Technical goal: Elimination of language features
with an unacceptably large runtime impact. e.g.
tasking, heap allocation, exceptions, predefined
I/O etc.
• Examples: Aonix C-SMART and Rational
VADSsc for Ada83.
Copyright © 2001 Praxis Critical Systems Limited

Ravenscar Profile
• A “profile” of tasking and other features in
Ada95 that is appropriate for high-integrity and
hard real-time systems.
• Technical goals:
– Particularly efficient, simple runtime system implementation
on a single processor target.
– Amenable to static timing and schedulability analysis.
• Examples: Aonix ObjectAda™ Real-Time/Raven.
Copyright © 2001 Praxis Critical Systems Limited

No Run Time
• A radical idea - eliminate all language features
which require any runtime library support.
• Advantages: no runtime implies no certification
of any COTS component.
• Example: ACT GNAT Pro High Integrity Edition.
Copyright © 2001 Praxis Critical Systems Limited

Runtime options and development
• In all these runtime options, the Board Support
Package (BSP) remains.
• The BSP provides support for:
–
–
–
–
–
–
Download and startup
“Cold Boot” (I.e. from ROM)
Hardware-specific initialisation
Debugging (I.e. breakpoint, single step)
Coverage analysis
Some predefined, “simple” I/O
Copyright © 2001 Praxis Critical Systems Limited

Runtime options and development
• Another problem: traditional testing techniques
such as unit-test and coverage analysis may not
work on the BSP.
• In some projects, we have fielded 2 BSPs:
• A “Debug” BSP
– Supports all of the above functionality for any application.
• A “Strip” BSP
– Supports only the functionality that is required for the
delivery of a specific application only. (e.g. no debug, no I/O
etc.)
• Advantages of Strip BSP: easier (manual)
coverage analysis and testing, no “dead” code.
Copyright © 2001 Praxis Critical Systems Limited

Runtime Systems and High Integrity Subsets
• Design goals and choices in developing an HI
runtime system largely intersect with those of
an HI language.
• SPARK was designed to require little or no
runtime library.
• It is no surprise, then, that SPARK is compatible
with all the products mentioned so far.
• Some expansion of SPARK (towards Ravenscar,
for example), is now possible.
Copyright © 2001 Praxis Critical Systems Limited

Object Code Verification (OCV)
• In the absence of sufficient trust in a compiler,
manual verification of object code may be
required.
• Avoid this if at all possible!
• Requires detailed knowledge: of language, of
compiler, and of target processor.
• Very hard work!
Copyright © 2001 Praxis Critical Systems Limited

OCV (2)
• A central problem is Traceability of source to
object code.
– Source code statements to object code instructions
– Source code declarations to memory layout
– See Annex LRM H.3.1
• Existing traditional support for debugging offers
a partial solution.
– Support could be better, but there is little call for
improvement from customers.
• Complex interaction with language subset and
optimization issues.
Copyright © 2001 Praxis Critical Systems Limited

OCV Techniques - 1 step
• Compare source code with disassembled object
code directly.
• Disadvantages:
– The “semantic gap” between the 2 is very wide (especially
for a rich language like Ada)
– Requires detailed knowledge of Ada compilation, code
generation, language issues, target architecture and so on.
Copyright © 2001 Praxis Critical Systems Limited

OCV Techniques - 2 step
• Step 1 - Review source against compilergenerated intermediate language (IL)
• Step 2 - Review IL against disassembled object
code.
• Advantages
– semantic gaps are narrower.
– Splits skills required.
• Disadvantages
– Not many compilers allow users access to IL.
Copyright © 2001 Praxis Critical Systems Limited

OCV Example
• One OCV exercise found a single nonsensical
code sequence in a program.
• Analysis by 3 different people could not
understand what was going on - was it a
compiler bug?
• No! Turned out to be a bug in the disassembler
(listing the wrong instruction for a certain opcode)
• No other problems found.
Copyright © 2001 Praxis Critical Systems Limited

The “No Surprises” rule
• My own personal rule-of-thumb for HI
programming.
• When programming, try to predict the generated
code, including timing and memory usage.
– Means you have to “know” the compiler and target pretty
well!
• The compiler should never surprise you.
Copyright © 2001 Praxis Critical Systems Limited

An example surprise...
• In the SHOLIS application software, there are
several large constant arrays:
My_Constant : constant Big_Type :=
Big_Type’( ... );
• Compiler is Alsys AdaWorld Ada83 targeting
68040. What code is generated?
Copyright © 2001 Praxis Critical Systems Limited

An example surprise (2)
• The aggregate is not static in Ada83, so is
evaluated at run-time into a temporary variable.
• This temporary variable is larger than 1024
bytes, so compiler puts it on the heap!
declare
type Temp_Ptr is access Big_Type;
Temp : Temp_Ptr;
begin
Temp := RTS.Allocate_Memory(Big_Type’Size);
Temp.all := Big_Type’( ... );
My_Constant := Temp.all;
RTS.Free(Temp);
end;
Copyright © 2001 Praxis Critical Systems Limited

An example surprise (3)
• Unfortunately, SMART runtime does not
implement Heap allocation…oh dear…a big
surprise!
• N.B. Things are much better in Ada95.
• Another rule of thumb: always have someone
on a project team who is capable of reading and
reviewing the object code.
Copyright © 2001 Praxis Critical Systems Limited

Compiler and Runtime Resources
• See the compiler vendors!
• “Re-engineering a Safety-Critical Application
Using SPARK95 and GNORT” R. Chapman and
R. Dewar, in Reliable Software Technologies Ada-Europe 1999, Springer LNCS Vol. 1622.
Copyright © 2001 Praxis Critical Systems Limited

Programme
• Introduction
• What is High Integrity Software?
• Reliable Programming in Standard Languages
– Coffee
• Standards Overview
• Def Stan 00-55 and SHOLIS
– Lunch
• DO178B and the Lockheed C130J
• ITSEC, Common Criteria and Mondex
– Tea
• Compiler and Run-time Issues
• Conclusions
Copyright © 2001 Praxis Critical Systems Limited

Some properties of critical systems
• Building safe systems is not the same as
certification
– the goal is to build a demonstrably safe system
– a demonstrably safe system can be certified to any standard
• Showing a system is safe is invariably harder
than building a safe system
• Quality and safety cannot be retro-fitted
Copyright © 2001 Praxis Critical Systems Limited

A cost-effective approach:
• Requires a “correctness by construction”
approach
– build in safety from the start
– build in quality from the start
– bug prevention not bug detection
• The approach should be “verification driven”
– evidence of suitability should be produced as a side-effect
of the development process
Copyright © 2001 Praxis Critical Systems Limited

No magic
• There is no easy way, no magic wands and no
silver bullets
• Therefore we must deploy range of techniques
that attack all sources of risk and error
–
–
–
–
–
–
–
hazard analysis
requirements capture
specification
programming languages
code
analysis
test
Copyright © 2001 Praxis Critical Systems Limited

The Role of Ada
• Superficially Ada is not a good match to DO-178B
– source code to object mapping can be harder (e.g. run-time checks)
– run-time library overheads increase certification cost
• However, in practice
– the savings from better abstraction, better front-end checking etc.
greatly outweigh this extra cost
– the disadvantages can be eliminated by subsetting
– only Ada allows the construction of rigorous subsets
Copyright © 2001 Praxis Critical Systems Limited

A Model Process
•
•
•
•
Really understand the requirements
Really understand the hazards
Design a system to mitigate the hazards
Design software to preserve the mitigations
– specify it formally
– prove the specification has the required properties
• Code in an unambiguous language
– analyse the code before compilation
– ideally don’t give coders access to the compiler!
• Test, with emphasis on requirements based testing
– obtain test cases from requirements, hazard analysis and formal spec
– don’t waste time on blanket unit test
Copyright © 2001 Praxis Critical Systems Limited

A Model Team (An Opinion!)
• Small and not too hierarchical
• Integrated systems and software expertise
• Engineering expertise is more important than tool- or
language-specific knowledge
• But team should include:
–
–
–
–
a domain specialist (someone who really understands what is being built)
a language/compiler guru
a mathematician
a quality/configuration management and documentation pedant
Copyright © 2001 Praxis Critical Systems Limited

A final word...
Better really can be cheaper!
Copyright © 2001 Praxis Critical Systems Limited
