Formal Model-Based Development in Aerospace Systems

Download Report

Transcript Formal Model-Based Development in Aerospace Systems

Formal Model-Based Development
in Aerospace Systems:
Challenges to Adoption
Mats P. E. Heimdahl
University of Minnesota Software Engineering Center
Critical Systems Research Group
Department of Computer Science and Engineering
University of Minnesota
and a Plea for Help
Domain of Concern
How we Develop Software
Concept
Formation
System Test
Requirements
Specification
System
Test
Integration
Test
Design
Integration
Unit Test
Implementation
Object Code
Analysis
Model-Based Development
Properties
Visualization
Analysis
Testing
Specification
Model
Code
Prototyping
Model-Based Development Tools
• Commercial Products
– Esterel Studio and
SCADE Studio from
Esterel Technologies
– Rhapsody from I-Logix
– Simulink and Stateflow
from Mathworks Inc.
– Rose Real-Time from
Rational
– Etc. Etc.
How we Will Develop Software
Concept
Formation
Requirements
Analysi
s
Properties
System
Specification/Model
Syste
m
Test
Integration
Test
Integration
Specification
Test
Implementation
What Does Industry Want?
Cheaper
Better / Safer
Faster
Model-Based Development
Examples
Company
Product
Tools
Specified & Autocoded
Benefits Claimed
Airbus
A340
SCADE
With Code
Generator
 20X Reduction in Errors
 Reduced Time to Market
Eurocopter
EC-155/135
Autopilot
GE &
Lockheed
Martin
Schneider
Electric
FADEDC Engine
Controls
SCADE
With Code
Generator
ADI Beacon





US
Spaceware
PSA
CSEE
Transport
Honeywell
Commercial
Aviation
Systems
70% Fly-by-wire Controls
70% Automatic Flight Controls
50% Display Computer
40% Warning & Maint Computer
90 % of Autopilot
 Not Stated
Nuclear Power
Plant Safety
Control
DCX Rocket
SCADE
With Code
Generator
MATRIXx
 200,000 SLOC Auto Generated
from 1,200 Design Views
Electrical
Management
System
Subway
Signaling System
SCADE
With Code
Generator
SCADE
With Code
Generator
MATLAB
Simulink
 50% SLOC Auto Generated
Primus Epic
Flight Control
System
 Not Stated
 50% Reduction in Cycle Time




Reduction in Errors
50% Reduction in Cycle Time
Decreased Cost
8X Reduction in Errors while
Complexity Increased 4x




50-75% Reduction in Cost
Reduced Schedule & Risk
60% Reduction in Cycle Time
5X Reduction in Errors
 80,000 C SLOC Auto Generated
 Improved Productivity from
20 to 300 SLOC/day
 60% Automatic Flight Controls
 5X Increase in Productivity
 No Coding Errors
 Received FAA Certification
Problem 1
Believing Testing Can be Eliminated
Testing will always be a crucial
(and costly) component
How we Develop Software
Concept
Formation
System Test
Requirements
Specification
System
Test
Integration
Test
Design
Integration
Unit Test
Implementation
Object Code
Analysis
Testing Does not go Away
Concept
Formation
Requirements
Properties
System
Specification/Model
Extensive Testing
(MC/DC)
Integration
Implementation
It Simply Moves
Concept
Formation
Requirements
Properties
System
Specification/Model
Extensive Testing
(MC/DC)
Integration
Implementation
Do it the Right Way
Concept
Formation
Requirements
Analysi
s
Properties
System
Specification/Model
Syste
m
Test
Integration
Test
Integration
Specification
Test
Implementation
Unit Test
Example: ADGS-2100 Adaptive
Display & Guidance System
883 Subsystems
9,772 Simulink Blocks
2.9 x 1052 Reachable States
Requirement
Drive the Maximum Number of Display Units
Given the Available Graphics Processors
Counterexample Found in 5 Seconds!
Checking 573 Properties
Found 98 Errors
Remedy
• Be honest about the capabilities of modelbased development and formal methods
– Done right, provides outstanding requirements,
models, analysis, etc., etc.
– May greatly reduce the effort spent in testing
Problem 2
Believing the Model is Everything
The model is never enough
Modeling Frenzy
Concept
Formation
Requirements
Modeling is
so much
fun
Properties
System
Specification/Model
How do we
know the model
is “right”?
Implementation
Integration
Do it the Right Way
Concept
Formation
Requirements
Analysi
s
Properties
System
Specification/Model
Syste
m
Test
Integration
Test
Integration
Specification
Test
Implementation
Unit Test
Remedies
• Recognize the Role of Software Requirements
– The model is not everything
• Development Methods for Model-Based
Development Badly Needed
– Model-Based Software Development Process
• Develop Tools and Techniques for Model, Properties,
and Requirements Management
• Develop Inspection Checklists and Style Guidelines
for Models
Problem 3
Trusting Verification
To really mess things up,
you need formal verification
Model Checking Process
Model
SMV
Spec.
Automatic Translation
Does the system
have property X?
Yes!
SMV
Automatic Translation
Engineer
Properties
SMV Properties
Model Checking Process
Model
SMV
Spec.
Automatic Translation
Does the system
have property X?
Counter Example
No!
SMV
Automatic Translation
Engineer
Properties
SMV Properties
Property or Model: Who is Right?
The Mode Annunciations shall be turned on
when the Flight Director is turned on
AG(Onside_FD_On -> Mode_Annunciations_On)
If this side is active, the Mode Annunciations shall
be turned on when the Flight Director is turned on
AG( (Is_This_Side_Active & Onside_FD_On)
-> Mode_Annunciations_On)
If this side is active and the Mode Annunciations are off, the Mode
Annunciations shall be turned on when the Flight Director is turned on
AG( ! Mode_Annunciations_On ->
AX ((Is_This_Side_Active & Onside_FD_On)
-> Mode_Annunciations_On)))
Translated All the “Shalls” into
SMV Properties
Analysis Process Steps
• All properties verified (!), or…
• Counterexamples found for
some properties
• Simulate counterexample in
MBD environment and make
corrections to:
Create Model
–
–
–
–
model
properties
requirements
assumptions
(invariants)
Simulation /
Corrections
(Manual)
Shall
Statements
Corrections
Corrections
Formal
Analysis Model
Corrections
Corrections
Formalize
Properties
(Manual)
Translate
(Automated)
MBD Model
CTL
Properties
Merge
(Automated)
Formal
Verification
Remedies
• Develop techniques to determine adequacy of model and
property set
– How do we know they are any “good”
• Techniques for management of invariants
– How do we validate the assumptions we make
• Methodology and guidance badly needed
– Tools with training wheels
– “Verification for Dummies”
All we need is one high-profile verified system
to fail spectacularly to set us back
a decade or more
Why?
Guru
Model
Checking Process
Model
SMV
Spec.
Automatic Translation
Does the system
have property X?
?
SMV
Automatic Translation
Engineer
Properties
SMV Properties
Problem 4
Believing One Tool Will Be Enough
To be effective, we need a suite of
notations and analysis tools
(and the ability to continually integrate new ones)
Original Tool Chain
RSML-e to NuSMV
Translator
NuSMV Model Checker
RSML-e
RSML-e to PVS
Translator
PVS Theorem Prover
Rockwell Collins/U of Minnesota
SRI International
Conversion to SCADE
SPY
Simulink
Gateway
Simulink
SCADE
NuSMV
Lustre
StateFlow
Safe State
Machines
Esterel Technologies
MathWorks
University of Minnesota/Rockwell Collins (NASA LaRC Funded)
University of Minnesota (NASA IV&V Funded)
PVS
Design
Verifier
Current(?) Tool Status
Simulink
SPY
Simulink
Gateway
SCADE
Reactis
StateFlow
NuSMV
Lustre
Safe State
Machines
Design
Verifier
ICS
Esterel Technologies
MathWorks
SAL
Reactive Systems
University of Minnesota/Rockwell Collins (NASA LaRC)
University of Minnesota (NASA IV&V)
SRI International
PVS
Symbolic
Model Checker
Bounded
Model Checker
Infinite
Model Checker
Three Conjectures
• No one modeling language will be
universally accepted, nor universally
applicable
• No one verification/validation tool will
satisfy the analysis needs of a user
• Languages and tools must be tested on
real world problems by practicing engineers
– Preferably in commercial tools
Translation – with no IL
Effort = m * n
poly
tables
Lustre ++
High quality translations
poly’
PVS
poly
SCADE
SMV
RSML-e
C
m modeling languages
n target languages
Translation – with IL
Effort = m + n
Low quality translations
poly’
poly
tables
Lustre ++
PVS
poly
SCADE
RSML-e
m modeling languages
Lustre IL
SMV
C
n target languages
A Proposed Framework (Van Wyk)
• Based on techniques from extensible programming
languages, specifically attribute grammars
extended with forwarding.
• Hypothesis:
– An extensible language may serve as a host language
for domain specific extensions (to construct new
modeling languages),
– while forwarding enables the feasible construction of high
quality translations from source specification languages
to target analysis languages.
• Provided to spur discussion only! There may be
better solutions.
Translation – with lang. exts.
Effort = m + n + Σ t I
High quality translations
pvs_trans (t1)
poly
poly’
forwarding
tables
Lustre ++
pvs_trans (t2)
PVS
pvs_trans
poly
SCADE
smv_trans
forwarding
Lustre Host
SMV
c_trans
forwarding
RSML-e
m modeling languages
c_trans (t3)
C
n target languages
Remedies
• Next generation tools must allow easy
extension and modification of notations to
meet domain specific needs
• They must allow easy construction of highquality translations from modeling notations
to analysis tools
• They also must enable controlled reuse of
tool infrastructure to make tool extensions
cost effective
Problem Summary
• Believing Testing Can be
Eliminated
• Believing the Model is
Everything
• Trusting Verification
• Believing One Tool Will
Be Enough
Thank You
• Rockwell Collins
–
–
–
–
Steven Miller
Michael Whalen
Alan Tribble
Michael Peterson
• NASA Langley
– Ricky Butler
– Kelly Hayhurst
– Celeste Bellcastro
• NASA Ames
– Michael Lowry
• NASA IV&V Facility
– Kurt Woodham (L3-Titan)
• My Students at Minnesota
–
–
–
–
–
–
Anjali Joshi
Ajitha Rajan
Yunja Choi,
Sanjai Rayadurgam
Devaraj George
Dan O'Brien
Opinions in talk are mine.
Do not blame the innocent.
Discussion
For More Information
•
•
•
•
•
Michael W. Whalen et. al., Formal Validation of Avionics Software in a ModelBased Development Process, Formal Methods in Industrial Critical Systems
(FMICS’2007), July 2007.
Steven P. Miller, Alan C. Tribble, Michael W. Whalen, Mats P. E. Heimdahl,
Providing the Shalls, International Journal on Software Tools for Technology
Transfer (STTT), Feb 2006.
Michael W. Whalen, John D. Innis, Steven P. Miller, and Lucas G. Wagner,
ADGS-2100 Adaptive Display & Guidance System, NASA Contractor Report
NASA-2006-CR213952, Feb. 2006. Available at
http://hdl.handle.net/2002/16162.
A lot of good reading at http://shemesh.larc.nasa.gov/fm/fm-collins-intro.html
Eric Van Wyk and Mats Heimdahl. Flexibility in modeling languages and tools: A
Call to Arms. To appear in Software Tools for Technology Transfer.