System of Systems and Live, Virtual, Constructive Test Global

Download Report

Transcript System of Systems and Live, Virtual, Constructive Test Global

The Use of M&S in T&E
Presented to the Industrial Committee on Test and Evaluation (ICOTE)
Kevin Knudsen
Boeing Test & Evaluation, Systems/System of Systems Test Capability
September 25, 2012
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
|1
Objectives
Engineering, Operations & Technology | Boeing Test & Evaluation
 Purpose
 Provide an overview of where Boeing Test & Evaluation (BT&E) is
leveraging the use of M&S in T&E
 Review two recent activities where BT&E has used M&S in
support of T&E
 Review findings from the August NDIA SE M&S/DTE Joint
Committee Meeting
 Expectations
 Ask clarifying questions
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
|2
Program Eating Habits
Engineering, Operations & Technology | Boeing Test & Evaluation
There are things that are easy for programs to swallow that stay with them a long time
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
License for use of cartoon expires on 9/24/2013
|3
Poor Requirements Impacting Execution
Engineering, Operations & Technology | Boeing Test & Evaluation
Relative Cost To Fix Problem*
600
500
400
300
For illustration purposes, if it costs $1,000 to fix a
problem found during requirements capture, it will
cost . . .
500
$6,000 if found during design
$10,000 if found during the build phase
$40,000 if found during DT&E
$70,000 if found during OT
$500,000 if found after fielding
200
70
100
1
6
10
REQS
CAPTURE
DESIGN
BUILD
40
0
DEV.
TESTING
ACCEPT
TESTING
FIELDED
Stage of Life Cycle
How can we capture the right requirements earlier?
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
*Joseph J. Carr, “Requirements engineering and management: the key to designing quality complex systems”
The TQM Magazine, Volume 12 . Number 6 . 2000 . pp. 400±407
|4
Late Discovery of Issues Impacting Execution
Engineering, Operations & Technology | Boeing Test & Evaluation
Gate discovered
Earliest potential discovery
1x
5x
16x
40x
110x
Issues
Nominal cost for
fault removal
1
2
3
4
5
6
7
8
9
10
11
How can we discover and mitigate risks earlier?
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
|5
Life Cycle Cross Functional Engagement
Engineering, Operations & Technology | Boeing Test & Evaluation
Identify, Examine, Mitigate Risks
Early and Throughout The Process
Market
Shaping
Program Reviews
1
Retirement,
Disposal
Validation Plan
Requirements
Definition
Hypothesis:
• Cross functionally
ensure the concepts,
requirements,
architectures, designs,
and operations across
the life cycle are:
• Affordable
• Feasible
• Suitable
• Valid
• Producible
• Testable
• Perform verification
and validation as soon
as possible to identify,
mitigate, and retire
program risks early
Delivery and
Upgrades
Validation
Functional
Analysis &
Allocation
Verification Plan
Design,
Synthesize,
Integrate
Test Plans
How will we ensure:
• Our customers get
the systems and
products they
want?
• Subsystem and
system verification
and validation is
done right the first
time?
Verification
Integration &
Test
Implementation,
Fabrication &
Assembly
Time Line
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
2
3
4
5
6
7
8
9
10
11
|6
Early Engagement
Engineering, Operations & Technology | Boeing Test & Evaluation
 Product design strategies that reduce testing requirements
 Use M&S and T&E for development and refinement of:
 Concepts/CONOPs
 Requirements
 Architecture
 V&V planning
 Disciplined review process with “pause” points
 True System Engineering, Integration, and Test (SEIT) team/process
 End-to-end collaboration with SE, T&E, and M&S across supply chain
Ensure right level of participation and influence
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
|7
Drive Out Risk to Prevent Late Discovery
Engineering, Operations & Technology | Boeing Test & Evaluation
Plan
Performance to Plan
Proposed Profile
 M&S
 M&S and Test (Live, Virtual, Constructive)
 Prototype systems
 Interfaces
 Integration
Budget
 Fund risk reduction at program start
Schedule
 Early test of assumptions, higher-risk designs,
and interfaces
 Off nominal
 Fault injection
 Focus on end-to-end mission threads as early
as possible
 Continually look across the “V” for V&V
planning
Relative Cost
 Disciplined use of the right environments for
the right tasks (across supply chain)
Test Environment
Earliest M&S and tests possible, with integrated M&S and test environments
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
|8
Targeted LVC Test Identifies and Mitigates
Program Risks Throughout the Life Cycle
Engineering, Operations & Technology | Boeing Test & Evaluation
Palmdale (Plant
Palmdale
(Plant 42)
40) CA
CA
Primary Command
Emulated/Simulated
Post will be at 6 with
Lab
Environment
Alternates
at 7 and 8
St. Louis
F15
Simulation
Huntington Beach CA
3
1
9
4
2
Seattle
7
5
8D
LABNET
6
Live Operational Environment
8B
Alt
8A
Color Key
8
SRW
8C
WNW
Note: Buildings will
be ignored for Path
Loss Calculations
DNW
LabNet
Leveraging opportunities for injecting downstream issues of a prototype system into a
controlled, complex SOS Test environment earlier in the life cycle
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
|9
FAA NextGen Test Event Overview
Engineering, Operations & Technology | Boeing Test & Evaluation
 Purpose:
 Demonstrate a distributed test
capability required to verify and
validate complex system-ofsystems, such as NextGen
 Participants:
 Six facilities
 Four separate government and
industry organizations
 Five aircraft simulators of varying
fidelity
 One simulated air traffic control
tower
 Result:
 Test merged four distinct research
networks into one integrated
national government and industry
research network.
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
| 10
FAA Test Lessons Learned
Engineering, Operations & Technology | Boeing Test & Evaluation
1. Voice Comm Integration


The ASTi system was the standard Boeing M&S voice system
Significant amount of time to integrate with the FAA Plexis system
• Systems were shown to be interoperable there was still quite a bit of configuration
and troubleshooting that needed to be accomplished
2. TENA Compatibility



FAA was used TENA 6.03
Boeing used TENA 6.0
Minor integration issues
3. Funding

Lack of persistent funding for connectivity causes start up delays
4. Networks


Lack of persistent networking agreement for:
 Connectivity
 Data sharing
Lack of a Network Connectivity Checklist/process
 Resulted in confusion in connecting networks between test event partners
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
| 11
NDIA SE M&S/DTE Joint Committee Meeting
Engineering, Operations & Technology | Boeing Test & Evaluation
 August 2012
 Crystal City
 Attendees









Catherine Parker,
Jerry Feinberg, Alion
Paul Huang, Army Research Lab
Brandy Greenberg, Alion
Louisa Guise, Raytheon
Beth Wilson, Raytheon
Kevin Knudsen, Boeing
Thomas Holland, NAVC
Michael Bell, ATEC
 Discussion focus:
 Using M&S as an enabler for contractor test, DT, and OT to be a continuum
where we have information discovery and incremental verification among
distributed partners.
 Benefits
 Barriers
 Lessons Learned
 Next Steps
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
| 12
Benefits Include
Engineering, Operations & Technology | Boeing Test & Evaluation

Find integration issues earlier

Test to learn in ‘safe’ environment

Protect proprietary information

Facilitate DT to OT transition

Increase performance testing range in operating environments

Support end to end studies throughout the program
1. Proprietary – could make a business case for sharing in a distributed test
2. Reuse and repurpose through the product model
3. Finding and injecting problems early
4. Collaboration with distributed and industry partners
5. Helps to integrate the components
6. Supports end to end studies throughout the program
7. Discover interface ambiguity and issues
8. Inject product technology earlier
9. Preflight analysis
10. Test to learn – can’t afford to fail in a test – we can fail in a simulation
11. Simplifies the transition from DT to OT
12. Use M&S to do virtual testing to reduce the physical prototype builds and test (reduce overall cost of acquisition)
13. Cost effective way to do Systems and System of Systems test
14. Full range of performance in the intended operating environments – tests the edges that are too dangerous for OT
15. Early user feedback (OT)
NDIA Meeting Results
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
| 13
Barriers Include
Engineering, Operations & Technology | Boeing Test & Evaluation





Security
Lack of persistent network
Early consideration of technical issues
Perceived value
Disconnect between the communities (M&S and T&E)
1. Security. Connecting to distributed labs/networks. Takes too much time – approvals/bureaucratic not a technical issue.
2. Lack of a persistent network. How do you get it funded? Program funded is temporary, Contractor funded needs ROI. Need planning
and sustainment.
3. Education. Need awareness, framework, how to use it. Contractor test to OT&E. Communicate and understand the value.
4. What is the incentive to use M&S? Need the perceived value.
5. Reuse – proprietary, suitability, don’t know how to use it “here’s a model go download it”, knowing that it is there, understanding its
interoperability, fidelity, know the design intent
6. Need to understand the latency of the M&S in the lab/distributed test - may be different than reality – don’t want to induce latency –
need to know what the latencies are – depends on the data element of interest – it may not apply – need to have the discussion –
design to accommodate
7. Can it operate real time – time – more than synchronizing the start
8. Decrees don’t work without ROI, the enablers and the funding
9. M&S developers are not connected to the T&E stakeholders
10. Different views/understandings/perspectives of the models/simulations between M&S and T&E
11. Unsubstantiated assumptions
1. If they use the same ICD they must be able to integrate
2. I can pull out a model from a repository and just use it and its available whenever I need it
3. I can get the hardware whenever I need it
4. Use different boundary conditions and parameters
12. Interaction of models (federations)
13. Adequate time to develop M&S for design and test
NDIA Meeting Results
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
| 14
Lessons Learned
Engineering, Operations & Technology | Boeing Test & Evaluation
 M&S and T&E need to collaborate early
 Real hardware adds complexity – plan for it early
 Complex interfaces require close collaboration
 Need to establish relationships among distributed team
 Establish common definition of intended and expected use of MBDI&T and
associated models
1. First employ a development platform network then move to near deployment version at a Government location – work out security
issues, then move to the final persistent network.
2. Real hardware adds complexity – plan for it early
3. Need to automate the data collection
4. Use the same data analysis tools
5. Need rigorous CM
6. Need quick look analysis capability
7. Need to synchronize time and manage that synchronized time – more than just starting at the same time
8. Complex interfaces require close collaboration
9. Need to have a face to face to build relationships and then you can go virtual to plan and execute
10. Working groups established to understand the interfaces
11. Fidelity is in the eye of the beholder – higher fidelity is not necessarily better
12. Establish common definition of intended use and the expected use
13. M&S and Test need to collaborate early
14. Simulations need to be re-runable with predictable startup
NDIA Meeting Results
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
| 15
Recommendations
Engineering, Operations & Technology | Boeing Test & Evaluation

Harmonize the standards for M&S and Test for the life cycle perspective
(HLA, TENA, Metadata)
Create a framework for reusing and repurposing M&S through the product
model
Establish M&S as part of statistical test design


–
–


1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
Determine what tests are conducted to acquire data for model validation.
Fewer test events with better models.
Recommend the use of M&S to do I&T
Recommend establishment of JMETC as a persistent node for industry to
engage in MBDI&T
Harmonize the standards for M&S and Test – the life cycle perspective. HLA, TENA, Metadata
Create a framework for reusing and repurposing M&S through the product model – How does a model evolve from a concept to a
design to a product?
Emphasize reuse and repurpose through the product model
Understand what is out there (standards) – get a baseline
Map fidelity to intended use – identify intended use early
Use M&S as part of DOE.
Determine what tests are conducted to acquire data for model validation.
Fewer test events with better models.
Investigate verification by simulation as a verification method
Investigate the use of M&S to do I&T (integrate early)
Evaluate the barriers and determine root cause
Identify successes, learn from them, pass on to community
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
NDIA Meeting Results
| 16
Engineering, Operations & Technology | Boeing Test & Evaluation
Copyright © 2012 Boeing. All Rights Reserved.
Unpublished Work.
| 17