presentation

Download Report

Transcript presentation

Meeting the Challenges of Unmanned and
Autonomous System Test and Evaluation
Thomas Tenorio, Subject Matter
Expert for UAST Executing Agent
10 March 2010, USC
Activities
• Working Group
– Roadmapping, Surveys, Networking, Tech Eval
• BAA Supply Space Surveys
• UAST Roadmap
– FY2009 Unmanned Integrated Roadmap Findings
– FY2010 UAST Roadmap
• Community / Professional Organization Engagement
–
–
–
–
–
Test Communities
Standards Communities
ITEA: tutorials and workshops
AUVSI
INCOSE (International Council of System Engineers)
The Test Resource Management Center
• TRMC under AT&L
• UAST part of T&E S&T
UAST
DET
MST
NII
SET
3
T&E S&T
6.3
NST
CTEIP
6.4
JMETC
6.5
TRMC Mission and Vision
• Mission
– "Plan for and assess the adequacy of the…MRTFB…[and] to provide adequate testing in support of
development, acquisition, fielding, and sustainment of defense systems; and, maintain awareness of
other T&E facilities and resources, within and outside the Department, and their impacts on DOD
requirements."
• Vision
– The DoD T&E ranges and facilities will be fully capable of supporting the Department with quality
products and services in a responsive and affordable manner.
22 ranges
Tagged as National Asset
4
Major Range and Test Facility Base
UAS Test Bed and Environment
S&T for physical test
capabilities associated with
Test Bed and Environment
– Stimulus selection or
generation
– Sensors
– Data Acquisition and
Management
– Data compression and
characterization (signatures)
– Data
reduction/analysis/interpret
ation
– Power Technologies:
providing increased mission
time and capability without
increasing the logistics
footprint.
– Test conducting including
situation awareness
– Test operations safety
– FAA and other civil
coordination
– C4ISR interoperability
TRMC Research and Interaction
• Test and Evaluation,
Science and
Technology
• Three groups
organized by TRL
level
– T&E S&T
• TRL 3-6 or 6.3
funding
• $95M / year
– CTEIP
• TRL 6-9 or 6.4
funding
• $140M / year
– JMETC
6
• 6.5 funding
• $10M / year
TRL Technology Readiness Levels
• Criteria used
to assess
project
eligibility and
status
– Measures
system
maturity
– Non-linear in
time, money
and effort
– Not well
understood
by
community
7
T&E S&T
• Test and Evaluation, Science and Technology
• 7 Focus Areas
–
–
–
–
–
–
–
8
UAST
SET
NST
NII
MST
HSHT
DET
Unmanned Systems Integrated
Roadmap
• The UAS Challenge
– Capabilities (311 JCA
named targets)
– Systems (138
systems)
– Performance
Envelope Aspects
(41)
– Technologies (17)
7/13/2016
9
UAST Augmented Study
Methodology
Draft Test Requirements
Validation
Documented (PoR) UAS Missions and
Technologies per USIR
Draft Drivers, Use Cases,
Test Concept ,Test Plans
Refined Drivers, Use Cases,
Test Concept ,Test Plans
Refined and Extended (beyond PoR) UAS
Missions and Technologies with Working Group
T&E Needed to Test Specific UAS
Technologies
Facility Specific Descriptions &
Requirements (Hard Numbers)
Test Resource Requirements
Test Resource Survey
Baseline
Baseline T&E Capabilities (we do not want
Tri-Service Baseline
Capability
to conduct an exhaustive survey of all the T&E
capabilities that exist and could support UAS T&E)
Gaps sensitive to New
Approaches and New
Paradigms
Draft Test Resource
Need Statements
Needs Analysis (Gap)
Refined Test Resource
Need Statements
Validation
Develop Solutions
Prioritize Solutions
BAA, RFI, White Papers
& Proposals
Roadmap
Refined Needs
Statements
BAAs, etc
RoadMap
Relevant Commentary on Test
Urgent Needs for UAS
Outcome Timelines
Means, Ways, and Ends to UAST as a Value Proposition
UAS Safety, Suitability, Survivability, Effectiveness
Test and Evaluation of UAS as Highly Complex Systems
V&V
What if --?
Integrity
Assessment
Space
Air
Ground
Maritime (surface/under)
Surrogate &
Simulate
Emulate
Live
Contrived
Live
in situ
Cross-Domain Commonality ……......... Specificity
Predicting Behavior
Emulating Mission & Environmental Complexity with Assured Safety
Assessing Effects and Capabilities
UAST Tools & Techniques
Reference Data Sets Ground Truth Decision & Behavior
Protocols & Design
Test Bed and Environment
UAST Technologies
Risks
Constraints
Near
Mid
Far
Use Case Driven Investment
7/13/2016
15
The Target for UAST
7/13/2016
16
BAA Technology Investment Categories
7/13/2016
17
Integrating Findings of the FY2009
Unmanned Integrated Roadmap
7/13/2016
18
Ensure test capabilities support the fielding of unmanned
systems that are effective, suitable, and survivable
UAS for Operational Necessity
(rapid tempo)
•
Evolutionary acquisition with
JUTLS & JUONS
•
Booming capability
development
•
Capability challenge of 311
named systems
•
Majority system non-PORs
•
Fielding tech in months: 4-6
months for joint operational
necessity
UAST: Knowledge
Risk Reduction
UAS: Warfighter
Capability
Test of UAS (increasing tempo for
Integrated T&E)
• Accelerate incrementally
improving T&E
• Augment legacy and
improvised capability
• Emerging arguments for UAST
value proposition
• AS-IS: OT&E emphasis with
emergent Joint T&E
• Fielding UAST in years: 3 year
in S&T and 4 years in T&E
The Interacting Communities of UAS
Advantage
UAS: Warfighter
Capability
UAST S&T:
Next Gen Tech
For UAST
UAST: Knowledge
Risk Reduction
Value Proposition of UAST
•
•
•
UAST to further
ensure Safe,
Suitable,
Effective, and
Survivable UAS
Pace and tempo
to secure
advantage for
UAS
Advancing
Knowledge
Generation
capabilities for
risk reduction in
the production
of UAS
The Tester Evolution Loop
1) Must get inside the Capability Evolution Loop
2) Must endure throughout the Capability operational life cycle
Capability Development
design
Concept
Warfighter
Operations
Effects
test
production
deployment
1
Knowledge
Concept
Capability
Test
Development
2
Warfighter
Knowledge
UAST Driver Overview
• UAST has primary drivers
– Autonomous Behavior of unmanned and autonomous systems
that sense, understand and act upon the environment in which
they operate
– Safety of Autonomous systems in Mission and Environment
• Secondary drivers
–
–
–
–
Sensory Capacity & Perception Loops (Observe)
Knowledge Models of Ground Truth & Behavior (Orientation)
Decision Making (Decide)
Supervised Autonomy Behavior (Action)
• Context
– Systems Testing of Human Independent Behavior
– Emulating UAS in Mission and Environmental Complexity with
Assured Safety
– Assessing UAS Effects and Capabilities in Joint Capability Areas
UAST Exemplar Overview
Leveraging S&T for T&E Capability Development
Predict
Standard Systems T&E
1.
System Level
T&E
2.
Mission &
Environment
T&E
3.
Joint Capability
Areas T&E
The Test
Arena
Emulate
Composable
Tools
Assess
Autonomous System T&E
1. Predicting
Autonomous
Behavior
2.
Emulating Mission
and Environmental
Complexity with
Assured Safety
3.
Assessing UAS
Effects and
Capabilities
Unmanned & Autonomous
Systems Test
World Models
Protocols
UAS Test & Evaluation Focus Areas
Non-Intrusive
Instrumentation
Spectrum Efficiencies
Multi-Spectral Sensors
Netcentric Systems
UAST Systems Engineering
Capabilities Reference Framework
1. Predicting Unmanned and
Autonomous System Behaviors
(T&E/E&A)
2. Emulating Mission and
Environmental Complexity with
Assured Safety (T&E/E&A)
3. Assessing UAS Effects and
Capabilities (E&A)
4. Autonomous Test Protocols
and Design (T&E)
5. Test Bed and Environments for
UAST (T&E)
6. UAST World Models (Ground
Truth, Decision, & Behavior)
(T&E)
7. Tools and Techniques for
Systemic UAST (T&E/E&A)
Autonomous Capabilities Model
Mission Context
Autonomous System/Systems/SoS
[OODA]
Observe
Safety
Effectiveness
Act
Decide
Orient
Agility
Suitability
Survivability
T&E Wide Knowledge Management
Test
Design
Test System Readiness
Preparation Assessment
Safety
Guard
Protocol
Mgmt
Sensor
Mgmt
Data, presentation & Session management
Interoperability infrastructure
Communications and networking
Data Quality
Mgmt
T&E Capabilities Reference Framework
1. Predicting Unmanned
and Autonomous
System Behaviors
2. Emulating Mission and
Environmental
Complexity for Assured
Safety
3. Assessing UAS Effects
and Capabilities
4. Autonomous Test
Protocols and Design
5. Test Bed and
Environments for UAST
6. UAST Reference Data
Sets (Ground Truth,
Decision, & Behavior)
7. UAST Tools and
Techniques
Essence of UAST Challenge
• Determine
• Regarding Mission
– False Positives and False Negatives
– Dynamic Limits of behavior
– Integrity Limits of intended functions
Across UAS(s) OODA
• At >10-fold reduction in
• Cycle time
• Cost
• And exemplary ROI of S&T
–
–
–
–
Effectiveness
Suitability
Survivability
Safety
S&T Opportunities for T&E
of Autonomy with Assured Safety
1) Tools for Design of Experiments in system, multisystem, and system
of systems scenarios considering also implications of mission
scenarios, opposition capability, and physical context.
2) Ability to incorporate UAS design models into warfighter-scope
models/simulations in order to anticipate mission suitability, safety,
effectiveness and survivability (including countermeasures).
3) Determining how to manipulate live physical scenarios including
Red Forces. Acquiring ground truth data during actual test
operations.
4) Bayesian Belief Networks and similar tools for conflating test data
to mission-level expectations.
5) Ensuring that Net Centric Systems and relevant test assets are
sufficiently agile to enable span and dynamics of UAS test scenarios.
6) Systems architecting and engineering of a family of composable
UAST’s.
UAST Systems Engineering
Capabilities Reference Framework
1. Predicting Unmanned and
Autonomous System Behaviors
(T&E/E&A)
2. Emulating Mission and
Environmental Complexity with
Assured Safety (T&E/E&A)
3. Assessing UAS Effects and
Capabilities (E&A)
4. Autonomous Test Protocols
and Design (T&E)
5. Test Bed and Environments for
UAST (T&E)
6. UAST World Models (Ground
Truth, Decision, & Behavior)
(T&E)
7. Tools and Techniques for
Systemic UAST (T&E/E&A)
Autonomous Capabilities Model
Mission Context
Autonomous System/Systems/SoS
[OODA]
Observe
Safety
Effectiveness
Act
Decide
Orient
Agility
Suitability
Survivability
T&E Wide Knowledge Management
Test
Design
Readiness
Test System Readiness
Preparation Assessment
Assessment
Safety
Guard
Protocol
Mgmt
Sensor
Mgmt
Data, presentation & Session management
Interoperability infrastructure
Communications and networking
Data Quality
Mgmt
Readiness Assessment Capability
What:
Discover internal bugs and vulnerabilities and external
incompatibilities in UAS’s, across UAS’s and in T&E systems.
Where: In executable code, source code, data bases, system models,
mission simulations, and SoS configurations.
At development, integration, warfighter and depot locations.
Why:
Generate warfighter-confident knowledge. Cut test cycle time
and cost in half. User controllable degree of False Positives
and False Negatives.
How:
Code inspection. Test beds not required.
Mathematically rigorous assessment method and tools.
Enabled by next generation pattern recognition
semiconductor chips with throughput ≈ 1 Gb/sec
When: TRL3@2010, TRL5 @2011, TRL6@2012, TRL9@2014
UAST Systems Engineering
Capabilities Reference Framework
1. Predicting Unmanned and
Autonomous System Behaviors
(T&E/E&A)
2. Emulating Mission and
Environmental Complexity with
Assured Safety (T&E/E&A)
3. Assessing UAS Effects and
Capabilities (E&A)
4. Autonomous Test Protocols
and Design (T&E)
5. Test Bed and Environments for
UAST (T&E)
6. UAST World Models (Ground
Truth, Decision, & Behavior)
(T&E)
7. Tools and Techniques for
Systemic UAST (T&E/E&A)
Autonomous Capabilities Model
Mission Context
Autonomous System/Systems/SoS
[OODA]
Observe
Safety
Effectiveness
Act
Decide
Orient
Agility
Suitability
Survivability
T&E Wide Knowledge Management
Test
Design
Safety
Safety
Test System Readiness
Guard*
Preparation Assessment Assurance
Protocol
Mgmt
Sensor
Mgmt
Data, presentation & Session management
Interoperability infrastructure
Communications and networking
Data Quality
Mgmt
Safety Assurance
What:
Discern and Referee the contest between autonomy and safety,
both a) test safety, including, e.g., FAA, and b) Operational safety,
e.g., fratricide and innocent civilians.
Enable both static and evolving limits.
Assess efficacy of UAS Self-test capability, resilience to cyber
threats, probable error in M&S evaluations of UAS(s).
Where: Throughout 5000.02 phases and Warfighter stages. Across UAS,
UAS’s, SoS. Spans both on-board and administrator functions.
Why:
Avoid unintended consequences of UAS operations. Generate
warfighter-confident knowledge. Cut test cycle time and cost in half.
User controllable degree of False Positives and False Negatives.
How:
A ‘Do No Harm’ OODA loop inside the autonomy loop of both the
UAS and the UAST. Method for Preempting UAS behaviors, separate
from Planner capability, preferably non-destructive.
When: for autonomy Level 1@2010, 2@2011, 3@2012, 4@2013, 5@2014
Contact Information
• [email protected]
• (575) 678-4671