Systems Engineering - AIAA Info - American Institute of Aeronautics

Download Report

Transcript Systems Engineering - AIAA Info - American Institute of Aeronautics

Track 2 – Session 1
Tuesday, 13 August 2013
(13:30 – 15:30)
Developing Testable Capability-Level Requirements
Session Co-Chair: Bryan Herdlick, Ph.D.
Session Co-Chair: Eileen Bjorkman, Ph.D.
1
[email protected]
Track 2, Session 1
Testable Capability-Level Requirements
•
Tuesday, 13 August 2013 (13:30 – 15:30)
•
Objectives:
– Establish the challenge / issue and the associated “body-of-knowledge” gap
• Identify documentation that substantiates the research challenge
• Establish a common foundation for discourse
– Discuss SoS-level requirements and testing
• Goal: Identify prospective paradigm shifts potential solutions
– Establish a context for subsequent collaboration and/or future research
• Foundation: Relevant contemporary references
• Foundation: Panelist experience
•
Flow:
– (10 min)
–
–
–
–
(10 min)
(80 min)
(15 min)
(5 min)
Introduce Panel
Establish topic and ‘boundaries’ for this session
Discuss key questions (15-20 minutes per question, nominal)
Consider prospective solution-space
“Take-aways” and “next steps”
2
[email protected]
Panelist Introduction
• Suzanne Beers, Ph.D.
– Panelist
– MITRE
• Systems Engineer
– Supporting: Office of the
Secretary of Defense
• Space & Missile Defense
– Retired Military
• Engineering & Test positions in
Space Systems and Space
Warfare (numerous)
• NRO: Deputy Director, Systems
Engineering
• USAF Operational T&E Center,
Detachment 4: Commander
• Eileen Bjorkman, Ph.D.
– Panelist, and Session Co-Chair
– USAF Test Center
• Senior Executive & Technical
Advisor
– Retired Military
•
•
•
•
Flight test engineer
Instructor
Test squadron commander
Director and staff positions
associated with MS&A and
Joint T&E (numerous)
• Chief of M&S Policy Division,
Warfighter Integration &
Deployment
– [email protected].
mil
– [email protected]
3
[email protected]
Panelist Introduction
• David Gill
– Panelist
– Boeing
• System Test Engineer
– Activities
• Integration & Simulation
• Electromagnetic Test
• Test Ranges
– Experience
•
•
•
•
F/A-18 Hornet
F-15 Eagle
CH-47 Chinook
Space Shuttle & Space
Station
• Various Proprietary Systems
• Bryan Herdlick, Ph.D.
– Session Co-Chair & Facilitator
– JHU Applied Physics Lab
• Systems Engineer
– Supporting: NAVAIR
• SoS Systems Engineering
• Integration of M&S and T&E
– Retired Military
•
•
•
•
F-14 Tomcat RIO and WTI
Flight Test NFO (F-14, F/A-18)
Navy T&E Policy / Oversight
Management (AIM-9X)
– [email protected][email protected]
4
[email protected]
Panelist Introduction
• Katherine Morse, Ph.D.
– Panelist
– JHU Applied Physics Lab
• Principal Professional Staff
• Computer Scientist
– Activity: Technical research
that enhances distributed
simulation
– Experience & Specialization
• Simulation Interoperability
Standards
• Live / Virtual / Constructive
(LVC) Federation Engineering
– [email protected]
• Steve Scukanec
– Panelist
– Northrop Grumman Corp.
• Aerospace Engineer
• HALE Enterprise T&E Manager
– Activity: High Altitude Long
Endurance Family of Systems
• Test Planning & Requirements
• Test definition (Lab and Flight)
– Experience / Programs
•
•
•
•
B-2 Spirit
F-35 Joint Strike Fighter
NASA, DARPA, MDA
Co-Chair NDIA DT&E
Committee
– [email protected]
5
[email protected]
Panelist Introduction
• Frank Serna
– Panelist
– Draper Laboratory
• Director: Systems Engineering
Directorate
– Experience:
• Draper SE activities, including:
–
–
–
–
Trident II guidance
Guided munitions
NASA manned space
UAS autonomy…
• Member Missile Defense
National Team, Systems
• Co-chair NDIA SE Division
• Chapter President, AUVSI New
England
• George Wauer
– Panelist
– Developmental Evaluations,
LLC; Touchstone POCs, LLC
• President & CEO
– Activity: OSD Consultant
– Experience / Specialties
• OSD SES (20 years)
– OT&E of interoperability and
cyber aspects
– Command and Control T&E
• NRO / Space
– [email protected][email protected]
6
[email protected]
Topic Landscape
Terms of Reference
Literature Review
“Challenges” = Body-of-Knowledge Gaps
Discussion boundaries / guidelines
7
[email protected]
Terms of Reference
Capabilities & Requirements
• For purposes of this discussion…
– Capability (CJCSI 3170.01C)
• “The ability to achieve a desired effect under specified
standards and conditions through combinations of means
and ways to perform a set of tasks. It is defined by an
operational user and expressed in broad operational terms
in the format of a joint or initial capabilities document…”
– Requirement (Requirements Process; CJCSI 3170.01C)
• “The requirements process supports the acquisition process
by providing validated capabilities and associated
performance criteria to be used as a basis for acquiring the
right weapon systems.”
8
Terms of Reference
Threshold vs. Objective
• For purposes of this discussion
– Threshold (CJCSI 3170.01C)
• A minimum acceptable operational value below which the
utility of the system becomes questionable.
– Objective (CJCSI 3170.01C)
• The desired operational goal associated with a performance
attribute beyond which any gain in utility does not warrant
additional expenditure. The objective value is an
operationally significant increment above the threshold. An
objective value may be the same as the threshold when an
operationally significant increment above the threshold is
not significant or useful.
9
Terms of Reference
• For purposes of this discussion…
– “Characterization”
• Progress toward a Capability Objective in an
Operational Context
– vs. (or through?) “Verification / Validation of Requirements”
• Even more dependent on applying a full-spectrum of:
– Test, Analyze, Inspect, Demonstrate
– Live / Virtual / Constructive
10
Literature Review Highlights
• SoS-level ‘requirements’
– Absent
• SoS-based capability not documented / funded
• Conflicting priorities across constituent systems
– May require a new approach and/or lexicon?
• “Capability Objective” (SoS-level)
– Human / Operator Performance not considered
• Human as a constituent system
• Unique metrics  Unique testing requirements
Supporting references and excerpts are detailed in back-up slides
11
Capability Objectives & Metrics
National Defense Industrial Association (2011)
“Addressing SoS Acquisition Questions”
(November 2011 DRAFT Out-Brief)
– Capability Objectives
• Measured in terms of utility to the user
• Best evaluated in field / exercise / operational environment(s)
– External factors and complexity become more apparent
– Metrics
• Must be traceable to Capability Objectives
• Remain in-force as the SoS-based capability matures
• Supports utility assessment
Do “Threshold” and “Objective” mean something different
in application to SoS-based capabilities?
12
[email protected]
Literature Review Highlights
• SoS-level ‘testing’
– May require a new approach
• “Capability Characterization”
• Operational context is paramount
– CONOPS
• Full-scale SoS required
– Asynchronous development & fielding are problematic
– Requires a broader view, flexibility and innovation
• Verify / Validate = Test + Analyze + Inspect + Demonstrate
– Balanced use of Live / Virtual / Constructive venues
Supporting references and excerpts are detailed in back-up slides
13
Topic Landscape & Session Focus
“Requirements”
(Measures / Metrics)
Operators as
Constituent
Systems
Management
Challenges
Trade Space
Exploration
“T&E”
(Characterization?)
Terminology &
Definitions
Prototyping
Systems
Theory /
Thinking
M&S
“Emergence”
“Architecture”
DoD
Acquisition
Security
14
[email protected]
Keeping the Discussion Focused
Operators as
Constituent
Systems
“Requirements”
(Measures / Metrics)
“T&E”
(Characterization?)
?
Terminology &
Definitions
!
Prototyping
M&S
Management
Challenges
Trade Space
Exploration
Systems
Theory /
Thinking
“Emergence”
“Architecture”
DoD
Acquisition
Security
15
[email protected]
A question / premise to consider…
If our “requirements” and “test strategy” were viable at the
SoS / capability level, the rest of the Complex System
“problem space” might become more tractable…(?)
16
Questions & Discussion
Target Window: 13:50 – 15:10
17
[email protected]
Candidate Questions for Discussion
• Should performance at the capability-level be
articulated differently from “traditional”
platform- / system- / program-centric
requirements?
• Should “requirements language” be modified
to better support mission-centric or capabilitycentric approaches to verification / test?
SYSTEM Requirements vs. SoS-LEVEL “Requirements”
[email protected]
13:50 – 14:20  30 min
18
Candidate Questions for Discussion
• Are there “preferred” venues or methods for
verifying complex systems performance, and
should ‘requirements’ be tailored accordingly?
– T&E approach / context
• “Traditional” system-centric T&E
– Performance verification against threshold values and/or
‘specifications’
• Capability Centric / Mission Focused T&E
– Characterization of capability
WHAT we test vs. HOW we test
[email protected]
14:20 – 14:40
19
Candidate Questions for Discussion
• Given that human operators constitute
complex systems within the larger SoS, should
human performance be reflected in capabilitylevel requirements?
– What aspects of human performance should be
considered at the SoS / capability level?
– What does this imply for T&E methods and
resource dependencies?
WHAT we test vs. HOW we test
[email protected]
14:40 – 14:55
20
UNCLASSIFIED
SoS Utility ≈ f(Human Performance)
Example: Measures of Effectiveness for Command & Control
Measures / Metrics (?):
• Cognitive Performance
• Decision Quality
• Decision Timeliness
• Planning Quality
• Planning Timeliness
Cooley and McKneely (2012) JHU/APL Technical Digest 31(1)
21
[email protected]
UNCLASSIFIED
Candidate Questions for Discussion
• Does the role of an operational concept
document (e.g., concept of operations,
concept of employment, etc.) need to change
to adequately support the development of
SoS-based capabilities?
– If requirements at the capability level are
different, must the format of CONOPS / CONEMPS
/ OCDs change to accommodate SoS designs?
Establishing CONTEXT for WHAT & HOW we test
[email protected]
14:55 – 15:10
22
Candidate Questions for Discussion
• Is there additional complexity or concern
associated with SoS-level safety and
assurance?
– How does it differ from system-specific safety and
assurance?
• Are new approaches to requirements and/or test
methods required?
WHAT we test vs. HOW we test
[email protected]
 transition at 15:10 to “solution space”
23
Candidate Questions for Discussion
• In what way is the incorporation of human
operators in the design process different
when developing a SoS-based capability?
– Is it necessary to insert human operators from the
outset, or only when the hardware and software
integration is complete and the SoS is available to
“play with?”
This is a SE process question – Consider skipping to stay on schedule
Human Operator as a System within the SoS
[email protected]
 transition at 15:10 to “solution space”
24
Solution Space (?)
Take-Aways / Wrap-Up
Target Window: 15:10 – 15:30
25
[email protected]
Solution Space (?)
National Defense Industrial Association (2010)
“Addressing SoS Acquisition Questions”
(November 2011 DRAFT Out-Brief)
– Capability Objectives
• Measured in terms of utility to the user
• Best evaluated in field / exercise / operational
environment(s)
– External factors and complexity become more apparent
– Metrics
• Must be traceable to Capability Objectives
• Remain in-force as the SoS-based capability matures
• Supports utility assessment (capability characterization?)
26
[email protected]
“Capability Need Statement”
One possible format…
• Focus: User Perspective & Utility
– The user needs to be able to ____________
• (insert: SoS-based capability objective)
– …in order to _____________
• (insert: utility statement; gap being addressed)
– …such that ____________.
• (insert: measure of effectiveness / metric)
• (address: what constitutes the first increment of
utility)
27
[email protected]
SoS Utility ≈ f(composite system performance)
• Candidate utility statements and
categories of metrics / measures:
– Increase service capacity
• New customers / markets
• New targets / nodes
– Improve service in adverse
environments
• Weather
• Terrain
– “Over-the-horizon”
– Aquatic / Subterranean
• Electromagnetic
• Day / Night
– Increase extent of service
• Candidate utility statements and
categories of metrics / measures:
– Improve persistence of service
• Availability
• Reliability
– Improve survivability / safety
• Reduce exposure
– Time
– Severity / level
• Reduce vulnerability
– Others?????
• Discussion…
• Range (miles)
• Altitude
– Exo-atmospheric
HOW would such performance be tested?
28
[email protected]
SoS Utility ≈ f(human system performance)
• Candidate utility statements
and categories of metrics /
measures:
– Enhance decisionmaker
effectiveness
• Accuracy / Error Rate
• Speed
• Consistency
• Candidate utility statements
and categories of metrics /
measures:
– Improve employment
efficiency
• Relative to desired speed
/ timeline
– Within single role
– Across roles / positions
HOW would such performance be tested?
29
[email protected]
“Litmus-Tests” for SoS-based Capabilities
• Uniquely delivered by a SoS-based design?
– Will (can) the capability ONLY be successfully achieved and
verified when constituent systems (or acceptable prototypes)
are integrated and operational?
– What justification exists for a SoS-based design?
• Do any of the constituent systems deliver the capability
independently? If so, why is a SoS-based design required?
• Are multiple designs imbedded within the SoS, and if so, why?
– i.e., Can the capability be achieved in more than one way within the SoS?
• Can the SoS design be simplified / reduced, and are there
disadvantages (or advantages) to doing so?
30
[email protected]
Relevant Excerpts from the
Literature Review
31
[email protected]
Systems Engineering & Requirements
• “…it is the responsibility of systems engineering to thoroughly
analyze all requirements…vis-à-vis the basic needs that the
system is intended to satisfy, and then to correct any
ambiguities or inconsistencies in the definition of capabilities
for the system…”
– Systems Engineering: Principles and Practice, Kossiakoff & Sweet
• “The output of Requirements Analysis is a technical
description of characteristics the future system must have in
order to meet Stakeholder Requirements – not a specific
solution – which will be evolved in subsequent development
processes.”
– INCOSE Systems Engineering Handbook (V 3.1)
32
[email protected]
Requirements   T&E
• “…it is a systems engineering responsibility to plan, lead and
interpret the tests and their analysis in terms of what system design
changes might best make the user most effective.”
– Systems Engineering: Principles and Practice, Kossiakoff & Sweet
• “The purpose of the Validation Process is to confirm that the
realized system complies with the stakeholder requirements.”
– INCOSE Systems Engineering Handbook (V 3.1)
• The Validation Process “…is invoked during the Stakeholders
Requirements Definition Process to confirm that the requirements
properly reflect the stakeholder needs and to establish validation
criteria…[and again] during the Transition Process to handle the
acceptance activities.”
– INCOSE Systems Engineering Handbook (V 3.1)
33
[email protected]
Challenge / Gap (2013)
Trans-Atlantic Research & Education Agenda in SoS
• SoSE Strategic Research Agenda
– Theme #5: Measurement and Metrics
• “…knowing what to measure and when to measure in
order to determine SoS performance or likely future
behaviours is inadequately understood.”
– Theme #6: Evaluation of SoS
• “…further work is needed to be able to evaluate SoS
against their expected behaviour, desired outcomes,
and in comparison with other possible configurations of
the SoS…”
SoS-level Requirements and Testing are Problematic
34
[email protected]
Challenge / Gap
INCOSE Insight (July 2009)
• INCOSE Research Plan: 2008-2020 (Ferris)
– Item 7: System-of-Systems & “Legacy” systems
• “Means to ensure assurance that a system of systems
will deliver the capability intended.”
– i.e., How do we test at the SoS level?
The SoS topic development landscape is complex too…
35
[email protected]
Challenge / Gap (2013)
INCOSE International Symposium
• System-of-Systems “Pain Points”
– Capabilities & Requirements (dedicated section)
• “The definition of SoS capabilities and translation to
systems requirements is core to the application of
[systems engineering] to SoS.”
• “…we often think about requirements differently when
working on an SoS.”
– “In an SoS context, many people prefer to focus on
capabilities and less on requirements…”
The relationship between Capabilities and Requirements
36
[email protected]
Challenge / Gap
INCOSE Insight (July 2009)
• INCOSE Research Plan: 2008-2020 (Ferris)
– Item 7: System-of-Systems & “Legacy” systems
• “Methods to address issues arising from legacy
systems in the design of new or updated systems.”
– e.g., Asynchronous development & fielding?
– e.g., Conflicting performance / capability requirements?
– Item 8: “Sundry other matters”
• “Means to predict and to design human-intensive
systems.”
The SoS development process is complex too…
37
[email protected]
Challenge / Gap (2013)
Trans-Atlantic Research & Education Agenda in SoS
• SoSE Strategic Research Agenda
– Theme #11: Human Aspects
• “For the technical community, human aspects are often
the elephant in the room, recognised as a key aspect of
the SoS, but either not included in the design or
included retrospectively.”
• “…situational awareness [may be] compromised by
the complexity of the SoS…”
• “…human error or unexpected behaviour features
significantly in normal accidents…”
Human Operator = Constituent System within SoS
38
[email protected]
Challenge / Gap (2013)
Trans-Atlantic Research & Education Agenda in SoS
• SoSE Strategic Research Agenda
– Theme #8: Prototyping SoS
• “…the evolutionary nature of SoS make(s) prototyping a
troublesome proposition.”
• “…emergent behaviour may not be manifest until the
system is fully deployed…”
• “…prototyping may need to take place within the
operational SoS, rather than a test bed…”
Asynchronous Fielding  Prototyping & Testing Challenges
39
[email protected]
Challenge / Gap (2006)
USAF Scientific Advisory Board
• Report on System-Level Experimentation
– Experiments: “The only way to explore the
complexities of a system is through campaigns of
experiments, based on the proper venue, people
and ideas. Combining these into a rigorous
program of technology and CONOPS will create a
deep understanding of what the future may be
and how best to meet it.”
Experimentation CONOPS Technology SoS  Capability
40
[email protected]
Bibliography & References
41
[email protected]
Bibliography
• The System of Systems Engineering Strategic
Research Agenda (2013)
– Trans-Atlantic Research and Education Agenda in System of
Systems (T-AREA-SoS) Project (www.tareasos.eu)
– Prof. Michael Henshaw, Loughborough University, UK
([email protected])
• INCOSE Research Agenda: 2008-2020
- Ferris, INCOSE Insight (July 2009)
• System-of-Systems “Pain Points”
- INCOSE International Symposium, 2013
42
[email protected]
Bibliography
• Report on System-Level Experimentation
- USAF Scientific Advisory Board, 2006
• Systems Engineering: Principles and Practice
- Kossiakoff & Sweet
• INCOSE Systems Engineering Handbook (V 3.1)
• Addressing SoS Acquisition Questions
- NDIA (November 2011 DRAFT Out-Brief)
43
[email protected]
Additional References
TESTABLE REQUIREMENTS: System-of-Systems / Complex Systems
•
•
•
•
•
•
•
•
Simulation in Test & Evaluation, Allen, C.L., ITEA Journal, 2012
Rapid Prototyping and Human Factors Engineering, Beevis & Denis, Applied
Ergonomics, 23(3), 1992
Combining Attributes for Systems of Systems, Chattopadhyay (et al.), INCOSE
INSIGHT, 2010
System of Systems Requirements, Keating, Engineering Management Journal, 2008
Test as We Fight, O’Donoghue, ITEA Journal, 2011
Systems Engineering and Test for Unprecedented Systems, Weiss (et al.), ITEA
Journal, 2009
Implications of Systems of Systems on System Design and Engineering. Proceedings
of the 2011 6th International Conference on System of Systems Engineering,
Albuquerque, New Mexico, June 27-30, 2011
Requirements Engineering for Systems of Systems, Lewis (et al.), IEEE SysCon, 2009
44
[email protected]
Additional References
OPERATIONAL CONCEPT DOCUMENTS: CONOPS, CONEMPS, DRMs & OCDs
•
•
•
•
•
•
•
•
•
•
•
American National Standards Institute / American Institute of Aeronautics and Astronautics [1992] Guide
for the Preparation of Operational Concept Documents, ANSI/AIAA G-043-1992
Chairman of the Joint Chiefs of Staff [2006] Joint Operations Concepts Development Process (JOpsC-DP),
Instruction CJCSI 3010.02B
Department of the Air Force, Secretary of the Air Force [2005] Air Force Concept of Operations
Development, Instruction 10-2801, Pentagon, Washington, D.C. www.e-publishing.af.mil
Department of the Navy, Fleet Forces Command (FFC) [2006] CONOPS TO DOCTRINE: Shaping the Force
From Idea Through Implementation. Fleet CONOPS Guidance Brief.
Department of the Navy, Fleet Forces Command (FFC) [2009] Fleet CONOPS Writer’s Guide, Version 1
Department of the Navy, Office of the Assistant Secretary of the Navy (Research, Development &
Acquisition) [2002] Technical Brief: Design Reference Mission Profile Development Guidelines, TB # ABM
1002-03, Pentagon, Washington, D.C. www.abm.rda.hq.navy.mil
Department of the Navy, Office of the Chief of Naval Operations (OPNAV) [2010], Navy Concept
Generation and Concept Development Program, Instruction 5401.9, Pentagon, Washington, D.C.
Department of the Navy, Office of the Chief of Naval Operations (OPNAV) [2010], Developmental System
Concept of Operations, Instruction 5401.xx (DRAFT), Pentagon, Washington, D.C.
Department of the Navy, Naval Warfare Development Command (NWDC) [2010] Guide for Navy Concept
Generation And Concept Development Program, Version 1.0
Herdlick, B.E., [2011] Establishing an Operational Context for Early System-of-Systems Engineering
Activities, Systems Research Forum, 5(2)
Department of the Navy, Space Warfare Systems Command (SPAWAR) [2000] Data Item Description for
Operational Concept Document, DI-IPSC-81430A
45
[email protected]
Additional References
HUMAN PERFORMANCE in a SoS / CxS context
•
•
•
•
•
•
Command and Control Systems Engineering: Integrating Rapid Prototyping and
Cognitive Engineering, Cooley & McKneely, JHU/APL Technical Digest, 31(1), 2012
The Effects of Automation on Battle Manager Workload and Performance, Soller &
Morrison, IDA Report (D3523) of January 2008
Successfully Changing Conceptual System Design Using Human Performance
Modeling, Mitchell, Naval Engineers Journal (peer review DRAFT; 2009)
MORS Workshop of January 2012: “A Joint Framework for Measuring Command
and Control Effectiveness” - see Group 4 Outbrief, in particular
Optimizing Performance for Mine Warfare: A Case of Mission-Centered Design,
Osga (et al.), Naval Engineers Journal (ASNE) (peer review DRAFT; 2009)
An Obstacle Course Evaluation of AWACS 40/45 HMI Options, Donnelly & Vidulich,
AFRL presentation, 28JUN2002
46
[email protected]
Additional References
INCOSE Compilation of 2013 (45 SoS ‘experts’ responded; 100 refs total)
References recommended by the most respondents were
•
Maier, Mark W. 1998. “Architecting Principles for System-of-systems.” Systems Engineering 1 (4): 267–284.
[9 recommendations]
•
Office of the Under Secretary of Defense (Acquisition, Technology and Logistics) Systems Engineering Guide for Systems of
Systems
[9 recommendations]
•
Jamshidi, M. (Ed.) System of Systems Engineering: Principles for the 21st Century Wiley, 2009
[7 recommendations]
Beyond this, there were eight references which were recommend by more than one respondent
•
Jamshidi, M. (ed). 2009. Systems of Systems Engineering - Principles and Applications. Boca Raton, FL, USA: CRC Press [4
recommendations]
•
Dahmann, J.; Rebovich, G.; Lane, J.; Lowry, R. & Baldwin, K. An Implementer's View of Systems Engineering for Systems of Systems
Proceedings of the 2011 IEEE International Systems Conference (SysCon), 2011, 212 – 217.
[3 recommendations]
•
DeLaurentis, D.A., "Understanding Transportation as a System-of-Systems Design, Problem," 43rd AIAA Aerospace Sciences Meeting &
Exhibit, Reno, NV, 10-13 Jan. 2005. AIAA Paper No. 2005-123. [3 recommendations]
•
BKCASE, Part 4, Systems of Systems
[3 recommendations]
•
Barot, V., Henson, S., Henshaw, M., Siemieniuch, C., Sinclair, M., Lim, S. L., Jamshidi, M., DeLaurentis, D., 2012. State of the Art Report,
Trans-Atlantic Research and Education Agenda in Systems of Systems (T-AREA-SoS), Loughborough University. Available on-line. A
comprehensive introduction to SoS, issues and research towards solutions. [2 recommendations]
•
Boardman, J. and B. Sauser. (2006). System of Systems – the meaning of Of. IEEE International Conference on System of Systems
Engineering. April 24-26, Los Angeles, CA
[2 recommendations]
•
Boardman, John, and Brian Sauser. 2008. Systems Thinking: Coping with 21st Century Problems. Boca Raton, FL: CRC Press. [2
recommendations]
•
INCOSE Systems Engineering Handbook v. 3.2.2 INCOSE‐TP‐2003‐002‐03.2.2 October 2011
[2 recommendations]
All the recommended references from the recent INCOSE SoS working group survey / compilation are on the SoS WG area of INCOSE Connect
at https://connect.incose.org/tb/soswg/
47
[email protected]
Additional References
TEXTS on COMPLEXITY & EMERGENT BEHAVIOR
•
•
•
•
•
•
•
•
•
•
•
Emergence: From Chaos to Order (Holland, John H.; 1998)
Hidden Order: How Adaptation Builds Complexity (Holland, John H.; 1995)
Complexity: The Emerging Science at the Edge of Order and Chaos (Waldrop, M.
Mitchell; 1992)
Chaos: Making a New Science (Gleick, James; 1987)
Complexity: Life at the Edge of Chaos (Lewin, Roger; 1992; 2nd Ed.)
Exploring Complexity: an introduction (Nicolis and Prigogine; 1989)
The Black Swan: The Impact of the Highly Improbable (Taleb, Nassim N.; 2007)
Blackhawk Down (Bowden, Mark)
Coping With the Bounds: Speculation on Nonlinearity in Military Affairs
(Czerwinski, Tom)
Engineering Systems: Meeting Human Needs in a Complex Technological World
(Roos & de Weck)
Causality (Pearl, Judea)
48
[email protected]