Transcript Document

SYCAMORE CREEK CONSULTANTS
Defense • Science • Education
Implications of Complexity Research
for Command and Control
M. I. Bell
FACT, 29 July 2009
Disclaimers
• Most of these ideas are not original; I will not
acknowledge my sources
• I am responsible for any errors; feel free to
point them out
• Complexity can be complicated, even
complex
• I get nothing from the admission charge; no
refunds will be given
2
Beware of Humpty Dumpty
“When I use a word,” Humpty
Dumpty said, in rather a scornful
tone, “it means just what I choose
it to mean – neither more nor
less.”
“The question is,” said Alice,
“whether you can make words
mean so many different things.”
• Care is required when
using everyday words
for specialized
purposes
• The community of
interest needs clear,
common definition
• The general public
needs warnings to
avoid confusion
“The question is,” said
Humpty Dumpty, “which is to be
master – that's all.”
3
Outline
•
•
•
•
•
•
•
Motivation
Some trivial questions (not answers)
Intuitive complexity
Quantifying complexity
Formal complexity: dynamic and architectural
Design and control of complex systems
Complexity and C2
4
Motivation
• Complexity as a buzzword
– “Six degrees of separation,” “butterfly effect,” etc. have entered
popular culture
– Dozens of university groups, programs, seminars, and projects
– Pioneers (e.g., Santa Fe Institute) considering moving on
• Complexity as a metaphor
– 98 of 144 papers in the 14th ICCRTS contain the word “complexity”
• Complexity as a mindset
– Awareness of chaos, “fat tails,” “tipping points,” self-organization
• Complexity as a toolbox
– Fractal geometry, nonlinear dynamics, agent-based simulation
• Complexity as a paradigm
“accepted examples of actual scientific practice… [that] provide
models from which spring particular coherent traditions of scientific
research”
– T. S. Kuhn, The Structure of Scientific Revolutions, 1962
5
What is Complexity?
• Many entities, many interactions, collective behavior
• Quality or quantity?
• Definition or characteristics?
– Emergence, self-organization, self-similarity, chaos, etc.
• Computational complexity (of a problem)
– Resources (typically time) required to obtain a solution
• Algorithmic information content (of a string)
– Length of the shortest program that will output the string
• Structural complexity
– Self-similarity, fractal geometry
• Dynamic complexity
– Chaos, sensitivity to initial conditions, phase transformations
6
Why are Things Complex?
• By selection or by design
• Selection
– Natural or artificial (often not “survival of
the fittest” but “the survivors are the fittest”)
– Preferential growth (“the rich get richer”)
• Design
– Nonlinearity
– Feedback control
– Optimization
7
Why Do We Care?
•
•
•
•
•
Emergent behavior (self-organization)
Requisite variety (control)
Causality (prediction)
Stability/instability (cascading failure)
Unintended consequences
8
Intuitive Complexity
• Disorganized complexity
“a problem in which the number of variables is
very large, and one in which each of the many
variables has a behavior which is individually
erratic, or perhaps totally unknown. However,
…the system as a whole possesses certain
orderly and analyzable average properties”
• Organized complexity
“problems which involve dealing simultaneously
with a sizable number of factors which are
interrelated into an organic whole”
– W. Weaver, American Scientist (1948)
9
Complexity vs. Order
Simple Entities
Organized/Differentiated
Entities
Systems
Analysis
COMPLEXITY
Statistical
Analysis
PHYSICS
Pressure
Temperature
Phase
ECONOMICS
GDP
Growth rate
ORDER
10
Butterfly Effect
‘‘Long range detailed weather prediction is therefore impossible, …the accuracy of this prediction
is subject to the condition that the flight of a grasshopper in Montana may turn a storm aside from
Philadelphia to New York!’’
– W. S. Franklin (1898)
11
Argument for Quantification
“When you can measure what you are speaking about, and
express it in numbers, you know something about it; but when
you cannot measure it, when you cannot express it in numbers,
your knowledge is of a meager and unsatisfactory kind…”
– William Thompson (Lord Kelvin), 1824-1907
If we can quantify complexity, we can
– Determine whether one system is more or less complex than
another
– Determine whether a control (or C2) system is of the appropriate
complexity for a given situation
– Take appropriate steps to control complexity; e.g.,
• Reduce the complexity of our environment
• Increase the complexity of an adversary’s environment
12
Algorithmic Information Content
• Length of the shortest possible description of a system (made
formal using Turing machine concept)
• Pros:
– Consistent with the idea that a good theory simplifies the
description of phenomena
• Cons:
– Complexity may seem to be a property of our understanding of a
system, not of the system itself
– The length of description may depend on the vocabulary available
– Relative complexity of two systems depends on the details of the
Turing machine used
– It is impossible to show that a description is the shortest possible
– Random systems are maximally complex (counter-intuitive)
13
Computational Complexity
• The number of operations (typically multiplications)
needed to solve a problem
• Pros:
– A complex problem takes longer (or more resources) to
solve than a simple one
– The difficulty of a complex problem grows rapidly with its
size n:
• Problems that can be solved in time proportional to nk are
“polynomial time” problems
• Problems that can be solved in time proportional to en or n! are
“exponential time” problems
• Cons:
– There is no algorithm for determining how hard a problem is!
14
Formal Complexity
• Dynamic (process)
– Corresponds roughly to computational
complexity
– Originated in non-linear dynamics
• Architectural (structural)
– Corresponds roughly to algorithmic
information content
– Originated in communication theory
15
Mandelbrot Set
B. Mandelbrot, ca. 1978
A complex number c is a member of the set if
starting with z0 = 0, zn+1 = zn2 + c is bounded
16
Escape Problems
• Mandelbrot set
– A complex number c is a member of the set if
starting with z0 = 0, zn+1 = zn2 + c is bounded
– In other words, c is not a member if zn+1 escapes
• Sinai billiard
– Y. Sanai, ca. 1963
Made into an escape
problem by Bleher
et al. (1988)
17
Sinai Billiard
0
x0
x 105
x 106
18
Prediction Horizon
• Discontinuity in
boundary
conditions (as
well as nonlinearity) can
cause divergent
trajectories
• Similar initial
conditions
produce similar
trajectories for a
limited time
19
Differential Games
• Modeling conflict in a dynamical system (e.g., pursuit-evasion)
– Each player (two or more) has a state-dependent utility function
that he seeks to maximize
– Each player has a set of control variables that influence the state of
the system
– What are the best strategies?
– What are the possible outcomes?
• Example: homicidal chauffeur problem (R. Isaacs, 1951)
– The “pedestrian” is slow but highly maneuverable
– The “vehicle” is much faster but far less maneuverable
– Under what initial conditions (if any) can the pedestrian avoid being
run over indefinitely?
• Some games (complex ones?) generate state-space structures
with fractal geometry
20
Control Systems
Open Loop
Model
Controller
System
Goal
Closed Loop
+
Controller
System
–
Sensor
21
Control Theory
•
Degrees of freedom (six for an
aircraft)
•
•
(x,y,z) = coordinates of center of
mass
(,,) = yaw, pitch, roll
• Holonomicity
•
•
•
N degrees of freedom
Nc controllable degrees of freedom
System is
• Holonomic if Nc = N
• Non-holonomic if Nc < N
• Redundant if Nc > N
• Aircraft (N=6, Nc=3,4) and automobiles (N=3, Nc=2) are non-holonomic
• No stable control settings are possible; not every path can be followed
• Every possible path can be approximated
22
Requisite Variety and Stability
• Requisite variety (Ashby, 1958)
– To control a system with Nc controllable degrees of freedom
the control system itself must have at least Nc degrees of
freedom
• Given requisite variety in the control system for a
holonomic system, stability is possible
– Lyapunov stability: paths that start near an equilibrium point
xe stay near xe forever
– Asymptotic stability: paths that start near xe converge to xe
– Exponential stability: the convergence is as fast as possible
(Lyapunov exponent)
23
Internet 2001
24
Scale-Free Network
World-Wide Web
Failure and Attack Tolerance
Random
•
•
•
•
k = degree (number of connections
Power law ( = -1.94)
Preferential growth and attachment
•
•
•
Diameter (max. distance between
nodes) vs. fraction deleted
Failure = random node deleted
Attack = high-degree node deleted
E = random, SF = scale-free
– A. Barabási, et al. (2000)
25
Fat Tails
0.4
0.01
Gaussian
Cauchy
2
2
(1/2)exp[-(x-1) /2]
0.3
1/[1+(x-1) ]
Cauchy
-12
10
2
1/[1+(x-1) ]
Gaussian
2
f(x)
f(x)
(1/2)exp[-(x-1) /2]
0.2
0.1
0
-10
-22
10
-32
10
-42
-5
0
x
5
10
10
1
10
x
26
Cellular Automata
Game of Life
Number
of live
neighbors
<2
Action
Die
2
Do nothing
3
Become
alive
>3
Die
– J. Conway (1970)
27
Emergence
•
•
Emergent objects belong to a higher level of representation than individual cells
or their behavior rules
Levels (Game of Life):
–
–
–
–
•
Multiscale Representation (Y. Bar-Yam): each level of representation has its
own:
–
–
•
Cells and rules
Objects (blinkers, gliders, blocks, beehives, etc.)
Interactions of objects (attraction/repulsion, annihilation, etc.)
Architectures of objects (guns, puffers, rakes, etc.)
Scale: number of entities or components
Variety: number of possible actions or states
Fundamental questions
–
–
–
–
How is behavior at each level determined?
Can constraints or behaviors at higher levels influence lower ones?
Is there “downward causation”?
Can we design for desired behaviors?
28
Gosper’s “Glider Gun”
29
Design and Control
• Systems can become complex either
because or in spite of design rules
• Simplicity is generally a goal, but it
competes with other goals: efficiency,
robustness, versatility, etc.
• Systems generally evolve toward
greater complexity, not less
30
Functional Decomposition
System of Systems
SOS
Module1
S1M1
System1
S1
System2
S2
Module2
S1M2
Module3
S1M3
System3
S3
• Traditional engineering
practice
• Hierarchical structure
• Independent modules
• System/subsystem or
system (family) of
systems
31
Commonality
System1
S1
System1
S1
Module1
S1M1
Module2
S1M2
Module3
S1M3
Module1
S1M1
Module2
S1M2
Module3
S1M3
Unit1
Unit1
Unit1
Unit1
Unit1
Unit1
Unit2
Unit2
Unit2
Unit2
Unit2
Unit2
Unit3
Unit3
Unit3
Unit3
32
Reuse
System1
S1
Module1
S1M1
Module2
S1M2
Unit1
Unit1
Unit1
Unit2
Unit2
Unit2
Unit3
Module4
S1M4
Module3
S1M3
Unit3
33
Big Ball of Mud
“A BIG BALL OF MUD is haphazardly structured,
sprawling, sloppy, duct-tape and bailing wire,
spaghetti code jungle… These systems show
unmistakable signs of unregulated growth, and
repeated, expedient repair.”
“…a complex system may be an accurate reflection
of our immature understanding of a complex problem.
The class of systems that we can build at all may be
larger than the class of systems we can build
elegantly, at least at first.”
– B. Foote and J. Yoder, in Pattern Languages of Program Design 4 (2000)
34
Highly Optimized Tolerance (HOT)
“Our focus is on systems which are optimized, either through
natural selection or engineering design, to provide robust
performance despite uncertain environments. We suggest that
power laws in these systems are due to tradeoffs between yield,
cost of resources, and tolerance to risks. These tradeoffs lead to
highly optimized designs that allow for occasional large events.”
“The characteristic features of HOT systems include: (1) high
efficiency, performance, and robustness to designed-for
uncertainties; (2) hypersensitivity to design flaws and
unanticipated perturbations; (3) nongeneric, specialized,
structured configurations; and (4) power laws.”
– J. M. Carlson and J. Doyle, Physical Review (1999)
35
Complexity and C2
• Complex systems analysis is not (yet) a revolutionary
new paradigm
• We can use the complexity mindset and toolbox to
re-visit and re-assess C2 problems
–
–
–
–
–
–
–
–
Speed of command and the OODA loop
Complex endeavors
The DIME/PMESII construct
Wicked problems
The C2 Approach Space
Optimization
Rare events
Emergence and causality
36
Speed of Command/Control
B
B
A
A
Control:
“Correct for cross winds”
Command:
“Fly from A to B
C
B
A
Command:
“Divert to C
37
OODA Loop vs. Control Loop
Observe Choose goal
Orient
Sense error
Decide
Act
Find correction
Correct
• Traditionally: command is human, control technological
• Modern control theory describes highly complex behaviors
• Potential for application to command problems
38
Complex Endeavors
•
Complex endeavors have one or more of the following characteristics:
– The number and diversity of participants is such that:
• There are multiple interdependent “chains of command”
• The objective functions of the participants conflict with one another or their
components have significantly different weights
• The participants’ perceptions of the situation differ in important ways
– The effects space spans multiple domains and there is
• A lack of understanding of networked cause and effect relationships
• An inability to predict effects that are likely to arise from alternative courses of
action
– D. Alberts and R. Hayes, Planning: Complex Endeavors (2007)
•
Interpretation as differential games
– Utility functions of coalitions (Uc = utility function of the coalition, Ui = utility
function of member i )
– Tight coalition: Uc is a fixed function of the individual Ui
– Loose coalition: Uc is a function of the individual Ui that depends on the
state of the system, allowing gain/loss of commitment, subversion,
defection, etc.
39
DIME/PMESII Formalism
• State variables: Political, Military, Economic,
Social, Information, Infrastructure
• Control variables (interventions): Diplomatic,
Information, Military, Economic
• Questions:
– Does DIME have requisite variety to control
PMESII?
– What happens when the game is two-sided?
many-sided?
40
Competition
P
M
E
S
I
I
D
I
M
E
P
M
E
S
I
I
D
I
M
E
Recent study (AT&L/N81)
indicates that available
models do not capture
essential features
– The process by which
PMESII state generates
DIME interventions
– The adversary response
and resulting feedback
loops
41
The “Invisible Hand”
• Adam Smith: market forces provide closed-loop
control of the economy
• Modern economists: are you kidding?
• No reason to assume:
– Requisite variety in control variables
– Stable solutions or attractors in state space
• Application of game theory:
– “Rational actor” assumption limits choices of utility functions
– Limited ability to deal with coalitions
• Similar issues in other PMESII variables
42
Wicked Problems
1.
2.
3.
4.
5.
There is no definitive formulation of a wicked problem
Wicked problems have no stopping rule
Solutions to wicked problems are not true-or-false, but good-or-bad
There is no immediate and no ultimate test of a solution to a wicked problem
Every solution to a wicked problem is a "one-shot operation"; because there is
no opportunity to learn by trial-and-error, every attempt counts significantly
6. Wicked problems do not have an enumerable (or an exhaustively describable)
set of potential solutions, nor is there a well-described set of permissible
operations that may be incorporated into the plan
7. Every wicked problem is essentially unique
8. Every wicked problem can be considered to be a symptom of another problem
9. The existence of a discrepancy representing a wicked problem can be
explained in numerous ways. The choice of explanation determines the nature
of the problem's resolution
10. The planner has no right to be wrong
– H. Rittel and M. Webber, Policy Sciences (1973)
43
No Evolution
1.
2.
3.
4.
5.
There is no definitive formulation of a wicked problem
Wicked problems have no stopping rule
Solutions to wicked problems are not true-or-false, but good-or-bad
There is no immediate and no ultimate test of a solution to a wicked problem
Every solution to a wicked problem is a "one-shot operation"; because there is
no opportunity to learn by trial-and-error, every attempt counts significantly
6. Wicked problems do not have an enumerable (or an exhaustively describable)
set of potential solutions, nor is there a well-described set of permissible
operations that may be incorporated into the plan
7. Every wicked problem is essentially unique
8. Every wicked problem can be considered to be a symptom of another problem
9. The existence of a discrepancy representing a wicked problem can be
explained in numerous ways. The choice of explanation determines the nature
of the problem's resolution
10. The planner has no right to be wrong
44
No Design
1.
2.
3.
4.
5.
There is no definitive formulation of a wicked problem
Wicked problems have no stopping rule
Solutions to wicked problems are not true-or-false, but good-or-bad
There is no immediate and no ultimate test of a solution to a wicked problem
Every solution to a wicked problem is a "one-shot operation"; because there is
no opportunity to learn by trial-and-error, every attempt counts significantly
6. Wicked problems do not have an enumerable (or an exhaustively describable)
set of potential solutions, nor is there a well-described set of permissible
operations that may be incorporated into the plan
7. Every wicked problem is essentially unique
8. Every wicked problem can be considered to be a symptom of another problem
9. The existence of a discrepancy representing a wicked problem can be
explained in numerous ways. The choice of explanation determines the nature
of the problem's resolution
10. The planner has no right to be wrong
45
Complexity
1.
2.
3.
4.
5.
There is no definitive formulation of a wicked problem
Wicked problems have no stopping rule
Solutions to wicked problems are not true-or-false, but good-or-bad
There is no immediate and no ultimate test of a solution to a wicked problem
Every solution to a wicked problem is a "one-shot operation"; because there is
no opportunity to learn by trial-and-error, every attempt counts significantly
6. Wicked problems do not have an enumerable (or an exhaustively describable)
set of potential solutions, nor is there a well-described set of permissible
operations that may be incorporated into the plan
7. Every wicked problem is essentially unique
8. Every wicked problem can be considered to be a symptom of another problem
9. The existence of a discrepancy representing a wicked problem can be
explained in numerous ways. The choice of explanation determines the nature
of the problem's resolution
10. The planner has no right to be wrong
46
Wicked, Complex, or Ill-Posed
“In reality the problems are not so much ‘wicked’ as complex.”
– E. Smith and M. Clemente, 14th ICCRTS (2009)
• “Wicked” problems are best described as differential games
– Multiple participants compete to maximize their individual utility
functions
– Most social policy problems (when described as games) probably
are complex, but formal analysis is just starting in biology and
economics
– The Rittel-Webber description reflects a misguided attempt by the
“planner” to define a single utility function (i.e., create a single, tight
coalition)
– “Wickedness” is not a property of the system but of how we have
defined the problem
47
C2 Approach Space
• Three dimensions (D. Alberts and R. Hayes, 2007):
– Patterns of interaction
– Distribution of information
– Distribution of decision rights
• Incident response model (M. Bell, 14th ICCRTS)
– Assumptions (decentralized C2)
• Decision rights: widely distributed
• Information: widely distributed
• Interaction: highly limited
– Results (agent-based simulation)
• Effective “edge” organizations do not have to be near the high end of all
three dimensions
• Self-organization can occur with very simple behavior rules
• Self-organization can be counter-productive
• Iterative refinement of the rule set needed to exclude bad cases
48
Optimization
• Optimization of large, non-linear systems is almost
always computationally hard (exponential time)
• Heuristic approaches will sometimes give good
approximate solutions
• Robustness is an issue
– Demonstrating stability (to small perturbations) may be
computationally hard
– Complex systems often have “brittle” optima
– The probability of large perturbations may be greatly
increased by non-linear dynamics
– Extreme optimization (HOT) alters the distribution of
properties or behaviors (fat tails)
49
Rare Events
• Not as rare as we might expect
– Scale-free (self-similar) structures yield power-law
distributions
– Probabilities can be many orders of magnitude greater than
predicted by the normal distribution
• Distributions may not be stable (linear combinations
of independent events do not have the same
distribution as the events)
• Joint probabilities may not be products of individual
event probabilities
• Increased probability of rare event sequences
(cascading failures)
50
Causality
• Complexity research deals with causal
(deterministic) systems
• Opposite of causal is random (not
complex)
• Complexity can:
– Make it difficult to discover causal
relationships
– Limit prediction
51
Unintended Consequences
• When we say that an outcome (or a sideeffect) is “unintended,” do we merely mean
that it is unanticipated?
• If we could anticipate (predict) such an
outcome or effect, would it necessarily
become intended?
• Does ethical or legal responsibility follow?
• Can blame be assigned without evidence of
predictability?
52
Conclusions
• Complexity research has deep roots in several traditional
scientific disciplines
• It has advanced the state-of-the art in these fields and promoted
cross-pollination among them
• It has been a major enabler in the development of new subdisciplines (e.g., social network analysis, non-linear dynamics)
• It has not (yet) yielded a revolutionary new paradigm for
scientific research
• It offers significant potential benefits in C2 research
– The mindset and toolbox can be exploited to advance OR and C2
research methodology
– Discoveries in other disciplines can be translated into useful
insights or partial solutions to C2 problems
• It does not invalidate any previous work or challenge the goals
of C2 research
53
Questions or Comments?
54