Transcript Document

Notes from talk “Ecology,
Complexity, and Metaphor”
• Simon Levin’s talk
• Theoretical ecologist (below directly from his web page)
– understanding how macroscopic patterns and
processes are maintained at the level of
ecosystems and the biosphere,
– In terms of ecological and evolutionary
mechanisms that operate primarily at the level of
organisms.
– Examples:
» evolution of diversification,
» the mechanisms sustaining biological diversity
in natural systems, and the
» implications for ecosystem structure and
functioning.
» The work integrates empirical studies and
mathematical modeling
Some integrating concepts
• Both speakers unsure of “new science” idea
or “complexity theory” concept
• Multi-cellularity and human tribes
(anthropology)
• Evolution and Game-theory (from
economics)
• Spatial Stochastic Processes
Spatial Stochastic Processes
– Perhaps combine Simulation with ESDA one day…test
for pattern similarity
• Certain CA rule sets manifest in patterns approximated by, say
a SAR, MA or CAR process
– {DGP is y=(I-pW)e}
– Regression estimation is y*=pWy.
• Directly estimating process is intractable…
– ESDA tests as a way of estimating scalar parameter
describing spatial autocorrelation within a cross-section
of data
• perhaps artifact of a complex adaptive system… so further
analysis than say OLS should be performed
• Reveals that units of study should not be though of as
independent observations (most econometrics, traditional
statistics)
Do communities/ecosystems exist?
Quick Time™ an d a
TIFF ( Un compr ess ed) de compr ess or
ar e n eed ed to s ee this pic ture .
Clements
QuickTime™ and a
TIFF(Uncompressed) decompressor
are needed to see thi s picture.
Gleason
Whittaker
oz.plymouth.edu
R.H.Whittaker
QuickTime™ and a
TIFF (Uncompressed) decompressor
are needed to see this picture.
Modeling Complexity: the limits to
prediction (Batty and Torrens 2001)
• Critical of Complex Systems Modeling
My summary of their critique: Unless they can be validated
with data then they are can not be considered useful—
perhaps paper is a reaction to recent popularity of complexity
modeling best illustrated with recent book of Wolfram?
– From Forrestor to CA simulation models
– Lack of testing or ability to test model vailidity empirically, with
data not used for its calibrartion and formal statistical tests
– I assume when they say ‘traditional modeling’ they mean mainly
micro-economic theory specifying models and estimation and
testing with a regression based econometrics approach
New complexity modeling doesn’t
tell us anything new?
:
• “Much of complexity theory so far has, in fact, been
concerned with demonstrating models of systems
that were initially deemed inexplicable because they
demonstrated surprising behavior. Once understood,
this behavior is no longer surprising, but invariably it
can only be explained by processes that exist at a
micro level giving rise to phenomena at a macro level
which, in turn, cannot be explained in traditional
macro terms. In short, much of complexity theory and
its modeling is rooted in explaining behaviors that
have already been observed and in some sense, can
thus be said to be no longer complex.”
Critical judgements towards
Complexity Modeling
• “Perhaps the most obvious use of complex
systems models which generate unexpected
change is for learning, education, and in the
broadest sense for entertainment.”
• 1)Parsimony
– (? What about large scale structural econometric equations ?)
• 2) Independence in validation
– “ Models which cannot be validated are thus no different
from qualitative reasoning, from intuition, or even dictat
which were the usual schemes used to develop policy prior
to the computer era.”
– “…traditional norms of theory development and hypothesis
testing have been relegated to the background.”
– Idea that all they do is show many different possible outcomes
originating from different combinations of rules. And no
assumption of a process OR resulting prediction can ever be
Parameterization:
“Forrester strategy”
•
“The difference between complex systems models and those that appeal to the
principles of strict parsimony . those that we have been referring to here as
traditional models . is one that revolves around the explicitness of assumptions.
In essence, traditional models are those in which all relations defining the model
are testable (?) while complex systems models have chains of relations that are
explicit but untestable in principle and/or untestable because data and
observations of their processes are not available.”
–
Econometric not all is testable…sometimes not really sure what is being measured
(measure = quantity x quality)…more difficult in economics to find measures of
theoretical concepts
•
–
Proxy variables, mismeasurements, “steady-state” concepts, regime or dummy variables,
human capital, utility etc… (for example in Barro Growth regressions)
Schelling Model EXAMPLE
•
“There are clearly examples of models of complex systems, such as the Schelling (1969)
models of spatial segregation, which articulate local action that leads to global pattern in the
simplest terms. However, even in that case, although the model is simple in its rules,
observations of how individuals exercise their preferences to segregate are rarely available and
the data to test such models is never complete.”
“...to see is to believe…”
– (Mandelbrot 1983, p.21 context of fractal geometry)
– ... The critical issue in complex systems models is
that this is not the only strategy. There are many
qualitative tests that are possible with respect to how
plausible structures are which generate believable
predictions, and these should be mapped out. In fact,
there has been hardly any work whatsoever on
strategies for validating models which deal with
intrinsically complex systems, and one purpose of
this paper is to raise awareness and encourage
debate in this domain.
•
Demontrations NOT no
practical value
:
Their own experiments with CA model of devopment in
Chicago
• “…it is already clear that very different outputs can
be obtained with quite minor changes in the rules
themselves.”
• “…such models are really demon-strations of
systems principles rather than vehicles for
operational analysis and policy-making. Models
which might be built along these lines to demonstrate
emergence and which are capable of being calibrated
to real data are likely to be much more specific than
those that we have illustrated here.”
a philosophical debate…?
• different objectives, view points, personalities
regarding the value of information and
knowledge?
•
“The problem is that short of statistical or numerical criterion, good rules for
choosing models based on a combination of discursive and reflective analysis
as well as standard quantitative evidence are not well-developed. In the case of
the CA-like models which we reviewed in the last section, there are so many
assumptions about the representation of space and the nature of the transition
rules that are used to determine development that it is not possible to definitively
use such a model to make predictions that we can act upon.”
Note: ? Perhaps we should never”act upon” the predictions a few people. And of a model the people
that will be most affected cant understand…perhaps should just be used as one piece of
information in a larger debate…and most accurate predtions are based on consensus and social
choice which include risk and loss functions of all..