Document 7210741
Download
Report
Transcript Document 7210741
History, Theory, and
Philosophy of Science
(In SMAC + RT)
7th smester -Fall 2005
Institute of Media Technology
and Engineering Science
Aalborg University Copenhagen
4th Module
The paradigms of complexity
Luis E. Bruni
What is the paradigm of Modernity?
Modernity from ~1450 to ?
Scientific Rationalism 1600
Mechanicism 1600
Materialism 1700
Positivism 1800
Reductionism
Asymptotic knowledge grow
Total
Knowledge
Paradigms of complexity
In the 1900’s alternatives to the reductionist-positivistic
epistemologies.
Technological evolution produces a perception of increasing
complexity and interactive synergies.
Frontier disciplines cognitive sciences, evolutive sciences, systemic
thinking, philosophy of science, experimental epistemology,
cybernetics.
The cybernetic conferences
(1946-1953)
Cybernetics
Cognitive sciences
Information Theory
Systems Theory
The common denominator the notion of ”complexity”
The first conference ”The Feedback Mechanism and
Circular Causal Systems in Biology and the Social Sciences”
Coevolutionary paradigms
Coevolution of Cybernetics, Information Theory, Cognitive Sciences and
Systems Theory multiple research programs:
Game Theory
Artifitial Intelligence
Communication Sciences
Neurosciences
Complex Systems
Artifitial Life
Caos theory
etc, etc, etc…
Applications in mechanical, electronical, biological and social systems.
Transdisciplinarity was born
The advances in the new disciplines were marked by many travels back and
forth between machine, organisms, man, and society.
From the machine to the living organism transferring from one to the other
the ideas and concepts e.g. feedback and finality opening the way for
automation and computers.
The vocabularies of engineering and physiology started to be used
interchangeably the basics of a common language and concepts from
cybernetics, system theory, cognitive science, etc. was created e.g. learning,
regulation, adaptation, self-organization, perception, memory, emergence,
feedback, attractors, agency.
The need to make machines imitate certain functions of living organisms
contributed to the speeding up of progress in the understanding of cerebral
mechanisms.
The second industrial revolution
A good deal of the exponential growth in technological developments
and the mass production of devices and systems communication and
computation was directly influenced by the Macy’s Cybernetic
Conferences.
Norbert Wiener (1940’s) predicted a second industrial revolution
centered on communication, control, computation, information and
organization.
The future more interest in concepts than in hardware Second
Order Cybernetics
Definitions of Cybernetics
Greek kubernetes pilot, or rudder, steer.
The word was first used by Plato cybernetics "the art
of steering" or "the art of government ".
Ampère "the study of ways of governing."
One of the very first cybernetics mechanisms invented
by James Watt and Matthew Boulton (1788) a governor,
or ball regulator to control the speed of the steam
engine.
The modern definition
The word cybernetics was re-coined by Norbert Wiener in
1948.
A philosophical definition Louis Couffignal (1958)
"the art of assuring efficiency of action. "
Cybernetics the same root as government the art of
managing and directing highly complex systems.
Cybernetics
Cybernetics from the Greek word kybernetes steersman
"the art of steering"
Norbert Wierner mathematician and founder of cybernetics the science of
communication and control in the animal and the machine and in individual
human beings and social systems.
Claude Shannon information theory designed to optimize the transmission
of information through communication channels.
Cybernetics proposes a revolution with respect to the linear, mechanistic
models of traditional Newtonian science.
In classical science every process is determined solely by its cause a factor
residing in the past however, the behavior of living organisms is typically
teleonomic oriented towards a future state which does not exist as yet.
The new concepts
The simplest example of such a circular mechanism is feedback engineering
control systems.
The simplest application of negative feedback for self-maintenance is
homeostasis Walter Cannon (1932) American physiologist
homeostasis from two Greek words meaning to remain the same resistance
to change.
A homeostatic system an open system that maintains its structure and
functions by means of a multiplicity of dynamic equilibriums rigorously
controlled by interdependent regulation mechanisms.
The non-linear interaction between the homeostatic or goal-directed system and
its environment results in a relation of control of the system over the
perturbations coming from the environment.
Feedback
The first cybernetic conference (1946) ”The Feedback Mechanism
and Circular Causal Systems in Biology and the Social Sciences”
Positive and negative feedback
Systems Theory
"General System Theory” biologist Ludwig von Bertalanffy.
Systems Theory the transdisciplinary study of the abstract
organization of phenomena, independent of their substance, type, or
spatial or temporal scale of existence.
It investigates both the principles common to all complex entities, and
the (usually mathematical) models which can be used to describe them.
Applications as diverse as engineering, computing, biology,
ecology, management, social sciences etc.
The analytical vs. the synthetic approach
The systems approach distinguishes itself from the more traditional
analytic approach (reductionism) it emphasizes the interactions and
connectedness of the different components of a system.
However the systems approach integrates the analytic and the
synthetic method encompassing both holism and reductionism.
The systems approach in principle considers all types of systems
in practice it focuses on the more complex, adaptive, self-regulating
systems what we might call "cybernetic" systems.
Open vs. closed systems
von Bertalanffy noted that all systems studied by
physicists are closed they do not interact with the
outside world it is possible to calculate future states
with perfect accuracy.
Organisms are open systems they interact with other
systems outside of themselves they cannot survive
without continuously exchanging matter, energy and
information with their environment.
System boundaries
The inside and the outside of a system system and
environment boundaries.
The environment consists of other systems interacting
with their environments.
Systems are structured hierarchically
Hierarchical systems and subsystems sometimes fuzzy
or sharp boundaries interfaces, borders frontiers.
System boundaries
Black and white boxes
If we can see the system's internal processes we might
call it a "white box".
The black box view is not restricted to situations where we
don't know what happens inside the system.
In many cases we can easily see what happens in the
system yet we prefer to ignore these internal details.
This has to do with our selection of boundaries and
hierarchical levels.
Black and white boxes
Hierarchical and emergent systems
According to the analytic approach reductionism the low level
view is all you need.
Example in medicine the body is a whole the state of your
mind affects the state of your stomach which in turn affects the state
of your mind
These interactions are not simple, linear cause and effect relations
but complex networks of interdependencies maintaining the
organism in good health
These interactions function at the level of the whole they are
meaningless at the level of an individual organ or cell.
Emergence
Emergence the arising of patterns, structures, or properties that do
not seem adequately explained by referring only to the system’s preexisting components and their interaction.
The new global patterns or properties are radically novel with respect
to the pre-existing components.
The emergent patterns seem to be:
unpredictable
nondeducible from the components
irreducible to those components.
Emergence and reductionism
Each ‘level’ of complexity of nature involves new
interactions and relationships between the component parts
which cannot be inferred simply by taking the system to
pieces.
Yet ontological reductionism implies that even if higher
order properties are emergent they remain secondary to
lower-order ones.
The lower the order the greater the primacy it seems as
if only lower order explanations can be ‘truly’ scientific.
Downward causality
Reductionism the laws governing the parts determine or
cause the behavior of the whole "upward causality"
from the lowest level to the higher ones.
In emergent systems the laws governing the whole also
constrain or "cause" the behavior of the parts.
When we say that the whole is more than the sum of its
parts, the "more" refers to the higher level laws, which
make the parts function in a way that does not follow from
the lower level laws.
Hierarchical organisation
Hierarchy A partially-ordered structure of entities in which every entity but
one is successor to at least one other entity; and every entity except the basic
entities is a predecessor to at least one other entity.
Although each level in a hierarchy has its own laws, these laws are often similar.
The same type of organization can be found in systems belonging to different
levels structural and functional organization.
The mechanistic paradigm seeks universality by reducing everything to its
material constituents.
The systemic paradigm seeks universality by ignoring the concrete material
out of which systems are made so that their abstract organization comes into
focus.
Structure and function
Structure the complex of concurrent relations among a set of objects
with the number of objects more numerous than the ordinality of the
relations connecting them the actual relations which hold between
the components which integrate a concrete entity in a given space.
Function The normal or characteristic action of a system of entities,
generally in time a notion that arises in the description made by the
observer of the components of a machine or system in reference to an
encompassing entity which may be the whole system or part of it
and whose states constitute the goal that the changes in the components
are to bring about.
Thus we find similar structures and functions for different systems,
independent of the particular domain in which the system exists
physical, chemical, biological, cognitive (mental), social or cultural.
Towards the mental sphere
New concepts useful not only in biology and engineering but also for a
transdisciplinary synthesis in human sciences.
Cognitivism ideas from cybernetics, computarized models of cognitive
processes, information theory a challenge to academic psychology.
Kenneth Craik (1944) what kind of machine is the human being situated
between an information output and a gun? an analogy between human mind
and servomechanism.
Welford (1947) the first model of mental function in terms of information
flux cognitivism.
Cognitivism
Behaviorism the observable behavior of organisms (humans, animals)
resulting from exposure to different stimuli.
Cognitivism mental processes the primary object of study model the
mental processes.
Knowledge symbolic, mental constructions in the minds of individuals.
The development of computers with a strict "input - processing - output
architechture" from the 1960s and up till today certainly have inspired these
"information-processing" views of mental processes "informationprocessing ideas".
Critics to cognitive science
1.
The emotion challenge cognitive science neglects the important role of emotions in
human thinking.
2.
The consciousness challenge cognitive science ignores the importance of
consciousness in human thinking.
3.
The world challenge cognitive science disregards the significant role of physical
environments in human thinking the context.
4.
The body challenge cognitive science neglects the contribution of the body to
human thought and action.
5.
The social challenge human thought is inherently social in ways that cognitive
science ignores.
6.
The dynamical systems challenge the mind is a dynamical system, not a
computational system.
7.
The mathematics challenge mathematical results show that human thinking cannot
be computational in the standard sense, so the brain must operate differently, perhaps
as a quantum computer.
Cognitive science and philosophy of science
Interesting methodological questions:
What is the nature of representation?
What role do computational models play in the development of cognitive
theories?
What is the relation among apparently competing accounts of mind involving
symbolic processing, neural networks, and dynamical systems?
What is the relation among the various fields of cognitive science such as
psychology, linguistics, and neuroscience?
Are psychological phenomena subject to reductionist explanations via
neuroscience?
"Sciences of complexity"
"Sciences of complexity" studying self-organization and heterogeneous
networks of interacting actors or entities.
Some recent fashionable approaches have their roots in ideas proposed by
cyberneticians many decades ago e.g. artificial intelligence, neural
networks, complex systems, human-machine interfaces, self-organization
theories, systems therapy, etc.
Most of the fundamental concepts of these approaches were already
formulated by cyberneticians Norbert Wiener, W. Ross Ashby, Ludwig von
Bertalanffy, Heinz von Foerster, John von Neumann, Warren McCulloch,
Gregory Bateson among many others in the 1940's through 1960's.
Few practitioners in these disciplines seem to be aware that many of their
concepts and methods were proposed or used by cyberneticians since many
years.
Difficulties of defining complexity
Many specific definitions are only applicable to a very restricted
domain computer algorithms or genomes very technical
meaningless in other domains.
Latin complexus "entwined", "twisted together".
In order to have a complex system you need two or more components
joined in such a way that it is difficult to separate them.
Oxford Dictionary something is "complex" if it is "made of (usually
several) closely connected parts".
A basic duality between parts which are at the same time distinct
and connected.
Unpredictability, non-linearity, chaos
Intuitively a system would be more complex if more
parts could be distinguished, and if more connections
between them existed.
Complexity can be characterized by lack of symmetry
or "symmetry breaking" by the fact that no part or
aspect of a complex entity can provide sufficient
information to actually or statistically predict the properties
of the others parts.
The analytical (reductionistic) stream
Focuses in investigating parts.
It emerges from traditions of experimental science where a
narrow enough focus is chosen in order to pose
hypotheses, collect data, and design critical tests to reject
invalid hypotheses.
Because of its experimental base, the chosen scale
typically has to be small in space and short in time how
can we design experiments to test the safety of a GMO in a
given ecosystem?
The integrative stream
Knowledge of the system is always incomplete.
Surprise is inevitable.
There will rarely be unanimity of agreement among peers only an increasingly credible line of tested argument.
Not only is the science incomplete the system itself is a
moving target evolving because of the impacts of
management and the progressive expansion of the scale of
human influences on the Biosphere.
We can distinguish between:
Complex systems
Complicated systems
Simple systems.
A simple system
A system is “simple” if it can be adequately captured
using a single perspective or description and by a standard
(e.g. analytical) model providing a satisfactory description
or general solution through routine operations
Examples ideal gases, mechanical motion.
A ”complicated” system
A system is “complicated” when it cannot be
satisfactorily captured trough the application of a standard
model, although it is possible to improve the description or
the solution through approximations, computations, or
simulations
However a complicated system can still be
characterized by using a single perspective
Examples a system of many billiard balls in movement,
cellular automata, the pattern of communications in a large
switchboard.
Complex vs. complicated
Gallopín et.al. the basic criterion to separate “complex”
from complicated is the need to use two or more
irreducible perspectives or descriptions in order to
characterize the system.
Complex systems share with complicated ones the
property of not being capturable through the application of
a generic model through routine operations.
Complex systems different conceptions complexity
is not an automatic outcome of increasing the number of
elements and/or relations in a system.
Some attributes that distinguish complex systems
1.
Multiplicity of legitimate perspectives
2.
Non-linearity
3.
Emergence
4.
Self-organization
5.
Multiplicity of scales
6.
Irreducible uncertainty
Attributes of complex systems (I)
1.
2.
Multiplicity of legitimate perspectives it is difficult to
understand an adaptive system without also considering
its context.
Non-linearity many relations between the constitutive
elements are non linear the magnitude of the effects is
not proportional to the magnitude of the causes a very
rich repertoire of behaviour (e.g. chaotic behaviour,
multi- stability because of the existence of alternative
steady-states, runaway processes, etc.).
Attributes of complex systems (II)
3.
4.
Emergence "the whole is more than the sum of its
parts" a systemic property, implying that the
properties of the parts can be understood only within the
context of the larger whole and that the whole cannot be
analysed (only) in terms of its parts true novelty can
emerge from the interactions between the elements of the
system.
Self-organization the phenomenon by which
interacting components cooperate to produce large-scale
coordinated structures and behaviour.
Attributes of complex systems (III)
5.
Multiplicity of scales many complex systems are
hierarchic each element of the system is a subsystem of
a smaller-order system, and the system itself is a
subsystem of a larger order “supra-system” there is
strong coupling between the different levels the system
must be analysed or managed at more the one scale
simultaneously.
But systems at different scale levels have different sorts of
interactions, and also different characteristic rates of
change it is impossible to have a unique, correct, allencompassing perspective on a system at even one systems
level.
Attributes of complex systems (IV)
6. Irreducible uncertainty many sources of uncertainty arise in
complex systems:
Epistemological some of them are reducible with more data and
additional research, such as the uncertainty due to random processes
(amenable to statistical or probabilistic analysis) or that due to
ignorance (because of lack of data or inappropriate data sets,
incompleteness in the definition of the system and its boundaries,
incomplete or inadequate understanding of the system).
Fundamental (ontological) irreducible uncertainty may arise from
non-linear processes (e.g. chaotic behaviour), in the processes of selforganization and through the existence of purposeful behaviour
including different actors or agents each with their own goal entails
a different conception of causality.
From simple to complex
While some of the above attributes exhibited by complex
systems can be displayed by some complicated, and even
simple systems (such as non-linearity, or uncertainty)
the point is that any complex system is likely to have all of
them.
Implications of systemic complexity
for scientific research
1.
Fluctuations can drive averages.
Micro fluctuations (external or internal to the system) can, in certain
circumstances, lead to drastic restructurations at the macro level
demonstrated for a number of physical, chemical, and biological
cases by Ilya Prigogine.
Example drug testing usually considered statistically low-risk
with an average of less than one person in a thousand dying or
suffering irreversible damage.
If the system is “Prigoginean” a perturbation can amplify itself so
as to change the average values for example, synergetic factors
in which case attempts to deal statistically with those situations
are unsatisfactory not only socially but also scientifically the
“side-effects” can be unpredictable and more important than the
intended effects or benefits from the drug.
Implications of systemic complexity
for scientific research
Scientific research about complex systems may have
to deal with a compounding of complexity at different
levels the interplay between the factors across the
different levels and layers adds to the complexity
intrinsic to each of the layers.
Implications for scientific research
Attention to the complex systems properties presents
difficulties for established conventions of scientific
practice and expert advice within the scientific community.
Knowledge in the sense of insight and understanding is
absolutely not synonymous with capacity for predictions.
Equally, awareness of risks is not synonymous with
capacity to intervene to reduce or control the risks.
Increased control?
Many will argue that this is not new that ignorance and
incompleteness of knowledge have always been admitted within the
scientific project.
Gallopín et.al. There has been, in the past (and is still widespread
today), an important ideological process that has protected science
practice from having to address deeply this feature of inherent
uncontrollability.
Any uncontrolled change effects are interpreted as symptoms of the
imperfection in the current knowledge and/or its application, with the
presumption that more knowledge will reduce uncertainties, increase
capacity for control, and permit the remedying of past mistakes.