Complexity explained? Part 1
Copyright 2014-17 Graham Berrisford. One of more than 100 papers
on the System Theory page at http://avancier.website.
Last updated 31/05/2019 16:24
To master complexity implies knowing
how to define, describe and design it.
Part 1 discusses some views of
complexity, including state variety, chaos, unpredictability and
self-organisation.
Part 2 discusses
more views: complication, complex adaptive systems and relative complexity.
On first thinking about it, one is bound to think the complexity of a thing has something to do with:
·
differentiation -
variety in the components the thing is made of
·
integration –
variety in relationships between those components.
However, that is only to
scratch the surface of what complexity can mean.
And it turns out that different people have very
different ideas about it; e.g. some equate chaos and complexity.
Further confusions and ambiguities arise when the
thing of interest is a called a “complex system”.
Another paper called Complexity explained starts by defining “complexity science” thus:
“Complexity science, also called complex systems science, studies how
·
a
large collection of components
·
locally
interacting with each other at small scales
·
can
spontaneously self-organize to exhibit non-trivial global structures and
behaviors at larger scales,
·
often
without external intervention, central authorities or leaders.”
All these
points are questionable.
“A large collection of components”
Does
complexity require a large variety - many types
- of component and interaction?
Or does it
require a large quantity - many instances
- of components in space and/or interactions in time?
A system may
have a wide variety of component and interaction types, but few instances of
them.
A system may
have only a few component and interaction types, but many instances of them.
Studying
large populations of interacting components (often only a few types) is the
domain of agent-based modelling and System Dynamics.
Other
definitions of “complexity science” embrace other branches of general system
theory, such as cybernetics.
“locally interacting with each other
at small scales”
Does
complexity require components to interact only locally?
Where
components exchange flows of matter or energy, components often interact
locally.
But where
components exchange flows of information (or gravity) components often interact
remotely.
Tele-communication
stacks enable remotely distributed actors to be unaware of how messages are
conveyed.
In a social
system, the actors may exchange information directly (point to point), via one
intermediate actor (a hub) or several (in a grid).
“can spontaneously self-organize”
Are we to
equate complexity with self-organisation?
The term
“spontaneous” is questionable, since every change to a system is a response to
some stimulus or precondition.
The term
“self-organisation” has been interpreted in many ways – to be discussed below.
In some
views, the system that organises itself may be simple. E.g. the growth by
accretion of a crystal in a liquid.
In a social
system, the actors may step outside that system to re-organise it.
“to exhibit non-trivial global
structures and behaviors at larger scales”
In
agent-based modelling and System Dynamics, this means exhibiting a complicated state or state change trajectory.
However,
chaos theory taught us that very simple systems can produce these effects.
And
complexity is not simply a function of scale.
E.g.
An elephant’s brain has up to three times the number of
neurons in a human brain; yet the latter recognises and produces more
complexity.
A simple system may be scaled up (e.g. adding more fish
to a shoal) with little or no increase in complexity.
The complexity of a scale-free
network lies in the orderly patterns it imposes on the chaos of a random network with the same number of
nodes.
A small system, as in the finest clockwork pocket watch,
can be complex
“often without external intervention,
central authorities or leaders.”
The most complex systems known to science are probably the organic
machines that have evolved in biology, especially ones with a central nervous
system.
The term
“authorities or leaders” implies the domain is sociology, rather than biology
or more general system theory.
In a social system, an actor may exhibit behaviors contrary to any direction given to them by another actor – be it a colleague or a leader.
The
complexity this introduces into system design is called “exception handling”.
Another paper on
complexity theory, Complexity
is not systems thinking, asserts that complex systems have the four
characteristics below.
· Identity: a set of core characteristics identify the system, and remain when it evolves or changes.
· Homeostasis: the system maintains its relative internal stability.
· Permeability: the system interacts via inputs and outputs with its environment.
· Self-organization: new behaviors emerge in response to changes in the environment.
However, identity, homeostasis and permeability can be found in the simplest of mechanical control systems
This leaves us with self-organisation as the defining feature of a complex system.
Before discussing self-organisation, there are many other ideas to explore.
The paper Complexity explained suggests features definitive of “complex systems” include:
·
A
network of interacting components
·
Holistic
view of how components interact
·
Emergent
properties (not found in individual components)
·
Dynamic
state change
·
State/history-dependent
processing
·
Chaos
(as in unpredictability of outcomes arising from variations in initial
conditions)
·
Chaos
(as in a non-linear state change trajectory)
·
Unpredictability
·
Self-organisation
·
Adaptation
·
Evolution
This paper explores ambiguities in the use of these terms.
A principle of system theory is
that all the elements of a system element are related directly or indirectly.
Else there would be two or more
distinct systems.
Defining
the network that connects components
Each component may interact with one, a few or many other components.
Wherever two components interact, a structural relationship may be drawn between them.
Connecting all the components by these relationships reveals a communication structure, sometimes a hierarchy, but more commonly a network.
To assess the
complexity of this network structure, you must know its nodes and their inter-connecting
relationships.
Two
ideas about network structures
Real world networks have been studied extensively.
E.g. the world-wide-web, the internet, energy landscapes, biological (cell-to-cell and protein-to-protein) networks and social networks.
Some have proposed such diverse networks can “self-organize” themselves.
Small world network
One
idea is that as a network grows, it tends to minimises the number of steps from
one node to another.
Mathematically speaking, in such a “small-world'' network, the number of steps grows in proportion to the logarithm of the number of nodes.
Scale-free
network
Another idea is that as a network grows, the number of major “hubs” with many connections increases, along with minor hubs between the major hubs.
The complexity of a scale-free network lies in the orderly patterns it imposes on the chaos of a random network with the same number of nodes.
These ideas may be useful to designers of large systems in which actors or subsystems must communicate.
However, it turns out that scale-free networks are rare in nature and society.
“The universality of scale-free networks remains controversial.
Across [nearly 1,000] networks, we find robust evidence that strongly scale-free structure is empirically rare.
Furthermore, social networks are at best weakly scale free, while a handful of technological and biological networks appear strongly scale free.” Scale-free networks are rare Nature, 2019
The systems of interest here are not passive structures, they are activity systems.
They contain components that interact with each other and/or with entities in their environment.
The system of interest is not simply a network structure (discussed above), it is an activity system.
In accord with the “primacy of behavior” principle, it is defined by its behaviors or dynamics.
The system is composed of interactions between its constituent components.
Abstracting
system interactions from network relationships
Any two components, connected by a relationship in network structure, may interact in several different processes.
So, the complexity of a system in which components interact is a different from the complexity of the network structure that connects them.
A complex network of connections
(or acquaintances) may be used for simple interactions.
Conversely, a simple network of
connections may be used for complex interactions.
Note,
moreover, that actors in one communications network may interact in several
different (complementary, cooperating, competing or contrary) systems.
E.g.
the people in one city may vote in both local and national government elections
- and vote for different parties in each.
Abstracting
system interactions from the communication stack
Every
communication act between human or computer actors depends on the use of
communication stack between them.
When
co-located humans converse, they use neurons > vocal chords > sound waves
> ear drums > neurons.
When
remotely distributed humans communicate via a telecommunications network, the
lower levels of the communication stack are hidden from them.
The
depth and complexity of the telecommunication stack is irrelevant to most
systems of interest that actors cooperate in.
Coupling is the relating of systems (as subsystems) in a wider system.
The actors in a system can interact by forces, energy, matter or information, the last of which is the main interest in sociology.
Read System coupling patterns (archetypes) for discussion of coupling patterns.
A principle of system theory is that
the whole is more than the sum of its parts.
“that a whole machine should be built of parts of given behavior is not sufficient to determine its behavior as a whole:
only when the details of coupling are added does the whole's behavior become determinate. ” Ashby 1956
Emergence primarily
means the appearance of effects (state changes or outputs) that arise from
coupling subsystems in a wider system.
Even in the simplest system, its
emergent properties are not deducible from the properties of its components.
E.g.
Emergent property |
cannot be deduced from studying |
The V
shape of a flight of geese |
one goose
|
The
forward motion of a bicycle + rider system |
a rider
or a bicycle |
The
famous collapse of the Tahoma narrows bridge |
the
structure of the bridge or the wind |
In each case, property emerged from
the interaction between the system’s components.
“You don't need something more to get
something more. That's what emergence means.” – Murray Gell-Mann
This is misleading, since you do need something more than the
components of a system, you need the interactions
between them.
Emergent does not mean unwanted or unexpected.
The requirements for any designed system include its emergent properties.
E.g.
The forward motion of bicycle and rider was wanted and expected by the bicycle designer.
Much else said about emergence is questionable.
“The concept has been used to justify all sorts of nonsense.” Gerald Marsh.
Some
say order emerges from disorder, as though this is
spontaneous.
In fact, there is always a stimulus or precondition (not to mention the input of energy and an increase in entropy outside the system).
Some say abstract concepts emerge from concrete matter and energy.
Conversely however, emergence can be seen in the opposite direction:
E.g.
Information/meaning emerges from reading physical signals or symbols.
Conversely, physicals signals and symbols emerge when senders write them to convey information meaning.
Conscious thought emerges from electrical activity in the brain.
Conversely, when adding two numbers, electrical activity in the brain emerges from conscious thought.
Read Holism and emergent properties for more discussion.
Holism means looking at something in terms of how its parts join up (e.g. bicycle and rider) rather than dissecting each part.
Holistic view: a description of how parts relate,
interact or cooperate in a whole.
Reductionist view: identifying the parts of a whole,
naming or describing parts without
considering how the parts are related in the whole.
A
principle of system theory is to take a holistic view of a system.
To
understand even a simple system, you must see beyond its components and
understand their interactions.
And
in doing that, you may disregard the internal complexity of the components.
E.g.
To
understand a bicycle + rider system you must understand how they interact to
produce the forward motion of both.
And
in doing that, you can disregard the internal biological/psychological
complexity of the rider.
So, having a holistic view of a thing does not mean you know the “whole”.
You do not know all there is to know about the thing or any part of it.
Read Holism and emergent properties for more discussion.
Principles
of system theory include “hierarchy”.
In
biology and other sciences, an organ or component is a subsystem, which may be
described as a system in its own right.
After
a first (top level) division into subsystems, each subsystem may be decomposed
- recursively - several times.
With his background in biology, von
Bertalanffy wrote
of a concept he called organicism.
Organicism:
the idea that systems are describable at multiple hierarchical levels.
A system may be decomposable into subsystems, and/or composable (with others) into larger systems.
A
body |
is
composed from |
organs |
that
interact in processes to sustain the body. |
An
organ |
is
composed from |
cells |
that
interact in processes to sustain the organ. |
A
cell |
is
composed from |
organelles |
that
interact in processes to sustain the cell. |
At every level, a biological entity has cross-boundary input/output flows.
So,
an event that is external to a
smaller system is internal to a
larger system
And
the emergent properties of a small
system are ordinary properties of any
larger system it is a part of.
The hierarchy above is simple physical containment hierarchy.
A system can be divided in different way, into parallel systems.
E.g. biologists see the body in terms of circulatory, respiratory, digestive, excretory, nervous, endocrine, immune, muscle and reproductive systems.
Ultimately,
every physical entity is reducible to sub-atomic particles; and every emotion
is reducible to biochemical reactions.
At
the bottom level of description (of interest to physicists and bio-chemists)
everything we see and feel is well-nigh infinitely complex.
Does
that mean every physical entity is a complex system?
E.g.
A
bicycle + rider system can be described as a simple system composed of two
coarse-grained components.
That
same system is infinitely complex if you decompose those two components down to
the level of subatomic particles.
The
lesson here is that complexity is matter of perspective.
It
depends on a) what your interest in a thing is and b) what you regard at the atomic components.
So,
to compare the complexity of two systems, they must be described a) in the same
style and b) at the same level of decomposition.
Atomic element: an element that is not further
divided in a description.
System describers choose how far to subdivide a system of interest.
E.g.
System |
Atomic actors (active structures) |
Atomic activities (behaviors) |
Astronomy / The solar system |
sun and planets |
orbits |
Biology / organism |
organs |
organic processes |
Biology / cell |
cell |
a chemical or
signal exchange |
Biology / beehive |
honey bee |
deliver pollen, perform and observe wiggle dances |
Biology / predator-prey
system |
wolves and sheep |
eat sheep, eat grass |
Economics /
economy |
trader (customer
and/or supplier) |
a trade or
transaction |
Sociology /
society |
person |
a communication
act |
Sociology / symphony
performance |
orchestra player |
musical notes |
Sociology /
business |
employee |
a
one-person-one-place-one-time activity |
Software /
application |
a module or
object |
an operation |
In each case above, an atomic actor may be a complex entity, and may play roles other systems.
Obviously,
we cannot understand or explain society or business at the bottommost level of
description (of interest to physicists and bio-chemists).
We
explain these systems at much higher level of abstraction - with reference to
actors and activities we regard as atomic, irreducible, elements.
Similarly,
the complexity of communications, between social or software entities, depends
on how far you unravel the levels of the communication stack.
The
atomic activities in communication stack are units of micro-scale physical
matter/energy.
E.g.
Vibration patterns in sound waves, and electron movements in electrical
circuits.
In cybernetics: “a system is any set of variables which he [the observer] selects” Ashby 1956.
A principle of system theory is that a system has a current state.
A system can be analyzed in terms of
how its state variables change over time, often in response to inputs from
outside the system.
Even the simplest system, say a
pendulum or a thermostat, changes state dynamically.
System state: the current structure or variables of a system, which may change over time.
A concrete system’s property values realise property types or variables in its abstract system description.
E.g.
|
Properties |
Examples |
Abstract description of system state |
Variable
types |
Air
temperature. Displayed colour. |
Concrete realization of system state |
Variable
values |
Air
temperature = 80 degrees. Displayed colour = red |
System state change: a change to the state of a system.
Chaos theory taught us that a simple system can change state in an unpredictable way.
When you turn on a tap, the stream of water may start by running in a smooth way.
As you continue to turn the tap, its state may switch from smooth to confused, and back again.
Any system may change state at
different rates at different times.
It may stay in a stable state for a
while, then be triggered by an input to move to an unstable state, or to a
different stable state.
Stable states are sometimes called
“attractors”.
“The principal heuristic innovation of the systems approach is what may be called ‘reduction to dynamics’ as contrasted with ‘reduction to components’ ” Laszlo and Krippner.
A system is characterised by what does more than what it is made of.
It is defined by some behavior(s) that are modelled with some
particular interest in mind.
Behaviors are processes performed by parts,
often called actors or agents, that play roles in a system.
Behaviors change the state of the system or something in its environment.
In cybernetics, the behaviors
are processes that change state variable values.
In Forrester’s System
Dynamics, the behaviors are inter-stock flows that change stock populations.
However, some use the term
behavior differently - to mean the trajectory that a state variable’s values
take over time.
That trajectory can be either
regular (linear or cyclical) or irregular/chaotic.
Even if the state change
trajectory is irregular, it is still an inexorable side effect of regular
behaviors.
A principle of system theory is that
processes – when changing the state of a system - are influenced by its current
state.
The current state is a result of all
past processes.
E.g.
The current state of |
Is a result of |
The
moon’s surface |
past
asteroid strikes, over millennia. |
A human
actor’s memory |
past
perceptions and thought processes, over life time |
A
computer actor’s memory |
past input
message and computations. |
Every decision and action that depends
on the current state of a system is – indirectly - dependent on its past
history.
An information system that does not
maintain current state information can derive it from a log of past events
(this is called hysteresis).
Chaotic generally means disorderly, random, with no regular or repeated pattern.
Confusingly,
some people equate chaos with complexity.
Counter-intuitive
to some, complexity appears when order emerges from chaos.
E.g.
The
complexity of a |
Composed
of |
Emerges from orderly patterns imposed
on |
Scale-free network |
Nodes |
a chaotic random network with the same nodes. |
Biological cell |
Molecules |
a chaotic soup of the same chemicals/molecules. |
Social network |
Actors |
chaotic ad hoc interactions of the same
actors |
Chaos has two more specific meanings
related to unpredictability and non-linearity.
Chaos 2 -
unpredictable outcomes arising from variations in initial conditions
“Chaos: When the present determines the future, but the approximate present does not approximately determine the future.” –– Edward Lorenz
Chaos theory applies to deterministic
systems in which the future is predictable from the initial conditions of the
system.
In practice, prediction (for example
of the weather) is often frustrated by two obstacles:
·
small
differences between initial conditions can lead to dramatically different
futures
·
the
observations and computations needed are beyond what can practically be made
Chaos 3 - non-linear state change
trajectory
Linear system state change has
straight line trajectory - directly proportional to time, or to an input event
stream.
Non-linear state change has a curved,
jagged or chaotic trajectory.
Non-linear state change can be the
outcome of a simple system with simple rules.
E.g.
A simple predator-prey system can have
a chaotically non-linear state change trajectory.
A system composed of wolves and sheep interacting according to simple rules can change state chaotically.
The behavior of an individual actor (e.g.
a wolf) in response to an event may be deterministic and predictable from its
current state.
Yet at a macro level, the volumes of
populations (wolf packs and sheep flocks) can fluctuate in what seems a random
or chaotic manner.
Populations may remain stable for a while,
then boom or bust.
The Plexus Institute glossary says:
"complexity is found in systems when there are unpredictable interactions of multiple participants and components across many levels of the system."
(The glossary contains no definition of "system", or what the "levels" of a system are.)
The definition equates unpredictability with complexity.
However, as indicated above, the next action or state of a simple system may be unpredictable because
·
its
current state is unknown
·
its
rules include a random or probabilistic choice between actions.
·
interactions
between actors at a micro-level lead to unpredictable state change effects at
the macro-level.
And in the simplest human social
system, how an actor responds to information received is unpredictable.
Because humans, having free will, can
choose their response – either within the bounds of a system, or contrary to
the rules of that system.
Adaptation is ambiguous, since it can mean either system state change (as in homeostasis) or system mutation.
A simple system can adapt by
responding to change in its environment.
A simple homeostatic system can withstand perturbations and restore its original state after a large perturbation.
E.g. a thermostat-controlled heating
system.
The term adaptation is used with a
wide variety of meanings, including
·
Homeostatic
adaptation through state restoration.
·
Psychological
adaptation through biological development and learning from experience.
·
Sociological
adaptation through social communication and learnng from education.
·
Species
evolution through genetic variation and natural selection.
The last of these is discussed separately
below.
Like every other entity, a system has a discrete life time, which can be short or long
Evolution:
inter-generational system mutation, a changing the nature of a system.
Evolution can be organic (a virus) or
designed (a software upgrade).
It is generally helpful to distinguish
system state change from system mutation.
However, beware that long-lived
coarse-grained systems often depend on shorter-lived subsystems.
And to maintain the state of a long-lived coarse-grained system may require mutations at the level of its atomic actors.
E.g. consider a termite colony as a system that must restore the state of its mound after damage.
No mutation is needed, since termites are innately pre-programmed to make this homeostatic state change.
E.g. consider a football team as a system of players that aims to win football matches.
Player mutations (when the manager replaces one player by another) are necessary to maintain the state of the whole team.
E.g. consider your immune system as a colony of cells that must recognise and dispose of pathogens.
The immune system doesn’t know what invaders it might meet, so it makes millions of different cells, each to recognise a different pattern.
Cell mutations are necessary to maintain the healthy state of the whole organism
E.g. consider the biosphere as a system of organisms that recovers from mass extinctions over millions or billions of years
Organism mutations are necessary to maintain the state of the whole biosphere.
The evolution of biological organisms depends on chance mutations proving beneficial to survival.
The evolution of business organisations depends primarily on designed mutations proving beneficial to customers.
Evolution favours whatever helps organisms and organisations perform better than rivals in competition for limited resources.
Typically increasing efficiency can involve simplification, and increasing effectiveness can involve complexification.
Some system theorists (notably Ashby
and Maturana) have said the concept of a self-organising system makes no sense.
However, term is widely used, and has
been interpreted in an extraordinarily wide variety of ways.
Self-organisation =
absence of a design or pattern?
This means there is no law, rule or
definition of how a system forms or changes.
However, one may say there is a
blueprint for much so-called “self-organisation”.
The blueprint for self-organisation in
a solar system is found in the laws of physics
The blueprint for self-organisation in
a molecule is found in the chemists’ periodic table
The blueprint for self-organisation in
a biological organism is found in its DNA.
The blueprint for self-organisation in
a social system is found in the minds or documents of its actors.
In the first three examples, the
self-organisation is predetermined and predictable in theory if not in
practice.
Self-organisation =
decentralised control?
Decentralisation means there is no
central control - no overarching controller of system processes.
Rather, the processes of the system
are distributed between atomic components or agents.
In sociology: this may be called
anarchy or a participatory democracy (apparently the vision of many social
systems thinkers).
In computing: this corresponds to the
“choreography” design pattern rather than the “orchestration” pattern.
The latter is found in very simple
software systems; does not imply “self-organisation”.
Self-organisation =
accretion?
Accretion means growth or increase by the gradual accumulation of additional layers or matter.
The accretion of a crystal growing in
a super-saturated liquid is a simple process that has been called
self-organisation.
The accretion of a city growing as it
attracts more people and money has also been called self-organisation.
But neither is what people usually
think of by “self-organisation”.
Self-organisation =
flocking?
Flocking means to move or go together
in a crowd.
The shoaling behavior of fish has been
called “self-organisation”.
The behavior of a flock of starlings
wheeling in the sky has also been called “self-organisation”.
In both examples, many simple
interactions between many adjacent actors produces a complicated moving shape.
However, the complexity of these state
change (visual) effects is more in the eye of the observer than in the system
itself.
Self-organisation =
morphogenesis?
By any measure, the morphogenesis of
an organism is a complex process.
The process, predetermined by DNA,
inexorably builds an adult organism from an egg.
As the process proceeds, new kinds of
component and interaction emerge, increasing the complexity of the organism.
Self-organisation =
business reorganisation?
All the processes above are very
different from the “self-organisation” of a social or business organisation.
The morphogenesis of a business
organisation is a process that leads to outcomes that are not inexorable or
pre-determined.
Organisation changes are stimulated by
a variety of internal and external forces - that might either complexify or
simplify the organisation
Ultimately, external forces
(political, economic, social, legislative and environmental) dominate the
internal ones.
A classification of
system change varieties
Further analysis resulted in this (tentative) classification.
· State change: changing the values of given state variables (typically triggered by inputs).
·
Update: changing
the values of variables in response to inputs.
·
Accretion:
as in the expansion of a city, or the inexorable growth of a crystal in a
super-saturated liquid
·
Flocking:
as in the flocking of starlings, or the shoaling behavior of fish
·
Self-regulation:
as in the maintenance of homeostasis during the life of an entity
·
Self-sustaining:
in which autopoietic processes make and maintain the structures that
perform the processes.
·
System
change: changing the variables or the rules that update their values.
·
Reconfiguration:
changing behaviour in a pre-ordained way.
·
Leverage:
switching a system from one given configuration to another.
·
Morphogenesis:
as in the inexorable growth an embryo into an adult
·
Evolution:
changing behaviour in a random or creative way.
·
Discrete mutation: replacement
of one system generation by the next.
·
Mutation with random change:
as in biological evolution.
·
Mutation with designed
change: redesign by external
observers, or by self-aware actors who observe and change the system they play
roles in.
·
Continuous mutation: n/a.
Impossible here, since it is contrary to the notion of a system.
Self-organisation
appears above in several guises.
The guises that
look like they fit much social systems thinking discussion are:
·
Continuous
mutation (which must be rejected, because it undermines the concept of a
system).
·
Mutation
with designed change by self-aware actors who observe and change the system
they play roles in.
Ashby and Maturana have said the
concept of a self-organising system makes no sense.
They rejected the idea that a system can change itself by creating new variables or rules.
They said a system can only be
re-organised from outside the system, by a higher process or meta system.
Since c1950,
the basic idea of general system theory has been that some ideas and principles
are common to systems in all scientific and professional domains.
However,
socio-cultural systems thinking discussions often confuse generality with
analogy..
E.g. To say an open system consumes inputs and produces outputs is a generalisation - common to different sciences.
To say the structures of a human’s nervous system and a business management system are both divisible into five parts is an analogy.
Drawing an analogy is different from saying the same science applies to both
It is common to draw analogies between biological organisms and business organisations.
This and other analogies can help to explain some things but can also (as Ackoff and Bausch have observed) be misleading.
The biology-sociology analogy
“Every
object that biology studies is a system of systems.” Francois Jacob
The same is
true in some other sciences; however, it is misleading to compare sociology
with biology.
Every living organism is essentially an open
system. It maintains itself in a continuous inflow and outflow…” Bertalanffy.
Clearly, the
notion of the system as a processor of inputs is a generalisation that fits
sociology and biology.
This
system |
depends
on |
that
consume |
and
manufacture |
A biological cell |
bio-chemical
structures |
higher forms of energy
and order (e.g. sunlight and complex chemicals) |
bio-chemical
structures and outputs |
A business
organisation |
employees and
machines |
materials and/or
information |
materials and/or
information that customers value enough to purchase. |
But consider
the dramatic differences between a biological organism and a business
organisation.
A biological organism is a system of
subsystems that cooperate.
The
morphogenesis of an organism is a complex process (predetermined by DNA) that
inexorably builds from an egg to an adult organism.
As the
organism grows, new kinds of component and interaction emerge, increasing the
complexity of the system.
However, the
process of life terminates in decay and death that destroys the complexity of
that system.
The process
is cyclical, it repeats in each generation of the system.
The atomic
components (cells) have no free will, they cannot disobey the rules of
biochemistry.
The only
choice an organism can make in “self-organisation” is the choice of a mate.
This is very
different from the “self-organisation” of a social or business organisation.
A business organisation is a legal entity
that owns/contain many systems.
Some systems
cooperate, some duplicate each other, some fail to cooperate as desired, some
undermine or compete with each other.
The
morphogenesis of a business is a process that leads to outcomes that are not
inexorable or pre-determined.
The process
is ever-unfolding, it never repeats.
Organisation
changes are stimulated by a variety of internal and external forces - which
might either complexify or simplify the organisation
Ultimately,
external forces (political, economic, social, legislative and environmental)
dominate the internal forces.
Moreover, the
atomic components (human actors) have free will, and may disobey the rules of
any system they play a role in.
To say "the enterprise is a system", is somewhat misleading.
A business
can be described as one coherent system, but only at the vacuously abstract
level of income and profit/loss, of little use to most purposes.
It is better described as a social network in which actors play roles in regular processes that create and use business data.
To this end,
a business employs countless separately described and tested systems.
The business
actors both act in those business systems (where they behave regular ways) and
act outside of those systems (in ad hoc ways).
The ad hoc
behaviors of actors in social network are, by definition, not systematic, and
cannot be included in any holistic or systemic description of that network.
In short, a business is a social network that employs countless discretely describable and testable systems.
But it is infinitely more than any system it can be
described as realising.
We have many
methods for analysing and designing systems of interacting components.
There are
methods and design patterns for the design of software systems and computer
networks.
There are agent-based
modelling techniques for modelling epidemics, the flocking of birds and other
natural phenomena.
The
mathematical modelling techniques of System Dynamics are applied to systems of
all kinds – ranging from simple to complex.
There are
climate forecasting models, and computer models of pedestrian dynamics.
It is
important not to confuse such mathematical models with non-mathematical
socio-cultural models of the kind promoted in social systems thinking
discussion.
To master complexity implies knowing
how to define, describe and design it.
Generally, it
seems a system grows more complex by increasing the orderliness and variety of
its components and interactions more than by increasing their quantity.
Ashby defined
the complexity of a system as a measure of its state variety.
“A system's variety V measures the number of possible states it can exhibit:
(Ergodic means that the system visits all its possible states; non-ergodic means it does not.)
There are several
difficulties with Ashby’s definition of complexity.
First, the
measure is incalculably large for any system with a non-trivial state.
Second, scores of
other complexity measures have been proposed: e.g. see the measures below.
Third, does the measure apply to the control system, or the target system of which only selected variables are controlled?
“every real machine embodies no less than an infinite number of variables, all but a few of which must of necessity be ignored.” Ashby 1952
What to look at
in measuring the complexity of a tennis match?
The structures of
the rackets, balls, net, court surface etc.?
The movements of the players, balls, etc.?
A moment’s
thought is enough to conclude you can never measure the complexity of real
world entity or behavior per se.
You can only measure it with respect to your chosen description of its structures and behaviors.
And then, only
measure it at the level of abstraction at which you choose to describe them.
What is to be counted in a measure of complexity?
Do we count types or instances?
The number of component and interaction instances is a measure of scale or size.
The number of component and interaction types is a measure of variety.
The suggestion here is that complexity is more a function of orderliness and variety than of quantity.
Do we count the variety to be found in
· internal states the system can take (Ashby’s measure)
· internal components and connections between them (a structural measure)
· internal actions or interactions (a behavioral measure)
· inputs processed and outputs produced (the basis of function point measurement)?
A system that is complex in one of these ways may be simple in another.
Moreover,
note that there are two kinds of behavioral complexity, process complexity and
effect complexity.
Adding more fish
to a shoal may make its effects more
complex, but doesn’t make its processes
(fish-to-fish interactions) more
complex.
An economy
has famously been defined as a simple system of monetary flows in the processes
of spending, saving, investing and taxing.
Adding more
buyers and sellers to an economic system may make its effects more complex, but doesn’t necessarily make its processes more complex.
Whatever you count, you cannot do it
without reference to a description of a system’s components.
The trouble is, there are infinite
possible descriptions of one thing in terms of its components.
And there are scores of complexity
measures.
Here are a few complexity measures I have picked up over the years.
· Structural complexity = variety = number of states (Ashby).
· Structural complexity = inter-component dependencies (Sessions).
· Maximum structural complexity = components * (components - 1) / 2 (Brooks).
· Procedural (cyclomatic) complexity = number of decisions + 1 (Mcabe).
· System complexity = number of variables and the number and the nature of their interactions required to explain the properties and behavior of that system (Ackoff, 2003).
· Complexification formula: For every 25% increase in the complexity of the problem space, there is a 100% increase in the complexity of the solution space (Glass’s Law).
I have proposed two complexity measures of my own:
· Structural complexity = relationships between component types / component types.
· Behavioral complexity = the number of event/state combinations * the average procedural complexity of the rules applied to them.
There are many more (at least 40) possible measures of complexity.
"The diversity of measures that have been proposed indicates that the notions of complexity that we're trying to get at have many different interacting dimensions and probably can’t be captured by a single measurement scale” ("Complexity: A Guided Tour" Chapter 7 "Defining and Measuring Complexity" Melanie Mitchell.)
Much discussion of “complex adaptive systems” confuses what are better distinguished.
Students of system theory should be taught to distinguish:
· abstract systems from the entities that realise them
· component interactions from network structures
· emergence of effects from system mutation
· chaos from complexity
· unpredictability from non-linearity from complexity
· system state change from system mutation
· five or six interpretations of self-organisation
· science from postulation.
Modern
society depends on systems of many kinds, simple and complex.
Our
dependency on systems, and the complexity of those systems, has increased and
may continue to increase.
However, some
of today’s complex systems will become the irreducible atomic components of
tomorrow’s systems.
So, at the
level you need to analyse or design future systems, complexity may be contained
to what you can understand.
A commonly-referred-to chart of “complexity science” can be found here http://www.art-sciencefactory.com/complexity-map_feb09.html
The chart is misleading chronologically and conceptually; it mixes up science and pseudo science.
It mixes up scientists with people who make assertions and classify things with no empirical validation.
It includes people (e.g. Parsons and Luhmann) whose ideas are metaphysical – cannot be verified or disproved.
Features
discussed as definitive of “complex systems” were listed in the preface and
have been discussed in the body of this paper.
The trouble
is that simple systems have the same features.
So to say a
system is complex requires more.
It seems
increasing the order and variety in component and interactions is more
important than increasing their quantity.
If complexity science is to be called a science, it should be based on one or more measures of complexity.
Currently, there is no widespread agreement about complexity measurement
Which is more
complex, a hurricane or a water molecule? IBM or a hen’s egg?
It depends on
your point of view, on what your interest in them is, and how you describe
them.
A thing is
simple or complex only with reference to a particular perception or description
of it.
To compare
the complexity of systems, they must be describes a) in the same style and b)
at the same level of decomposition.
Often, an entity is called a complex system where one or more of the following are true.
· No measure of complexity has been agreed.
· No level of abstraction has been agreed.
·
No quantifiable properties are described, which
makes any measure of complexity impossible.
· No description of the entity as a system has been agreed, or even made.
· No description is possible, because the entity changes continually, rather than generationally.
Perhaps the bigger
problem is that so many draw the equation social network = social system.
Systems are
abstractions from realities.
No system
describes all that is knowable about a reality; and all systems may fail to
describe what an observer is interested in.
The actors in
one social network may play roles in countless different social systems.
The actors
make choices and behave in ad hoc ways – meaning that the network enables more
than all possibly describable systems.
It is
misleading to use the term "system" for any but one perspective of
what a business or social network does.
Other ways of
looking at the social network may be more helpful – for example in motivating
actors.
All said above is explored on the "System Theory" page at http://avancier.website
For
discussion of the same ideas in the wider context of general system theory and
enterprise architecture, read “Introduction to system ideas”.
For more discussion
of complexity, read Complexity explained part 2.
For
discussion of particular points made, read “System
coupling concepts”, “Second order
cybernetics”, “Third order cybernetics” and “Complex adaptive systems”.
All free-to-read materials at http://avancier.website are paid for out of
income from Avancier’s training courses and methods licences.
If you find the web site helpful, please spread the word and
link to avancier.website in whichever social media you use.