A systems thinking vocabulary – with ambiguities
Copyright 2019 Graham Berrisford. One of about 300 papers at
http://avancier.website. Last updated 29/09/2019 19:44
We spend most of our time thinking in ways not well-called systems thinking.
They might be called “entity thinking”, “situation thinking”, “analytical thinking”, “creative thinking” or just “thinking”.
To call all of those “systems thinking” would be to overload the term until it means everything and nothing in particular.
So, can we pin down what systems thinking is about?
Contents
On
abstract and concrete systems - recap.
A
complex adaptive system (CAS)?
System: an entity describable as actors interacting in activities to advance the system’s state and/or transform inputs into outputs
· The activities are behaviors (over time) that change the state of the system or something in its environment - governed by rules that are describable and testable.
· The state is describable as a set of state variables - each with a range of values.
· An open system is connected to its wider environment - by inputs and outputs that are describable and testable.
These concepts can be seen in writings of Ashby, Forrester and Checkland.
In Ashby’s cybernetics, a system is modelled as processes that maintain a set of state variables.
In Forrester’s system’s dynamics, a system is modelled as inter-stock flows that maintain a set of stocks (variable populations).
In Checkland’s soft systems method, a system is modelled as actors who perform processes that transform inputs into outputs for customers.
Terms are often ambiguous or undefined in systems thinking discussion.
They can mean different things in different
schools, as this table indicates
Term |
School |
Meaning |
Complex |
Classical
cybernetics |
The measurable
complication of an abstract system description |
Sociological
thinking |
The un-measurable
disorder or unpredictability of a real world
situation |
|
Adaptive |
Classical
cybernetics |
System state
change – updating the values of system variables |
Sociological
thinking |
System mutation
- changing the roles and rules of the system |
|
System |
Classical
cybernetics |
Actors playing roles and acting according to rules |
Sociological
thinking |
A group of self-aware actors who inter-communicate
and act as they choose, or a problematic situation |
|
Emergence |
Classical
cybernetics |
A property arising from coupling subsystems into a large system |
Sociological
thinking |
Not seen before, new, or surprising. |
For more, read on.
Ackoff,
Ashby, Checkland and others have emphasised
that a system is a perspective of a reality.
“Different
observers of the same [concrete] phenomena may conceptualise them into
different [abstract] systems.” Ackoff 1971
Ashby urged us not confuse a concrete entity with an
abstract system that the entity realises.
“At this point
we must be clear about how a "system" is to be defined.
Our first impulse is to point at [some real entity
or machine] and to say "the system is that thing there".
This method, however, has a fundamental
disadvantage: every material object contains no less than an infinity of
variables and therefore of possible systems.
Any suggestion that we should study "all"
the facts is unrealistic, and actually the attempt is never made.
What is necessary is that we should pick out and
study the facts that are relevant to some main interest that is already given.”
(Ashby 1956)
A concrete entity is a system only
when and in so far as it realises a testable system description.
We do commonly abuse the term “system”.
We point to an
entity (e.g. a business organisation or a biological organism) and casually
call it "a system".
But with no explicit
or implicit reference to a particular system description, this is to say
nothing.
Idly calling an
entity (or process, problem or situation) a system is meaningless, because one
entity can realise countless systems.
Abstract system:
is a description or model of how some part of the word behaves, or should
behave.
E.g. the standard
heart beat is a theory of, or insight into, how some part of the world works.
Concrete system:
is a realisation by a real-world entity that conforms well enough to an
abstract system.
E.g. your own heart
beating is a real-world application, or empirical
example, of such a theory.
Science requires us to show the latter
conforms (well enough) to the former.
The basis of system theory |
Abstract systems (descriptions) <create
and use>
<represent> System thinkers <observe and envisage > Concrete systems (realities) |
These papers take
this triangular, scientific, view of system theory as axiomatic.
Note that the
relationship between physical entities and abstract systems is many-to-many.
One abstract system
(e.g. the game of poker) may be realised by countless physical entities
(countless card schools).
One physical entity
(e.g. a card school) may realise countless abstract systems (e.g. poker, whist
etc).
An abstract system does not have to be a perfect model of
what it describes; it only has to be accurate enough to be useful.
And note that systems thinking hides the full complexity of real-world entities that realise systems.
In discussion and testing of the stickleback mating ritual, no attention is paid to any individual stickleback, or its complex internal biochemistry.
Abstract system |
Conceptual |
A
set of roles and rules (the logic or laws actors
follow) |
The
stickleback mating ritual |
Concrete system |
Physical |
Actors playing the roles and acting according to the rules |
Countless
pairs of stickleback. |
“Since different systems may be abstracted from the same real thing, a statement that is true of one may be false of another.
… there can be no such thing as the unique behavior of a very large system, apart from a given observer.
… there can be as many systems as observers… some so different as to be incompatible.
… studying only carefully selected aspects of things is simply what is always done in practice.” (Ashby 1956, 6/14)
Unfortunately, most of us use the term system for both abstract systems (types), and concrete/real world entities or social networks that instantiate them.
And so, in systems thinking discussions, we often confuse them.
But remember, an abstract system may be realised by several real-world entities - each of which might also do others things
And one entity might realise other abstract systems – each defined by taking a different perspective of the entity.
Reconciling different abstract or soft systems is a theme of Checkland’s "soft systems methodology”.
System: an entity describable as actors
interacting in activities to advance the system’s state and/or transform
inputs into outputs
·
Abstract system: (e.g. the normal regular
heart beat) a description or model of how some part of the word behaves, or
should behave.
·
Concrete system: (e.g. your own heart
beating) a realisation by a real-world entity that conforms well enough to an
abstract system.
·
Natural system: a system that runs before
it is envisaged as a system.
·
Designed system: a system that is
envisaged before it runs.
·
Closed system: a system defined (as in a System
Dynamics model) without reference to its wider environment.
·
Open system: a system defined as
consuming/delivering inputs/outputs from/to its wider environment.
Atomic element: a system element that is not further
divided in a description.
Behavior: a service or process that changes the state of a system or something in its environment.
In most system
thinking schools, a system is dynamic, meaning it behaves in ways that
involve and change the structural state of the system.
This is true in
Ashby’s cybernetics, Forrester’s system dynamics and Checkland’s
soft systems
Ashby put it that
cybernetics deals with “all forms of behaviour in so
far as they are regular, or determinate, or reproducible.”
In other words, the behaviours of a system are orderly
processes that can be described by an observer or designer.
When performed, the processes can be tested because they
produce observable outputs and/or internal effects or state changes.
After Forrester, Meadows defined a system thus:
“A
set of elements or parts that is coherently organized and interconnected in a pattern
or structure that produces a characteristic set of behaviors."
In short, a system is characterised by what does more than what it is made of.
This principle is occasionally called the primacy of
behaviour.
Alternatively, the term behavior
can refer to the external appearance of a system.
That is, to how a system is observed to change
state over time - its state change trajectory.
Chaotic: 1) disorderly, random, with no regular or
repeated pattern. 2) unpredictable outcomes arising from variations in initial
conditions. 3) non-linear state change trajectory.
The system of interest can be orderly, yet produce chaotic
results.
The system’s behavior in response
to one event may be deterministic and predictable.
Yet the long-term behavior - the
trajectory of state changes over time - may be chaotically unpredictable.
Linear: in a straight line or a sequence.
Some use the
term non-linear as synonym for chaotic; properly speaking it has a different
meaning
Linear state
change: progress or change over time that is represented in a graph as a
straight line.
Non-linear
state change: progress or change over time that is represented in a graph as
oscillating, a curve or jagged.
Complex: a term with scores of different definitions.
A system that is complex in one way may be simple in another.
It may be complex
internally and/or appear complex externally
Internally, there
may be a wide variety in its variables, its roles or its rules.
Externally, there may
be complex convolutions in the observable trajectories
of state variable changes.
See below for
more discussion of complexity.
Coupling: the relating of subsystems in a wider
system by flows.
For discussion of coupling varieties, or design patterns,
read this paper on system
coupling varieties.
Deterministic system: a system that processes an input, with respect to its memory/state, to produce a result that is predictable.
The response may be to choose one of many actions and/or to
complete a complex process.
The choice between options may be made using random or
probabilistic rules – which makes the response relatively unpredictable.
Some equate unpredictability with complexity – which is
misleading - since even a very simple system can be unpredictable.
Emergence: the appearance of properties in a wider or
higher-level system, from the coupling of lower-level subsystems.
For example, the forward motion of a bicycle and its rider emerges from
their interaction; neither can achieve it on their own.
However, the term emergence has been used to mean many other things.
E.g. the emergence of order from disorder, the emergence of a system
from what seems chaos, or the emergence of system mutations by evolution or
design.
And “The concept has been used to justify all sorts of
nonsense.” Gerald Marsh.
For more on emergence, read “Emergence”.
Event: a discrete input that triggers a process.
Evolution: a progression of inter-generational system mutations that change the nature of a system.
Exception: what happens when actors do not complete
actions expected of their roles.
This is common in business systems composed of processes in
which human actors play roles.
The need to design systems with exception handling is a common source of complexity.
Flow: the conveyance of a force, matter, energy or
information.
Hierarchy: the successive decomposition of a system
into smaller subsystems or a behaviour into shorter behaviors.
Holism: looking at a thing in terms of how its parts join up rather than dissecting each part.
Having a holistic view of a thing does not mean you know the
“whole” - all there is to know about the thing and/or its parts.
Since how you identify the parts and join them up is only
one perspective out of countless possible ones.
Holistic view: a description of how parts relate, interact or cooperate in a whole.
Information flow: the conveyance of information in a message from a sender to a receiver.
Information quality: an attribute of a flow or a state, such as speed, throughput, availability, security, monetary value.
Information state: the information retained in a memory or store.
Information: any meaning created or found in a structure by an actor.
Learning: the processes by which an organism or AI machine remembers past events and responds to differently to new events (implies pattern recognition and fuzzy logic).
Linear: in a straight line or a sequence.
Meta system: a system that defines a system or changes it from one generation to the next.
Organicism: the idea that systems are describable at multiple hierarchical levels.
Process: a sequence of activities that changes or reports a system’s state, or the logic that controls the sequence.
Reductionist view: identifying the parts of a whole, naming or describing parts without considering how the parts are related in the whole.
Stochastic system:
a system that processes an input, with respect to its memory/state, to produce
a result that (due to some randomness) is predictable only with a degree of
probability.
System
environment: the world outside the system of interest.
System boundary:
a line (physical or logical) that separates a system from is environment, and encapsulates
a system as an input-process-output black box.
System interface:
a description of inputs and output that cross the system boundary.
Two kinds of social
group may be distinguished.
Social network:
a group of actors who communicate with each other directly or indirectly.
Social system: a system that is realised by a social network.
Social cell: a
social network in which actors find that realising the roles and rules of a
particular social system is so attractive they resist any change to it.
Although any human
society or business might be describable as mix of networks and systems, we
surely need to be clear which we are talking about.
Suppose you observe a
group of people playing a game of cards.
You identify the
actors in that social network, then work out the roles they play and the rules
they follow.
You can distinguish
three things.
·
The soft or abstract system - the rules of the
game - a perspective of the group’s behaviour.
·
The real machine or concrete system – the actors
distributing and playing their cards.
·
The social network - composed of actors who meet
to play roles in the card game.
The actors in the social
network are much more than the roles they play in the game of cards.
If your aim is to
understand or motivate the actors, you need tools other than "systems
thinking".
Suppose you observe a
second group of people.
You identify the
actors in a social network who talk to each other, and perhaps some goals they
have in common.
You see they
communicate and do stuff, guided by their personal and to-some-extent shared
goals.
But you cannot
identify any roles they play or rules they follow.
You see what may be
called a social network, but to call it a system adds no useful meaning.
It seems some
in the latest generation of systems thinkers are not clear what a system is.
To say every named
group of people (or social network) is a system is to use the term with no
useful meaning.
To say every entity
composed of inter-related parts is a system is to say nothing of practical use.
To say every problem
and situation we encounter is a system, is to denude the term of its value.
Some use the term
complex adaptive entity to describe entities and situations that are unstable
and disorderly.
That is, to describe
things which are not systems in
either a natural language or a general system theory sense.
If that thing is a business, problem
or situation that requires an "intervention”, you may need some social
network thinking as well as systems thinking.
Social systems thinkers often speak of a Complex Adaptive System (CAS).
The trouble is, it isn't clear they agree:
· why they call thing they are talking about a system
· why they call it complex, or how they could measure that
· in what ways they expect it to adapt and
· when
an adaptation changes one system into a different one.
Complex?
In cybernetics, a
system is complex if the system description
is complex; the roles and rules are complex
To social systems
thinkers, a system is complex if the reality
is complex, the actors are complex; their roles and rules may be lightly
prescribed, if at all.
Adaptive?
In cybernetics, a
system adapts to feedback from its
environment by changing state – which may be called self-regulating.
To social systems
thinkers, a system mutates as actors
change its roles, rules or aims - which may be called self-organising.
System?
In cybernetics and
system dynamics, a system is a collection of repeated or repeatable activities.
In social systems
thinking, a system is a collection of actors,
who interact as they choose.
What is a complex
adaptive system?
How about a simple system that adapts by changing state: a simple adaptive system (SAS)?
E,g, a bicycle + rider system, or a motor car + driver system.
Social systems thinkers tend to dismiss such systems as "linear" or "mechanistic" or "deterministic".
How about an entity that mutates continually: a continuously evolving entity (CEE)?
E.g. IBM, or any informal human society.
A continuously
evolving entity is not a system in the ordinary sense of the term.
It is rather an
ever-unfolding process, whose behavior is not describable and testable.
The following
sections expand on these ambiguities.
Complex: a term with scores of different definitions.
Complex systems
were introduced by Ashby thus:
“Not until… the 1920s… did it become
clearly recognised that there are complex
systems…
they are so dynamic and interconnected that the alteration of one factor immediately acts as cause to evoke alterations in others, perhaps in a great many others.”
Ashby defined complexity as measure of state variety – meaning the number of different states
a system can take.
Others define it differently.
E.g. Snowden defines complexity in a
way that is particular to his classification of problematic situations.
Others speak of complex systems with reference to at least five different situations:
1 A complicated orderly
system
Consider an organism or software system whose
processes are variegated,
complicated or convoluted.
It seems intuitively reasonable to call that complex, or relatively complex.
2 An unpredictable
situation
Surely unpredictable cannot mean complex.
A situation in which actors must respond to unforeseen
inputs in ad hoc ways is not a system at all.
And unpredictable state change patterns can be produced by very simple systems (see next point).
3 Non-linear system state change
A non-linear or convoluted state change pattern does not imply a complex system.
Chaos theory showed us it can result from
repeating simple processes.
4 A disorderly situation
Consider for example a war zone, or an uncoordinated,
decoupled, set of silo systems.
Surely disorder is chaotic rather than complex? (Disorder is simpler than order).
5 A system composed by coupling other systems
It might seem obvious that coupling two subsystems makes a more complex system.
But only if you are obliged to describe or manage the internals of those subsystems.
Else, you can take the holistic view and ignore the internals of the subsystems.
E.g. a card game can be described as simple system, regardless of whether it is played by people or software.
A system is an abstraction from reality, and most human system designers take human abilities for granted, they are axiomatic.
Often, an entity is called a complex system where one or more of the following are true.
· No measure of complexity has been agreed.
· No level of abstraction has been agreed.
· No
quantifiable properties are described, which makes any measure of complexity
impossible.
· No description of the entity as a system has been agreed, or even made.
· No description is possible, because the entity changes continually, rather than generationally.
Which is more
complex out of communism
or capitalism? Too difficult to answer?
OK, which is
more complex out of IBM, Microsoft, a chicken and a hen’s egg?
A description of IBM as a receiver of money from customers and sender of money to suppliers is simple.
A description of IBM that included every activity of every employee would be complex beyond imagination.
But then, a description of a hen’s egg that included every sub-atomic particle would be even more complex.
Read our paper on complexity for more.
Adaptation: 1) system state
change. 2) system mutation.
Adaptive can mean either system state change (as in homeostasis) or inter-generational system mutation.
System state: the current structure or variable
values of a system, which change over time.
System state change: a change to the state of a
system.
System mutation: changing the roles, rules or
variables of a system.
Self-organisation:
can mean various things, including growth and self-regulation or homeostasis.
Sooner or later, the environment of a system changes in a way that threatens its survival.
The two kinds of change below
are often confused in systems thinking discussion.
And many social systems thinkers apply the terminology of system
state changes to system mutations.
System state change
System state change (within a system generation)
is a change to the values of a system’s state variables (e.g. body
temperature).
To change Ashby’s system state is to update its
variable values in response to an event or condition.
To change Forrester’s system state is to change
the quantity of a stock in response to an inter-stock flow.
State change examples:
· A bicycle + rider system - changes state by accelerating, or steering to the left.
· A crystal system - changes state by growing in a liquid.
· A heater system - changes state in response to messages from thermostat.
Tools for System Dynamics
show the trajectory of quantitative variable value changes (up and down) in a graph.
·
steadily increase or
decrease over time
·
dwell for a while in a
stable state (an attractor), then move to another stable state
·
change in a
non-linear, jagged or “chaotic” way.
It turns out that the simplest of systems can change state in a non-linear or chaotic way.
Whatever the state change trajectory looks like, it is an inexorable result of actors behaving according to given rules.
While the state of a weather
system may change in non-linear way, the laws of physics do not change.
Some describe a system with a
non-linear state or chaotic state change trajectory as a complex system.
But while that trajectory may look
complex, the system itself may be very simple and orderly.
System mutation
System mutation (between system generations) is
a change to the nature of a system.
To change Ashby’s system is to change its
variable types or processes.
To change Forrester’s system is to add/remove
stocks or change inter-stock flows.
System mutation examples:
· A bicycle is converted into monocycle.
· A motor car is converted into a boat.
· An entity is replaced by another (as a parent is replaced by a child).
To change
the roles or rules of a concrete system is to change the abstract system it
realises.
This change may be observed in differences in
how the system works or what it produces.
The simplest of systems can mutate, or be replaced by a new generation.
Mutation is creative in the sense that it changes the very nature of a system, from one generation to the next.
Mutation can occur in at least three ways:
· redesign by actors outside the system - as a machine or software system may evolve
· redesign by actors who also play roles inside the system - as a card game may evolve.
· self-replication with changes - as a virus evolves.
Three distinctions related to
system change
First, we must distinguish between system state changes (be they linear or chaotic) and system mutations (be they small or large).
Within a generation, by homeostasis or reconfiguration, a system can adapt to changes; let us call that an agile system.
Between generations, by evolution or redesign, a system can be adapted to changes by a higher-level process or entity.
Second, we must distinguish between continuous and discrete mutation.
A game of cards is a system in which there are regular, determinate and repeatable processes.
People can’t play a game of cards unless the
players agree the rules, at least for the duration of one round.
Provided actors change the system incrementally, and all together, the classical concept of a system is upheld.
Continuous mutation undermines the very
concept of a system, since it is disorganising rather than organising.
Instead of seeing an island of stability or order, we see an ever-mutating entity that is never describable and testable.
Third, we must distinguish a system from whatever entity or a process causes it to mutate.
Self-organisation
can mean various things, including growth and self-regulation or homeostasis.
In sociology, it often means
something very different – the process by which actors who play roles in a
system define or redefine the roles and rules of that system.
Some social systems thinkers consider it their mission to promote this
idea, and the notion of a “participatory democracy”.
Ashby and Maturana said a machine
cannot change itself, the change agent must sit outside the system of interest.
For them, “re-organisation” requires the intervention of a higher level process or actor.
E.g. The process of biological evolution runs over and above
any individual organism.
And to modify the car + driver system, you play a role in a
different system, which may be called car design or psycho-therapy.
This paper defines some terms central to systems thinking and to discussion in other papers.
For further discussion of system change and change control, read “System change”.
The paper on Complex Adaptive Systems explores ambiguities in two academic definitions of that term.
The paper on System thinking ideas used in Agile explores the relevance of systems thinking ideas to agile software development and enterprise architecture.