Systems in general
Copyright 2016 Graham Berrisford. One
of about 100 papers on the System Theory page at http://avancier.website. Last
updated 05/10/2019 17:01
System thinking is not easy.
Suppose intelligence is defined as the ability to abstract useful types and patterns from the ever-unfolding process that is the universe.
And wisdom is defined as the ability to recognize types that matter (to a particular goal) among the infinite variables that might be abstracted from the infinite complexity of reality.
Then effective systems thinking requires a considerable amount of wisdom and intelligence.
Moreover,
to understand systems thinking requires a paradigm shift.
Because, as many systems thinkers have told us, a system is a very particular way of looking at the world.
A system is defined by abstracting a transient island of order or regularity from the infinite complexity and ever-unfolding process of the universe
A concrete entity is a system only when and in so far as it realises a testable system description.
Because if it can't be described as
having the properties of a system, then it ain't a system, and it can’t be
designed or tested as a system.
Contents
A few points
to get us started
On abstract
and concrete systems
Classifying
systems in general
Open systems
– with inputs and outputs
On
generalising from different kinds of system
A dictionary
of system concepts
Footnote: a
couple of half-baked comparisons
Some define a system as "parts related in an organised whole", which may be true but is too vacuous to be of much use.
That definition includes
passive structures and taxonomies, like the Linnaean system for classifying
organisms.
Here, the term “system” has the more interesting and useful meaning that emerged in the 20th century.
In most modern
systems thinking, the “parts” of a system are actors or components that
interact in activities.
System: an entity describable as actors interacting in activities to advance the system’s state and/or transform inputs into outputs.
· The actors are structures (in space) that perform activities - in roles and processes that are describable and testable.
· The activities are behaviors (over time) that change the state of the system or something in its environment - governed by rules that are describable and testable.
· The state is describable as a set of state variables - each with a range of values.
· An open system is connected to its wider environment - by inputs and outputs that are describable and testable.
These concepts can be seen in writings of Ashby, Forrester and Checkland.
In Ashby’s cybernetics, a system is modelled as processes that advance a set of state variables.
In Forrester’s system’s dynamics, a system is modelled as inter-stock flows that advance a set of stocks (variable populations).
In Checkland’s soft systems method, a system is modelled as actors who perform processes that transform inputs into outputs for customers.
A
system can evolve naturally or be designed purposefully
Solar systems and hurricanes evolve naturally.
A natural system emerges without intent, with no aims in mind.
The outcomes of its repeated behaviors are unintended consequences.
Bicycles and card games and symphonies are designed.
A designed system is created by intent, with aims in mind - though its outcomes may diverge from those aims.
Consider a cuckoo clock, a motor car,
an accounting system, a choir, a tennis match.
The outcomes of their repeated
behaviors are intended.
A system
is defined by its actions more than by its actors
Any actor can be replaced by another actor performing the same activities.
What makes the solar system a system is not the substance of the planets, it is the regularity of the orbits they repeatedly perform.
What makes a game of poker a system is not the personalities of the players, it is the rules they follow in their roles as poker players.
A
system can be closed or open
A closed system (as in a System Dynamics model) is defined without reference to its wider environment.
An open system is defined as consuming/delivering inputs/outputs from/to its wider environment.
A
system can be predictable in the short term but not the long term
A deterministic system responds to external events and internal condition changes in an orderly way.
The rules mean you can predict what will happen when the
next event or condition occurs.
Or given stochastic rules, you can predict what will most
likely happen.
However, this does not mean you can predict the longer-term
trajectory of system state changes.
Chaos theory and Forrester’s System Dynamics taught us that even simple systems can be unpredictable in that way.
A system can be homeostatic or not
In “Design for a Brain” (1952),
Ashby, discussed biological organisms as homeostatic systems.
He presented the
brain as a regulator that maintains each of a body’s state variables in the
range suited to life.
This table distils
the general idea.
Generic system |
Ashby’s design for a brain |
Actors interact in orderly activities to maintain system state and/or consume/deliver inputs/outputs from/to the wider environment. |
Brain cells interact in processes to maintain body state variables by receiving/sending information from/to bodily organs/sensors/motors. |
However, homeostatic entities and processes are only a subset of systems in general.
In his more general work, “Introduction to Cybernetics” (1956), Ashby
defined a system as a set of regular or repeatable behaviors, which advance a
set of state variables.
The Good Regulator theorem
After Ashby, Conant (1970) said "every
Good Regulator of a System Must be a Model of that System".
This
theorem was proved in biology and sociology long before it was articulated.
Both animals and businesses are connected to their wider environment
by input-output feedback loops.
They remember the state of actors and
activities that they monitor, inform and direct.
·
The brain of an animal maintains mental models of things (food,
friends, enemies etc.) it
monitors and directs.
·
The
information systems of a business maintain documented models of things
(customers, suppliers, etc.) it monitors and directs.
These memories must model or represent
reality well enough, if animals and
businesses are to act effectively and survive.
So, they must update their memories in response to input messages that reveal state changes in those actors and activities.
A concrete entity is a system only when and in so far as it realises a testable system description.
We commonly abuse the term “system”.
We point to an entity (e.g. a business organisation or a biological organism) and casually call it "a system".
But with no explicit or implicit reference to a particular system description, this is to say nothing.
Idly calling an entity (or process, problem or situation) a system is meaningless, because one entity can realise countless systems.
“Different
observers of the same [concrete] phenomena may conceptualise them into
different [abstract] systems.” Ackoff 1971
Ashby urged us not confuse a concrete entity with an abstract system that the entity realises.
“At this point we must be clear about how a "system" is
to be defined.
Our first impulse is to point at [some real entity or machine] and
to say "the system is that thing there".
This method, however, has a fundamental disadvantage: every
material object contains no less than an infinity of variables and therefore of
possible systems.
Any suggestion that we should study "all" the facts is
unrealistic, and actually the attempt is never made.
What is necessary is that we should pick out and study the facts
that are relevant to some main interest that is already given.” (Ashby 1956)
System theorists distinguish what some call “soft systems” and Ackoff called “abstract systems” from their realizations.
An abstract system (e.g, the rules of Poker) is a theory of, or insight into, how some part of the world works.
A concrete system (e.g. a game of Poker) is a real-world application, or empirical example, of such a theory.
Science requires us to show the latter conforms (well enough) to the former.
The
basis of system theory |
Abstract systems (descriptions) <create and use> <represent> System thinkers <observe and envisage > Concrete systems
(realities) |
These
papers take this triangular, scientific, view of system theory as axiomatic.
· An abstract system (e.g. the normal regular heart beat) is a description or model of how some part of the word behaves, or should behave.
· A concrete system (e.g. your own heart) is a realisation by a real-world entity that conforms well enough to an abstract system.
An abstract system does not have to be a perfect model of what is described.
It only has to be accurate enough to be useful in understanding and predicting the behaviour of a concrete system.
It is a type that hides the infinite complexity of real-world actors and activities that instantiate (or realise) the type.
Abstract system description |
Theoretical system |
Type |
A set
of roles and rules (the logic or laws actors follow) |
Concrete system realisation |
Empirical system |
Instance |
Actors playing the roles and acting according to the rules |
This
table contains some examples.
Abstract system description |
“Solar system” |
Laws of tennis |
The score of a symphony |
The US constitution |
Concrete system realisation |
Planets in orbits |
A tennis match |
A performance of that symphony |
US governments |
Note that systems thinking hides the full complexity of real-world entities that realise systems.
In discussion and testing of the stickleback mating ritual, no attention is paid to any individual stickleback, or its complex internal biochemistry.
Abstract system |
Conceptual |
A set
of roles and rules (the logic or laws actors follow) |
The
stickleback mating ritual |
Concrete system |
Physical |
Actors playing the roles and acting according to the rules |
Countless
pairs of stickleback. |
The relationship between physical entities and abstract systems is many-to-many.
· One
abstract system (e.g. the game of poker) may be realised by countless physical
entities (countless card schools).
· One physical entity (e.g. a card school) may realise countless abstract systems (e.g. poker, whist etc).
The trouble with much modern system thinking this: the confusion of a real-world entity (usually, a named human organisation or institution) with a system.
The entity is not a system; rather, it is countless systems.
It is as many systems as observers can a) define and b) demonstrate that the entity conforms to - well enough.
That is what Ashby, Ackoff, Checkland and other systems thinkers taught us.
Moreover, those many systems can be incompatible.
“Since different systems may be abstracted from the same real thing, a statement that is true of one may be false of another.
… there can be no such thing as the unique behavior of a very large system, apart from a given observer.
… there can be as many systems as observers… some so different as to be incompatible.
… studying only carefully selected aspects of things is simply what is always done in practice.” (Ashby 1956, 6/14)
Unfortunately, most of us use the term system for both abstract systems (types), and concrete/real world entities or social networks that instantiate them.
And so, in systems thinking discussions, we often confuse them.
But remember, an abstract system may be realised by several real-world entities - each of which might also do others things
And one entity might realise other abstract systems – each defined by taking a different perspective of the entity.
Reconciling different abstract or soft systems is a theme of Checkland’s "soft systems methodology”.
This work discusses many kinds of system, including physical and biological systems.
It discusses natural systems (like the solar system) which evolve so as to behave in a regular or orderly fashion, with no given aim.
However, the systems of most interest to us are one or more of the following:
· A dynamic system, noting that it can maintain a passive structure, such as a record of system state data.
· A designed system in which actors perform activities to meet given aims. E.g. a cyclist pressing pedals to move a bicycle forward.
· An open system which consumes inputs and produces, noting that the boundary is always a matter of choice.
· A social system in which actors exchange meaningful messages. E.g. a business activity system.
· A scripted system in which actors perform prescribed activities. E.g. some violinists following the score of a symphony.
This table is an attempt to arrange system kinds in a taxonomy.
It is flawed, since (for example) social systems can be designed and open systems can be natural.
Discrete entity |
||||||
Disorganised disorderly entity chaotic (and so not describable) |
System organised, orderly, stable (in described ways) |
|||||
Passive structure does not act |
Dynamic system acts an orderly or rule-bound way |
|||||
Natural system evolved |
Designed system e.g. symphony or software system |
|||||
Inorganic e.g. solar system |
Organism e.g. tree, cat |
Social system e.g. bee hive, hunting party |
Closed system e.g. System Dynamics model |
Open system I/O exchange across boundary |
||
It is important to distinguish two very different ways a
system can change.
The two kinds of change are
often confused in systems thinking discussion.
And many social systems thinkers apply the terminology of system
state changes to system mutations.
System state change
System state change (within a system generation) is a
change to the values of a system’s state variables (e.g. body temperature).
To change Ashby’s system state is to update its variable
values in response to an event or condition.
To change Forrester’s system state is to change the
quantity of a stock in response to an inter-stock flow.
Changes to the
concrete state of a system do not change the abstract system it realises.
Tools for System Dynamics show the trajectory of quantitative variable value changes
(up and down) in a graph.
·
steadily increase or
decrease over time
·
dwell for a while in a
stable state (an attractor), then move to another stable state
·
change in a
non-linear, jagged or “chaotic” way.
Complexity does not
imply chaos.
Some describe a system
with a non-linear state or chaotic state change trajectory as a complex system.
But while that
trajectory may look complex, the system itself may be very simple and orderly.
System mutation
System mutation (between system generations) is a change to
the nature of a system.
To change Ashby’s system is to change its variable types or
processes.
To change Forrester’s system is to add/remove stocks or
change inter-stock flows.
To change the roles or rules of a concrete system is to
change the abstract system it realises.
This change may be observed in differences in how the
system works or what it produces.
For further discussion of system change and change control, read “System change”.
Millions of years ago, animals evolved to conceptualise things they perceived in the world.
To remember a thing, they encoded a representation of it in a neural memory.
To recall a thing, they decoded that memory.
Later, animals in the same species evolved to communicate concepts to each other.
To communicate is to send a message that
conveys information, such as a description, direction or decision.
This table distinguishes some concepts
related to communication.
WKID |
meaning |
Wisdom |
is the ability to respond effectively
to knowledge in new situations |
Knowledge |
is information that is accurate
enough to be useful |
Information |
is any meaning created or
found in a structure or behavior by an actor |
Data |
is a structure of matter/energy in
which meaning has been created or found (by a sender or receiver) |
For an act of communication to succeed in conveying
information, two
roles must be played.
· One actor (a sender) must encode some
information or meaning in a data structure or message.
· Another actor (a receiver) must decode the same
information or meaning from that data structure or message.
Worry not how brains, languages/codes and communication protocols work
Worry not how weakly a model represents a reality, and how different the models of a sender and receiver may be.
Consider only this - the evidence is that we can and do share knowledge.
E.g. You tell me a train is coming and I then step off the railway track.
That evidence indicates we share a considerable amount of knowledge about the world.
In systems, in business, in all science, it is necessary for communicating actors to use the same code or language.
Messages must be expressed using the symbols and grammar of a language shared by senders and receivers.
Scripts for regular behaviors must be expressed using a language known to the actors.
A system must be described using a domain-specific language, else the system description cannot be agreed or tested.
For further discussion of data, information and communication, read here about “Second order cybernetics”.
Describing a dynamic system by what it does has been expressed as a general principle called “the primacy of behavior”.
This principle can be seen not only in Ashby’s Cybernetics and in Forrester’s System Dynamics but also in Checkland’s Soft Systems Methodology.
In discussion of “soft systems”, Churchman said “a thing is what it does”.
And Checkland defined a system in the form of a “Business
Activity Model”.
In discussion
of “system
dynamics”, Forrester said every system can be defined in terms of how flows
(behaviors) update stocks (quantities).
And Meadows said a system is “A set of
elements or parts [stocks] that is coherently organized and interconnected in a
pattern or structure that produces a characteristic set of behaviors
[flows]."
In
discussion of
“cybernetics”, Wiener and Ashby discussed systems as machines of any kind -
mechanical, biological or social.
“Cybernetics deals with all forms of behavior in
so far as they are regular, or determinate, or reproducible…
[It] treats, not things but ways of behaving. It does
not ask "what is this thing?" but ''what does it do?"
It is thus essentially functional and
behavioristic.” (Ashby 1956)
So, the question is not so much "what is this thing?" as ''what does it do?"
Typifying the three As
The operations of a dynamic system are described by typifying:
· Actors who play roles in activities
· Activities that follow rules
· Attributes (state variables) of the system
or its parts.
Below are a few simplistic examples.
A solar system
· Actors: planets.
· Activities: orbit the sun.
· State: the current condition and position of the planets.
A windmill
· Actors: sails, shafts, cogs, millstones
· Activities: conversion of wind energy into the rotation of parts.
· State: the current condition and position of the actors.
A termite nest
· Actors: termites
· Activities: deposit materials, disperse a pheromone.
· State: the structure of the nest, which grows as termites deposit material at peaks in the pheromone gradient and disperse the pheromone.
A prey-predator
system
· Actors: wolves and sheep
· Activities: births and deaths of wolves and sheep.
· Material state: the condition of the wolves and sheep at a moment in time.
· Information state: wolf and sheep population numbers (grow and shrink in response to each other, and may settle in a stable cyclical pattern).
A tennis match
· Actors: tennis players.
·
Activities: the motions of the ball and the
players.
· Material state: the condition of the court, the balls and the players at a moment in time.
· Information state: the game, set and match scores (a structural side effect of players acting according to laws of the game.)
A circle property calculator
· Actors: a software component
· Activities: calculate perimeter, calculate area.
· Information state: an invariant, the value of pi.
An information system
· Actors: humans and/or computers
· Activities: messages and data processing
· Information state: memories.
Principia Cybernetica says this of a system:
“real systems are open to, and interact with, their environments….”
“Systems concepts include: system-environment boundary, input, output, process, state, hierarchy, goal-directedness, and information.” Principia Cybernetica (Web)
As Gibbs suggested, to describe a system is to separate it from the rest of the universe.
A closed system has no interaction with anything in its wider environment.
An open system consumes inputs (from “suppliers) and produces outputs (consumed by “customers”).
An open system in its environment might be called ecology.
The table below shows system features found in three dictionaries (A, B and C) and two popular internet sources.
The features include inter-related components that exhibit orderly or rule-bound behaviors.
Also, encapsulation of a system by a boundary, across which inputs
and outputs may be exchanged with entities in the wider environment.
Feature |
Google |
Wikipedia |
Meaning |
|||
Wholeness (or holism) |
yes |
yes |
yes |
yes |
yes |
parts cooperate in processes to act as a whole (rather than act in isolation). |
Inter-related components |
yes |
yes |
yes |
yes |
yes |
all parts are related directly or indirectly |
Orderly or rule-bound behavior |
yes |
yes |
|
yes |
yes |
system processes are constrained, bound by the rules of physics, chemistry or man. |
System boundary (or encapsulation) |
|
|
yes |
yes |
yes |
things inside the system are separable from things outside the system. |
Input/output exchange across boundary |
|
|
yes |
yes |
yes |
the system is open to and interacts with its environment |
Consideration
of system inputs and outputs leads to three more concepts.
System
environment: the
world outside the system of interest.
System boundary: a line (physical or logical) that separates a system from is environment.
System interface: a description of inputs and outputs that cross the system boundary.
The describer
has to decide where to draw the boundary.
The boundary may be physical – think of a solid structure in a liquid or gas – or a factory.
Or logical – where the actors in a system act in different locations, and communicate remotely.
The boundary
of a system may be expanded or contracted at the whim of the describer.
Having
identified external entities that interact with a system, the boundary can be
expanded to include them.
In this way,
the ecology containing the first system may described as a wider system.
The table below the similarities between systems as they are defined in five different sources.
Generic structure |
Principia Cybernetica |
Boulding’s social system |
Checkland’s Soft System |
Maturana’s biological entity |
Wikipedia’s “system” entry (2016) |
Actors |
Interrelated parts |
Individuals
perform roles in |
Components
interact coherently in |
Interacting components are transformed and destroyed |
Related components perform |
Activities |
interact in processes to meet goals
by |
repeatable processes according to |
activities
to meet a purpose by |
by a network of processes that form |
processes
that transform |
State |
maintaining system state and |
remembered mental images, and |
maintaining their integrity and |
a machine, a concrete unity |
|
I/O boundary |
sending/receiving information |
exchange messages |
sending/receiving inputs and outputs |
|
inputs
into outputs (material, energy or
data) |
Environment |
to/from each other and external actors |
|
to/from each other and external actors |
in space |
|
“The same concepts and principles of organization underlie the different disciplines (physics, biology, technology, sociology, etc.), providing a basis for their unification.” Principia Cybernetica
In different sciences, you can find system elements of these general types
· Structure: a thing locatable in space, like an actor.
· Process: a thing that happens over time, like an activity.
o Procedure: a process that ends in a result, like a tennis match.
o Loop: a process that cycles forever, like a heart beat.
And you can find loops in many different sciences, such as:
· A condition-less loop - in the procedural logic of software.
· A neural loop - in mathematics and Artificial Intelligence.
· A physiological loop - the Krebs cycle in biology.
· An ecological loop – the plant-oxygen-animal-CO2 loop.
· A cognitive loop - the cortico-striato-thalamo-cortical loop associated with Obsessive Compulsive Disorder.
· A social behaviour loop - the communication-decision loop in Luhmann’s social-psychic systems.
A system theorist may not directly extrapolate/extend from one kind of loop to another.
But may instead draw an analogy between them both with respect to the more generic loop concept.
E.g. Luhmann did not extrapolate from systems in biology to systems in sociology.
He generalised from the former to the abstract concept of an autopoietic system in which "elements produce elements".
Then extrapolated from this generalisation to define his peculiarly metaphysical "social system".
Circularity and recursion
Circular
feedback loops are a feature of Ashby’s cybernetics and Forrester’s
system dynamics.
Feedback loops can relate the value of one system state variable to the value of another.
E.g. Consider how the quantity of sheep and the quantity of wolves are related in a causal loop.
A growth in the stock of |
sheep |
will increase the stock of |
wolves |
will deplete the stock of |
sheep |
Other kinds
of systems thinking often involve some kind of circularity of some kind.
Think of
systems layered on top of each other, at different levels of abstraction.
There can be
a hierarchy of process control: a control system at level N throws an exception
up to a control system at level N+1, and awaits direction.
There can be a
hierarchy of system definition: the rules of a system at level N are state
variables that can be manipulated by a meta system at level N+1.
“Second-order cybernetics” was developed around 1970.
It was developed and pursued by thinkers including Heinz von Foerster, Gregory Bateson and Margaret Mead.
It is said to be the circular or recursive application of cybernetics to itself.
It shifts attention from observed systems to the observers of systems.
It is often applied to human societies or businesses.
In those contexts, a system’s actors can also be system thinkers, who study and reorganise the system they play roles in.
Unfortunately, second order cybernetics tends to lead people away from classical cybernetics.
A common issue in discussion of systems is the one Ashby warned us of – the confusion of real-world entities with systems.
In much systems thinking discussion there is little or no recognition that:
· one entity can realise several (possibly conflicting) systems at the same time
· there is a need to verify an entity behaves in accord with a system description
Seeing the world as a duality of systems and observers is naive.
Classical cybernetics gives us a more sophisticated triangular view.
A concrete entity is a system only when and in so far as it realises a testable system description.
The observer(s) of a real word entity may abstract countless different (possibly conflicting) systems from its behaviour.
Referring to every entity, every social network, as a system, is naïve.
Observers may well discuss an entity (its properties, problems and possibilities) without reference to any system.
Moreover, discussion of systems often confuses two kinds of change or adaptation.
In classical cybernetics, a system responds
in a regular way to changes in its environment or another (coupled) system.
The term adaptation usually means system state change - changing state variable values in response to events.
The trajectory of a system’s state change (be
it oscillating, linear, curved or jagged) is an inexorable result
of the system’s rules.
Second-order cybernetics is often applied to thinking about social organisations.
Here, the term adaptation often means system mutation or evolution – changing the system’s state variables, roles and rules.
This
changes the very nature of the system; it changes its laws.
The
trouble is that continual adaptation or reorganisation of a system
undermines the general concept of a system – which is regularity.
Consequently, second-order cybernetics tends to undermine more general system theory.
If we don't distinguish an ever-evolving social network from the various modellable systems it may realise, the concept of a system evaporates.
For more, read second order cybernetics.
For a discussion of the terms and concepts below wrt Enterprise Architecture, read this
paper.
· Concrete system: (e.g. your own heart beat) a realisation by a real-world entity that conforms well enough to an abstract system.
· Natural system: a system that runs before it is envisaged as a system.
· Designed system: a system that is envisaged before it runs.
· Closed system: a system defined (as in a System Dynamics model) without reference to its wider environment.
· Open system: a system defined as consuming/delivering inputs/outputs from/to its wider environment.
Adaptation: 1) system state change. 2) system mutation.
Atomic element: a system element that is not further divided in a description.
Behavior: a service or process that changes the state of a system or something in its environment.
Chaotic: 1) disorderly, random, with no regular or repeated pattern. 2) unpredictable outcomes arising from variations in initial conditions. 3) non-linear state change trajectory.
Complex: a term with scores of different definitions.
Coupling: the relating of subsystems in a wider system by flows.
Deterministic system: a system that processes an input, with respect to its memory/state, to produce a result that is predictable.
Emergence: the appearance of properties in a wider or higher-level system, from the coupling of lower-level subsystems.
Event: a discrete input that triggers a process.
Evolution: a progression of inter-generational system mutations that change the nature of a system.
Exception: what happens when actors do not complete actions expected of their roles.
Flow: the conveyance of a force, matter, energy or information.
Hierarchy: the successive decomposition of a system into smaller subsystems or a behaviour into shorter behaviors.
Holism: looking at a thing in terms of how its parts join up rather than dissecting each part.
Holistic view: a description of how parts relate, interact or cooperate in a whole.
Information flow: the conveyance of information in a message from a sender to a receiver.
Information quality: an attribute of a flow or a state, such as speed, throughput, availability, security, monetary value.
Information state: the information retained in a memory or store.
Information: any meaning created or found in a structure by an actor.
Learning: the processes by which an organism or AI machine remembers past events and responds to differently to new events (implies pattern recognition and fuzzy logic).
Linear: in a straight line or a sequence.
Meta system: a system that defines a system or changes it from one generation to the next.
Organicism: the idea that systems are describable at multiple hierarchical levels.
Process: a sequence of activities that changes or reports a system’s state, or the logic that controls the sequence.
Reductionist view: identifying the parts of a whole, naming or describing parts without considering how the parts are related in the whole.
Social cell: a social network in which actors find that realising the roles and rules of a particular social system is so attractive they resist any change to it.
Social network: a group of actors who communicate with each other directly or indirectly.
Social system: a system that is realised by a social network.
System environment: the world outside the system of interest.
System boundary: a line (physical or logical) that separates a system from is environment, and encapsulates a system as an input-process-output black box.
System interface: a description of inputs and output that cross the system boundary.
System state: the current structure or variable values of a system, which change over time.
System state change: a change to the state of a system.
System mutation: changing the roles, rules or variables of a system.
Self-organisation: can mean various things, including growth and
self-regulation or homeostasis.
Read Introducing system ideas for a discussion of the system terms and concepts above, and some ambiguities.
Beware that many terms used by systems thinkers (e.g. emergence, complexity and self-organisation) are open to several interpretations.
Systems thinking terms and concepts are further explored in the following papers.
Classical cybernetics (Weiner, Ashby, Turing)
The system theory page hosts papers that discuss concepts associated with cybernetics:
· Encapsulation of structure and behavior
General system theory (Bertalanffy, Boulding, Rappaport)
The system theory page hosts papers that discuss concepts associated with general system theory:
· Introducing general system theory
· Holism and emergent properties
System Dynamics (Forrester, Meadows)
The system theory page hosts papers that discuss concepts associated with system dynamics:
· System state change by circular causal loops
· Chaos and non-linear behavior
Soft systems (Churchman, Checkland, Ackoff)
For soft systems methodology, and applying general system theory to management science.
The tendency of “systems thinkers” to draw analogies between “systems” in different sciences is not itself science.
Consider how very different are the systems in this (possibly questionable) table
A (designed) software system |
A
(natural) free market economic system |
Is a designed
entity. |
Is a natural
entity. |
Must be described
fully, in excruciating detail. |
Can be described
lightly as a collection of actors making buy and sell decisions. |
Must be tested to
ensure its actions add up to coherent, consistent and correct whole. |
Needs no testing
before it runs. |
Proceeds by
actions that are predetermined and coordinated. |
Proceeds by
actions that are neither predetermined nor coordinated. |
Proceeds only
when intentionally deployed and set in motion. |
Proceeds
regardless of any overarching intention. |
Changes only by
design of actors outside the system |
Changes as a result
of actors inside the system. |
Changes only
incrementally and predictably. |
Evolves
continually and unpredictably. |
Is relatively
complex |
Is relatively
simple (however complex the real world actors and
activities are). |
And consider how very different are the systems in this table.
The actors in an agile software
development system |
The
actors in a free market economic system |
The actors are a
small team of developers (cf. a hunter-gatherer group) |
Actors are millions
of individuals who act in their own self interest |
Actors act to
build a coherent, consistent and correct software system |
Actors act to
sell or buy something (anything). |
Actors must make
coordinated decisions. |
Actors make ad
hoc decisions. |
Actors make
decisions with the purposes of the system’s owners and users in mind. |
Actors make
decisions no wider purposes in mind. |
Actors’ decisions
are highly constrained by requirements technologies, standards etc. |
Actors’ decisions
are constrained by the money they possess. |
All free-to-read materials at http://avancier.website are paid for out of
income from Avancier’s training courses and methods licences.
If you find the web site helpful, please spread the word and
link to avancier.website in whichever social media you
use.