System change - intro

Copyright 2019 Graham Berrisford. One of about 100 papers on the System Theory page at Last updated 09/09/2019 10:13         



Two kinds of system change. 1

On system state change in “system dynamics”. 1

On system mutation and “self-organising systems”. 1

On “complex adaptive systems”. 1

Conclusions and remarks. 1


Two kinds of system change

It is important to distinguish two very different ways a system can change.

The two kinds of change are often confused in systems thinking discussion.


System state change

System state change (within a system generation) is a change to the values of a system’s state variables (e.g. body temperature).

To change Ashby’s system state is to update its variable values in response to an event or condition.

To change Forrester’s system state is to change the quantity of a stock in response to an inter-stock flow.


Changes to the concrete state of a system do not change the abstract system it realises.


A homeostatic system maintains its state variable values (e.g. body temperature) within a desirable range.

Homeostasis was a basis for much social systems thinking discussion in the 19th and early 20th century.

However, many systems are not homeostatic.


Tools for System Dynamics show the trajectory of quantitative variable value changes (up and down) in a graph.

The graph may show the values of a system’s quantitative state variables

·       hover around a homeostatic norm

·       dwell for a while in one stable state (an attractor), then move to another stable state

·       steadily increase or decrease over time

·       change in a non-linear, jagged or “chaotic” way.


Some describe a system with a non-linear or chaotic state change trajectory as a complex system.

But chaotic state change does not imply system complexity.

The system that produces that trajectory may itself may behave in a very simple and orderly way.


System mutation

System mutation (between system generations) is a change to the nature of a system.

To change Ashby’s system is to change its variable types or processes.

To change Forrester’s system is to add/remove stocks or change inter-stock flows.


To change the roles or rules of a concrete system is to change the abstract system it realises.

This change may be observed in differences in how the system works or what it produces.

On system state change in “system dynamics”

Ashby’s discrete event-driven system is transformable into one of the kind modelled in Forrester’s System Dynamics.

The trick is to abstract from singular entities to populations of them (stocks), and from singular events to batches of them (flows).

In System Dynamics, the stocks can be resources of any kind – materials, energy, organisms, happiness, whatever.

The flows represent interactions between stocks.


System Dynamics

Stock and flow models

<create and use>                 <idealise>

System modellers <observe & envisage>  Interdependent quantities


Consider two stocks: the quantity of sheep and the quantity of wolves.

A System Dynamics model can help to explain how these stocks interact, using the idea of causal loop.


A growth in the stock of


will increase the stock of


will deplete the stock of



Stocks and flows can be represented in a causal loop diagram (CLD).

Here for example is a CLD relating resources to human population

Note that it omits economic growth (or GDP) and the decrease in birth rate that it causes.


Animating the model reveals the trajectory of changes over time to the size of a group, stock, population or resource quantity.

It reveals how feedback loops and time delays affect the stock levels over the long term.

So helps to answer questions like:

·       "Will the system settle down to a steady state?

·       “What steady states are possible?"

·       “Will the system crash or halt?

·       "Does the long-term behavior of the system depend on its initial condition?"


A System Dynamics model is closed, a self-contained model of stock populations that increase and decrease in response to flows

However, each stock may be seen as a system or subsystem in its own right, and the whole model may be regarded as an ecology.


In discussing System Dynamics, it is important to distinguish theory, animated theory and reality.


Abstract System Dynamics model

A theory: a model of stocks and flows that interact according to fixed laws

Concrete System Dynamics model

An animated theory: a performance of actions precisely according to the laws above

An ecology in the real world

Actors that interact to a greater or lesser extent according to the laws above


When a model of a human social system is animated, the actors robotically obey the roles and rules (modelled as stocks and flows).

In the real-world equivalent, human actors may ignore or disobey those roles and rules.

They may spend most of their time acting outside of the modelled system and/or playing roles in other systems, even ones with conflicting rules.

Which to say, there may be a mismatch between theory and reality.


For more, read System state change and regulation by causal loops

And here are links to a couple of Gene Bellinger’s videos.


Some social systems thinkers questionably apply the terminology of system state changes to system mutations.

On system mutation and “self-organising systems”

You may look at a chess game as the realisation by a social network (a pair of actors) of the roles and rules of an abstract system.

The first pair of people to play something like a game of chess must have agreed the rules that choreograph their roles in the game. 

The rules of chess varied somewhat from place to place, and were extended during the Middle Ages.

The rules continued to be modified until the early 19th century, when they reached essentially their current form.


Changing the rules is not part of the game itself.

An actor cannot simultaneously move a chess piece and change the rules by which that piece is moved.

To change the rules, the actor must

1.     stop playing their role in the game.

2.     step up into the higher or meta system in which their role is to suggest, discuss and agree rule changes.

3.     ensure the rule change is agreed with their opponent

4.     step back down into their role as game player.


As Ashby and Maturana agreed, the notion of a "self-organising system” undermines the concept of a system.

The notion requires division of a system into higher and lower parts – as in the example above.


On the need for change control

As a member of a social network, one actor can change how other actors act in response to events, without changing the network. 

Because the network is defined by its actors, not its actions.

But as a role player in a social system, one actor cannot change roles and rules without changing the system itself.

Because that system is defined by its actions. 


An entity that is disorderly, mutates continually, rather than incrementally, cannot reasonably be called a system.

If there is no change control on system activities, then there is no describable and testable system.


The issue for the system theorist is a logical one.

It is to do with the meanings we give to words in the domain-specific language of system theory. 

The meaning of “system” is logically undermined by any theory that allows “continuous adaptation”, where adaptation means mutation rather than state change.

On “complex adaptive systems”

Read social systems thinking discussion and you will come across “complexity science”, “complexity theory” and “complex adaptive systems”.

Authors use these terms glibly, without being clear what they mean.

What is a complex adaptive system? How are complexity and adaptivity defined and measured?

What kinds of adaptation or change are possible? How is a change measured? How is the ease of change measured?

As in so much systems thinking discussion, there are ambiguities



In cybernetics, a system is complex if the system description is complex; the roles and rules are complex

To social systems thinkers, a system is complex if the reality is complex, the actors are complex (their roles and rules may be lightly prescribed, if at all.)



In cybernetics, a system adapts to feedback from its environment by changing state – which may be called self-regulating.

To social systems thinkers, a system mutates as actors change its roles, rules or aims - which may be called self-defining.



In cybernetics and system dynamics, a system is a collection of repeated or repeatable activities.

In social systems thinking, a system is a collection of actors, who interact as they choose (a social network).


In his 2003 book, Michael Jackson proposed as follows.

"Social systems are not just ‘complex adaptive systems’ bound by the fixed rules of interaction of their parts.

Rather, they are ‘complex evolving systems’ that can change the rules of their development as they evolve over time."


He could more clearly have written:

“Social networks are not systems, in which actors are bound by roles and rules

Rather, they are “complex evolving entities” that can change the roles or rules of any system(s) they realise.”


A complex evolving entity (or ever unfolding process) - in which no behavior is regular, or determinate, or reproducible – is not a system in the ordinary sense of the term.

This table distills some ambiguities.






Classical cybernetics

The measurable complication of an abstract system description

Sociological thinking

The un-measurable disorder or unpredictability of a real-world situation


Classical cybernetics

System state change – updating the values of system variables

Sociological thinking

System mutation - changing the roles and rules of the system (evolving)


Classical cybernetics

Actors playing roles and acting according to rules

Sociological thinking

A group of self-aware actors who inter-communicate and act as they choose, or a problematic situation (an entity)


Classical cybernetics

A property arising from coupling subsystems into a larger system

Sociological thinking

Not seen before, new, or surprising.


Read “Complex adaptive systems” for discussion of definitions from the Sante Fe Institute and MIT.

Conclusions and remarks


Two kinds of change

Some/much systems thinking discussion does not draw a distinction between the following two kinds of change.

A system state change (within a system generation) - a change to the values of a system’s variables (e.g. body temperature).

A system mutation (between system generations) - a change to the nature of a system, to the variables it has or how they are updated.        


The proposal here is that consideration of this distinction reveal a schism in systems thinking.

What social thinkers call a “self-organising system” would better be called a social network.

It is an ever-unfolding process - which may realise successive generations of a social system, or several social systems at once.


Two kinds of behavior

The term behavior can refer to internal processing or external appearance.

It can mean a system’s operations - its variables, roles and rules.

Or else how a system is observed to change state over time - its state change trajectory.


Two kinds of complexity

The term complexity can also refer to internal processing or external appearance.

It can mean complexity in a system’s operations – variety in its variables, roles and rules.

Or else convolutions in a system’s observable state change trajectory.


This table shows a system that is complex in one way can be simple in the other.



Is operationally

Yet has a state change trajectory that is

System A


Simple (flat or linear)

System B


Complex (non-linear


Two meanings of chaos

Some use the term chaos to mean there is no order or pattern, so there is no describable system.

Others use the term chaos to describe the disorderly state change of an orderly system.

Surely, systems are complex, and chaos is simple?



Sooner or later, the environment of a system changes in a way that threatens its survival.

Again, the system can adapt or be adapted by state change or by mutation.

Whichever adaptivity means, there is no agreed measure of it.


Two kinds of agility

Agile system design has different principles from agile development.

Agile development is based on the idea that system mutations should be small and frequent.

Principles include YAGNI (You Ain’t Gonna Need it) and KISS (Keep it Simple).


Agile system design means designing a system so it does not need to change when the environment changes.

This implies designing the system - in anticipation of those changes – so it can respond to change by homeostatic change or by reconfiguration.

Principles include WAGNI (We Are Gonna Need it) and CFC (Complexify for Configurability).



When the environment changes the system

The system change may be called

Agile development

Must be changed to a new version or generation

Evolution, incremental development

Agile system design

Need not be changed, is flexible, configurable

Homeostasis, reconfiguration, growth