Change?

Copyright 2017 Graham Berrisford. A chapter in “the book” at https://bit.ly/2yXGImr. Last updated 31/05/2021 17:35

 

Reading online? If your screen is wide, shrink its width for readability.

 

This chapter classifies and discussed different kinds of change. It chapter unscrambles terms and concepts used in discussing the adaptation, evolution or mutation of entities and systems.

 

Contents

Two kinds of system.. 1

Two kinds of system change. 2

On state change in “system dynamics”. 3

On system mutation and “self-organization”. 4

A classification of system change types. 5

On “complex adaptive systems”. 6

Natural activity system mutation. 6

Views of self-organization. 7

Meta systems thinking. 9

Applying meta systems thinking. 10

Conclusions and remarks. 12

 

Two kinds of system

[To speak of] “the system” [is] ambiguous. “The system” may refer to… the thing itself; or to the variables with which some given observer is concerned. (Ashby 1956, 6/14)

 

This book starts from the position that it is misguided to refer to a government or a university (for example) as a system. Rather, it is a social entity that employs and participates in several more or less clearly-defined human activity systems.

 

A social entity is a community of actors who may both realize one or more activity systems and act outside of any defined activity system. The social entity (in which actors may act in ad hoc and innovative ways) is distinct from any particular activity system in which those actors are bound by roles and rules.

 

At one extreme (think of an ant colony) the range of performable actions is limited. Expanding the range of possible actions gives actors a higher degree of freedom, and increases the system’s complexity, but it remains describable as an activity system. At the other extreme, where every action is self-determined, and there is no observable regularity, repetition, or pattern, then there is no recognizable activity system in the social entity.

 

A real-world business sits between those extremes. It is a social entity that gives its employees some degree of freedom, but also expects them to play roles in regular business activity systems.

Two kinds of system change

“'adaptation' is commonly used in two senses which refer to different processes.” (Ashby 1956, 5/7)

 

Ashby urged us to distinguish two kinds of change, which are often confused in systems thinking discussion. To distil the distinction, an activity system can respond to events by:

 

·       rule-bound state change, whether to maintain homeostasis or to advance its state progressively.

·       rule-changing mutation, from one system generation to the next, or a new system.

 

“The distinction is fundamental and must on no account be slighted.” (Ashby 1956, 4/1)

 

"Change the rules from those of football to those of basketball, and you’ve got, as they say, a whole new ball game.” Meadows

 

"Social systems are not just ‘complex adaptive systems’ bound by the fixed rules of interaction of their parts. Rather, they are ‘complex evolving systems’ that can change the rules of their development as they evolve over time." This book Jackson 2003

 

In discussing social entities, Jackson preferred evolving to adaptive. To draw the distinctions needed to move systems thinking forward, we might do better to distinguish evolving social entities from adaptive activity systems.

 

Rule-bound system state change

A state change (within a system generation) changes the values of a system’s state variables (e.g. body temperature). In cybernetics and system dynamics it means variable values change response to some event or condition. For example, the quantity of a stock changes in response to an inter-stock flow.

 

A homeostatic system maintains its state variable values within a desirable range. Homeostasis was a basis for much social systems thinking discussion in the 19th and early 20th century. However, many systems are not homeostatic.

 

The trajectory of changes to a quantitative variable’s value can be show on a graph as line of behavior. It may shows value of a variable

·       hovers around a homeostatic norm

·       dwells for a while in one stable state (an attractor), then move to another

·       steadily increases or decreases over time

·       change in a non-linear, jagged or “chaotic” way.

 

Some describe a system with a non-linear or chaotic state change trajectory as a complex system. But chaotic state change does not imply system complexity. The system that produces that trajectory may itself may behave in a very simple and orderly way.

 

Rule-changing system mutation

System mutation (between system generations) is a change to the nature of a system, to its way of behaving. To change Ashby’s system is to change its variable types or processes. To change Forrester’s system is to add/remove stocks or change inter-stock flows. To change the roles or rules of a concrete system is to change the abstract system it realises. This change may be observed in differences in how the system works or what it produces.

On state change in “system dynamics”

A discrete event-driven system is transformable into one of the kind modelled in Forrester’s system dynamics. The trick is to abstract from singular entities to populations of them (stocks), and from singular events to batches of them (flows). In system dynamics, the stocks can be resources of any kind – materials, energy, organisms, happiness, whatever. The flows represent interactions between stocks.

  

Consider two stocks: the quantity of sheep and the quantity of wolves. A system dynamics model can help to explain how these stocks interact, using the idea of causal loop.

 

A growth in the stock of

sheep

will increase the stock of

wolves

will deplete the stock of

sheep

 

Stocks and flows can be represented in a causal loop diagram (CLD). Here for example is a CLD relating resources to human population https://www.edrawsoft.com/causal-loop-diagram-software.php

 

Note that it omits economic growth (or GDP) and the decrease in birth rate that it causes.

 

Animating the model reveals the trajectory of changes over time to the size of a group, stock, population or resource quantity. It reveals how feedback loops and time delays affect the stock levels over the long term. So helps to answer questions like:

·       "Will the system settle down to a steady state?

·       “What steady states are possible?"

·       “Will the system crash or halt?

·       "Does the long-term behavior of the system depend on its initial condition?"

 

A system dynamics model is closed, a self-contained model of stock populations that increase and decrease in response to flows However, each stock may be seen as a system or subsystem in its own right, and the whole model may be regarded as an ecology.

 

In discussing system dynamics, it is important to distinguish theory, animated theory and reality.

 

Abstract system dynamics model

A theory: a model of stocks and flows that interact according to fixed laws

Concrete system dynamics model

An animated theory: a performance of actions precisely according to the laws above

An ecology in the real world

Actors that interact to a greater or lesser extent according to the laws above

 

When a model of a human social system is animated, the actors robotically obey the roles and rules (modelled as stocks and flows). In the real-world equivalent, human actors may ignore or disobey those roles and rules. They may spend most of their time acting outside of the modelled system and/or playing roles in other systems, even ones with conflicting rules. Which to say, there may be a mismatch between theory and reality.

 

For more on systems represented by causal loop diagrams, here are links to a couple of Gene Bellinger’s videos.

http://www.youtube.com/watch?v=wiYLx5a1VBk.

https://www.youtube.com/watch?v=uH5zM7J-eHU

On system mutation and “self-organization”

You may look at a chess game as the realisation by a social entity (a pair of actors) of the roles and rules of an abstract system. The first pair of people to play something like a game of chess must have agreed the rules that choreograph their roles in the game.  The rules of chess varied somewhat from place to place, and were extended during the Middle Ages. The rules continued to be modified until the early 19th century, when they reached essentially their current form.

 

Changing the rules is not part of the game itself. An actor cannot simultaneously move a chess piece and change the rules by which that piece is moved.  To change the rules, the actor must

1.     stop playing their role in the game.

2.     step up into the higher or meta system in which their role is to suggest, discuss and agree rule changes.

3.     ensure the rule change is agreed with their opponent

4.     step back down into their role as game player.

 

As Ashby and Maturana agreed, the notion of a "self-organising system” undermines the concept of a system. The notion requires division of a system into higher and lower parts – as in the example above.

 

On the need for change control

As a member of a social entity, one actor can change how other actors act in response to events, without changing the entity. Because the entity is defined by its actors, not its actions. But as a role player in a social system, one actor cannot change roles and rules without changing the system itself. Because that system is defined by its actions. 

 

An entity that is disorderly, mutates continually, rather than incrementally, cannot reasonably be called a system. If there is no change control on system activities, then there is no describable and testable system.

 

The issue for the system theorist is a logical one. It is to do with the meanings we give to words in the domain-specific language of system theory. The meaning of “system” is logically undermined by any theory that allows “continuous adaptation”, where adaptation means mutation rather than state change.

A classification of system change types

“A system generally goes on being itself… even with complete substitutions of its actors as long as its interactions and purposes remain intact.” Meadows

 

Natural systems evolve before they are described. Consider solar systems, weather systems, plants, animals, subsystems of those such as the organs of a body, and regular and repeated behaviors observable in a social entity. Designed systems are described (in mind or documentation) before they are made. Consider a steam engine, marriage ceremony, game of poker, word processor or billing system.

 

State changes occur when state variable values change, as when a temperature goes up or down, or a stock grows or shrinks. System mutations occur when the roles, rules or variables of a system change. If we add or remove a stock or flow in a causal loop diagram, or we change a rule in a game of poker, then we define a new system generation or a different system altogether.

 

Continuous changes occur when the values of state variables, or the generations of a system, move along a continuous sliding scale, as when a planet moves through space, or a child matures into an adult. Discrete changes occur when there is a step change in the value of a state variable, or in the generation/version of a system.

 

Given the dichotomies above, system change can be classified as continuous or discrete, state change or mutation, and natural/accidental or designed/planned. Representing change as a three-dimensional phenomenon helps us to think about what it means to model change and design for it.

 

 

Continuous

Discrete

State change

Natural: the growth of a crystal

Natural: asleep to awake, or day to night

Designed: analogue light dimmer

Designed: light on to light off

Mutation

Natural: child to adult

Natural: parent to child

Designed: X?

Designed: version 1 to version 2

 

X? How to design an activity system that mutates continually? If there is no stable pattern, no regularity or repetition, then there is no describable system.

 

However, continuous change can be simulated by dividing changes into discrete steps frequent and small enough to appear continuous. And it is possible to design a higher “meta system” responsible for the design of systems, of which EA might be seen as an example, as discussed later in the section on self-organization.

On “complex adaptive systems”

The term complex adaptive system (CAS) is prominent in much social entity thinking discussion. Since the three words are variously defined, separately and together, the discussion is fraught with ambiguity.

 

Some attempts to define CAS apply to a rider on a bicycle, or a flight of geese, which adapts to changing conditions by changing state. Often, what the writer really has in mind is a human social group that might better be called an evolving social entity (ESE), which adapts to changing conditions by changing its way of behaving.

 

This table indicates some  ways you might find words used differently in the two schools of thought – activity systems thinking (AST) and social entity thinking (SET).

 

Term

School

Possible meanings

Complex

AST

The complexity of actors and activity structures in an abstract system model OR

The convoluted nature a system’s observable line of behavior (state change trajectory)

SET

The un-measurable and infinite complexity of a physical entity

Adapting

AST

Changing state – updating state variable values according to rules

SET

Mutating - changing rules, to act in a different way

System

AST

A set of inter-related activities performed by actors playing roles according to rules.

SET

A set of inter-related actors who determine their own actions

Emergence

AST

Properties arising from subsystems interacting a larger system

SET

Properties not seen before, new or surprising.

 

A system that is complex in one way can be simple in another.

 

 

System model is

Line of behavior

System A

Complex

Simple (flat or linear)

System B

Simple

Complex (non-linear)

 

Some equate complexity to chaos. That begs the question what chaos means; arguably, chaos is simple.

 

See the chapter on complexity science for more on CAS.

Natural activity system mutation

Every weather system, every plant and animal, every business activity system (as described in soft systems methodology) and every causal loop diagram (as drawn in system dynamics) is a machine in the broadest sense. Although business activity systems allow human actors the freedom to choose between defined activities, still, the range of actions is limited to those available in the machine.

 

How do machines - natural and designed - evolve in discrete steps? This chapter focuses on natural systems.

 

A species mutates, not continually, but via discrete birth and death events. Biology does not design for the future. Nature does not, as one system thinker opined, "design processes that foster adaptability and robustness for a range of scenarios that could come to pass."  While inter-generational genetic mutations are accidental; the process of natural selection produces “adaptive” changes.

 

·       "Genetic mutations arise by chance. They may or may not equip the organism with better means for surviving in its environment. But if a gene variant improves adaptation to the environment (for example, by allowing an organism to make better use of an available nutrient, or to escape predators more effectively—such as through stronger legs or disguising coloration), the organisms carrying that gene are more likely to survive and reproduce than those without it.

·       Over time, their descendants will tend to increase, changing the average characteristics of the population. Although the genetic variation on which natural selection works is based on random or chance elements, natural selection itself produces "adaptive" change—the very opposite of chance." https://www.ncbi.nlm.nih.gov/books/NBK230201/

 

In biology, the adaptability of a species does not lie in the adaptability of an organism. It lies in the “higher” process of evolution by natural selection. This kind of evolution is wasteful in the sense that it discards almost every new feature it creates. Nature kills off "designs" that don’t work well enough to be reproduced. It replaces them by whatever new "designs" turn out to better fit today's environment.

 

Given an environment with limited resources, each generation must die to make space for the next. Thus, old entities are replaced by new ones.

 

Not only are individuals are replaced, but species also. Most (99.9%) of the species that evolved are now extinct. The analogy in the business world would be the most brutal of capitalist systems. One in which every business fails, to be replaced by start-ups. And only a small number of those start-ups' random innovations survive for long in a changing world.

Views of self-organization

“The use of the phrase [self-organization] tends to perpetuate a fundamentally confused and inconsistent way of looking at the subject”… “No machine can be self-organizing in this sense.” Ashby 1962

 

The term self-organization has been used with several meanings, including

·       the emergence of order from chaos (as paper clips connect in a chain when shaken up)

·       homeostatic adaptation (as a body adapts to changes in temperature)

·       self-assembly (as of a crystal in a liquid)

·       self-improvement (as by insightful learning from experience)

·       mutation to survive changes in a wider environment.

 

Ashby and Maturana, separately, suggested the use of the term “self-organization” tends to undermine the concept of a system. However, both allowed that a system can evolve or mutate in coordination with its environment. Ashby spoke of system mutation as

 

“a change of its way of behaving… occurs at the whim of the experimenter or some other outside factor.”

 

The idea of systems being described and re-organized from the outside appears in many domains of knowledge.

 

In mathematics

Gödel's two incompleteness theorems (1931) demonstrate the limitations of every formal axiomatic system capable of modelling basic arithmetic. The first states that no formal system is capable of proving all truths about the arithmetic of natural numbers. The second shows that the system cannot demonstrate its own consistency. However, an observer, standing outside the system, can assess its truth or consistency.


In biology

Darwin’s evolution of species can be seen as overcoming the limits to the adaptability of an organism. No organism is capable of changing its own DNA. However, the higher system of evolution and the lower system of an organism are coupled whenever two organisms succeed in mating, and replacing one generation of those organisms by the next.

 

In sociology

The notion of the cooperative can be seen as overcoming the limitations of social entities that compete for resources (as in “the tragedy of the commons”). Typically, a committee or governing body determines the rules of the wider collective. The members of the committee may be drawn from the collective’s members. It designs changes to the roles of actors, and directs actors in the social entities to follow them. It also has some power to ensure compliance to the rules.

 

To a greater or less extent, members may be permitted to negotiate how they interact on a bi-partisan basis, as in the Morning Star example later.

 

In cybernetics

Ashby’s treatise on self-organization can be seen as overcoming the limitations of basic cybernetics. He stated that no machine is capable of changing itself. However, another (let us call it “higher”) machine can do that.

 

Ashby’s idea is that a higher machine can monitor the state and rules of a lower machine, and when some condition is recognized, can change those variables or rules. The two machines are coupled by a feedback loop in one system.

 

Note that neither machine re-organizes itself; the lower system does not self-organize, and the higher system does not self-organize. The whole composite (of higher and lower) is a larger and more complex system, but it does not self-organize either.

 

What happens might better be called “rule-changing mutation”. It turns out that this approach to self-organization sits well alongside the notion of a sociological cooperative. But first, a brief aside on second order cybernetics.

 

In second order cybernetics

Krippendorff wrote:

“Although second-order cybernetics (Foerster et al. 1974) was not known at this time, Ashby included himself as experimenter or designer of systems he was investigating.”

 

Second order cybernetics is not a later version of classical cybernetics; it is different. Authors who used the term in the 1970s include Heinz von Foerster, Gregory Bateson and Margaret Mead. However, they didn’t use the term with exactly the same meaning.

 

Second order cybernetics is usually said to be about systems that include the system describer in the system, which enables the system to be self-organizing and creatively self-improving. Remember Boulding’s distinction between roles and actors? Second order cybernetics conflates them, and so undermines the concept of an activity system in which actors are defined by their roles.

 

In EA

EA is a higher system that is coupled to the primary activity systems of a business, and plans changes to them. See the later chapter.

Meta systems thinking

 

Rule-changing mutation

Ashby proposed a system (call it S) cannot reorganize itself, but can be reorganized by higher system or entity (call it M). M can observe and change the behavior of S, and when circumstances demand it, M can change the roles and rules of S, thus creating a new generation of S.

 

To change an entity that realizes S, M must:

1.     take as input the abstract system (S version N) realized by the entity

2.     transform that input into a new abstract system (S version N+1)

3.     trigger the entity to realize the new abstract system.

 

Meta systems thinking

“Meta systems thinking” adds one more idea to Ashby-style rule-changing mutation. The idea is that one actor may play two different roles: a role in the operation of a regular activity system (S), and another role in a high system or entity (M) that observes and changes S.

 

E.g. One person can play a role as a tennis player in tennis matches, and another role as a law maker in the Lawn Tennis Association.

 

Tennis

The rules of tennis

<create and use>          <represent>

LTA   <observe and envisage>   Tennis matches

<are realized by>

Tennis players

 

The idea that one human actor can play a role in systems at different levels helps us to reconcile activity system theory with “self-organization”, and gives us an alternative to second order cybernetics.

 

It does however presume that a system evolves by inter-generational steps. And in the case of human activity systems, this implies some kind of change control.

Applying meta systems thinking

Meta systems thinking reconciles activity system theory with self-organization. And it goes some way to reconcile second order and classical cybernetics. A few points to bear in mind:

 

·       Every activity system (ecological, socio-technological, whatever) we define is a "machine" in the Ashby sense.

·       It is logically impossible to define a fully self-defining system, because the moment it starts running it may depart from whatever was defined (e.g. peers reorganize themselves into a hierarchy).

·       We can however define a meta system that enables actors who play roles in a system to negotiate changes to those roles (as in the “Morning Star” example).

·       The system must be changed in discrete, generational steps, since without change control, the system’s activities would become incoherent.

·       Changes to a meta system are distinct from changes to a system that it defines; the former organizes the latter; neither organizes itself.

 

It turns out that meta system thinking can be applied in a variety of domains, to homeostatic machines, to biology and to sociology.

 

Applying meta system thinking to a homeostat

Ashby built a homeostat to illustrate inter-generational reorganization. A “higher” machine detects when environment variables move outside the range safe for the lower machine to function. The higher machine.

1.     Takes as input the rules applied by the homeostat to its variables

2.     changes those rules at random

3.     triggers the homeostat to realise the new rules.

 

What if the higher-level system detects this doesn’t improve matters? Then it can change the rules again, until the lower system ether dies or works better.

 

What if the lower-level system is not an individual, but a species or population of actors or agents? If the higher-level system is like biological evolution, it will change the rules of each individual differently. And then leave it to the environment to favor individuals that work better. Or else, if the higher-level system is like a human government, then it can change the rules for all. And then monitor the effectiveness of those rule changes.

 

Applying meta system thinking in biology

“[Consider] the process of evolution through natural selection. The function-rule (survival of the fittest) is fixed”. (Ashby’s Design for a Brain 1/9)

 

Darwin’s evolution of species can be seen as overcoming the limits to the adaptability of an organism. No organism is capable of changing its own DNA. However, the higher system of evolution can do that. The two systems are coupled in that evolution is triggered whenever two organisms succeed in mating, and so replace one generation of those organisms by the next.

 

The rules of organic living are encoded in an organism's DNA. The rules of the organisms in a species are changed from one generation to the next by the fertilisation of an egg. The process of sexual reproduction embodies the “survival of the fittest” rule.

1.     male and female individuals mate

2.     their DNA mixes to form new DNA (think of it as an abstract system description)

3.     the new abstract system is realized by a new individual

4.     the environment favors individuals that make best use of the available resources.

 

Applying meta system thinking in sociology

Can we stretch ideas about mechanical, biological and psychological machines to the level of sociology? In the theory of evolution by natural selection, can a social entity be treated as an organism? Might natural selection favor cooperation and oppose competition?

 

Thinkers who addressed this include:

·        Lynn Margulis – the evolution of cells, organisms and societies

·        Boehm – the evolution of hunter-gatherer groups

·        Elinor Ostrom – the formation of cooperatives.

 

The notion of the cooperative can be seen as overcoming the limitations of social entities that compete for resources (as in “the tragedy of the commons”). Typically, a committee or governing body determines the rules of the wider collective. It designs changes to the roles of actors, and directs actors in the social entities to follow them. It also has some power to ensure compliance to the rules.

 

Of course, people are not automatons who inexorably and helplessly play their roles. They are intelligent and creative; they change the rules of an activity system they play a role in. As free agents, they can

·       follow the rules of the system.

·       ignore or break the rules – which may be recognised in the system as an “exception”.

·       propose changing the rules of the system.

 

Generally speaking, an actor can act in the lower system (S) as an actor in its regular operations, and the same actor can act in a higher system (M) as observer of the lower system (S).

 

How can M change a system from “bad” to “good”. Whereas a robotic M may iteratively make random changes to a system, favouring ones that lead an entity to behave better (in some pre-defined way), a human M can observe the system, understand it and invent changes that are likely to make it better.

 

Applying meta system thinking to EA

The idea of meta system thinking is readily applies to business operations. EA is the higher system or entity that monitors and changes business activity systems in discrete steps, under change control. The two levels are coupled in a feedback loop. See the later chapter.

Conclusions and remarks

For the use of these ideas in EA, the later chapter addresses what it takes for an enterprise to be adaptable, including several interpretations of self-organization, and a way of approaching “wicked problems”.

 

Aside: comments on a paper

This paper on robustness makes some questionable assertions. It defines robustness thus. “A core property of robust systems is given by the invariance of their function against the removal of some of their structural components.” Contrary to what some interpret Ackoff as having said, we can indeed remove parts from a whole without affecting its ability to meet its main aim. We can remove the spell checker from a word processor. We can remove the glove compartment, airbag, safety belt, carpet, arm rests and radio from a motor car, and still drive from A to B. We can remove the spleen, gall bladder and appendix from a human body with no significant effect on the functioning of the body.

 

The paper doesn’t distinguish actors from roles. Since it assumes one actor plays one role, and vice-versa, removing one removes both. In practice, it isn’t that simple. Given one role played by many actors, we can remove actors without qualitatively changing the system behavior. E.g. remove one of our two kidneys.  And given one role is responsible for several activities, removing the role may disable all of them. However, some processes may be completable without some ideally-expected activities. And in practice, some processes are more central to business success than others. This “centrality” may not be evident from a model of system structure and behavior.