Ashby’s ideas – an introduction to cybernetics

https://bit.ly/2TRAqlA

Copyright 2017 Graham Berrisford. One of more than 100 papers on the “System Theory” page at http://avancier.website. Last updated 07/05/2019 11:15

 

This is one of many companion papers that analyse some systems thinkers’ ideas.

·         Read Ashby’s ideas for an introduction to his ideas about cybernetics, mostly integral to general system theory

·         Read Ackoff’s ideas on the application of general system theory (after von Bertalanffy and others) to management science.

·         Read Ashby’s ideas about variety on his measure of complexity and law of requisite variety.

·         Read Beer’s ideas on the application of cybernetics (after Ashby and others) to management science.

·         Read Von Foerster’s ideas on ideas attributed to Heinz von Foerster and his second order cybernetics.

 

Further reading on the “System Theory” page at http://avancier.website includes:

Boulding’s ideas, Checkland’s ideas, Luhmann’s ideas, Marx and Engels’ ideas, Maturana’s ideas and Snowden’s ideas.

 

This paper includes several quotes from Ashby’s Introduction to Cybernetics (1956).

And a few quotes from Ashby’s later treatise on self-organisation (1962).

Contents

The beginnings of cybernetics. 1

Systems are abstractions. 3

Systems have regular behaviors or dynamics. 6

All system dynamics can be modelled as discrete. 8

Cybernetics depends on the communication of information. 8

Communication depends on encoding and decoding. 9

Machines are determinate (rule-bound) 10

Machines can process inputs. 11

Systems can be coupled in wider systems. 12

Systems can change in different ways. 13

“Self-organisation” requires a wider or higher system.. 14

Control systems need the requisite variety. 16

On evolution and learning. 16

Conclusions and remarks. 17

Footnotes. 18

 

The beginnings of cybernetics

First, a few general system theory ideas.

A system is defined by some orderly behavior(s) that we model with some particular interest in mind.

 

A closed system is one modelled without reference to its wider environment.

E.g. a causal loop diagram (often used in connection with Forrester’s System Dynamics) shows some stocks connected by some flows.

The model may be incomplete, because it omits stocks in the wider environment that have a significant effect on stocks in the model.

 

An open system is one we model as consuming inputs from its environment.

Those inputs stimulate some processes in the system that change its internal state and/or produce outputs.

E.g. a model of a computer program.

The model of the system can be complete in the sense that it models all possible inputs and outputs.

It cannot model everything that happens when the program runs, down to the level of electron movements in the computer.

 

Many general system theory concepts are taken for granted in today’s enterprise and software architecture methods. E.g.

·         Environment: the world outside the system of interest.

·         Boundary: a line (physical or logical) that separates a system from its environment.

·         Encapsulation: enclosure of a system as an input-process-output “black box”.

·         Hierarchy: since a system is composed from interacting subsystems, systems are recursively composable and decomposable.

·         State: the current structure or variables of a system, which changes over time.

 

These and related ideas were investigated and disseminated in the middle of 20th century.

Notably in cybernetics, which emerged in the 1940s out of studying the role of information in system control.

Weiner introduced cybernetics as the science of biological and mechanical machines.

He discussed how a controller directs a target system to maintain some state variable(s) in a desired range.

E.g. A thermostat directs the actions of a heating system.

 

Cybernetics ideas were soon adopted by psychologists and sociologists.

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

 

“Despite being widely influential within cybernetics, systems theory…

Ashby is not as well known as many of the notable scientists his work influenced.” Wikipedia 2017

 

Understanding Ashby’s ideas helps you to understand much else in the field of systems thinking.

In “Design for a Brain” (1952), Ashby presented the brain as a regulator that maintains a body’s state variables in the ranges suited to life.

This table distils the general idea.

 

Generic system description

Ashby’s design for a brain

A collection of actors

that interact in regular behaviors

that maintain system state and/or

consume/deliver inputs/outputs

from/to the wider environment.

A collection of brain cells that

interact in processes to

maintain body state variables by

receiving/sending information

from/to bodily organs/sensors/motors.

Systems are abstractions

For some, to understand systems thinking requires making a paradigm shift as radical as is needed to understand Darwin’s evolution theory.

First, it is necessary to distinguish the concrete from the abstract.

·         One physical individual can be many logical instances.

·         One physical entity (occupying some space and time) can instantiate many logical types

·         One person can play many roles.

·         One business (owner of physical resources and employer of physical people) can realise many systems.

 

In discussion, people often refer to an individual thing as a system.

They point at a machine or a business (like IBM) and say "the system is that thing there".

But with no further description, that is vacuous to the point of being meaningless.

Ashby, Ackoff, Checkland and other systems thinkers emphasised that a system is a perspective of a reality.

Ashby put it thus:

 

“Important matter of principle in the study of the very large system.

“The observer must be cautious in referring to “the system”, for the term will probably be ambiguous, perhaps highly so.

“The system” may refer to:

·         the whole system quite apart from any observer to study it— the thing as it is in itself.

·         the set of variables (or states) with which some given observer is concerned.

 

Though the former sounds more imposing… the practical worker inevitably finds the second more important.

The second meaning can be ambiguous if the particular observer is not specified…

[Since different systems] may be abstracted from the same real “thing”, a statement that is true of one may be false of another.

 

It follows that there can be no such thing as the (unique) behavior of a very large system, apart from a given observer.

For there can legitimately be as many [systems] as observers, which may... be so different as to be incompatible.

 

Science is not immediately concerned with discovering what the system “really” is.

But [instead] with coordinating the various observers’ discoveries, each of which is only a portion, or an aspect, of the whole truth.

It will be seen therefore that the method of studying very large systems by studying only carefully selected aspects of them is simply what is always done in practice.” (1956, 6/14)

 

Any substantial entity (like IBM) can be described as manifesting (instantiating, realising) countless different systems.

So, we must distinguish abstract systems from concrete systems.

 

“At this point we must be clear about how a "system" is to be defined.

Our first impulse is to point at [some real entity or machine] and to say "the system is that thing there".

This method, however, has a fundamental disadvantage: every material object contains no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” (1956)

 

We model system features that are "relevant to some main interest already given".

To be confident the model is accurate or practically useful, we need some social, logical or empirical verification of the model.

 

Every model of a reality is a partial representation of it.

Some models are wrong because they misrepresent some aspect of reality.

E.g. a causal loop diagram that suggests an increase in GDP leads to an increase in population growth.

 

Some models are right because they do represent reality.

·         A data model representing a database structure.

·         A flowchart representing the logic of a program.

·         This flowchart representing cell respiration.

·         This flowchart representing the Kreb’s cycle

 

Some models are right but not useful because they omit some feature that matters to their purpose.

E.g. imagine if this NOAA model of wave height along the USA west coast showed where, but not when.

 

Any model (right or wrong) might be irrelevant "to some main interest that is already given" (Ashby).

 

Finally, a model may be right and relevant but wrongly instantiated to a greater or lesser degree.

Consider the rules of a card game, poker, which define the roles of the players.

Actors, playing their roles in a particular game of poker, may disobey the rules.

Still, any instance of a card game that conforms well enough to those rules may be called a game of poker

 

Suppose the main interest is the perspective taken by an umpire of a tennis match.

What facts are relevant to the umpire?

·         The state – the current values of the variables – as displayed on the score board.

·         The inputs - the movements of the players and the ball (describable as discrete events) that change the variable values.

·         The rules - the laws of tennis by which the inputs advance the state variables.

 

The laws of tennis are an abstract description of how concrete tennis matches proceed, or should proceed.

Any particular game that conforms well enough to the laws of tennis may be called a tennis match.

 

Tennis match as a system

The laws of tennis

<maintain>                 <represent>

The LTA <observe and envisage> Tennis matches

 

The systems thinker Russell Ackoff spoke of abstract systems and concrete systems.

An abstract system (e.g. the laws of tennis) describes how some part of the word behaves, or should behave.

A concrete system (e.g. a tennis match) is some part of the world that conforms - well enough - to an abstract system.

 

Systems as abstractions

Abstract systems (descriptions)

<create and use>                   <represent>

Systems thinkers <observe & envisage> Concrete systems (realisations)

 

You can think of a concrete system as what Ashby called “the real machine” – a concrete structure in motion.

But bear in mind that the same structure may realise other systems at the same time.

 

An abstract system is a type that hides the infinite complexity of real-world actors and activities that instantiate (or realise) the type.

 

Abstract system description

Theoretical system

Type

A set of roles and rules (the logic or laws actors follow)

Concrete system realisation

Empirical system

Instance

Actors playing the roles and acting according to the rules

 

This table contains some examples.

 

Abstract system description

“Solar system”

Laws of tennis

The score of a symphony

The US constitution

Concrete system realisation

Planets in orbits

A tennis match

A performance of that symphony

US governments

 

A process in an abstract system description may not repeat exactly in the concrete world.

E.g. The process defined in a musical score is performed somewhat differently each time it is played.

Even a computer program is different each time it runs at the level of memory spaces and electrical phenomena.

 

An abstract system description may be extended with directions on how it can be changed.

E.g. The US constitution defines the roles and rules of the essential actors in the US federal government system.

It defines the roles of the Congress (the legislative branch), the President, the court system (the judicial branch) and the States.

It defines relations between actors playing those roles.

It does not define the roles or rules of lower-level institutions created by federal governments.

It does define a higher-level meta system to be used (by Congress or Constitutional Convention) to amend the constitution.

This kind of “self-organisation” is discussed briefly at the end of this paper.

Systems have regular behaviors or dynamics

Some define a system simply as a structure in which things are related to each other.

But (outside of quantum physics) than means every describable thing is a system, and the term system adds no value.

Also, things may be related only by their relationship to something else.

Are the employees of IBM to be called a system purely because they are all paid by one employer?

 

“The primacy of behavior”

The first question in systems thinking was expressed by Seidl thus:

 

The first decision…  is what to treat as the basic elements of the social system.

The sociological tradition suggests two alternatives: either persons or actions.” Seidl 2001.

 

Do you see a system as a set of actors performing actions? Or a set of actions performed by actors?

Ashby and other system theorists focused attention on a system’s actions before its actors.

A system is defined by some orderly behavior(s) that we model with some particular interest in mind.

 

“Cybernetics deals with all forms of behavior in so far as they are regular, or determinate, or reproducible.”

“[It] treats, not things but ways of behaving. It does not ask "what is this thing?" but ''what does it do?"

It is thus essentially functional and behavioristic.” (Ashby 1956)

 

Donella Meadows, a champion of Forrester’s System Dynamics, also characterised a system by its behaviors.

 

“A set of elements or parts that is coherently organized and interconnected in a pattern or structure

that produces a characteristic set of behaviors, often classified as its function or purpose."

 

In short, a system is set of actions or behaviors performed by actors or structures.

The behaviors change the values of state variables.

 

 

Solar system

Ashby’s cybernetics

Meadow’s System Dynamics

Structures

Planets

Roles played by actors

Stocks or populations

Behaviors

Orbits

Rules for activities

Flows between stocks

State

Positions in space

State variables

Stock volumes

 

The primacy of behavior is a principle

It means the structures need be considered only in so far as they perform required behaviors.

 

System dynamics

Roles, rules and variable types

<define>                          <represent>

Systems thinkers   <observe and envisage>   Regular behaviors

 

Beware however that the term “behavior” is used with two meanings in systems thinking discussions.

 

The meaning of “behavior”

In cybernetics, the behaviors are processes that change state variable values.

In Forrester’s System Dynamics, the behaviors are inter-stock flows that change stock populations.

 

Some use the term behavior differently - to mean the trajectory that a state variable’s values take over time.

That trajectory can be regular (linear or cyclical) or irregular/chaotic.

Even if the state change trajectory is irregular, it is still an inexorable side effect of regular behaviors.

All system dynamics can be modelled as discrete

In a real machine, the inputs, outputs and state changes may be all continuous, all discrete, or a mixture.

For examples of continuous change think of the orbits of the planets, or the hands of a clock,

However, to describe such a machine in a mathematical model, Ashby divided continuous inputs into discrete events.

 

“Often a change occurs continuously, that is, by infinitesimal steps,

as when the earth moves through space, or a sunbather's skin darkens under exposure.

The consideration of steps that are infinitesimal, however, raises a number of purely mathematical difficulties,

so we shall avoid their consideration entirely.

Instead, we shall assume in all cases that the changes occur by finite steps in time and that any difference is also finite.

We shall assume that the change occurs by a measurable jump, as the money in a bank account changes by at least a penny…

Consideration of the case in which all differences are finite loses nothing.

It gives a clear and simple foundation; and it can always be converted to the continuous form if that is desired.” (1956)

 

In short, to model a physical system, Ashby converted any continuous dynamics to discrete dynamics.

Elsewhere, I have demonstrated that a System Dynamics model (a la Forrester) can be abstracted from a discrete event-driven model (a la Ashby).

Cybernetics depends on the communication of information

“Cybernetics was defined by Wiener as "the science of control and communication, in the animal and the machine" (1956)

 

Weiner discussed how a controller directs a target system to maintain some state variable(s) in a desired range.

And on how information feedback loops connect control and target systems.

 

A system is a transient island of order carved out of the universe.

The universe is an ever-unfolding process that will end in chaos.

To hold off that chaos, the system must draw energy from its environment

However, cybernetics is focused on flows of information rather than energy.

 

“Cybernetics started by being closely associated in many ways with physics,

but it depends in no essential way on the laws of physics or on the properties of matter.”

“In this discussion, questions of energy play almost no part; the energy is simply taken for granted.” (1956)

 

Cybernetics is about how behavior is affected by information state and flow (rather than physical state or flow).

Communication depends on encoding and decoding

Cybernetics is the science of how systems communicate, and are controlled by, information flows.

First, for any act of communication to occur, there must be some of what Ashby called variety.

To have some variety a structure must have more than one possible state.

 

Information can be created or found in any structure that has more than one possible state

E.g. If your office door is always open, you cannot use it to convey a message.

If it can be left open or closed, you can use that structural variety to convey the message that you are open to visitors or not.

  

For an act of communication to succeed, two roles must be played.

·         One actor must encode some information or meaning in a data structure or message.

·         Another must decode the same information or meaning from that data structure or message – using the same code.

 

Ashby wrote of “the ubiquity of coding” in the communication of information.

“Let us consider, in some detail, the comparatively simple sequence of events that occurs when a “Gale warning” is broad-cast.

It starts as some patterned process in the nerve cells of the meteorologist, and then becomes

·         a pattern of muscle-movements as she writes or types it, thereby making it

·         a pattern of ink marks on paper. From here it becomes

·         a pattern of light and dark on the announcer’s retina, then

·         a pattern of retinal excitation, then

·         a pattern of nerve impulses in the optic nerve, and so on through her nervous system. It emerges, while she is reading the warning, as

·         a pattern of lip and tongue movements, and then travels as

·         a pattern of waves in the air. Reaching the microphone it becomes

·         a pattern of variations of electrical potential, and then goes through further changes as it is amplified, modulated, and broadcast. Now it is

·         a pattern of waves in the ether, and next

·         a pattern in the receiving set. Back again to

·         a pattern of waves in the air, it then becomes

·         a pattern of vibrations traversing the listener’s ear-drums, ossicles, cochlea, and then becomes

·         a pattern of nerve-impulses moving up the auditory nerve.

 

… this very brief account mentions no less than sixteen major transformations

through all of which something has been preserved,

though the superficial appearances have changed almost out of recognition.”

 

The thing preserved in every physical form is the information/meaning that the meteorologists intended to broadcast.

All that remains is for the receivers to decode the message - using the same code the sender used to encode it.

Machines are determinate (rule-bound)

 “Cybernetics is a "theory of machines".

“Our starting point is the idea, much more than a century old, that a machine, in given conditions and at a given internal state, always goes to a particular state.”

·         “A variable is a measurable quantity that has a value.”

·         “The state of the system is the set of values that the variables have.”

·         “A system is any set of variables which he [observer] selects from those available on the real machine.” (Ashby 1956)

 

Ashby illuminated systems thinking by viewing the world in terms of machines describable as systems that maintain state variables.

In chapter 3, he discussed how a determinate machine may be defined.

“Every real determinate machine or dynamic system corresponds to a closed, single-valued transformation.” (1956)

 

Ashby’s ideas here include:

·         A transformation is a state change.

·         Single-valued means the system cannot be in two states at once - the set of state variables will have only one set of values at a time.

·         Determinate means each state-to-state transformation is determined by rules.

·         Closed is a constraint we need not discuss here.

 

To show the relevance of his ideas to the natural world, Ashby offered the following example (after Tinbergen).

The rows of the table below show the successive states of the stickleback mating system.

The columns show stickleback roles (the columns could be swim lanes in a process flow diagram).

The pair of sticklebacks communicate by sending information in the form visual signals to each other.

 

Stickleback mating system

The female’s role is to

The male’s role is to

present a swollen abdomen and special movements

present a red colour and a zigzag dance

swim towards the male

turn around and swim rapidly to the nest.

follow the male to the nest

point its head into the nest entrance

enter the nest

quiver in reaction to the female being in the nest

spawn fresh eggs in the nest

fertilise the eggs

 

The table above is an abstract system, a description of roles that typify actors.

Reading left to right, top to bottom, each visual signal (information flow) triggers the partner to act in response.

A concrete system would be any pair of sticklebacks that realises the two abstract roles.

Notice the system includes both active structures (sticklebacks) and passive structures (nest and eggs).

Machines can process inputs

Ashby extended his initial notion of a determinate system to include inputs as well as state variables.

 

The machine with input or finite state machine

Ashby equated the organisation of a system to the transformation rules of a machine with finite states.

 

“What matters is the regularity of the behavior….

A machine behaves [such] that its internal state, and the state of its surroundings, defines, uniquely, the next state it will go to.

When the variables are continuous it corresponds to the definition of a dynamic system by giving a set of ordinary differential equations with time as the independent variable.

The fundamental nature of such a representation has been recognised by many.

So arises the modern definition…

The machine with input (Ashby 1958) or the finite state automaton (Jeffrey 1959) is today defined by

·         a set of S internal states,

·         a set of I input or surrounding states and

·         a mapping (say f) of the product set I*S into S.

 

Here, we have the very essence of the “machine”; all known types of machine are to be found here.

Change f and the organisation changes.” (1962)

 

The number of possible states could be practically if not theoretically infinite.

What matters is the regularity of the behavior - the determinate nature of state-to-state transformations

Change a transformation, a function, and you have a different machine/system.

 

The transducer or determinate system

Ashby wrote that his machine with input is identical with the “transducer” of Shannon.

 

“A system whose next state is determined by its present state and the present values of its parameters.

Since we can predict its next state—we say it is “determinate”. (1956)

 

Change the determinate rules and you change the machine/system.

 

The Markovian Machine

In chapter 12, Ashby slightly relaxed his definition of a machine as a system whose behavior is law-abiding or repetitive.

He said it is sufficiently law-abiding or repetitive for us to be able to make some prediction about what the system will do.

He allowed there can be a less constrained, but still rule-bound, system

We cannot predict its next state, but can predict that, in repeated conditions, the frequencies of the various states will be constant.

 

“We can therefore consider a new class of absolute system.

It is one whose states change with time not by a single-valued transformation but by a matrix of transition probabilities.

For it to remain the same absolute system the values of the probabilities must be unchanging.” (1956)

 

In short, Ashby’s machines include both determinate machines and Markovian machines.

All his systems are predictable to some extent – using determinate and/or probability rules.

Remember, if you add a new variable or function/rule/probability, you make a new machine/system/organisation.

Systems can be coupled in wider systems

“… coupling is of profound importance in science” (1956)

 

In a coupling, one system’s output is input to another system.

 

“Two or more whole machines can be coupled to form one machine;

and any one machine can be regarded as formed by the coupling of its parts,

which can themselves be thought of as small, sub-machines.” (1956)

 

Via coupling, one (control) system can direct the behavior of another (target) system, and change its state change trajectory.

You may see the relationship from control system to target system as being from a “higher” system to a “lower” system.

The terms “higher” and “lower” are subjective, reflecting how we think of the brain and the body, or a hierarchical human organisation.

 

Coupling two or more systems makes a wider system (or ecology). .

Given a mesh of coupled systems, many possible system boundaries may be drawn.

 

Properties of a wider system are said to “emerge” from the coupling of the parts.

 

“That a whole machine should be built of parts of given behavior is not sufficient to determine its behavior as a whole.

Only when the details of coupling are added does the whole's behavior become determinate.” (1956)

 

An experimenter can couple to a system in two ways.

·         A provider of inputs (which change values of the system’s state variable)

·         An observer of the outputs

 

“when the experimenter runs an experiment he is coupling himself temporarily to the system that he is studying.” (1956)

 

Coupling systems does not change those systems.

 

“… coupling does no violence to each machine's inner working,

so that after the coupling each machine is still the same machine that it was before.

For this to be so, the coupling must be arranged so that, in principle, each machine affects the other only by affecting its conditions, i.e. by affecting its input.” (1956)

 

But what about reengineering - changing the machine’s inner workings - changing its state variable types or rules?

This kind of change is of particular interest in social and business systems thinking.

Systems can change in different ways

Ashby scoped a system as a set of variables (think, a set of stocks in System Dynamics).

This implies two very different ways to change the system.

 

·         Change the variables’ values (think, stock volumes)  and you change the system’s state

·         Change the variables (think, stock types) and you change the system itself.

 

Ashby’s system includes not only variables but also the rules applied to those variables.

Rules are applied when an input/event is detected, or a particular state is reached (or a flow passes between stocks).

So, changing the rules is another way to change the system itself – to change the transformations it can make.

 

E.g. Consider system state changes and mutations in playing a game of cards.

There are state changes: In playing their cards, according to defined rules, players win “tricks” which advance the state of the game.

There can be mutations: The players can stop the game, agree a change to the rules, then play on; this is a creative act that changes the game itself.

 

Ashby urged us to distinguish state change and transformation change in this observation.

 

“The word "change" if applied to [an entity repeating a behavior] can refer to two very different things.

·         change from state to state, which is the machine's behavior, and which occurs under its own internal drive, and

·         change from transformation to transformation, which is a change of its way of behaving, and occurs at the whim of the experimenter or some other outside factor.

The distinction is fundamental and must on no account be slighted.” (1956)

 

A reader has suggested he intended to distinguish state change in a machine-without-input from state change triggered by input.

However, the phrase at “at the whim of the experimenter” suggests a more fundamental distinction.

System change may be divided into two broad types:

·         State change: changing the values of given state variables (typically triggered by inputs).

·         Behavior change: changing the variables or the rules that update their values.

 

Behavior change may be further subdivided into reconfiguration and mutation.

 

Reconfiguration: changing behavior in a pre-ordained way.

Ashby gave this example

 

“Many a machine has a switch or lever on it that can be set at any one of three positions,

and the setting determines which of three ways of behaving will occur.” (1956)

 

For sure, a more flexible machine is more complex than a rigid one (this is a universal trade off).

But still, the ways of behaving are constrained, and we know what pulling the lever will do.

So reconfiguring the machine, switching it from one mode to another, does not change what it has the potential to do.

 

For a natural example: consider the re-configuration of a caterpillar, via a pupae, into a butterfly.

This change is pre-ordained in the DNA of the caterpillar.

 

Mutation: changing behavior in a random or creative way.

A mutation can happen at random (as in biological evolution) or by intention (as in re-engineering a machine).

 

System mutation is of much interest in social and business systems thinking.

Ashby argued a machine cannot change itself in this way.

This “re-organisation” requires the intervention of a higher level process or machine.

It is creative in that it changes the very nature of the machine, from one generation to the next.

“Self-organisation” requires a wider or higher system

Ashby (and Maturana) rejected the idea of a “self-organising system” as undermining the concept of a system.

We need to distinguish inter-generational evolution from chaotic ad hoc change.

For sure, the actors who play roles in a system may agree to change the variables or rules of that system

Whenever actors discuss and agree changes, they act in a higher or meta system.

And once the change is made, the actors (still members of the same social network) now act in a new system or system generation.

If actors continually change the properties and functions of the organisation they work in, then the concept of a system is lost.

 

You can find Ashby’s 1962 treatise on self-organisation here.

He pointed out that to say a system is self-organising can have two different meanings.

 

Self-organisation = changing from unorganised to organised

“The system that starts with its parts separate, and when those parts act they act so that they change towards forming connections of some type.” (1962)

For example, a rider and a bicycle are parts that become organised when coupled in a whole.

Properties of the wider system are said to “emerge” from the coupling of the parts.

 

Self-organisation = changing from a bad organisation to a good one

What “good” means is a question Ashby explored at length, but more importantly here he said this also.

 

Before the question is answered we must notice… that no machine can be self-organising in this [second] sense.

If the system is to be self-organising, the self must be enlarged to include… some outside agent." (1962)

 

Ashby wrote that to appear self-organizing, a machine S must be coupled to another machine α.

And that the part S can only be ‘self-organizing’ within the whole S + α.

 

And as the preface to Ashby’s treatise says:

 

“One of Ashby’s goals was to repudiate that interpretation of self-organization, commonly held, that a machine or organism can change its own organization.”

Instead he postulated “A new higher level of the machine was activated to reset the lower level's internal connections or organisation.” (2019)

 

Here, higher means something different than the relationship from control system to target system.

It means reshaping the organisation of a system – changing the state variables or the rules for changing their values.

The higher level machine (α) plays role of a meta system to the lower level machine (S).

Yes, higher and lower level machines may be seen as “coupled” in a wider or aggregate machine.

But the aggregate machine is only ever partially self organising – since the higher part (α) is needed to change the lower part (S).

 

Note that in a social system an actor can play two roles - one role "inside" the system and the other role as an "outside” agent.

In other words, one (physical) actor can play a (logical) role in both system and meta system.

But one action can take place only in one or other system.

 

Moreover, note that a lower system (S) might be the target of another higher level machine (b) which may conflict with or override the first (α).

Conflicts between required changes seem to me the very stuff of human social existence.

Control systems need the requisite variety

Ashby’s ideas about regulatory systems include the following.

 

Variety as a measure of complexity

Ashby said the complexity of a system = its variety = the number of possible states it can exhibit.

“A system's variety V measures the number of possible states it can exhibit.”

 

The law of requisite variety “only variety can absorb variety”

"The larger the variety of actions available to a control system, the larger the variety of perturbations it is able to compensate".

Perturbations are changes in the values of a target system’s variables that need to be regulated by the controller.

 

Managing complexity (variety)

Where a controller has insufficient variety, design options include

·         Amplifying (increasing) the variety in the control or management system

·         Attenuating (reducing) the variety in the target or operational system.

 

Read Ashby’s law of requisite variety for a detailed exploration of these ideas.

On evolution and learning

Ashby’s work is dense, expressed in mathematical terms, and a tough read.

His references to Darwinian evolution and learning are slight, and appear largely speculative.

 

Regarding evolution, Ashby discussed properties that diminish or flourish because entities who possess them decrease or increase in number.

He did not discuss the higher level meta systems by which a system with new properties may be generated.

·         Mutation by random changes to variables and rules when a system is reproduced (as in Darwinian evolution).

·         Mutation by intentional reengineering of a system’s variables and rules (as in enterprise architecture and software engineering).

 

It is unclear whether Ashby envisaged cybernetics could explain all kinds of learning.

·         Learning from simple physical sensation (e.g. that your lips may stick to a cube ice)

·         Learning facts (e.g. the colours of the rainbow)

·         Learning a physical process (e.g. to swim or play music)

·         Learning a logical process (e.g. multiplication, algebra)

·         Learning a cultural norm (e.g. to say please and thank you).

·         Machine learning (finding patterns in data)

 

However, machine learning algorithms are state machines executable by computers.

And every state machine is a machine of the kind addressed by cybernetics.

Conclusions and remarks

To describe the data/information created and needed by businesses processes does not imply business automation or digitisation.

It does imply the business is a "system" in the general system theory sense.

The persistent data/information - the memory or state of the system – may be described using an entity-relationship diagram.

The processes of the system - triggered by input messages and internal state changes – may be described using flowcharts.

In the 1960s and 70s, the first wave of digitisation led to refinements in diagram notations.

But the concepts of general system theory did not change then and have not changed since.

 

Understanding Ashby’s ideas helps us understand the above, and much else.

Cybernetics shares many concepts with cross-science general system theory.

The same concepts appear in enterprise and software architecture methods.

And in sociology, they lead us to distinguish an abstract social system from the concrete actors in a social network.

The same actors may realise not only the system of interest, but countless other, even conflicting, systems.

 

On self-organisation

Ashby (and Maturana) rejected the idea of a “self-organising system” as undermining the concept of a system.

First, we need to distinguish self-regulation from self-organisation.

Then, for a coherent discussion, we need to distinguish several possible meanings of “self-organising”.

Read Foerster’s ideas about second order cybernetics for further explanation and exploration of the topic.

Some speak of complex adaptive systemswhere the meaning of all three terms is debatable.

Read Complex adaptive systemsfor more on that.

 

On system modelling as science

A model is a description of a reality, and is infinitely simpler than that reality.

It may be built to develop understanding, or to disseminate understanding, or to build a system, or to predict the future.

 

You can model a situation in many ways; not every model is a system description.

But if you model a situation as a system, then it is a system - by definition.

 

What is relevant to your model is whatever relates to "some main interest already given" (Ashby 1956).

If simply understanding the situation is your interest, then that determines what is relevant.

But how do you know your model is improving your understanding?

(Rather than merely presenting some existing prejudice in the form of a model.)

 

At the end of day, to test what you believe to be a new understanding, you need some social, logical or empirical verification.

E.g. one person presented a Causal Loop Diagram to a discussion group.

Another person pointed to out that it omitted that increasing GDP tends to decrease population growth, and can therefore reduce pressure on resources.

 

The critique of science by Marxists, post-modernists and some second order cyberneticians is much to be deprecated.

To apply Ashby’s system theory is to apply the scientific method.

You observe or envisage a system in the real world – an empirical system – an instance.

You model it in an abstract system description - a theoretical system – a type.

You model the state of the system as variables whose values can be measured (e.g. the positions of the planets).

You model the behavior of the system as processes (e.g. the orbits of the planets) that maintain or advance variable values

You test how far the model is supported by empirical evidence.

 

Was second order cybernetics needed?

Ashby’s cybernetics is sometimes called first order cybernetics.

His observer, or experimenter, sits outside the system that is described.

Second-order cybernetics was developed around 1970 by Heinz von Foerster and others.

It includes the observer in the system.

Was that change of perspective necessary?

Read this paper on second order cybernetics for further discussion.

Footnotes

 

On levels of thinking

This table identifies three levels of thinking about the realisation of a system.

 

Abstract system descriptions

mental, spoken, documented, mechanical, whatever

“Solar system”

Laws of tennis