Ashby’s ideas – an introduction to cybernetics

https://bit.ly/2TRAqlA

Copyright 2017-9 Graham Berrisford. One of a hundred papers on the System Theory page at http://avancier.website. Last updated 25/04/2019 22:16

 

This is the first of three companion papers that contain background research on systems thinkers.

·         Read Ashby’s ideas for an introduction to his ideas about cybernetics, mostly integral to general system theory.

·         Read Ackoff’s ideas on the application general system theory (after von Bertalanffy, Boulding and Rapaport) to management science.

·         Read Beer’s ideas on the application cybernetics (after Weiner, Ashby and Turing) to management science.

 

This paper includes several quotes from Ashby’s Introduction to Cybernetics (1956).

And a few quotes from Ashby’s later treatise on self-organisation (1962).

Contents

The beginnings of cybernetics. 1

Systems are abstractions. 2

Systems have regular behaviours or dynamics. 4

Continuous dynamics can be modelled as discrete. 6

Cybernetics is about the communication of information. 6

Communication depends on encoding and decoding. 7

Machines are determinate (rule-bound) 8

Machines can process inputs. 9

Systems can be coupled in wider systems. 10

Systems can change in different ways. 11

“Self-organisation” requires a wider or higher system.. 12

Control systems need the requisite variety. 13

Conclusions and remarks. 14

Footnotes. 15

 

The beginnings of cybernetics

Cybernetics emerged out of efforts in the 1940s to understand the role of information in system control.

Weiner introduced cybernetics as the science of biological and mechanical control systems.

He discussed how a controller directs a target system to maintain some state variable(s) in a desired range.

E.g. A thermostat directs the actions of a heating system.

 

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

“Despite being widely influential within cybernetics, systems theory… Ashby is not as well known as many of the notable scientists his work influenced.” Wikipedia 2017

Understanding Ashby’s ideas helps you to understand much else in the field of systems thinking.

 

In “Design for a Brain” (1952), Ashby, addressed biological (rather than mechanical or electronic) homeostatic functions.

He presented the brain as a regulator that maintains a body’s state variables in the ranges suited to life.

This table distils the general idea.

 

Generic system description

Ashby’s design for a brain

A collection of actors

that interact in regular behaviors

that maintain system state and/or

consume/deliver inputs/outputs

from/to the wider environment.

A collection of brain cells that

interact in processes to

maintain body state variables by

receiving/sending information

from/to bodily organs/sensors/motors.

 

Cybernetics shares many concepts with wider cross-science general system theory.

The following concepts appear also in enterprise and software architecture methods.

·         Environment: the world outside the system of interest.

·         Boundary: a line (physical or logical) that separates a system from its environment, and encapsulates a system as an input-process-output “black box”.

·         Interface: a description of inputs and outputs that cross the system boundary.

·         Hierarchy: a system is composed from interacting subsystems; systems are recursively composable and decomposable.

·         Emergence of properties, at a higher level of composition, from coupling of lower-level subsystems.

·         Coupling of systems by input/output information.

·         State: the current structure or variables of a system, which changes over time.

·         Deterministic processing of a system’s inputs with respect to its memory/state data.

Systems are abstractions

For some, to understand systems thinking requires making a paradigm shift as radical as is needed to understand Darwin’s evolution theory.

In discussion, people often refer to a named entity as a system.

They point at a machine or a business (like IBM) and say "the system is that thing there".

But with no further description, that is vacuous to the point of being meaningless.

 

Ashby, Ackoff, Checkland and other systems thinkers emphasised that a system is a perspective of a reality.

Any substantial entity (like IBM) can be described as manifesting (instantiating, realising) countless different systems.

So, we must distinguish abstract systems from concrete systems.

 

“At this point we must be clear about how a "system" is to be defined.

Our first impulse is to point at [some real entity or machine] and to say "the system is that thing there".

This method, however, has a fundamental disadvantage: every material object contains no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” (1956)

 

Suppose the main interest is the perspective taken by an umpire of a tennis match.

What facts are relevant?

·         The state – the current values of the variables displayed on the score board.

·         The inputs - the movements of the ball and the players that change the variables, which are describable as discrete events.

·         The rules - the laws of tennis by which the inputs advance the state.

 

The laws of tennis are an abstract description of how concrete tennis matches proceed, or should proceed.

Any game that conforms (well enough) to the laws of tennis may be called a tennis match

 

Tennis match as a system

The laws of tennis

<form>                 <represent>

The LTA <observe and envisage> Tennis matches

 

The systems thinker Russell Ackoff spoke of abstract systems and concrete systems.

An abstract system (e.g. the laws of tennis) describes how some part of the word behaves, or should behave.

A concrete system (e.g. a tennis match) is some part of the world that conforms - well enough - to an abstract system.

 

Systems as abstractions

Abstract systems (descriptions)

<create and use>                   <represent>

Systems thinkers <observe & envisage> Concrete systems (realisations)

 

A concrete system can be viewed in two ways.

You can think of it as what Ashby called “the real machine” – a concrete structure in motion.

Or think instead of how far and how well that structure’s behaviour realises a particular abstract system.

(Bear in mind that the same structure may realise other systems at the same time.)

 

An abstract system is a type that hides the infinite complexity of real-world actors and activities that instantiate (or realise) the type.

 

Abstract system description

Theoretical system

Type

A set of roles and rules (the logic or laws actors follow)

Concrete system realisation

Empirical system

Instance

Actors playing the roles and acting according to the rules

 

This table contains some examples.

 

Abstract system description

“Solar system”

Laws of tennis

The score of a symphony

The US constitution

Concrete system realisation

Planets in orbits

A tennis match

A performance of that symphony

US governments

 

The US constitution defines the roles and rules of the essential actors in the US federal government system.

The roles include the Congress (the legislative branch), the President, the court system (the judicial branch) and the States.

The constitution also defines relations between actors playing those roles.

It does not define the roles or rules of subordinate institutions created by federal governments.

It does however define a higher/meta system to be used (by Congress or Constitutional Convention) to change the system (amend the constitution) itself.

Systems have regular behaviours or dynamics

Some define a system as a structure in which things are related to each other.

But that fits every substantial entity in the universe; and if the entity-system relationship is 1-1, the term system adds no value.

Also, things may be related only tenuously, by their relationship to something else.

Are the employees of IBM to be called a system purely because they are all related to one employer?

 

The first decision…  is what to treat as the basic elements of the social system.

The sociological tradition suggests two alternatives: either persons or actions.” Seidl 2001.

Is a system a set of actors performing actions? Or a set of actions performed by actors?

 

Ashby and other general system theorists focused attention on a system’s actions rather than its actors.

“Cybernetics deals with all forms of behaviour in so far as they are regular, or determinate, or reproducible.”

“[It] treats, not things but ways of behaving. It does not ask "what is this thing?" but ''what does it do?"

It is thus essentially functional and behaviouristic.” (Ashby 1956)

 

Donella Meadows, a champion of Forrester’s System Dynamics, characterised a system by its behaviours.

“A set of elements or parts that is coherently organized and interconnected in a pattern or structure

that produces a characteristic set of behaviours, often classified as its function or purpose."

 

In short, a system is set of actions or behaviours performed by actors or structures.

The behaviors change the values of state variables.

 

 

Solar system

Ashby’s cybernetics

Meadow’s System Dynamics

Structures

Planets

Roles played by actors

Stocks or populations

Behaviors

Orbits

Rules for activities

Flows between stocks

State

Positions in space

State variables

Stock volumes

 

A general principle in system design is the primacy of behaviour.

Meaning that the structures only exist to perform the required behaviours.

 

System dynamics

Roles, rules and variable types

<define>                         <represent>

Systems thinkers   <observe and envisage>   Regular behaviours

 

Beware however that the term “behaviour” is used with two meanings in systems thinking discussions.

 

Behaviours as regular processes

In cybernetics, the  behaviors are processes that change state variable values.

In Forrester’s System Dynamics, the behaviors are inter-stock flows that change stock populations.

Note that a process in an abstract system description may not repeat exactly in the concrete world.

The process defined in a musical score is performed somewhat differently each time it is played.

Even a computer program is different each time it runs at the level of memory spaces and electrical phenomena.

 

Behaviours as state change trajectories

The term behaviour can instead mean the trajectory that a state variable’s values take over time.

That trajectory can be regular (linear or cyclical) or irregular/chaotic.

Whether the state change trajectory is regular or irregular, it is an inexorable side effect of regular behaviours.

Continuous dynamics can be modelled as discrete

Ashby set out to view physical (biological and technological) machines as systems.

To that end, he converted all physically continuous dynamics to logically discrete dynamics.

In a real machine, the inputs, outputs and state changes may be all continuous, all discrete, or a mixture.

However, to describe that machine in a mathematical model, Ashby divided continuous inputs into discrete events.

 

“Often a change occurs continuously, that is, by infinitesimal steps, as when the earth moves through space, or a sunbather's skin darkens under exposure.

The consideration of steps that are infinitesimal, however, raises a number of purely mathematical difficulties, so we shall avoid their consideration entirely.

Instead, we shall assume in all cases that the changes occur by finite steps in time and that any difference is also finite.

We shall assume that the change occurs by a measurable jump, as the money in a bank account changes by at least a penny…

Consideration of the case in which all differences are finite loses nothing.

It gives a clear and simple foundation; and it can always be converted to the continuous form if that is desired.” (1956)

 

In short, to model a physical system, Ashby converted any continuous dynamics to discrete dynamics.

Elsewhere, I have demonstrated that a System Dynamics model (a la Forrester) can be abstracted from a discrete event-driven model (a la Ashby).

Cybernetics is about the communication of information

“Cybernetics was defined by Wiener as "the science of control and communication, in the animal and the machine"

in a word, as the art of steersmanship, and it is to this aspect that the book will be addressed.” (1956)

 

Weiner’s cybernetics was focused on the control of biological and technological systems.

Especially on how coupling a control system to a target system can act to regulate their states.

 

More generally, the universe is an ever-unfolding process that will end in chaos.

A system is a transient island of order carved out of the universe.

To hold off chaos, the system must draw energy from its environment

But cybernetics is focused on flows of information rather than energy.

 

“Cybernetics started by being closely associated in many ways with physics,

but it depends in no essential way on the laws of physics or on the properties of matter.”

“In this discussion, questions of energy play almost no part; the energy is simply taken for granted.” (1956)

Cybernetics is about how behaviour is affected by information state and flow (rather than physical state or flow).

Communication depends on encoding and decoding

Cybernetics is the science of how systems communicate, and are controlled by, information flows.

First, for any act of communication to occur, there must be some of what Ashby called variety.

If my office door is always open, I cannot use it to convey a message.

If it can be left open or closed, I can use it to convey a message (I am open to visitors or not).

 

Ashby emphasised that the meaning of a message depends on what the receiver/decoder knows of the sender/encoder.

In his example, after two soldiers are taken prisoner by countries A and B; their wives receive the same brief message “I am well”.

Though each has received the same message (or signal), they have received different informations (or meanings).

Because country A allows the prisoner a choice of three messages: I am well, I am slightly ill and I am seriously ill,

Whereas country B allows only one message: I am well (meaning no more than “I am alive”).

 

Ashby wrote as follows of “the ubiquity of coding” in the communication of information.

Let us consider, in some detail, the comparatively simple sequence of events that occurs when a “Gale warning” is broad-cast.

It starts as some

·         patterned process in the nerve cells of the meteorologist, and then becomes

·         a pattern of muscle-movements a she writes or types it, thereby making it

·         a pattern of ink marks on paper. From here it becomes

·         a pattern of light and dark on the announcer’s retina, then

·         a pattern of retinal excitation, then

·         a pattern of nerve impulses in the optic nerve, and so on through his nervous system. It emerges, while he is reading the warning, as

·         a pattern of lip and tongue movements, and then travels as

·         a pattern of waves in the air. Reaching the microphone it becomes

·         a pattern of variations of electrical potential, and then goes through further changes as it is amplified, modulated, and broadcast. Now it is

·         a pattern of waves in the ether, and next

·         a pattern in the receiving set. Back again to

·         a pattern of waves in the air, it then becomes

·         a pattern of vibrations traversing the listener’s ear-drums, ossicles, cochlea, and then becomes

·         a pattern of nerve-impulses moving up the auditory nerve.

Here we can leave it, merely noticing that this very brief account mentions no less than sixteen major transformations through all of which something has been preserved, though the superficial appearances have changed almost out of recognition.

Machines are determinate (rule-bound)

Ashby illuminated systems thinking by viewing the world in terms of machines describable as systems that maintain state variables.

“Cybernetics is a "theory of machines".

“Our starting point is the idea, much more than a century old, that a machine, in given conditions and at a given internal state, always goes to a particular state.”

·         “A variable is a measurable quantity that has a value.”

·         “The state of the system is the set of values that the variables have.”

·         “A system is any set of variables which he [observer] selects from those available on the real machine.” (Ashby 1956)

 

In chapter 3, Ashby discussed how a determinate machine may be defined.

“Every real determinate machine or dynamic system corresponds to a closed, single-valued transformation.” (1956)

A transformation is a state change.

Single-valued means the system cannot be in two states at once - the set of state variables will have only one set of values at a time.

Determinate means each state-to-state transformation is determined by rules.

Closed is a constraint we need not discuss here.

 

To show the relevance of his mathematical ideas to the natural world, Ashby offered the following example.

“The male and female three-spined stickleback form… a determinate dynamic system.

Tinbergen (in his Study of Instinct) describes the system's successive states as follows” (1956)

The stickleback roles are shown columns of the table below (they could be shown as swim lanes in a process flow diagram).

The actors play those roles by communicating, by sending visual signals (information flows) to each other.

 

Stickleback mating system

The female’s role is to

The male’s role is to

present a swollen abdomen and special movements

present a red colour and a zigzag dance

swim towards the male

turn around and swim rapidly to the nest.

follow the male to the nest

point its head into the nest entrance

enter the nest

quiver in reaction to the female being in the nest

spawn fresh eggs in the nest

fertilise the eggs

 

The table above is an abstract system, a description of roles that typify actors.

Reading left to right, top to bottom, each visual signal (information flow) triggers the partner to act in response.

A concrete system would be any pair of sticklebacks that realises the two abstract roles.

Notice the system includes both active structures (sticklebacks) and passive structures (nest and eggs).

Machines can process inputs

Ashby extended his initial notion of a determinate system to include inputs as well as state variables.

 

The machine with input or finite state machine

“What matters is the regularity of the behaviour….

A machine behaves [such] that its internal state, and the state of its surroundings, defines, uniquely, the next state it will go to.

When the variables are continuous it corresponds to the definition of a dynamic system by giving a set of ordinary differential equations with time as the independent variable.

The fundamental nature of such a representation has been recognised by many.

So arises the modern definition…

The machine with input (Ashby 1958) or the finite state automaton (Jeffrey 1959) is today defined by

·         a set of S internal states,

·         a set of I input or surrounding states and

·         a mapping (say f) of the product set I*S into S.

 

Here, we have the very essence of the “machine”; all known types of machine are to be found here.

Change f and the organisation changes.” (1962)

 

Above, Ashby equated the organisation of a system to the transformation rules of a machine with finite states.

The number of possible states could be practically if not theoretically infinite.

What matters is the regularity of the behaviour - the determinate nature of state-to-state transformations

Change a transformation, a function, and you have a different machine/system.

 

The transducer or determinate system

Ashby wrote that his machine with input is identical with the “transducer” of Shannon.

“A system whose next state is determined by its present state and the present values of its parameters.

Since we can predict its next state—we say it is “determinate”. (1956)

Change the determinate rules and you change the machine/system.

 

The Markovian Machine

Later, Ashby slightly relaxed his definition of a machine as a system whose behaviour is law-abiding or repetitive.

He said it is sufficiently law-abiding or repetitive for us to be able to make some prediction about what the system will do.

In chapter 12, he allowed there can be a less constrained, but still rule-bound, system

We cannot predict its next state, but can predict that, in repeated conditions, the frequencies of the various states will be constant.

“We can therefore consider a new class of absolute system.

It is one whose states change with time not by a single-valued transformation but by a matrix of transition probabilities.

For it to remain the same absolute system the values of the probabilities must be unchanging.” (1956)

 

In short, Ashby’s machines include both determinate machines and Markovian machines.

All his systems are predictable to some extent – using determinate and/or probability rules.

And if you add a new variable or function/rule/probability, you make a new machine/system/organisation.

Systems can be coupled in wider systems

“Two or more whole machines can be coupled to form one machine;

and any one machine can be regarded as formed by the coupling of its parts,

which can themselves be thought of as small, sub-machines.” (1956)

 

In a coupling, one system’s output is input to another system.

Via coupling, one (control) system can direct the behaviour of another (target) system, and change its state change trajectory.

The relationship from a control system to a target system may be described as from a “higher” system to a “lower” system.

The terms “higher” and “lower” are subjective, reflecting how we think of the brain and the body, or a hierarchical human organisation.

 

Any coupled systems can be regarded as parts of a wider system (or ecology).

But note that the boundary of the wider system is arbitrary; it lies wherever the observer chooses.

And in a mesh of coupled systems, there are infinite possible wider systems.

 

Properties of a wider system “emerge” from the coupling of the parts.

That a whole machine should be built of parts of given behaviour is not sufficient to determine its behaviour as a whole.

Only when the details of coupling are added does the whole's behaviour become determinate.” (1956)

 

An experimenter can couple to a system by providing inputs (and so changing its state variable values) or observing outputs.

“… coupling is of profound importance in science, for when the experimenter runs an experiment he is coupling himself temporarily to the system that he is studying.”

“… coupling does no violence to each machine's inner working, so that after the coupling each machine is still the same machine that it was before.

For this to be so, the coupling must be arranged so that, in principle, each machine affects the other only by affecting its conditions, i.e. by affecting its input.” (1956)

 

But what about reengineering - changing the machine’s inner workings - changing its state variable types or rules?

This kind of change is of particular interest in social and business systems thinking.

Systems can change in different ways

Ashby scoped a system as a set of variables (cf. a set of stocks in System Dynamics).

Change the variables’ values (cf. stock volumes) and you change the system’s state

Change the variables (cf. the stocks) and you change the system itself - change the transformations it can make.

 

However, Ashby’s system is more than a set of variables.

It also includes the rules applied to those variables in any state-to-state transformation.

Rules are applied when a particular state is reached or an input/event is detected (or a flow passes between stocks).

So, another way to change the system is to change the rules that govern state changes.

 

Consider system state changes and mutations in playing a game of cards.

There are state changes: In playing their cards, according to defined rules, players win “tricks” which advance the state of the game.

There can be mutations: The players can stop the game, agree a change to the rules, then play on; this is a creative act that changes the game itself.

 

Ashby urged us to distinguish state change and transformation change in this observation.

“The word "change" if applied to [an entity repeating a behaviour] can refer to two very different things.

·         change from state to state, which is the machine's behaviour, and which occurs under its own internal drive, and

·         change from transformation to transformation, which is a change of its way of behaving, and occurs at the whim of the experimenter or some other outside factor.

The distinction is fundamental and must on no account be slighted.” (1956)

 

Ashby may have intended to distinguish state change in a machine without input from state change triggered by input.

However, his phrase at “at the whim of the experimenter” suggests a more fundamental distinction.

System change may be divided into two broad types:

·         system state change: changing the values of given state variables (whether triggered by input or not)

·         system behaviour change: changing the variable types or the rules that update their values.

 

The second kind of change may be further subdivided into reconfiguration and mutation.

 

System reconfiguration: changing behaviour in a pre-ordained way.

Consider as a system, a machine that is coupled to a lever.

“Many a machine has a switch or lever on it that can be set at any one of three positions, and the setting determines which of three ways of behaving will occur.” (1956)

Notice that more flexible machine is more complex than a rigid one (this is a universal trade off).

But still, the ways of behaving are constrained, and we know what pulling the lever will do.

So reconfiguring the machine, switching it from one mode to another, does not change what it has the potential to do.

 

Consider the re-configuration of a caterpillar, via a pupae, into a butterfly.

This change is pre-ordained in the DNA of the caterpillar.

 

System mutation: changing behaviour in a random or creative way.

A mutation can happen at random (as in biological evolution) or by intention (as in re-engineering a machine).

Ashby argued a machine cannot change itself in this way.

This “re-organisation” requires the intervention of a higher level process or machine.

It is creative in that it changes the very nature of the machine, from one generation to the next.

And it is the kind of change of most interest in social and business systems thinking.

 

For more on different kinds of system change, read the papers on that topic at the top of the System Theory page.

“Self-organisation” requires a wider or higher system

Ashby (and Maturana) rejected the idea of a “self-organising system” as undermining the concept of a system.

For a coherent discussion we need to distinguish:

·         self-organisation (as in changing the variables or rules of a system)

·         self-assembly (as in the pre-determined growth of a crystal in a liquid, or an embryo into an adult)

·         self-regulation (as in the maintenance of homeostasis during the life of an entity)

·         self-determination (as in self-aware actors choosing what to do, and whether to follow some rules or not).

 

In discussion of self-organisation, we ought to distinguish inter-generational evolution from chaotic ad hoc change.

For sure, the actors who play roles in a system may agree to change the variables or rules of that system

Whenever actors discuss and agree changes, they act in a higher or meta system.

And once the change is made, the actors (still members of the same social network) now act in a new system or system generation.

 

You can find Ashby’s 1962 treatise on self-organisation here.

He pointed out that to say a system is self-organising can have two different meanings.

 

Self-organisation = changing from unorganised to organised

“The system that starts with its parts separate, and when those parts act they act so that they change towards forming connections of some type.” (1962)

For example, a rider and a bicycle are parts that become organised when coupled in a whole.

Properties of the wider system are said to “emerge” from the coupling of the parts.

 

Self-organisation = changing from a bad organisation to a good one

What “good” means is a question Ashby explored at length, but more importantly here he said this also.

Before the question is answered we must notice… that no machine can be self-organising in this [second] sense.

If the system is to be self-organising, the self must be enlarged to include… some outside agent." (1962)

 

Ashby wrote that to appear self-organizing, a machine S must be coupled to another machine α.

And that the part S can only be ‘self-organizing’ within the whole S + α.

 

And as the preface to Ashby’s treatise says:

“One of Ashby’s goals was to repudiate that interpretation of self-organization, commonly held, that a machine or organism can change its own organization.”

Instead he postulated “A new higher level of the machine was activated to reset the lower level's internal connections or organisation.” (2019)

 

Here, higher means something different than the relationship from control system to target system.

It means reshaping the organisation of a system – changing the state variables or the rules for changing their values.

The higher level machine (α) plays role of a meta system to the lower level machine (S).

Yes, higher and lower level machines may be seen as “coupled” in a wider or aggregate machine.

But the aggregate machine is only ever partially self organising – since the higher part (α) is needed to change the lower part (S).

 

Note that in a social system an actor can play two roles - one role "inside" the system and the other role as an "outside” agent.

In other words, one (physical) actor can play a (logical) role in both system and meta system.

But one action can take place only in one or other system.

 

Moreover, note that a lower system (S) might be the target of another higher level machine (b) which may conflict with or override the first (α).

Conflicts between required changes seem to me the very stuff of human social existence.

 

Some speak of complex adaptive systemswhere the meaning of all three terms is debatable.

Read Complex adaptive systemsfor more on that.

Control systems need the requisite variety

Ashby’s ideas about regulatory systems include:

·         Variety: the number of possible states a system can exhibit

·         Attenuator: a device that reduces variety.

·         Amplifier: a device that increases variety.

 

Ashby's law of requisite variety says a control system requires sufficient variety to control selected variables of target system.

Read Ashby’s law of requisite variety for a detailed exploration of the idea.

Conclusions and remarks

Understanding Ashby’s ideas helps us understand much else in the field of systems thinking.

His ideas lead us to distinguish an abstract social system of interest from the concrete actors in a social network.

The same actors may realise not only the system of interest, but countless other, even conflicting, systems.

Moreover, the same actors may act in countless ways that are not usefully considered systematic.

And it would be a mistake to reduce management science – especially the motivation of people to do things - to system theory.

 

Ashby’s cybernetics is sometimes called first order cybernetics.

His observer, or experimenter, sits outside the system that is described.

 

Is second order cybernetics needed?

Second-order cybernetics was developed around 1970 by Margaret Mead, Heinz von Foerster and others.

It is the recursive application of cybernetics to itself.

It allows systems actors to be system thinkers, who re-organise the system they play a role in.

It is a reference point for many “systems thinkers” who discuss “social systems” that mutate over time.

 

However, second-order cybernetics undermines Ashby’s concept of a system, and any more general system theory.

Is this change of perspective necessary to study behaviour change?

 

It is not needed to explain changing the values of given variables, or in a trajectory or pattern in that change.

But what if behaviour change means system mutation - replacing one set of variables or rules with another?

Ashby rejected the idea that a system can reorganise itself in this way.

“If the system is to be self-organising, the self must be enlarged to include… some outside agent." (1962)

 

Ashby’s first order cybernetics accommodates self-organisation in that way.

It allows one (higher or meta) system to change the description of another (lower) system.

Read the claims of second cybernetics for further discussion.

 

The value of applying scientific method

To apply system theory is to apply the scientific method.

You observe or envisage a system in the real world – a concrete or empirical system.

You model it in an abstract system - description or theoretical system you can use to predict and test real world behaviours.

This method is widely applied today in enterprise, business and software architecture.

 

Second order cyberneticists claim that the “objectivity” of classical cybernetics, and of science as a whole, is, in general a false perspective, it does not exist.

The critique of science by Marxists, post-modernists and some second order cyberneticians is much to be deprecated.

Ashby, like the hardest of hard scientists, saw all descriptions of reality as subjective.

The important thing is to test how far the model is supported by empirical evidence.

 

To apply Ashby’s system theory is to apply the scientific method.

You observe or envisage a system in the real world – an empirical system – an instance.

You model it in an abstract system description - a theoretical system – a type.

·         model the state as variables whose values can be measured (e.g. the positions of the planets).

·         model the behaviour as processes (e.g. the orbits of the planets) that maintain or advance variable values

You test how far the model is supported by empirical evidence.

Footnotes

 

On levels of thinking

This table identifies three levels of thinking about the realisation of a system.

 

Abstract system descriptions

mental, spoken, documented, mechanical, whatever

“Solar system”

Laws of tennis

The score of a symphony

The US constitution

Concrete system realisations

behaviour that matches the above well enough

Orbits

A tennis match

A performance of that symphony

Government of the US

Concrete entities

which performs the behaviour above

Planets

in space

Tennis players

on a tennis court

Orchestra members

in a concert hall

Politicians and civil servants

in offices

 

It is important to realise that one concrete entity can realise many systems.

For example a manufacturing company may realise the several systems in the table below, at the same time.

 

System 1

System 2

System 3

System 4

Describer / perspective

Shareholder

Government

Customer

Supplier

Abstract system description

(mental, spoken, documented, mechanical, whatever)

Profit generator

Tax revenue generator

Manufacturer

Customer

Concrete system realisation

(behaviour that matches the above well enough)

Selling goods

Paying employees

Making goods

Purchasing goods

Concrete entity

(which performs the behaviour above)

The social network of a manufacturer’s employees

 

As Ashby said: our first impulse is to point at a concrete entity repeating a behaviour and to say "the system is that thing there".

However, an entity is rightly called a system only when and where it conforms to a system description.

 

In practice, it is common to regard the concrete entity as part of the system realisation.

Because the remainder of that entity, and whatever it does outside the system of interest, is out of scope.

However, as Ashby urged us to recognise, one social network can realise infinite different (even conflicting) systems.

 

Notions of evolution and learning

Ashby’s work is dense, expressed in mathematical terms, and a tough read.

In section 4/23 Ashby discusses properties/variables in relation to Darwinian evolution.

However, his references to evolution and learning are slight, and tend to be speculative.

He discusses properties that diminish or flourish because entities who possess them decrease or increase in number.

Ashby does not discuss the meta system (sexual reproduction) by which entities with properties can be changed or new properties generated.

Nor does he address the concern in sociology, which is the mutation of a system by intentional (not random) restructuring of its variables and rules.

 

 

All free-to-read materials at http://avancier.website are paid for out of income from Avancier’s training courses and methods licences.

If you find the web site helpful, please spread the word and link to avancier.website in whichever social media you use.