Ashby’s ideas

Copyright 2017 Graham Berrisford. One of about 300 papers at http://avancier.website. Last updated 14/10/2017 14:37

 

Understanding Ashby’s ideas helps you to understand much else in the field of systems thinking.

Unless otherwise stated, quotes below are from Ashby’s Introduction to Cybernetics (1956).

 

This paper serves as a preface to other papers including: Introducing general system theory   Introducing systems thinkers   Systems thinking approaches   Beer’s ideas  

Contents

A biosocial system.. 1

Abstraction of description from reality. 2

The philosophy of system theory. 4

System change. 5

Cybernetics (control systems and target systems) 5

Variety (a candidate measure of complexity) 6

The law of requisite variety ("variety absorbs variety”) 7

Ashby’s law of requisite variety - revisited. 9

 

A biosocial system

The universe is ever-unfolding process that will end in chaos.

A system is a transient island of order carved out of the universe.

To hold off chaos, a system must draw energy from its environment; however Ashby wrote:

“Cybernetics depends in no essential way on the laws of physics.”

“In this discussion, questions of energy play almost no part; the energy is simply taken for granted.” Ashby, 1956

 

There were describable systems in biology before human kind evolved.

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

“Despite being widely influential within cybernetics, systems theory… Ashby is not as well known as many of the notable scientists his work influenced.

W Ross Ashby was one of the original members of the Ratio Club, who met to discuss [system theory] issues from 1949 to 1958.” Wikpedia in 2017

 

Biology is abundant with examples of dynamic systems, both within one organism and in wider bio-social contexts.

Ashby reported this description of a bio-social system.

“The male and female three-spined stickleback form… a determinate dynamic system.

Tinbergen (in his Study of Instinct) describes the system's successive states as follows” Ashby, 1956

 

The narrative quoted by Ashby is easily converted into a process flow chart.

The male and female stickleback roles can be shown in parallel swim lanes (columns in the table below).

The actors playing those roles are in communication; each act of a male or female sends information, in a visual form, to its partner.

Reading the table left to right, top to bottom you see how each visual information flow triggers the partner to act in response.

 

Stickleback mating system

The female’s role is to

The male’s role is to

present a swollen abdomen and special movements

present a red colour and a zigzag dance

swim towards the male

turn around and swim rapidly to the nest.

follow the male to the nest

point its head into the nest entrance

enter the nest

quiver in reaction to the female being in the nest

spawn fresh eggs in the nest

fertilise the eggs

 

The table above is an abstract entity that describes a concrete entity – a pair of sticklebacks acting in the real world.

The description is a generalised type – the typical process realised by many different pairs of sticklebacks.

This simple example has features that are important in general system theory.

 

Behaviour

A regular process, a sequence of event-triggered activities that advance the state of the system

Structural role

A description listing the activities an actor is expected to perform

Actor

A concrete individual that cooperates with others by playing definable roles in processes

Interaction

A cooperation, usually enabled by communication, by the sending and receiving of information flows

Passive structure

An object that is acted in or on (nest, eggs).

Abstraction of description from reality

To understand system theory is to make a paradigm shift as radical as is needed to understand Charles Darwin’s evolution theory.

(I say that because I find most people find it difficult to understand.)

The paradigm shift was expressed in 1956 by Ross Ashby.

 

“At this point we must be clear about how a "system" is to be defined.

Our first impulse is to point at [a concrete entity repeating a behavior] and to say "the system is that thing there".

This method, however, has a fundamental disadvantage: every material object contains no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” Ashby 1956

 

In short: every system description abstracts facts important to the describer.

The describer must abstract from the infinite describable facts that could be found in the reality of the system that actually runs.

 

"Cybernetics does not ask "what is this thing?" but ''what does it do?" It is thus essentially functional and behavioristic.”

“[It] deals with all forms of behavior in so far as they are regular, or determinate, or reproducible.” Ashby 1956

 

In other words, our concern is activity systems rather than passive structures.

This table identifies three levels of thinking about the realisation of an activity system.

 

 

Concept

System 1

System 2

A

Abstract system description

(mental, spoken, documented, mechanical, whatever)

A symphony score

The design-time code of a computer program

C1

Concrete system realisation

(a behavior that matches the above well enough)

A performance of the above

A run-time execution of the above

C2

Concrete entity (which performs the above)

The orchestra members in a concert hall.

A computer in a data centre.

 

It is very important to realise that one entity can realise many systems.

For example, you may realise the three systems in the table below, at least two of them at the same time!

 

 

Concept

System 3

System 4

System 5

A

Abstract system description

Convert oxygen into carbon dioxide

Reproduce genes in next generation

Model a system

C1

Concrete system realisation

Person breathing

Person making love

Drawing ArchiMate diagrams

C2

Concrete entity

A person – you for example

 

As Ashby said: our first impulse is to point at a concrete entity repeating a behavior and to say "the system is that thing there".

However, an entity is rightly called a system only when and where it conforms to a system description.

No entity is rightly called a system without reference to a specific system description or model – one to which the entity demonstrably conforms.

 

In practice, it is normal to regard the concrete entity (C2) as part of the system realisation (C1).

Because the remainder of that entity, and whatever it does outside the system of interest, is out of scope.

So, the next table wraps up C1 and C2 into C, adds the describer (A) into the picture.

 

D

Describer

Astronomers

LTA

Composer

Playwright

A

Abstract system description

“Solar system”

Laws of tennis

The score of a symphony

The roles in a radio play

C

Concrete system realisation

Planets in orbits

A tennis match

A performance of that symphony

Actors playing those roles

 

The premise is that D, A and C are all material in form, but A is a passive structure rather than an activity system.

The philosophy of system theory

These papers represent describers, descriptions and realities in a triangular relationship

Description theory

Descriptions

<form>                 <idealise>

Describers <observe and envisage> Realities

 

Abstraction of processes to describe the behaviors of a system

Ashby emphasised that the real world systems of interest (to general system theory) are ones that experience or perform processes.

System theory

Processes

<define>                        <idealise>

Systems thinkers <observe and envisage> Regular behaviors

 

No process (no symphony or play) repeats exactly in the material world; any process we describe in a system is an abstraction from reality.

Even a computer program is different each time it runs at the level of matter and energy, memory spaces and electrical phenomena.

 

Abstraction of variables to describe the state of a system

Ashby emphasised that real world entities are systems only to the extent that selected variables are describable and observable.

System theory

Variable types

<define>                          <idealise>

Systems thinkers <observe and envisage> Variable values

 

In short

A system is a view of a reality; or a role you can observe or envisage real world actors as playing.

To apply system theory is to form an abstract system description that hides the infinite complexity of real-world entities you observe or envisage.

You describe the state of a real-world system in terms of variables whose values can be measured (e.g. the positions of the planets).

You model whatever regular processes can change the values of those variables (e.g. the orbits of the planets).

You describe a process in a way that enables real world behaviors to be tested as matching your description.

System change

“The word "change" if applied to [a real world entity repeating a behavior] can refer to two very different things.

·         change from state to state, which is the machine's behaviour, and which occurs under its own internal drive, and

·         change from transformation to transformation, which is a change of its way of behaving, and occurs at the whim of the experimenter or some other outside factor.

The distinction is fundamental and must on no account be slighted.”

 

In other words, one should on no account confuse:

·         State system change: a change to the state of a system, which changes the value of at least one variable.

·         System generation change: a change to the nature of a system, which changes the type of at least one variable or behavior.

 

For more on different kinds of system change, read Stability and change in entities and systems.

Cybernetics (control systems and target systems)

 “Ashby popularised the usage of the term 'cybernetics' to refer to self-regulating systems

The book dealt primarily with homeostatic processes within living organisms, rather than in an engineering or electronic context.” Wikipedia 2017

 

Ashby furthered the ideas of general system theory through discussion of control systems.

Just as we abstract a system description from a reality, so, a mechanical control system models some variable(s) detectable in a target system.

Control system

Model of target system variables

Thermostat

Target system

Target system variables and behavior

Heating system on/off

 

In “Design for a Brain” (1952) Ashby viewed the human body as self-regulating system.

The graphic below separates a control system (the brain) from its target system (the remainder of the body).

Control system

Target system

“Design for a brain”

<wrote>                  <realised in>

Ashby                 <envisaged>               Brains

 

 

<monitor> Body state variables

<control>  Muscles and organs

 

Information flows are central to cybernetics.

Ashby saw the brain as a regulator that maintains a body’s state variables in the ranges suited to life

He presented the brain-body relationship as an information feedback loop.

A brain holds (or has access to) an abstract model of the body’s current state.

The brain receives information from sensors, and sends instructions to motors and organs.

The aim is homeostasis – to maintain the state of the body - and so help it achieve other desired outcomes.

 

Ashby eschewed discussion of consciousness; his basic idea might be distilled into one sentence thus.

A brain is collection of brain cells that interact to maintain body state variables by sending/receiving information to/from bodily sensors and motors.

 

Ashby picked out variables relevant to his interest in the brain as a control system, putting aside other things you might consider important to being human.

His Design for a Brain book holds an abstract description of a brain, which in turn holds an abstract description of a body’s physical variables.

Much as an engineer’s specification holds an abstract description of a control system, which in turn holds an abstract description of the physical variables it controls in a target system.

 

Brains and businesses

Cybernetics has influenced systems thinking in general, and business system thinking in particular.

Businesses can be seen as successfully applying the principles of general system theory and cybernetics.

A brain maintains mental models of things (food, friends, enemies, etc.) it perceives to be out there.

A business (in its information systems) maintains documented models of actors and activities it monitors and directs through information feedback loops.

 

A business system is connected to its wider environment by feedback loops.

It receives information about the state of entities and activities in its environment, and records that information in memory.

The state information it receives and stores must model reality well enough, else the system will fail.

It outputs information to inform and direct entities and activities as need be.

 

This paper goes on to introduce two more of Ashby’s ideas, ones that were later used by Stafford Beer.

Variety (a candidate measure of complexity)

Ashby said that: “a system is any set of variables which he [the observer] selects”. 

“A system's variety V measures the number of possible states it can exhibit, and corresponds to the number of independent binary variables.

But in general, the variables used to describe a system are neither binary nor independent.”

 

In short, complexity = variety = the number of possible states a system can exhibit.

There are several difficulties with this definition of complexity.

 

First, the measure is incalculably large for any system with a non-trivial state.

 

Second, many other measures have been proposed: e.g. McAbe’s procedural complexity.

Here, we might propose: complexity = the number of event/state combinations * the average procedural complexity of the rules applied to them.

There is no agreement as to which complexity measure is right or best.

 

Third, does the measure apply to the control system, the target system, or the real world entity of which only selected variables are controlled?

every real machine embodies no less than an infinite number of variables, all but a few of which must of necessity be ignored.” (Ashby in “Design for a Brain”)

Consider the complexity of a tennis match in real world.

Consider the thought processes of its players, and the molecular structures and movements in the players, court surface and tennis balls.

Obviously, you can never measure the complexity of real world entity or behavior per se.

You can only measure it with respect to your chosen description of its elements (roles, actors, processes, variables, whatever).

And then, only measure it at the level of abstraction at which you choose to describe those elements and their inter-relationships.

 

From the viewpoint of a describer, a system is only as complex as its description.

From the viewpoint of a control system, a target system is only as complex as those variables the control system monitors and controls.

Read “Complexity” for more on that topic.

The law of requisite variety ("variety absorbs variety”)

“Ashby formulated his Law of Requisite Variety stating that "variety absorbs variety, defines the minimum number of states necessary for a controller to control a system of a given number of states."

This law can be applied for example to the number of bits necessary in a digital computer to produce a required description or model.

In response, Conant (1970) produced his so-called "Good Regulator theorem" stating that "every Good Regulator of a System Must be a Model of that System". Wikipedia 2017

 

The Good Regulator theorem was proved by biological evolution long before Conant articulated it.

Animal brains maintain mental models of things (food, friends, enemies etc.) they care about in their environment.

These mental models must be accurate enough to enable the animals to monitor and manipulate those things. (Read Knowledge and Truth for more on this theme).

 

Ashby's law applies to how a control system controls selected or essential variables of a target system.

"The larger the variety of actions available to a control system, the larger the variety of perturbations [in values of target system variables] it is able to compensate"..

 

The law defines the minimum number of states necessary for a control system to control a target system with a given number of states.

It is interpreted here as meaning:

·         A control system’s information state models only those variables in the target system’s concrete state that are monitored and controlled.

·         For a homeostatic system to be stable, the number of states of the control system must be at least equal to the number of states in the target system.

·         The more ways that a homeostatic system can deviate from its ideal state, the more control actions a control system will need.

 

The law says if a controller does not have enough variety to control the variables that define target system, then it cannot control them.

It does not say having enough variety is to sufficient to ensure the controller can control those variables.

 

Ashby’s law does not mean (as some seem to think) that the control system must be as complex as the concrete reality of the entity controlled

A control system is usually much simpler than any real world entity it controls.

For example, a thermostat models only as much variety (colder or hotter than a given temperature) as the behavior (heating system on or off) it controls.

To the control system, the target system is no more complex than the variables it controls.

Control system

Target system

Control system designs

<create and use>                  <realised by>

Engineers      <observe and envisage> Control systems

 

 

<monitor and control>   Heating systems

 

For example, a score board models only a few essential variables of a tennis match.

To umpires, what matters are the rules that determine the score, not the length or complexity of the rallies.

Control system

Target system

Laws of tennis

<create and use>             <realised by>

LTA      <observe and envisage > Umpires

 

 

<monitor and control> Real tennis matches

 

A corollary to Ashby’s law of requisite variety

 “Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.”

 

The implication is that the system describer must be experienced, expert and trained enough to “pick out and study the facts that are relevant”.

System architects must know what is architecturally significant to their and stakeholders’ interests in the target system.

 

Maximising internal variety”

"Since the variety of perturbations a [control] system can potentially be confronted with is unlimited, we should always try maximize its internal variety (or diversity),

so as to be optimally prepared for any foreseeable or unforeseeable contingency." Principia Cybernetica.

This might be interpreted by a business today as filling up a “data warehouse” or collecting lots of “big data”.

There is considerable experience of this leading to data quality issues, as discussed in this paper Beer’s ideas.

Ashby’s law of requisite variety - revisited

It turns out that, in nature and in business, control systems may be distributed rather than centralised (as Ashby and Beer presumed).

I gather the human body does not maintain homeostasis purely by top-down command and control from the higher brain.

Instead, its many variables are maintained by several control systems, operating in parallel, which are distributed and not in direct communication with each other.

An oyster manages to maintain homeostasis without having any central brain and nervous system.

In large businesses, there is rarely if ever an overarching control body that monitors and directs all business processes.

There are in practice several (hierarchical and/or parallel) bodies that may compete for resources, and even have conflicting goals.

 

To different controllers, a real world entity is at once several target systems, a different one to each controller.

But suppose two control systems either compete or complement each other in their efforts to control the same target system state?

This feels to me the very stuff of relationships between people in a social system!

 

Ashby’s law might be interpreted for a business thus:

·         A business system’s information state models only those variables of actors and activities that the business monitors and directs.

·         For directions to be effective, the information state must model those variables accurately enough.

·         The more variegated the actors and activities to be monitored and directed, the more complex a business system must be.

 

To direct an actor or activity in its environment, a brain or a business must be able to:

·         maintain or obtain a model of the state of that actor or activity (the model must be accurate enough).

·         gather inputs: detect events that reveal a significant state change (in an acceptably reliable and timely fashion).

·         produce outputs: respond to events or state changes by sending directives to actors to perform activities (in an acceptably reliable and timely fashion).

·         adapt if the actor does not respond to a directive as expected, in an acceptably reliable and timely fashion.

 

To generalise, a control system

·         must know just enough about the state of target system it monitors and directs

·         must detect events that reveal significant state changes in the target system - in an acceptably reliable and timely fashion.

·         must respond to those events by sending appropriate directives to the target system - in an acceptably reliable and timely fashion.

 

A control system can expect a target system to respond appropriately provided that:

·         monitor and control signals cannot go missing or be corrupted

·         time/speed, capacity/throughput, availability and any other non-functional requirements are met

·         the target system has no other (competing/interfering) control system

·         the target system is not capable of self-determination, of choosing an inappropriate response to a control message.

 

Read Introducing General System Theory for other system theory terms and concepts in the context of early sources.

 

All free-to-read materials at http://avancier.website are paid for out of income from Avancier’s training courses and methods licences.

If you find the web site helpful, please spread the word and link to avancier.website in whichever social media you use.