Ashby’s ideas

Copyright 2017 Graham Berrisford. One of several hundred papers at http://avancier.website. Last updated 19/09/2018 21:05

 

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

“Despite being widely influential within cybernetics, systems theory… Ashby is not as well known as many of the notable scientists his work influenced.”

Understanding Ashby’s ideas helps you to understand much else in the field of systems thinking.

Unless otherwise stated, quotes below are from Introduction to Cybernetics” (1956) W. Ross Ashby.

Contents

Ashby’s ideas. 1

Systems are abstractions. 2

Systems are deterministic. 3

Cybernetics is behavioristic. 4

System change differs from system state change. 5

Thermodynamics is not relevant 5

The brain can be modelled as a control system.. 6

Variety is a measure of complexity. 7

Variety absorbs variety - the law of requisite variety. 8

Ashby’s law reviewed and revisited. 9

Ashby’s law elaborated. 11

Footnote: Three levels of thinking. 11

 

Ashby

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

“Despite being widely influential within cybernetics, systems theory… Ashby is not as well known as many of the notable scientists his work influenced.” Wikipedia

Ashby’s ideas include:

 

·         Systems are abstractions

·         Systems are deterministic

·         Cybernetics is behavioristic

·         System generation change differs from system state change

·         Cybernetics is about information rather than energy flow

·         The brain can be modelled as a control system

·         Variety is a measure of complexity

·         Variety absorbs variety - the law of requisite variety

 

Understanding Ashby’s ideas helps you to understand much else in the field of systems thinking.

There follow some notes on the ideas above.

Systems are abstractions

“A system is any set of variables which he [the observer] selects”.  Ashby 1956

For some, understanding cybernetics requires making a paradigm shift as radical as is needed to understand Darwin’s evolution theory.

People point at a machine or a business in the real world (like IBM) and say "the system is that thing there".

But IBM can be (can manifest, instantiate, realise) countless different systems.

 

Ashby stressed the importance of distinguishing a system from a machine (natural or designed) that realizes it.

“At this point we must be clear about how a "system" is to be defined.

Our first impulse is to point at [a machine] and to say "the system is that thing there".

This method, however, has a fundamental disadvantage: every [machine] has no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” Ashby 1956

 

These papers represent a philosophy of description and reality as a triangular relationship.

 

Description theory

Descriptions

<form>                 <idealise>

Describers <observe and envisage> Realities

 

To apply Ashby’s system theory is to combine this philosophy with the scientific method.

You observe or envisage an entity in the real world which behaves as an empirical system – an instance.

You model it in an abstract system description - a theoretical system – a type.

The abstract system hides the infinite complexity of real-world actors and activities that instantiate (or realise) the model

 

Ashby’s view

Abstract system descriptions

<create and use>                   <idealise>

Systems thinkers <observe & envisage> Concrete system realisations

 

If and when the concrete system runs in reality, you can test its behavior against what the abstract system predicts.

 

Abstract system description

Theoretical system

System type – perceived or conceived

Concrete system realisation

An empirical system

A system instance - in action

 

This table contains some examples.

 

Abstract system description

“Solar system”

Laws of tennis

The score of a symphony

The US constitution

Concrete system realisation

Planets in orbits

A tennis match

A performance of that symphony

US governments

 

The US constitution defines the roles and rules of the essential actors in the US federal government system.

The roles include the Congress (the legislative branch), the President, the court system (the judicial branch) and the States.

The constitution also defines relations between actors playing those roles.

It does not define the roles or rules of subordinate institutions created by federal governments.

It does however defines the meta system to be used (by Congress or Constitutional Convention) to amend the constitution (change the system) itself.

Systems are deterministic

Ashby wrote that the notion of a deterministic system was already more than century old.

“Cybernetics deals with all forms of behaviour in so far as they are regular, or determinate, or reproducible.” Ashby 1956

In simple terms, a system is orderly rather than disorderly.

 

Deterministic: the quality of a system that means its next state is predictable from its current state and input event.

A deterministic system, in a given state, will respond to a specific stimulus or event by acting in a predictable way.

Paradoxically, as a result of processing a series of events, a system may change in an unpredictable, chaotic or non-linear fashion.

Read Determinism and hysteresis for more.

 

A biosocial system example

Ashby offered the following example of an orderly deterministic system.

“The male and female three-spined stickleback form… a determinate dynamic system.

Tinbergen (in his Study of Instinct) describes the system's successive states as follows” Ashby, 1956

The stickleback roles are shown columns of the table below (they could be shown as swim lanes in a process flow diagram).

The actors play those roles by communicating, by sending visual signals (information flows) to each other.

 

Stickleback mating system

The female’s role is to

The male’s role is to

present a swollen abdomen and special movements

present a red colour and a zigzag dance

swim towards the male

turn around and swim rapidly to the nest.

follow the male to the nest

point its head into the nest entrance

enter the nest

quiver in reaction to the female being in the nest

spawn fresh eggs in the nest

fertilise the eggs

 

The table above is an abstract system, a description of roles that typify actors.

Reading left to right, top to bottom, each visual signal (information flow) triggers the partner to act in response.

A concrete system would be any pair of sticklebacks that realises the two abstract roles.

 

Several other system theory concepts appear in this example.

 

System concept

Interaction

Communication by the sending and receiving of information flows (here, visual signals)

Behavior

Activities that advance the state of the system

Logical structure (role)

A list of activities an actor is expected to perform when playing a role

Physical structure (actor)

A concrete individual that cooperates with others by playing definable roles

Passive structure

An object that is acted in or on (here, nest and eggs).

 

Cybernetics is behavioristic

Ashby emphasised that the real world systems of interest (to general system theory) are ones that experience or perform processes.

"Cybernetics does not ask "what is this thing?" but ''what does it do?" It is thus essentially functional and behavioristic.”

“[It] deals with all forms of behavior in so far as they are regular, or determinate, or reproducible.” Ashby 1956

 

Ashby was interested in how selected behaviours change selected variables.

To model a real world entity as a system is to model its state and its behavior.

You model the state as variables whose values can be measured (e.g. the positions of the planets).

 

Abstraction of system structures

Variable types

<define>                         <idealise>

Systems thinkers   <observe and envisage>   Variable values

 

You model the behavior as processes (e.g. the orbits of the planets) that maintain or advance variable values

 

Abstraction of system behaviors

Process types

<define>                          <idealise>

Systems thinkers <observe and envisage> Regular behaviors

 

You describe a process in a way that enables real world behaviors to be tested as matching your description.

No process repeats exactly in the material world; any process we describe in a system is an abstraction from reality.

Even a computer program is different each time it runs at the level of matter and energy, memory spaces and electrical phenomena.

System generation change differs from system state change

Ashby insisted we distinguish two kinds of system change.

“The word "change" if applied to [a concrete entity repeating a behavior] can refer to two very different things.

·         change from state to state, which is the machine's behaviour, and which occurs under its own internal drive, and

·         change from transformation to transformation, which is a change of its way of behaving, and occurs at the whim of the experimenter or some other outside factor.

The distinction is fundamental and must on no account be slighted.”

 

In other words, one should on no account confuse:

·         State system change: a change to the state of a system, which changes the value of at least one variable.

·         System generation change: a change to the nature of a system, which changes the type of at least one variable or behavior.

 

The two kinds of change may be give different names here.

·         Regulation: regulating the values of defined state variables, usually to stay within a desired range.

·         Re-organization: changing the state variables themselves, or the rules that update variable values.

 

For more on different kinds of system change, read System stability and change.

Cybernetics is about information rather than energy flow

Ashby wrote:

“Cybernetics depends in no essential way on the laws of physics.”

“In this discussion, questions of energy play almost no part; the energy is simply taken for granted.” Ashby, 1956

 

The universe is ever-unfolding process that will end in chaos.

A system is a transient island of order carved out of the universe.

To hold off chaos, a system must draw energy from its environment

But cybernetics if focused on flows of information rather than energy.

Especially on information flows between control systems and target systems.

The brain can be modelled as a control system

In “Design for a Brain” (1952), Ashby presented the brain as a control system.

“The book dealt primarily with homeostatic processes within living organisms, rather than in an engineering or electronic context.” Wikipedia 2017

 

In cybernetic regulation, a control system directs a target system to maintain its state variables in a desired range.

A thermostat (control system) directs the actions of a heating system (target system).

In Ashby’s view, the brain (control system) directs the actions of a body (target system).

He saw the brain as a regulator that maintains a body’s state variables in the ranges suited to life

The aim is homeostasis – to maintain the state of the body - and so help it achieve other desired outcomes.

 

He selected variables relevant to his interest in the brain as a control system,

Setting aside other things you might consider important to being human, such as consciousness.

The basic idea might be distilled into one sentence thus.

 

Generic system description

Ashby’s design for a brain

A collection of active structures

that interact in regular behaviors

that maintain system state and/or

consume/deliver inputs/outputs

from/to the wider environment.

A collection of brain cells that

interact in processes to

maintain body state variables by

receiving/sending information

from/to bodily sensors/motors.

 

Ashby’s book holds an abstract description of a brain, which in turn holds an abstract description of a body’s physical variables.

Much as an engineer’s specification holds an abstract description of a control system, which in turn holds an abstract description of the physical variables it controls in a target system.

 

Ashby’s book holds an abstract description of a brain, which in turn holds an abstract description of a body’s physical variables.

Much as an engineer’s specification holds an abstract description of a control system, which in turn holds an abstract description of the physical variables it controls in a target system.

The graphic below separates a control system (the brain) from its target system (the remainder of the body).

 

Control system

Target system

“Design for a brain”

<wrote>                  <realised in>

Ashby                 <envisaged>               Brains

 

 

<monitor> Body state variables

<control>  Muscles and organs

 

Information flows are central to cybernetics.

 

Brains and businesses

Cybernetics has influenced systems thinking in general, and business system thinking in particular.

Businesses can be seen as successfully applying the principles of general system theory and cybernetics.

A brain maintains mental models of things (food, friends, enemies, etc.) it perceives to be out there.

A business (in its information systems) maintains documented models of actors and activities it monitors and directs through information feedback loops.

 

A business system is connected to its wider environment by feedback loops.

It receives information about the state of entities and activities in its environment, and records that information in memory.

The state information it receives and stores must model reality well enough, else the system will fail.

It outputs information to inform and direct entities and activities as need be.

Variety is a measure of complexity

Ashby wrote:

“a system is any set of variables which he [the observer] selects”. 

“A system's variety V measures the number of possible states it can exhibit, and corresponds to the number of independent binary variables.

But in general, the variables used to describe a system are neither binary nor independent.”

 

In short, complexity = variety = the number of possible states a system can exhibit.

There are many difficulties with this definition of complexity.

 

First, the measure is incalculably large for any system with a non-trivial state.

 

Second, there is no widely agreed measure of complexity.

There is complexity in system state/structure - as in Ashby’s measure of “variety”.

Also complexity in behavior - as in Mcabe’s measure of procedural complexity.

And complexity in the trajectory of system state change – an interest in System Dynamics.

(Here, I propose: complexity = the number of event/state combinations * the average procedural complexity of the rules applied to them.)

 

Third, what is measured - the control system, the target system, or the real world entity?

Consider the complexity of a tennis match in real world.

Consider the thought processes of its players, and the molecular structures and movements in the players, court surface and tennis balls.

Obviously, you can never measure the complexity of real world entity or behavior per se.

You can only measure it with respect to your chosen description of its elements (roles, actors, processes, variables, whatever).

And then, only measure it at the level of abstraction at which you choose to describe those elements and their inter-relationships.

 

From the viewpoint of a describer, a system is only as complex as its description.

From the viewpoint of a control system, a target system is only as complex as those variables the control system monitors and controls.

Read “Complexity” for more on that topic.

Variety absorbs variety - the law of requisite variety

Ashby’s ideas about regulatory systems include:

·         Variety: the number of possible states a system can exhibit

·         Attenuator: a device that reduces variety.

·         Amplifier: a device that increases variety.

 

Ashby's law of requisite variety applies to how a control system controls selected or essential variables of a target system.

"The larger the variety of actions available to a control system, the larger the variety of perturbations [in values of target system variables] it is able to compensate".

 

The law defines the minimum number of states necessary for a control system to control a target system with a given number of states.

It is interpreted here as meaning:

·         A control system’s information state models only those variables in the target system’s concrete state that are monitored and controlled.

·         For a homeostatic system to be stable, the number of states of the control system must be at least equal to the number of states in the target system.

·         The more ways that a homeostatic system can deviate from its ideal state, the more control actions a control system will need.

 

In response, Conant (1970) produced his so-called "Good Regulator theorem" stating that

"every Good Regulator of a System Must be a Model of that System".

 

Observation:

The Good Regulator theorem was proved by biological evolution long before Conant articulated it.

Animal brains maintain mental models of things (food, friends, enemies etc.) they care about in their environment.

These mental models must be accurate enough to enable the animals to monitor and manipulate those things.

Read Knowledge and Truth for more on the fuzziness of truth.

 

Ashby’s law reviewed and revisited

The law says having enough variety is a necessary precondition to control selected variables in a target system.

 

The law does not say variety is all you need

The law does not say having enough variety is a sufficient precondition to control selected variables.

 

The law does not mean maximising variety

The law does not say having more than enough variety is a good idea.

Ashby emphasised the need to be selective.

“we should pick out and study the facts that are relevant to some main interest that is already given.”

 

Which is contrary to the following “maximize internal variety” principle,

"Since the variety of perturbations a [control] system can potentially be confronted with is unlimited, we should always try maximize its internal variety (or diversity),

so as to be optimally prepared for any foreseeable or unforeseeable contingency." Principia Cybernetica.

This suggests redundant design effort, inefficient system operation and may result in data quality issues.

 

One real world entity may be subject to many control systems

Ashby’s view of a human being had something of Cartesian dualism about it.

He treated the body as the target system and the brain/central nervous system as the control system.

 

Today, psychobiology tends to the view that mental states and activities are bodily states.

This view, that the mind is inseparable from the body, is called “cognitive embodiment”.

And it appears we do not maintain homeostasis purely by control from the higher brain..

Rather, our many state variables are maintained by different control systems, which operate in parallel.

These control systems are distributed through the body and not in direct communication with each other.

 

The law does not mean matching the variety of the whole target system

The law does not mean the control system must be as complex as the concrete reality of the entity controlled

A control system is usually much simpler than any real world entity it controls.

 

E.g. a thermostat models only as much variety (colder or hotter than a given temperature) as the behavior (heating system on or off) it controls.

To the control system, the target system is no more complex than the variables it controls.

 

Control system

Target system

Control system designs

<create and use>                  <realised by>

Engineers      <observe and envisage> Control systems

 

 

<monitor and control>   Heating systems

 

E.g. a tennis a score board models only a few essential variables of a tennis match.

To umpires, what matters are the rules that determine the score, not the length or complexity of the rallies.

 

Control system

Target system

Laws of tennis

<create and use>             <realised by>

LTA      <observe and envisage > Umpires

 

 

<monitor and control> Real tennis matches

 

An implication is that the system describer must be experienced, expert and trained enough to “pick out and study the facts that are relevant”.

System architects must know what is architecturally significant to their and stakeholders’ interests in the target system.

 

The universe is a mess of more or less related systems

In nature and in business, control systems may be distributed, and act independently of each other.

An oyster manages to maintain homeostasis this way, without a central brain and nervous system.

In a large business, parallel divisions may compete for customers, or resources, or have conflicting goals.

 

From the perspective of two different control systems, one real world entity can be two different target systems.

Two control systems may simultaneously compete with and complement each other in controlling the state of one real world entity.

Surely, this is the very stuff of relationships between people in a social group?

 

The wider universe is divisible into infinite systems, separate, nested and overlapping.

Two systems may be described as related as:

·         Control (regulatory) and target systems

·         Cooperating (symbiotic) systems

·         Competing systems

But which labels you use may depend on your perspective.

Surely, a heating system controls the behavior of its thermostat?

Ashby’s law elaborated

Ashby’s law might be interpreted for a business thus:

·         A business system’s information state models only those variables of actors and activities that the business monitors and directs.

·         For directions to be effective, the information state must model those variables accurately enough.

·         The more variegated the actors and activities to be monitored and directed, the more complex a business system must be.

 

To direct an actor or activity in its environment, a brain or a business must be able to:

·         maintain or obtain a model of the state of that actor or activity (the model must be accurate enough).

·         gather inputs: detect events that reveal a significant state change (in an acceptably reliable and timely fashion).

·         produce outputs: respond to events or state changes by sending directives to actors to perform activities (in an acceptably reliable and timely fashion).

·         adapt if the actor does not respond to a directive as expected, in an acceptably reliable and timely fashion.

 

To generalise, a control system

·         must know just enough about the state of target system it monitors and directs

·         must detect events that reveal significant state changes in the target system - in an acceptably reliable and timely fashion.

·         must respond to those events by sending appropriate directives to the target system - in an acceptably reliable and timely fashion.

 

A control system can expect a target system to respond appropriately provided that:

·         monitor and control signals cannot go missing or be corrupted

·         time/speed, capacity/throughput, availability and any other non-functional requirements are met

·         the target system has no other (competing/interfering) control system

·         the target system is not capable of self-determination, of choosing an inappropriate response to a control message.

Footnote: Three levels of thinking

This table identifies three levels of thinking about the realisation of an activity system.

 

 

Concept

System 1

System 2

A

Abstract system description

(mental, spoken, documented, mechanical, whatever)

A symphony score

The design-time code of a computer program

C1

Concrete system realisation

(a behavior that matches the above well enough)

A performance of the above

A run-time execution of the above

C2

Concrete entity (which performs the above)

The orchestra members in a concert hall.

A computer in a data centre.

 

It is very important to realise that one entity can realise many systems.

For example, you may realise the three systems in the table below, at least two of them at the same time!

 

 

Concept

System 3

System 4

System 5

A

Abstract system description

Convert oxygen into carbon dioxide

Reproduce genes in next generation

Model a system

C1

Concrete system realisation

Person breathing

Person making love

Drawing ArchiMate diagrams

C2

Concrete entity

A person – you for example

 

As Ashby said: our first impulse is to point at a concrete entity repeating a behavior and to say "the system is that thing there".

However, an entity is rightly called a system only when and where it conforms to a system description.

No entity is rightly called a system without reference to a specific system description or model – one to which the entity demonstrably conforms.

 

In practice, it is normal to regard the concrete entity (C2) as part of the system realisation (C1).

Because the remainder of that entity, and whatever it does outside the system of interest, is out of scope.

So, the next table wraps up C1 and C2 into C, adds the describer (A) into the picture.

 

D

Describer

Astronomers

LTA

Composer

Playwright

A

Abstract system description

“Solar system”

Laws of tennis

The score of a symphony

The roles in a radio play

C

Concrete system realisation

Planets in orbits

A tennis match

A performance of that symphony

Actors playing those roles

 

The premise is that D, A and C are all material in form, but A is a passive structure rather than an activity system.

 

 

All free-to-read materials at http://avancier.website are paid for out of income from Avancier’s training courses and methods licences.

If you find the web site helpful, please spread the word and link to avancier.website in whichever social media you use.