Ashby’s ideas

Copyright 2017 Graham Berrisford. One of about 300 papers at Last updated 25/02/2018 15:32


Understanding Ashby’s ideas helps you to understand much else in the field of systems thinking.

Unless otherwise stated, quotes below are from Introduction to Cybernetics” (1956) W. Ross Ashby.



A biosocial system example. 1

Abstraction of description from reality. 2

System change. 4

Disregarding thermodynamics. 4

Cybernetics. 4

Variety. 6

The law of requisite variety. 7

False corollaries to Ashby’s law.. 7

Ashby’s law of requisite variety – revisited and elaborated. 8

Footnote: Three levels of thinking. 9


Abstract and concrete systems

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

“Despite being widely influential within cybernetics, systems theory… Ashby is not as well known as many of the notable scientists his work influenced.

W Ross Ashby was one of the original members of the Ratio Club, who met to discuss issues from 1949 to 1958.”


For some, understanding classical system theory requires making a paradigm shift as radical as is needed to understand Charles Darwin’s evolution theory.

Many find it difficult to understand the implications of what Ashby said.


To apply system theory is to form an abstract system description (a type).

This hides the infinite complexity of actors and activities you observe or envisage as acting to realise (instantiate) that description.

You describe the state of a real-world system in terms of variables whose values can be measured (e.g. the positions of the planets).

You describe whatever regular processes can change the values of those variables (e.g. the orbits of the planets).

You describe a process in a way that enables real world behaviors to be tested as matching your description.


“At this point we must be clear about how a "system" is to be defined.

Our first impulse is to point at [a concrete entity] and to say "the system is that thing there".

This method, however, has a fundamental disadvantage: every [concrete entity has] no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” Ashby 1956


People point at a machine or a business (like IBM) and say "the system is that thing there".

But to Ashby, a real-world entity is only a system in so far as it performs the behaviors in an abstract system description.

IBM (like the universe) is an ever changing entity in which stuff happens.

If there is no description of the entity as a system, there is no agreed or testable system, just stuff happening.

Different people will define the aims, inputs and outputs of IBM differently, each expressing their own perspective of “the whole.”

In other words, IBM can be (can manifest, instantiate, realise) many different systems.


Abstract system description

Theoretical system

System type – perceived or conceived

Concrete system realisation

An empirical system

A system instance - in action


This table illustrates how a concrete entity realises (or instantiates) an abstract system description (or type).


Abstract system description

“Solar system”

Laws of tennis

The score of a symphony

The US constitution

Concrete system realisation

Planets in orbits

A tennis match

A performance of that symphony

US governments


The US constitution defines the roles and rules of the essential actors in the US federal government system.

The roles include the Congress (the legislative branch), the President, the court system (the judicial branch) and the States.

The constitution also defines relations between actors playing those roles.

It does not define the roles or rules of subordinate institutions created by federal governments.

It does however defines the meta system to be used (by Congress or Constitutional Convention) to amend the constitution (change the system) itself.


These papers represent describers, descriptions and realities in a triangular relationship.

Description theory


<form>                 <idealise>

Describers <observe and envisage> Realities


Read System theory terms and concepts for more.

A biosocial system example

Biology is abundant with examples of dynamic systems, both within one organism and in wider bio-social contexts.

Ashby reported this description of a bio-social system.

“The male and female three-spined stickleback form… a determinate dynamic system.

Tinbergen (in his Study of Instinct) describes the system's successive states as follows” Ashby, 1956


The narrative quoted by Ashby is easily converted into a process flow chart or table.

The male and female stickleback roles can be shown in parallel swim lanes (columns in the table below).

The actors playing those roles are in communication; each act of a male or female sends information, in a visual form, to its partner.

Reading the table left to right, top to bottom you see how each visual information flow triggers the partner to act in response.


Stickleback mating system

The female’s role is to

The male’s role is to

present a swollen abdomen and special movements

present a red colour and a zigzag dance

swim towards the male

turn around and swim rapidly to the nest.

follow the male to the nest

point its head into the nest entrance

enter the nest

quiver in reaction to the female being in the nest

spawn fresh eggs in the nest

fertilise the eggs


The table above is an abstract entity that describes a concrete entity – a pair of sticklebacks acting in the real world.

The description is a generalised type – the typical process realised by many different pairs of sticklebacks.

This simple example has features that are important in general system theory.



A regular process, a sequence of event-triggered activities that advance the state of the system

Structural role

A description listing the activities an actor is expected to perform


A concrete individual that cooperates with others by playing definable roles in processes


A cooperation, usually enabled by communication, by the sending and receiving of information flows

Passive structure

An object that is acted in or on (nest, eggs).

Abstraction of behaviours and structures

Ashby emphasised that the real world systems of interest (to general system theory) are ones that experience or perform processes.

"Cybernetics does not ask "what is this thing?" but ''what does it do?" It is thus essentially functional and behavioristic.”

“[It] deals with all forms of behavior in so far as they are regular, or determinate, or reproducible.” Ashby 1956


Abstraction of system behaviors

Process types

<define>                          <idealise>

Systems thinkers <observe and envisage> Regular behaviors


No process repeats exactly in the material world; any process we describe in a system is an abstraction from reality.

Even a computer program is different each time it runs at the level of matter and energy, memory spaces and electrical phenomena.


Ashby was interested in how selected behaviours change selected variables.

Abstraction of system structures

Variable types

<define>                         <idealise>

Systems thinkers   <observe and envisage>   Variable values


Ashby saw a system as view of a reality; or one or more roles you can observe or envisage real world entities as playing.

You describe the state of a real-world system in terms of variables whose values can be measured (e.g. the positions of the planets).

You model whatever regular processes can change the values of those variables (e.g. the orbits of the planets).

You describe a process in a way that enables real world behaviors to be tested as matching your description.

System change types

“The word "change" if applied to [a concrete entity repeating a behavior] can refer to two very different things.

·         change from state to state, which is the machine's behaviour, and which occurs under its own internal drive, and

·         change from transformation to transformation, which is a change of its way of behaving, and occurs at the whim of the experimenter or some other outside factor.

The distinction is fundamental and must on no account be slighted.”


In other words, one should on no account confuse:

·         State system change: a change to the state of a system, which changes the value of at least one variable.

·         System generation change: a change to the nature of a system, which changes the type of at least one variable or behavior.


For more on different kinds of system change, read System stability and change.

Disregarding thermodynamics

The universe is ever-unfolding process that will end in chaos.

A system is a transient island of order carved out of the universe.

To hold off chaos, a system must draw energy from its environment

However Ashby wrote:

“Cybernetics depends in no essential way on the laws of physics.”

“In this discussion, questions of energy play almost no part; the energy is simply taken for granted.” Ashby, 1956


Ashby was focused on flows of information rather than energy.

Especially on information feedback loops between control systems and target systems – as in cybernetics..


 “Ashby popularised the usage of the term 'cybernetics' to refer to self-regulating systems

The book dealt primarily with homeostatic processes within living organisms, rather than in an engineering or electronic context.” Wikipedia 2017


Ashby furthered the ideas of general system theory through discussion of control systems.

Just as we abstract a system description from a reality, so, a mechanical control system models some variable(s) detectable in a target system.


Control system

Model of target system variables


Target system

Target system variables and behavior

Heating system on/off


“Design for a Brain” (1952)

In this book, Ashby presented the brain-body relationship as an information feedback loop.

He saw the brain as a regulator that maintains a body’s state variables in the ranges suited to life

He picked out variables relevant to his interest in the brain as a control system,

He set aside other things you might consider important to being human; he eschewed discussion of consciousness.

His basic idea might be distilled into one sentence thus.


Generic system description

Ashby’s design for a brain

A collection of active structures

that interact in regular behaviors

that maintain system state and/or

consume/deliver inputs/outputs

from/to the wider environment.

A collection of brain cells that

interact in processes to

maintain body state variables by

receiving/sending information

from/to bodily sensors/motors.


Ashby’s book holds an abstract description of a brain, which in turn holds an abstract description of a body’s physical variables.

Much as an engineer’s specification holds an abstract description of a control system, which in turn holds an abstract description of the physical variables it controls in a target system.

The graphic below separates a control system (the brain) from its target system (the remainder of the body).


Control system

Target system

“Design for a brain”

<wrote>                  <realised in>

Ashby                 <envisaged>               Brains



<monitor> Body state variables

<control>  Muscles and organs


Information flows are central to cybernetics.


Brains and businesses

Cybernetics has influenced systems thinking in general, and business system thinking in particular.

Businesses can be seen as successfully applying the principles of general system theory and cybernetics.

A brain maintains mental models of things (food, friends, enemies, etc.) it perceives to be out there.

A business (in its information systems) maintains documented models of actors and activities it monitors and directs through information feedback loops.


A business system is connected to its wider environment by feedback loops.

It receives information about the state of entities and activities in its environment, and records that information in memory.

The state information it receives and stores must model reality well enough, else the system will fail.

It outputs information to inform and direct entities and activities as need be.


This paper goes on to introduce two of Ashby’s ideas that were later used by Stafford Beer.


Ashby said that: “a system is any set of variables which he [the observer] selects”. 

“A system's variety V measures the number of possible states it can exhibit, and corresponds to the number of independent binary variables.

But in general, the variables used to describe a system are neither binary nor independent.”


In short, complexity = variety = the number of possible states a system can exhibit.

This ignores the complexity of the measure of behaviours affecting those variables!

There are several other difficulties with this definition of complexity.


First, the measure is incalculably large for any system with a non-trivial state.


Second, many other measures have been proposed: e.g. McAbe’s procedural complexity.

Here, we might propose: complexity = the number of event/state combinations * the average procedural complexity of the rules applied to them.

There is no agreement as to which complexity measure is right or best.


Third, does the measure apply to the control system, the target system, or the real world entity of which only selected variables are controlled?

“every real machine embodies no less than an infinite number of variables, all but a few of which must of necessity be ignored.” (Ashby in “Design for a Brain”)

Consider the complexity of a tennis match in real world.

Consider the thought processes of its players, and the molecular structures and movements in the players, court surface and tennis balls.

Obviously, you can never measure the complexity of real world entity or behavior per se.

You can only measure it with respect to your chosen description of its elements (roles, actors, processes, variables, whatever).

And then, only measure it at the level of abstraction at which you choose to describe those elements and their inter-relationships.


From the viewpoint of a describer, a system is only as complex as its description.

From the viewpoint of a control system, a target system is only as complex as those variables the control system monitors and controls.

Read “Complexity” for more on that topic.

The law of requisite variety (“variety absorbs variety)

Ashby's law of requisite variety applies to how a control system controls selected or essential variables of a target system.

"The larger the variety of actions available to a control system, the larger the variety of perturbations [in values of target system variables] it is able to compensate".


Ashby’s law defines the minimum number of states necessary for a control system to control a target system with a given number of states.

It is interpreted here as meaning:

·         A control system’s information state models only those variables in the target system’s concrete state that are monitored and controlled.

·         For a homeostatic system to be stable, the number of states of the control system must be at least equal to the number of states in the target system.

·         The more ways that a homeostatic system can deviate from its ideal state, the more control actions a control system will need.


In response, Conant (1970) produced his so-called "Good Regulator theorem" stating that "every Good Regulator of a System Must be a Model of that System". Wikipedia 2017

The Good Regulator theorem was proved by biological evolution long before Conant articulated it.

Animal brains maintain mental models of things (food, friends, enemies etc.) they care about in their environment.

These mental models must be accurate enough to enable the animals to monitor and manipulate those things.

Read Knowledge and Truth for more on the fuzziness of truth.

False corollaries to Ashby’s law

The law says having enough variety is a necessary precondition to control selected variables in a target system.


The law does not say variety is all you need

The law does not say having enough variety is a sufficient precondition to control selected variables.


The law does not mean maximising variety

The law does not say having more than enough variety is a good idea.

Ashby emphasised the need to be selective.

“we should pick out and study the facts that are relevant to some main interest that is already given.”


Which is contrary to the following “maximize internal variety” principle,

"Since the variety of perturbations a [control] system can potentially be confronted with is unlimited, we should always try maximize its internal variety (or diversity),

so as to be optimally prepared for any foreseeable or unforeseeable contingency." Principia Cybernetica.

This suggests redundant design effort, inefficient system operation and may result in data quality issues.


The law does not mean matching the variety of the whole target system

The law does not mean the control system must be as complex as the concrete reality of the entity controlled

A control system is usually much simpler than any real world entity it controls.


E.g. a thermostat models only as much variety (colder or hotter than a given temperature) as the behavior (heating system on or off) it controls.

To the control system, the target system is no more complex than the variables it controls.


Control system

Target system

Control system designs

<create and use>                  <realised by>

Engineers      <observe and envisage> Control systems



<monitor and control>   Heating systems


E.g. a tennis a score board models only a few essential variables of a tennis match.

To umpires, what matters are the rules that determine the score, not the length or complexity of the rallies.


Control system

Target system

Laws of tennis

<create and use>             <realised by>

LTA      <observe and envisage > Umpires



<monitor and control> Real tennis matches


An implication is that the system describer must be experienced, expert and trained enough to “pick out and study the facts that are relevant”.

System architects must know what is architecturally significant to their and stakeholders’ interests in the target system.

Ashby’s law of requisite variety – revisited and elaborated

In nature and in business, control systems may be distributed rather than centralised (as in a brain).


One real world entity may be subject to many control systems

Our body does not maintain homeostasis purely by top-down command and control from the higher brain.

Instead, its many variables are maintained by several control systems, operating in parallel, which are distributed and not in direct communication with each other.

An oyster manages to maintain homeostasis without having any central brain and nervous system.

In large businesses, there is rarely if ever an overarching control body that monitors and directs all business processes.

There are in practice several (hierarchical and/or parallel) bodies that may compete for resources, and even have conflicting goals.


To different controllers, a real world entity is at once several target systems, a different one to each controller.

Suppose two control systems either compete or complement each other in seeking to control the same target system state?

Surely, this is the very stuff of relationships between people in a social system.


Ashby’s law elaborated

Ashby’s law might be interpreted for a business thus:

·         A business system’s information state models only those variables of actors and activities that the business monitors and directs.

·         For directions to be effective, the information state must model those variables accurately enough.

·         The more variegated the actors and activities to be monitored and directed, the more complex a business system must be.


To direct an actor or activity in its environment, a brain or a business must be able to:

·         maintain or obtain a model of the state of that actor or activity (the model must be accurate enough).

·         gather inputs: detect events that reveal a significant state change (in an acceptably reliable and timely fashion).

·         produce outputs: respond to events or state changes by sending directives to actors to perform activities (in an acceptably reliable and timely fashion).

·         adapt if the actor does not respond to a directive as expected, in an acceptably reliable and timely fashion.


To generalise, a control system

·         must know just enough about the state of target system it monitors and directs

·         must detect events that reveal significant state changes in the target system - in an acceptably reliable and timely fashion.

·         must respond to those events by sending appropriate directives to the target system - in an acceptably reliable and timely fashion.


A control system can expect a target system to respond appropriately provided that:

·         monitor and control signals cannot go missing or be corrupted

·         time/speed, capacity/throughput, availability and any other non-functional requirements are met

·         the target system has no other (competing/interfering) control system

·         the target system is not capable of self-determination, of choosing an inappropriate response to a control message.


Footnote: Three levels of thinking

This table identifies three levels of thinking about the realisation of an activity system.




System 1

System 2


Abstract system description

(mental, spoken, documented, mechanical, whatever)

A symphony score

The design-time code of a computer program


Concrete system realisation

(a behavior that matches the above well enough)

A performance of the above

A run-time execution of the above


Concrete entity (which performs the above)

The orchestra members in a concert hall.

A computer in a data centre.


It is very important to realise that one entity can realise many systems.

For example, you may realise the three systems in the table below, at least two of them at the same time!




System 3

System 4

System 5


Abstract system description

Convert oxygen into carbon dioxide

Reproduce genes in next generation

Model a system


Concrete system realisation

Person breathing

Person making love

Drawing ArchiMate diagrams


Concrete entity

A person – you for example


As Ashby said: our first impulse is to point at a concrete entity repeating a behavior and to say "the system is that thing there".

However, an entity is rightly called a system only when and where it conforms to a system description.

No entity is rightly called a system without reference to a specific system description or model – one to which the entity demonstrably conforms.


In practice, it is normal to regard the concrete entity (C2) as part of the system realisation (C1).

Because the remainder of that entity, and whatever it does outside the system of interest, is out of scope.

So, the next table wraps up C1 and C2 into C, adds the describer (A) into the picture.









Abstract system description

“Solar system”

Laws of tennis

The score of a symphony

The roles in a radio play


Concrete system realisation

Planets in orbits

A tennis match

A performance of that symphony

Actors playing those roles


The premise is that D, A and C are all material in form, but A is a passive structure rather than an activity system.



All free-to-read materials at are paid for out of income from Avancier’s training courses and methods licences.

If you find the web site helpful, please spread the word and link to in whichever social media you use.