Ashby’s law of requisite variety – and Beer’s use of it

This page at is published under the terms of the licence summarized in the footnote.



This paper challenges one of Ashby ideas about complexity measurement and Beer’s idea about management options.

It is commonly downloaded, but reader should be aware that (though other papers do lean on some of Ashby’s other ideas) this paper is only an aside.

There is a much wider and deeper exploration of system theory and systems thinkers on the “Sense and nonsense in system theory” page at


Ashby: control systems and real machines. 1

Beer: the Viable System Model (VSM) 3

Ashby’s law of requisite variety. 4

In my words. 4

In Ashby's words. 4

On complexity as a subjective judgement 5

On management control options. 6


Ashby: control systems and real machines

In “Design for a Brain” (1952), W Ross Ashby approached general system theory from a psychologist's perspective.

In An Introduction to Cybernetics (1956), Ashby furthered the ideas of general system theory about control systems.


“At this point we must be clear about how a "system" is to be defined.

Our first impulse is to point at [some real machine] and to say "the system is that thing there".

This method, however, has a fundamental disadvantage: every material object contains no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” (Ashby in “Introduction to Cybernetics”).


Every real football match is unique in the immensely rich and complex detail of individual footballer’s actions.

Every football match is the same in so far as it conforms to the abstract description of a football match’s qualities in the laws of football.

System theory


Theoretical system

System description

Laws of football


Empirical system

Operational system

Football match


Suppose a football player scratches his nose during a football match; that is an act in reality but not in the described system.

Strictly speaking, that action is irrelevant to the football match system that is controlled by a referee.

The player’s breathing and biochemistry (though essential) are also outside the scope of the system.


Not only do humans abstract system descriptions from realities, but so do automated control systems.

As Conant’s "Good Regulator theorem" (1970) declared: "every Good Regulator of a System Must be a Model of that System".

A control system models a selection of variable facts (such as temperature) observable in controlled system.



Control system

Model of controlled system variables



Controlled system

Controlled real machine behaviour

Heating system on/off


Beer: the Viable System Model (VSM)

For general system theorists, organisation is about how the parts of body or machine cooperate in processes.

For some more sociological systems thinkers, organisation is about the management (command and control) structure of a business or other society.

Stafford Beer’s book title “Brain of the Firm” (1972) may well be a deliberate echo of Ashby’s “Design for a Brain” 20 years earlier.

Later, in “Diagnosing the system for organisations”, Beer detailed his “Viable System Model”.


The VSM provides a reference model or design pattern for the organisation of a business.

“Few organizations have adopted the VSM as their formal organizational structure.

But many consultants have used it to diagnose the way an organization is operating and where improvements are needed.” (Stuart Umpleby).


Beer’s writing is hard to follow and the VSM is a very complex set of ideas.

Beer wrote: “There is no 'correct' interpretation of the VSM. We have spoken instead of more or less useful interpretations.”

Interpreting the VSM is a job in itself, which appeals to some management consultants.


Beer said the VSM was inspired by the structure of the human nervous system.

At a meeting I attended, a consultant claimed the VSM has a sound theoretical basis.

The fact is: the VSM doesn’t resemble the known structure or workings of the human brain.

It cannot be the VSM, since a tree, a bee hive and an oyster (with no central nervous system) are viable systems.


Beer knew Ashby, he regarded him as a grandfather (and in Beer’s case godfather) of cybernetics, and a world authority on the brain.

Beer based his thinking on Ashby’s, and set out these four core concepts.

·        Variety: a measure of complexity: the number of possible states of a system.

·        The law of requisite variety: “only variety can absorb variety” (see explanations below).

·        Attenuator: a device that reduces variety.

·        Amplifiers: a device that increases variety.


This paper challenges both Ashby’s simplistic measure of complexity above, and Beer’s use of the law of requisite variety.

Ashby’s law of requisite variety

In Ashby's words

Ashby's law of requisite variety applies to what is necessary to control the variety in the essential variables of a controlled system.


"The larger the variety of actions available to a control system, the larger the variety of perturbations it is able to compensate"...

"Since the variety of perturbations a [control] system can potentially be confronted with is unlimited, we should always try maximize its internal variety (or diversity), so as to be optimally prepared for any foreseeable or unforeseeable contingency." Principia Cybernetica.


The law has been expressed alternatively as: “For a system to be stable, the number of states of the control system must be at least equal to the number of states in the target system.”

Ashby said "variety absorbs variety”; his law defines the minimum number of states necessary for a controller to control a system of a given number of states.


“A system's variety V measures the number of possible states it can exhibit, and corresponds to the number of independent binary variables.

But in general, the variables used to describe a system are neither binary nor independent.”

In my words

Ashby’s concepts might be phrased thus:

·        The more different ways a system can deviate from its normal state, the more different control actions a (homeostatic) control system will need.

·        The information variables known to a control system must be at least as complex as the variability of the real machine behaviour to be controlled.

For example, a thermostat must model or record as much variety (colder or hotter than a given setting) as the behaviour (heating system on or off) it controls.


The law does not mean (as some seem to think) that the control system must be as complex as the target system or real machine.

A control system is usually very much simpler than the real machine controlled.

For example, a thermostat is very much simpler than a heating system.

Control system

Real machine

Control system design


<create and use>    <abstracts concepts from>


Engineers      <observe and envisage> Control systems

<monitor and control>   Heating systems


And the rules of football are infinitely simpler than any football match; the rules do not direct footballers in how to kick, head, dribble or pass the ball.

Control system

Real machine

Rules of football


<create and use>             <abstract concepts from>


FIFA             <observe and envisage > Match referees

<monitor and control> Real football matches


Ashby’s law of requisite variety has a corollary not mentioned in the sources I have read.

The system describer must be experienced, expert and trained enough to “pick out and study the facts that are relevant”.

System architects (be they heating engineers, or FIFA) must know what is architecturally significant to their and stakeholders’ interests in the real machine.

On complexity as a subjective judgement

Ashby said complexity = variety = the number of possible states of a system.

This measure of complexity or variety is incalculably large for any significant system.

And it is only one of many possible measures, none of which appear practical on a large scale.


And there is a bigger problem. Ashby said that:  “a system is any set of variables which he [observer] selects”. 

Which means the complexity of the real machine is as subjective as the selection of those variables


There is a gulf between a system description and an operational system.

The full complexity of an operational business system includes the thought processes of any human participants and extends down to the structures and movements of atomic particles.

You cannot measure the complexity of an operational system per se.

You can only measure the complexity of a system at the level of abstraction at which you describe the components and process steps of the system.

And that level of abstraction in a system description is a matter of choice.

In other words, a system is as simple or as complex as the describer chooses to make it in their description.


Stafford Beer set out to apply Ashby’s ideas to business systems.

Beer said thinking about the 4 Ms: (men, materials, machines and money) is inadequate, that we need to think about managing complexity.

After Ashby, Beer viewed the complexity of a business as the number of its possible states.

But knowing that possible states of a substantial system are uncountable, Beer said that relative statements are valid.


How can an observer compare the relative complexity of two real entities, machines or societies?

1.      Choose the system elements and relationships to be described (components, processes, variables etc.)

2.      Describe the two real world entities using that form and demonstrably to the same level of abstraction.

3.      Test that the two entities behave according to their descriptions.

4.      Then choose one of the 40 or so possible complexity measures and apply it to the system descriptions.

The process is both difficult and subjective. (Read Complexity for more.)


On management control options

Above, I phrased Ashby’s concepts thus:

·        The more different ways a system can deviate from its normal state, the more different control actions a (homeostatic) control system will need.

·        The information variables known to a control system must be at least as complex as the variability of the real machine behaviour to be controlled.


What if, in the operation of a business system, key state variables do not stay within required and defined parameters?

Beer’s proposed that where a control system needs to exercise tighter control, we must either

·        amplify (increase) variety in the control system, or

·        attenuate (reduce) variety in the target system.


So Beer divided management techniques into those two types.

He spent decades predicting the imminent collapse of government institutions, since they lacked the variety of the operational systems they were intended to control. 

But the governments and their institutions are still here; so his thought experiment failed.

A conclusion could be that real-world systems do not behave in ways that conform to the law of requisite variety.


What is going on?

Often, a control system knows very little of what is actually going on in the system it strives to control.

A thermostat knows nothing of a heating system bar the temperature of the environment.

If I understand correctly, in the human body, adaptation to achieve homeostasis is not maintained by top-down command and control from the higher brain.

It is maintained by various distributed control systems, which are not in direct communication with each other.


In business, it is rare to find a single control body that regulates everything in the operational system.

The workers in a business can and often do largely organise themselves.

Managers ask us to reduce their variety by reporting only “traffic light” status information to them.

Management by exception is common.

Agile development methods propose a manager serves the workers.


We might address a given control problem by adding more feedback loops and designing a more complex control system.

But a system engineer’s aim is usually to define a control system that is “lean”; it has the minimum variety needed to control whatever entities and activities it is supposed to control.


It seems to me that the cybernetic principles should be:

·        A control system must know just enough about the state of the entity or activity to be directed.

·        A control system must detect events that reveal a significant state change - in an acceptably reliable and timely fashion.

·        A control system must respond to those events by sending directives to the entity or activity - in an acceptably reliable and timely fashion.

·        A control system must be designed assuming the external entity or activity will respond to a directive a reasonably predictable and deterministic way.


Where a control system has insufficient variety to direct a target system, then options include Beer's two.

But Beer's thought experiment failed, something else must be happening in the real world and I suggest three more options

1.      attenuate (reduce) variety in the target system

2.      amplify (increase) variety in the control system.

3.      provide the target system with a variety of independent and distributed control systems

4.      empower actors in the target system to be control themselves or the subsystem their work in

5.      empower actors in the target system to find their own control systems.




http:// (this link may be broken)

A list of Beer’s works, extracted from “Diagnozing the system”

·        S. beer (1959) Cybernetics and Management. English Universities Press.

·        S. beer (1960) Towards the cybernetic factory. In Principles of Self Organization. Symposium of 1960, Pergamon Press, Oxford.

·        S. beer (1965) The world, the flesh and the metal. Nature 205 (No. 4968), 223-231.

·        S. beer (1966) Decision and Control. Wiley, Chichester.

·        S. beer (1972) Brain of the Firm. Allen Lane, Penguin, Hardmondsworth.

·        S. Beer (1975) Platform for Change. Wiley, Chichester.

·        S. beer (1979) The Heart of Enterprise. Wiley, Chichester.

·        S. beer (1981) Brain of the Firm, 2ndedn. Wiley, Chichester.

·        S. beer (1983) A reply to Ulrich's "Critique of pure cybernetic reason: the Chilean experience with cybernetics". 1. appl. Systems Analysis 10.


Footnote: Creative Commons Attribution-No Derivative Works Licence 2.0     18/01/2016 23:57

Attribution: You may copy, distribute and display this copyrighted work only if you clearly credit “Avancier Limited:” before the start and include this footnote at the end.

No Derivative Works: You may copy, distribute, display only complete and verbatim copies of this page, not derivative works based upon it.

For more information about the licence, see



Beer’s VSM: Figure 37 in “Diagnosing the system for organisations”