Ashby’s ideas about variety

Copyright 2017 Graham Berrisford. One of more than 100 papers on the “System Theory” page at . Last updated 06/12/2019 09:00


In discussion, social systems thinkers often use the terms “variety” and “complexity”, and refer to Ashby’s “law of requisite variety”

Some presume Ashby's law is applicable to the management of human institutions, if not whole nations.

If you don't measure the variety of a target system, and demonstrably control it using data of the same or greater variety,

then your use of the law is more art than science, more inspiration than direction.

This paper questions the degree to which hard science ideas can be applied in sociology and management science.


Cybernetics - recap. 1

Ashby’s law of requisite variety. 1

Three things the law does not say. 1

On the complexity of the real world. 1

Beer’s application of Ashby’s ideas. 1

Conclusions and remarks. 1

References and reading. 1


Cybernetics - recap

Many systems of interest to us can be seen as a black box that consumes inputs from its environment and produces outputs.

Inputs stimulate internal processes or behaviors that change the system’s internal state and/or produce outputs.

Inside the system can be many subsystems or actors that interact to complete behaviors.

Most systems of interest to us are describable in the same general way.

That is, in terms of actors that interact in behaviors to advance the system’s state and/or consume/deliver inputs/outputs from/to the wider environment.


Cybernetics emerged out of efforts in the 1940s to understand the role of information in system control.

Weiner introduced cybernetics as the science of biological and mechanical control systems.

He discussed how a controller directs a target system to maintain some state variable(s) in a desired range.

E.g. A thermostat directs the actions of a heating system.


W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

He popularised the usage of the term 'cybernetics' to refer to self-regulating (rather than self-organising) systems.

 “Cybernetics is a "theory of machines".

“Our starting point is the idea, much more than a century old, that a machine, in given conditions and at a given internal state, always goes to a particular state.”

·       “A variable is a measurable quantity that has a value.”

·       “The state of the system is the set of values that the variables have.”

·       “A system is any set of variables which he [observer] selects from those available on the real machine.” (Introduction to Cybernetics” Ashby, 1956)


Ashby and other general system theorists focused attention on a system’s behaviors rather than its actors.

“Cybernetics deals with all forms of behaviour in so far as they are regular, or determinate, or reproducible.”

“[It] treats, not things but ways of behaving. It does not ask "what is this thing?" but ''what does it do?"

It is thus essentially functional and behaviouristic.” (Ashby 1956)


In “Design for a Brain” (1952), Ashby, addressed biological (rather than mechanical or electronic) homeostatic machines.

He presented the brain as a regulator that maintains a body’s state variables in the ranges suited to life.

This table distils the general idea.


Generic system description

Ashby’s design for a brain


interact in regular behaviors

that maintain system state and/or

consume/deliver inputs/outputs

from/to the wider environment.

Brain cells

interact in processes to

maintain body state variables by

receiving/sending information

from/to bodily organs/sensors/motors.


The Good Regulator theorem

After Ashby, Conant (1970) said "every Good Regulator of a System Must be a Model of that System".

This theorem was proved in biology and sociology long before it was articulated.

Both animals and businesses are connected to their wider environment by input-output feedback loops.

They remember the state of actors and activities that they monitor, inform and direct.

·       The brain of an animal maintains mental models of things (food, friends, enemies etc.) it monitors and directs.

·       The information systems of a business maintain documented models of things (customers, suppliers, etc.) it monitors and directs.

These memories must model or represent reality well enough, if animals and businesses are to act effectively and survive.

So, they must update their memories in response to input messages that reveal state changes in those actors and activities.

Ashby’s law of requisite variety

“A system's variety V measures the number of possible states it can exhibit.” (Ashby 1956)

Beware (as discussed below) that variety is not an absolute measure of a thing, it relates to a controller’s interest in that thing as a system.

Different controllers - with different interests - perceive a target as having different varieties.


The law of requisite variety “only variety can absorb variety”

"The larger the variety of actions available to a control system, the larger the variety of perturbations it is able to compensate".

Perturbations are changes in the values of a target system’s variables that need to be regulated by the controller.

In the context of a homeostatic system this means the following.

·       The purpose of a controller is to monitor and regulate one or more essential state variables of a target system.

·       To succeed, the controller must be able to recognise when a target state variable value strays outside a desired range.

·       The more ways a target system can deviate from its desired state, the more control actions its controller(s) will need.


For sure, the variety of states to be regulated is a measure of significance to the designer of a controller.

And Ashby law of requisite variety does tell us something useful

But it can be misinterpreted and there is more to know.


Variety = Complexity?

Ashby said the complexity of a system = its variety.

His variety is a measure of observable state complexity, rather than of internal behavior complexity.

E.g. Behavior complexity might be measure as: the number of input event/state combinations * the average procedural complexity of the rules applied to them.

There are scores of other ways to measures complexity.

Today, there is no single or agreed idea of complexity.

Still, when you hear systems thinkers speak of “managing complexity”, they may be referring to Ashby’s idea of variety.

And to Ashby’s principles for managing complexity/variety, which are discussed below.


How to calculate the variety of a large and complex system?

A binary variable has two possible values

Think of a “bit” in computer science, or a “truth value” in mathematical logic.

E.g. A citizen might be described by two binary variables

·       Financial status = rich or poor

·       Political status = communist or capitalist.


A system with N binary variables can have 2 to the Nth possible states.

So, a citizen with 2 binary variables can have 4 possible states.


Citizen’s financial status

Citizen’s political status




Rich Communist

Poor Communist


Rich Capitalist

Poor Capitalist


Suppose the government of a state maintains a database of its citizens,

Each citizen’s record contains the two variables above, and 20 additional attributes, each with many possible values.

Imagine the combinatorial explosion of citizen varieties in a state with millions of citizens.

It would appear that the value for the system’s variety is huge, practically incalculable.


However, to calculate variety in this way is misleading, because the variety that matters in Ashby’s law of requisite variety is different.

Ashby’s variety is not an absolute measure of a thing, it is relative to a controller’s interest in that thing.

His law applies to the variety of states that a controller seeks to control.

“Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” (Ashby 1956)


The target system’s state is relative to the interest the controller has

E.g. Consider a simple bi-metal strip thermostat as a controller, and a heating system as its target.

The thermostat moves a pointer along a temperature scale drawn in Celsius from 0 to 100.

Does the system have 100 states? If the scale were redrawn in Fahrenheit, would the number of states increase?

No, because the thermostat has only two control actions – switch the heater on or off.

And it is interested in only two states of the target system: 1 above a set temperature, 2 below a set temperature.


Different controllers - with different interests - perceive a target as having different varieties.

E.g. Consider two observers who work to control the state of a lake.

The first is interested in maintaining the lake’s water level between upper and lower bounds.

The variable values of interest to them are the upper and lower water levels, and the flow rates of water in and out.

The second is interested in maintaining the populations of several interacting biological species in a balanced ecology.

The variable values of interest to them are upper and lower numbers for each population.


So, variety is not an absolute measure of a thing.

An observer must identify which variables of the thing must be regulated to meet some goal(s) “already given”.

And identify which changes to variable values will trigger controlling actions.

Only then can the system’s variety be measured.


Managing complexity (meaning variety)

Remember Ashby’s law of requisite variety is that “only variety can absorb variety”.

When a controller has insufficient variety to control a target system, then design options include:

·       Amplifying (increasing) the variety in the control/management system

·       Attenuating (reducing) the variety in the target/operational system.


Amplifying and attenuating variety were major themes in Beer's work in management (the profession of control, as he called it).

Note, however, that there are other design options for managing complexity, including:

·       Improve information flow qualities (speed, throughput, integrity etc.)

·       Tighten the control-target coupling

·       Remove any rival controllers that compete with the one we wish to succeed

·       Decentralise control: divide regulation responsibilities between maximally autonomous controllers (E.g. kidney and liver)

·       Decentralise what is managed: divide the target system into maximally autonomous subsystems (E.g. “microservices”).

Three things the law does not say

Remember Ashby’s law of requisite variety is that “only variety can absorb variety”.


The law does not say variety is all a controller needs

The law says having enough variety is a necessary precondition to control selected variables in a target system.

It does not say having enough variety is a sufficient precondition.

A controller must also

·       be coupled to its target system tightly enough.

·       not be overridden by any competing controller.

·       not be overridden by a target in which actors are self-aware and self-determining.


The law does not mean a controller knows all the state variables of the thing controlled

What may be called a “lean controller” knows a minimal amount of what is going on in an entity it controls.

E.g. A thermostat knows nothing but the temperature of the target environment.

It ignores countless other variables that might be measured in “the real machine” it controls.

E.g. the score board of a tennis match models only those variables that measure success in the sport.

The umpire ignores other variables in “the real machine”, like the heights of the players and lengths of their rallies.


The law does not imply a controller should maximise its variety

A controller is interested in state variations that influence its control actions.

E.g. The Bank of England may appear to be interested in only two variables: the interest rate and the money supply.

But it gathers a lot of data with a view to deciding how to manipulate those two variables.


On the other hand, Ashby emphasised the need to be selective: to “pick out and study the facts that are relevant to some main interest.”

E.g. Remember the database in which citizen is in one of four states (rich communist, rich capitalist, poor communist and poor capitalist)?

The government monitors every citizen state change with a very particular aim.

When any citizen enters the “rich communist” state, the government sends a message: “You must either give away your money or become a capitalist”.

So, although the government record four citizen states, their real interest is limited to two states – rich communist or not.

They record more variety than they need.

It is inefficient to capture data that is currently irrelevant and unused.

And when redundant data is collected, it often turns out later that the unused data is low quality.

On the complexity of the real world

Ashby illuminated thinking about the world in terms of systems that communicate by input/output data flows.

“Two or more whole machines can be coupled to form one machine” (Ashby 1956)


Ashby’s law of requisite variety best fits cases where

·       two systems are designed to be coupled to each other

·       one target system is regulated by one control system.

The coupling of systems in the real world is often more variegated and complex.


Not all inter-system relationships are control or regulation

Systems may be coupled, loosely or tightly, in various kinds of relationship:

·       Cooperation or symbiosis

·       Competition or conflict

·       Control or regulation


One real world entity may be regulated by several controllers

In the real world, a thing may be monitored and directed by several controllers.

E.g. A child may be regulated by its teachers and its parents.

E.g. Homeostasis in an oyster is maintained by distributed ganglia, there is no central nervous system.


Controllers may cooperate to regulate one entity

The state variables of the human body are maintained by relatively independent controllers.

E.g. The human blood system is regulated by two controllers - the kidney and the liver – which do not share information.

The kidneys regulate the amount of water in the body and balance the concentration of mineral ions in the blood.

The liver regulates the amount of glucose in the blood, and is in turn regulated by hormones, such as insulin and glucagon.


Controllers may compete to regulate one entity

E.g. A child might be monitored and directed differently by its teachers and parents –who work in competition to different ends.

This seems to be an example of dysfunctionality, but consider competition between controllers in a different example.

E.g. Two businesses (or business divisions) may compete to monitor and direct the purchasing behavior of one customer.

In this second example, competition might work to the benefit of the customer.


One entity may play roles in many systems

“Any one machine can be regarded as formed by the coupling of its parts, which can themselves be thought of as small, sub-machines.” (Ashby 1956)

A subsystem can be contained tightly in a wider system.

E.g. the kidney and the liver are firmly contained within a body.

And a cell contained in the kidney cannot also act in the liver.

By contrast, a subsystem may participate only loosely in a wider system.

E.g. a human actor can act in many different social networks.

Each person must trade-off between goals of their own and the goals of social networks they belong to.


The real world is a mess of more and less tightly coupled systems

"Managers do not solve problems, they manage messes." ~ Russell L. Ackoff

The business managed by a manager is not a system.

The business is countless systems, it is as many systems as observers can a) define and b) demonstrate the business conforms to.

Moreover, much of what happens in a business is not systematic – and need not be systematic.


"There are no separate systems. The world is a continuum.

Where to draw a boundary around a system depends on the purpose of the discussion." ~ Donella Meadows

We can divide the world into distinct systems, nested systems, overlapping systems and competing systems.

The fuzziness of social network and system boundaries is a challenge for any attempt by sociologists to treat a social group as a system.


Several points above can make it difficult to apply cybernetics to sociology and management science.

See the conclusions below for further discussion.

Beer’s application of Ashby’s ideas

This section is the briefest of summaries.

For a longer version of this section read Beer’s ideas on the application of cybernetics (after Ashby and others) to management science.


In 1972 (the year Ashby died) Beer’s “Brain of the Firm” was published.

The book title echoes Ashby’s “Design for a Brain” 20 years earlier; and it easy to see why.

In Ashby’s “Design for a Brain”, the human is a homeostatic machine that maintains essential state variables; the brain-to-body relationship is seen as a regulatory system with feedback loops.

In Beer’s “Brain of the Firm”, a business is a homeostatic machine that maintains essential state variables; the management-to-worker relationship is seen as a regulatory system with feedback loops.


Project Cybersyn

On taking office in 1970, the new Chilean president, Allende faced a problem.

“How was he to nationalize hundreds of companies, reorient their production toward social needs, and replace the price system with central planning,

all while fostering the worker participation that he had promised?” Ref 3.

Beer was hired to help, and named his project Cybersyn, short for “cybernetics synergy”.

Six months after its launch in 1973, the project ended when Allende was overthrown and Chilean politics swung away from central planning.


The Nobel prize-winning economist Hayek knew Beer, but they never agreed about planning.

In 1974, Hayek gave a famous prize acceptance speech called “The Pretence of Knowledge”.

He coined the term “scientistic” meaning “a mechanical and uncritical application of habits of thought to fields different from those in which they have been formed.”

This distillation of Hayek’s “Pretence of Knowledge” speech includes this expression of Hayek’s view.

“Fooled into believing that the measurable variables are the most critical, economists propose “solutions” that actually worsen the problem.”

Did Hayek have had in mind the application of cybernetics to the requirement for worker participation in planning?

Beer learned from the project.


Beer’s biology-sociology analogy

Over the following years, Beer polished his ideas about applying cybernetics to business management, and gathering feedback from workers.

He refreshed and detailed his “Viable System Model”.

Find “Diagnosing the system for organisations” (1985) on the internet and look at Figure 37, or else the exemplar here.

Still, in essence, the VSM is a structural view of a business (and/or unit of a business) that divides it into 5 subsystems, each with its own functions.


Beer followed in the long tradition of social systems thinkers who have drawn a biology-sociology analogy

He looked for inspiration in the structure of the human central nervous system (CNS).

Given what I understand of both, the CNS-VSM mapping is a loose analogy rather than isomorphic, and misleading in some places


Of course, the business functions that Beer positioned in the structure of his VSM can be found in businesses.

And information does flow into, up, down, around and out of a business.

However, the biology-sociology analogy seems to me a device for teaching and promotion.

Like many analogies, the more you think about it the less convincing it is.

Conclusions and remarks

Beer, Churchman, Checkland and others have given us models for thinking about a business or a system.

They are useful tools, at least when interpreted by a skilled management consultant or systems analyst.

This paper is neither for nor against the use of any such model.

It is about the applicability of hard science ideas in sociology and management science.

It makes some general points about those ideas and their limits.


Ashby’s law of requisite variety best fits cases where

·       two systems are designed to be coupled to each other

·       one target system is regulated by one control system.

The coupling of systems in the real world is often more variegated and complex.


On the application of general system theory and cybernetics to enterprise architecture

Most systems of interest to us are describable in the same general way.

That is, in terms of actors that interact in behaviors to advance the system’s state and/or consume/deliver inputs/outputs from/to the wider environment.

General system theory, enterprise and software architecture all embrace some cybernetics ideas.

Enterprise architects strive to ensure:

·       a business is optimally decomposed into separately describable and manageable systems

·       those systems are coupled where necessary, more or less tightly, using the best design pattern for the case.


Often, a system is required to monitor and direct the state of a target actor or activity.

The more variegated the actions to be directed, the more complex the system needs to be.

And enterprise architects should seek to ensure:

·       the system knows enough about the state of any target it monitors and directs

·       the system detects events that reveal significant state changes in a target - in an acceptably reliable and timely fashion

·       the system responds to events by sending appropriate directives - in an acceptably reliable and timely fashion

·       the system is coupled to its target tightly enough

·       inter-system messaging is efficient and effective enough.

·       other (competing or complementary) controllers directing the target are recognised

·       the system recognises exceptions, when actors do not respond as desired to a directive

·       the system manages exceptions - in an acceptably reliable and timely fashion.


On the biology-sociology analogy

For sure, the behaviors of animals and businesses can be described in terms of three functions.

·       Sensing: collecting input from the external environment.

·       Integrating: processing and interpreting the input – sometimes with reference to the system’s state/memory.

·       Responding: acting appropriately to the input, changing state and/or producing output.


Beyond that, many biology-sociology analogies are questionable

Some systems thinkers, notably Ackoff and Bausch, have deprecated drawing a biology-sociology analogy.

One reason: human societies are not homeostats; rather than stay stable, they continually evolve.

Another reason: human actors are free to act as they choose, contrary to any system they might be expected to play a role in.


On what is beyond the scope of cybernetics

Cybernetics applies to systems that sense/consume inputs and respond by changing state and/or producing outputs.

Ashby urged us to recognise every “real machine” is infinitely more complex than any system abstracted from it.

Likewise, every real-world business (or other social network) is infinitely more complex than any system abstracted from it.

A lot of information, some essential to business survival, flows up, down and around a business.

Much of that information is ad hoc or informal, and stimulates some irregular, unpredictable or unrepeatable behavior

Those information flows and behaviors are beyond the scope of cybernetics as Ashby defined it.


On the subjectivity of complexity

Some say thinking about a system’s structures is inadequate because what we need to understand, monitor and control is behaviors.

Beer said thinking about the four Ms (Men, Machines, Materials and Money) is inadequate because managers need to think about managing complexity.

How to objectively assess the relative complexity of two business systems?

You might try to do this systematically, as follows.

1.     Choose your measure of complexity (Ashby’s or other)

2.     Identify the system elements to be described (roles, actors, processes, variables, whatever)

3.     Describe the two real world businesses in terms of those elements

4.     Demonstrate your two descriptions have been made to the same level of abstraction.

5.     Test that the two real world business systems behave according to your descriptions.

6.     Then apply the complexity measure to each business system and compare them.


The process is impractical for all but trivial systems and complexity measures.

So Beer said a subjective assessment of relative complexity is valid.

OK, but that is to shift away from the objective science Ashby sought to establish.


On decentralisation of what is managed

A supply chain or manufacturing business is very different from an information processing or knowledge worker business.

Every human organisation has to find its own balance between hierarchy and anarchy.


E.g. A current fashion (in “agile architecture”) is to dismember a system and decouple its subsystems (“microservices”) as far as possible.

The team responsible for one subsystem is encouraged to develop it independently.

Teams are encouraged to be self-organising; actors are encouraged to be self-determining.

They are allowed to choose their own actions to meet given aims; perhaps even choose their own aims.

They may even make and deploy their own control or coordination system, perhaps in cooperation with neighbours.


On social networks v. social systems

There is big difference between:

·       A social system in which regular activities are performed by a group of actors

·       A social network in which a group of actors perform whatever activities they choose.


A real-world business may be seen as one social network that realises many socio-technical systems.

Designing a system using cybernetics is one thing, managing the social network another.

Directing activities based on analysis data collected is certainly one part of managing business.

Motivating and helping people to reach aims is another part (along with rules that discourage aberrant behaviors).


In conclusion

There is some scientism and pseudo-science in social systems thinking discussion.

“There are limits to what science and the scientific method can achieve.

In particular, studying society generally shows how difficult it is to control.

This truth may disappoint those who want to build a science to shape society.

But the scientific method is not a recipe that can be mechanically applied to all situations.” From this distillation of Hayek’s “Pretence of Knowledge” speech

References and reading

This is one of many companion papers that analyse some systems thinkers’ ideas.

·       Read Ashby’s ideas for an introduction to his ideas about cybernetics, mostly integral to general system theory

·       Read Ackoff’s ideas on the application of general system theory (after von Bertalanffy and others) to management science.

·       Read Ashby’s ideas about variety on his measure of complexity and law of requisite variety.

·       Read Beer’s ideas on the application of cybernetics (after Ashby and others) to management science.

·       Read Von Foerster’s ideas on ideas attributed to Heinz von Foerster and his second order cybernetics.


Beer’s ideas serves to illustrate some points made in this paper.

Further reading on the “System Theory” page at includes:

Boulding’s ideas, Checkland’s ideas, Luhmann’s ideas, Marx and Engels’ ideas, Maturana’s ideas and Snowden’s ideas.


A variety of sources were referred to in the course of writing this paper; they include Ashby’s “Design for a Brain” (1952) and “Introduction to Cybernetics” (1956).

Unless otherwise stated, quotes are from Introduction to Cybernetics” (1956) W. Ross Ashby.

Some speak of complex adaptive systemswhere the meaning of all three terms is debatable.

Read Complex adaptive systemsfor more on that.



All free-to-read materials at are paid for out of income from Avancier’s training courses and methods licences.

If you find the web site helpful, please spread the word and link to in whichever social media you use.