Ashby’s law of requisite variety

Copyright 2019 Graham Berrisford. A chapter in the book https://bit.ly/2yXGImr. Last updated 22/07/2021 11:26

 

Social systems thinkers often use the terms “variety” and “complexity”, and refer to Ashby’s “law of requisite variety”. Some apply Ashby's law to the management of human institutions, if not whole nations. Arguably, these applications of the law are more art than science, more inspired than directed by cybernetic principles.

Contents

Systems. 1

Variety. 1

Regulation (repeat) 1

Managing complexity/variety. 1

On applying cybernetics to society. 1

Conclusions and remarks. 1

 

Systems

Ashby defined a system in two ways, and defined its complexity as its variety.

 

System as a set of state variables

The physical or material state of a system is represented by the values of state variables. Ashby defined a system as a set of variables chosen for attention and relationships between these variables, established by observation, experimentation, or design. For example, the state of wolf-sheep (predator-prey) system is represented by wolf and sheep population variables. And the state of a tennis match is represented by game, set and match scores

 

The focus of cybernetics is mostly on quantitative variables, like the levels of a stock, population or resource. Given that a quantitative variable’s value changes over time, its line of behavior may be shown on a graph of quantity against time.

 

System as a set of rule-bound activities

In “Introduction to Cybernetics”, Ashby defined systems by what they do over time, as a set of regular or repeatable state changes, rather than what they are made of. This shift in perspective, from the physical structure of a system to its logical behavior, is central to the cybernetic view of the world. When a cybernetician calls a system complex, the reference is to its lines of behaviour rather than its material structure.

Variety

“A system's variety V measures the number of possible states it can exhibit.” Ashby 1956

 

Ashby said the complexity of a system = its variety.

 

Identifying variables

An observer must begin by identifying which variables of a thing must be monitored and/or regulated to meet some goal(s) already given, then identify the variable values that will trigger controlling actions.

 

A target system’s state is relative to the interest the controller has. E.g. Consider a simple bi-metal strip thermostat as a controller, and a heating system as its target. The thermostat moves a pointer along a temperature scale drawn in Celsius from 0 to 100. Does the system have 100 states? If the scale were in Fahrenheit, would there be more states? No, because the thermostat has only two control actions – switch the heater on or off. So, it is interested in only two states of the target system, a) above a set temperature,  and b) below a set temperature.

 

Different controllers may perceive a target as having different varieties. E.g. Consider two observers who work to control the state of a lake. The first is interested in maintaining the lake’s water level, and is therefore interest in the upper and lower water levels that are acceptable, and the flow rates of water in and out. The second is interested in maintaining the populations of several biological species interacting in a balanced ecology, so is interested in the upper and lower numbers for each species’ population.

 

Calculating variety

An entity with N binary variables can have 2 to the Nth possible states. If a citizen has two binary variables (financial status and political status) then each citizen as four possible states.

 

Rich

Poor

Communist

Rich Communist

Poor Communist

Capitalist

Rich Capitalist

Poor Capitalist

 

The variety/complexity of a nation with one citizen is 4. Ashby’s measure does not account for there being a population of entities of the same type. What is the complexity of a nation with a million citizens? And in practice, a large and complex business may maintain thousands of different state variables types. So, the value for its variety is incalculable, and well-nigh infinite.

 

Counting possible states is only one of scores of ways to measure complexity. The complexity of activities might be measured as the number of input event/state combinations the average procedural complexity of the rules applied to each. The complexity of a social entity might be measured in terms of the structure(s) in which actors connected. There is no single agreed measure of complexity.

 

Having said that, one may make subjective judgements about whether one entity or system is more or less complex than another, in terms of its possible states. Still Ashby’s measure of variety is relevant to our ability to monitor and regulate some state of some entity.

Regulation (repeat)

A missile guidance system must sense the direction and speed of a missile, in order to keep it on course. An animal’s brain must recognize and remember the state of things (food, friends, enemies etc.) it monitors and directs. Likewise, a business must recognize the state of things (customers, suppliers, etc.) it monitors and directs. In each case, the sensation of memory must model or represent reality well enough, if animals and businesses are to act effectively and survive.

 

Two ideas from the earlier chapter are relevant here.

 

Ashby 6: A regulator needs a model of its environment

The Conant-Ashby theorem, or “good regulator” theorem, was conceived by Roger C. Conant and W. Ross Ashby and is central to cybernetics. In short, it states that every good regulator of a system must be (or have) a model of that system.

 

Abstract "The design of a complex regulator often includes the making of a model of the system to be regulated. The making of such a model has hitherto been regarded as optional, as merely one of many possible ways. In this paper a theorem is presented which shows, under very broad conditions, that any regulator that is maximally both successful and simple must be isomorphic with the system being regulated. (The exact assumptions are given.) Making a model is thus necessary.

 

The theorem has the interesting corollary that the living brain, so far as it is to be successful and efficient as a regulator for survival, must proceed, in learning, by the formation of a model (or models) of its environment."

https://www.tandfonline.com/doi/abs/10.1080/00207727008920220

 

This principle is a foundation for much in this book. Read this triangle from left to right: regulators <have and use> models, which <represent> targets.

 

The good regulator

Models

<have and use>           <represent>

Regulators    <monitor and regulate >   Targets

 

Evidently, to function and respond to changes, an animal must “know” what it going on in its world. It needs a model of entities and events its environment if it is to find food and mates, and avoid enemies.

 

The regulator can be an animal, machine or business that has a model, or has access to a model, of what it needs to monitor and direct. Organic, mechanical, business and software entities may all be connected to variables they monitor and direct by feedback loops. A brain holds a model of things in its environment, which an organism uses to manipulate those things. A missile guidance system senses spatial information, and sends messages to direct the missile. A business database holds a model of business entities and events, which people use to monitor and direct those entities and events.

 

The richer the model, the more adaptive the animal, machine or business can be to changes in its environment.

 

Note that a regulator models only those variables of the target system it seeks to control, not necessarily all the variables that the target maintains. Note also that in a discrete event-driven system, a stateless regulator can import its model before it processes an input, then put the model away again.

 

Ashby 7: Regulators must recognize the variety they control

“A system's variety V measures the number of possible states it can exhibit.” Ashby 1956

 

You cannot turn on a light unless it is off, or vice-versa. Regulators must recognize the variety in the states they seek to control. The wider the variety of states to be controlled, the wider the variety of states, the regulator must recognize, and the wider the variety of actions it must able to perform. This is not a physical law like Newton’s laws of motion, which may be supported or disproven by experiment. It is an information law, and an expression of what is mathematically inevitable.

 

Ashby’s law of requisite variety is commonly expressed as: only variety can absorb variety”.  He expressed it rather more clearly as follows.

"The larger the variety of actions available to a control system, the larger the variety of perturbations it is able to compensate".  Ashby 1956

 

Perturbations are changes to a target system’s variables that move their values outside of a normal or desirable range. The more ways a target can deviate from its desired state, the more control actions its controller(s) will need.

 

This law has some application to business architecture. However, there limits to how far a target can be controlled by a regulator.

 

A regulator can control only selected variables. A government institution or a business (like IBM) is a social entity definable in terms of countless variables. Different regulators - with different interests – may see it as different systems with different varieties. There is no prospect of typifying the whole of IBM, all describable variables, in one abstract system.

 

A regulator can’t fully control variables also controlled by other forces. A regulator designed to monitor and direct some selected variables of IBM may find they are buffeted by other factors – including other regulators and market forces.

Managing complexity/variety

What if a controller has insufficient variety to control a target system? Then design options include:

·           Amplifying (increasing) the variety in the control/management system

·           Attenuating (reducing) the variety in the target/operational system.

 

Amplifying and attenuating variety were major themes in Beer's work on management cybernetics (the profession of control, as he called it). However, he ran into the problem that you cannot control a target that is in reality controlled by other levers, outside your control.

 

And there are other design options for managing complexity, including:

·           Improve information flow qualities (speed, throughput, integrity etc.)

·           Tighten the control-target coupling

·           Remove any rival controllers that compete with the one we wish to succeed

·           Decentralize control: divide regulation responsibilities between more or less autonomous controllers (E.g. kidney and liver)

·           Decentralize what is managed: divide the target system into more or less autonomous subsystems (E.g. microservices”).

 

In EA and BA, systems must remember enough about entities and events of interest (both inside and outside the system of interest) to direct them as need be.

On applying cybernetics to society

Ashby hoped his principles for the regulation of a target by a controller could be applied to large and complex entities like a human society.

 

“Preface: [his book] introduces the principles that must be followed when the system is so large and complex (e.g. brain or society) that it can be treated only statistically.”

 

“1/6 cybernetics is likely to reveal a great number of interesting and suggestive parallelisms between machine and brain and society. And it can provide the common language by which discoveries in one branch can readily be made use of in the others.”

 

“4/16. Cybernetics, however, looks forward to being able to handle systems of vastly greater complexity—computing machines, nervous systems, societies.”

 

“12/23. this chapter has treated only of systems that were sufficiently small and manageable to be understood. What happens, he may ask, when regulation and control are attempted in systems of biological size and complexity? What happens, for instance, when regulation and control are attempted in the brain or in a human society? Discussion of this question will occupy the remaining chapters.”

 

“13/10. Here we shall be thinking not so much of the engineer at his bench as of the brain that, if it is to achieve regulation in its learned reactions, must somehow cause the development of regulatory machinery within the nervous material available; or of the sociologist who wants a regulatory organisation to bring harmony into society.|”

 

“14/6. is there not a possibility that we can use our present powers of regulation to form a more highly developed regulator, of much more than human capacity, that can regulate the various ills that occur in society, which, in relation to us, is a very large system?”

 

There are several difficulties with the notion of controlling or regulating a large and complex society.

 

The need to scope the system

"There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion." ~ Donella Meadows

 

The arbitrariness of system boundaries a challenge for any attempt to treat a business as one system. Above, Ashby drew a one-to-one correspondence between a large and complex social entity and a system. Elsewhere, he said that infinite systems might be abstracted from one material entity. And for sure, a business is a social entity in which countless systems (that employs and participates in) might be identified – each defined by a different set of state variables and rules for how they change over time.

 

The need to select the right variables

“Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made. What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” (Ashby 1956)

 

Ashby’s law implies that to regulate a social entity, one must identify all the variables critical to the regulation. For a controller, Ashby’s variety is not an absolute measure of a thing; it applies only to those variables the controller seeks to control.

 

So, the law does not mean a controller knows all the state variables of the thing controlled. E.g. A thermostat knows only the temperature of its target environment. It ignores countless other environment variables that might be measured. Other controllers - with different interests – may monitor different variables and see different measures of variety.

 

The law does not imply a controller should maximise its variety. Imagine a database in which every citizen is recorded as in one of the four states above (rich communist, rich capitalistpoor communist and poor capitalist). The government monitors every citizen state change with the very particular aim that when any citizen enters the “rich communist” state, the government sends a message: “You must either give away your money or become a capitalist”. The government recognizes more variety than it needs to, since its interest is limited to two states – rich communist or not. On larger scale, it is inefficient to capture data that is currently irrelevant and unused. And when redundant data is collected, it often turns out later that the unused data is low quality.

 

The ideal is a “lean controller” that knows a minimal amount of what is going on in an entity it controls. Although The Bank of England may appear to be interested in only two variables -  the interest rate and the money supply – it surely gathers a lot of data with a view to measuring those variables and deciding what to do.

 

The Nobel prize-winning economist Hayek famously gave an acceptance speech titled “The “Pretence of Knowledge”. He coined the term “scientistic” meaning “a mechanical and uncritical application of habits of thought to fields different from those in which they have been formed.” and said:

“Fooled into believing that the measurable variables are the most critical, economists propose “solutions” that actually worsen the problem.”

 

The need for tight and exclusive coupling

The law says having enough variety is a necessary precondition to control selected variables in a target system. It does not say having enough variety is a sufficient precondition. To be effective, a controller must also be coupled to its target system tightly enough, and have exclusive control over its variables.

 

These two conditions may be presumed in the design of a machine in which all the components serve the purposes of the machine, but not in a social entity in which the actors are self-aware and self-determining, and other entities may be coupled to the target and affecting the same critical variables.

 

A challenge here is the many-to-many relationship between systems. Ashby wrote of cases where just two systems are designed to be coupled to each other, one target system is regulated by one control system. The coupling of systems in the real world is more variegated and complex.

 

One entity may be regulated by several controllers. In the real world, a thing may be monitored and directed by several controllers. E.g. A child may be regulated by its teachers and its parents. Homeostasis in an oyster is maintained by distributed ganglia, there is no central nervous system.

 

Several controllers may cooperate to regulate one entity. The state variables of the human body are maintained by relatively independent controllers. E.g. The human blood system is regulated by two controllers - the kidney and the liver – which do not share information. The kidneys regulate the amount of water in the body and balance the concentration of mineral ions in the blood. The liver regulates the amount of glucose in the blood, and is in turn regulated by hormones, such as insulin and glucagon.

 

Several controllers may compete to regulate one entity.  E.g. A child might be monitored and directed differently by its teachers and parents –who work in competition to different ends. This seems to be an example of dysfunctionality, but consider competition between controllers in a different example. Two businesses (or business divisions) may compete to monitor and direct the purchasing behavior of one customer. In this second example, competition might work to the benefit of the customer.

 

One actor may play a role several systems. A subsystem can be contained tightly in a wider system. E.g. the kidney and the liver are firmly contained within a body. And a cell contained in the kidney cannot also act in the liver. By contrast, a subsystem may participate only loosely in a wider system. E.g. a human actor can act in many different social networks. Each person must trade-off between their own goals and the goals of social networks they belong to.

 

In short, the real world is a mess of more and less tightly coupled systems.

"Managers do not solve problems, they manage messes." ~ Russell L. Ackoff.

 

The situations above can make it difficult to apply cybernetics to sociology and management science.

Conclusions and remarks

Ashbys law tells us something useful, that a system can be controlled by a regulator only if it recognizes enough variety in the system’s state, but the law can be misinterpreted.

 

The law is oft expressed as: “only variety can absorb variety”. Ashby expressed it more clearly as "The larger the variety of actions available to a control system, the larger the variety of perturbations it is able to compensate".


Ashby’s variety is not an absolute measure of a thing; it applies only to the narrow selection of variables a controller seeks to control. “Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made. What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.”



 

Ashby’s law works best where two systems are coupled one-to-one. The coupling of systems in the real world is often many-to-many, and so more complex.

 

The law tells us that given the infinitely variety of our environment, we'd require an infinite variety of responses to control it. There is only so much one control system can do, and as new variables emerge, it, will have to be revised, or other control systems introduced.

 

In short, Ashbys law can be misinterpreted and there is more to know.

 

·       The law does not say variety is all a controller needs

·       The law does not mean a controller knows all the state variables of the thing controlled

·       The law does not imply a controller should maximize its own variety

·       One real world entity may be regulated by several controllers

·       Several controllers may cooperate to regulate one entity

·       Several controllers may compete to regulate one entity

·       The real world can be seen as mess of more and less tightly coupled systems.

 

EA and BA embrace some cybernetics ideas. They define business in terms of separately describable systems, and couple them more or less tightly, using the best design pattern for the case.

 

Often, a system is required to monitor and direct the state of a target actor or activity. The more variegated the actions to be directed, the more complex the system needs to be. And enterprise architects should seek to ensure:

·       the system knows enough about the state of any target it monitors and directs

·       the system detects events that reveal significant state changes in a target - in an acceptably reliable and timely fashion

·       the system responds to events by sending appropriate directives - in an acceptably reliable and timely fashion

·       the system is coupled to its target tightly enough

·       inter-system messaging is efficient and effective enough.

·       other (competing or complementary) controllers directing the target are recognised

·       the system recognises exceptions, when actors do not respond as desired to a directive

·       the system manages exceptions - in an acceptably reliable and timely fashion.

   

In conclusion

There is some scientism and pseudo-science in social systems thinking discussion.

“There are limits to what science and the scientific method can achieve. In particular, studying society generally shows how difficult it is to control. This truth may disappoint those who want to build a science to shape society.

But the scientific method is not a recipe that can be mechanically applied to all situations.”

Hayek