Ashby’s ideas about variety

Copyright 2017 Graham Berrisford. One of more than 100 papers on the “System Theory” page at http://avancier.website . Last updated 02/05/2019 13:21

 

This is one of many companion papers that analyse some systems thinkers’ ideas.

·         Read Ashby’s ideas for an introduction to his ideas about cybernetics, mostly integral to general system theory

·         Read Ackoff’s ideas on the application of general system theory (after von Bertalanffy and others) to management science.

·         Read Ashby’s ideas about variety on his measure of complexity and law of requisite variety.

·         Read Beer’s ideas on the application of cybernetics (after Ashby and others) to management science.

·         Read Von Foerster’s ideas on ideas attributed to Heinz von Foerster and his second order cybernetics.

 

Further reading on the “System Theory” page at http://avancier.website includes:

Boulding’s ideas, Checkland’s ideas, Luhmann’s ideas, Marx and Engels’ ideas, Maturana’s ideas and Snowden’s ideas.

 

This paper is about the applicability of hard science ideas in sociology and management science.

Beer’s ideas serves to illustrate some points made in this paper.

Contents

Cybernetics. 1

Ashby’s “Variety”. 2

Variety revisited. 4

Beer’s application of Ashby’s ideas. 7

Conclusions and remarks. 9

References and reading. 13

 

Cybernetics

First, a few general system theory ideas.

·         Environment: the world outside the system of interest.

·         Boundary: a line (physical or logical) that separates a system from its environment, and encapsulates a system as an input-process-output “black box”.

·         Hierarchy: a system is composed from interacting subsystems; systems are recursively composable and decomposable.

·         State: the current structure or variables of a system, which changes over time.

 

A system can be seen as a black box that consumes inputs from its environment and produces outputs.

Inputs stimulate internal processes or behaviors that change the system’s internal state and/or produce outputs.

In most systems of interest to us, multiple subsystems or actors interact to complete the behaviors.

These and other general system theory ideas are taken for granted in most of today’s enterprise and software architecture methods.

They weren’t so widely recognised in the first half of the 20th century.

 

Cybernetics emerged out of efforts in the 1940s to understand the role of information in system control.

Weiner introduced cybernetics as the science of biological and mechanical control systems.

He discussed how a controller directs a target system to maintain some state variable(s) in a desired range.

E.g. A thermostat directs the actions of a heating system.

 

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

He popularised the usage of the term 'cybernetics' to refer to self-regulating (rather than self-organising) systems.

 “Cybernetics is a "theory of machines".

“Our starting point is the idea, much more than a century old, that a machine, in given conditions and at a given internal state, always goes to a particular state.”

·         “A variable is a measurable quantity that has a value.”

·         “The state of the system is the set of values that the variables have.”

·         “A system is any set of variables which he [observer] selects from those available on the real machine.” (Introduction to Cybernetics” Ashby, 1956)

 

In “Design for a Brain” (1952), Ashby, addressed biological (rather than mechanical or electronic) homeostatic machines.

He presented the brain as a regulator that maintains a body’s state variables in the ranges suited to life.

This table distils the general idea.

 

Generic system description

Ashby’s design for a brain

A collection of actors

that interact in regular behaviors

that maintain system state and/or

consume/deliver inputs/outputs

from/to the wider environment.

A collection of brain cells that

interact in processes to

maintain body state variables by

receiving/sending information

from/to bodily organs/sensors/motors.

 

Ashby and other general system theorists focused attention on a system’s behaviors rather than its actors.

“Cybernetics deals with all forms of behaviour in so far as they are regular, or determinate, or reproducible.”

“[It] treats, not things but ways of behaving. It does not ask "what is this thing?" but ''what does it do?"

It is thus essentially functional and behaviouristic.” (Ashby 1956)

Ashby’s “Variety”

Some systems thinkers have leaned on Ashby’s ideas related to variety, a few of which are summarised in this section.

 

Variety as a measure of complexity

 “A system's variety V measures the number of possible states it can exhibit.”

Ashby said the complexity of a system = its variety.

Variety is not an absolute measure of a thing, it relates to a controller’s interest in that thing as a system.

The possible states a target can have are relative to the interest the controller has.

Different controllers - with different interests - perceive a target as having different varieties.

 

The law of requisite variety “only variety can absorb variety”

"The larger the variety of actions available to a control system, the larger the variety of perturbations it is able to compensate".

Perturbations are changes in the values of a target system’s variables that need to be regulated by the controller.

 

In the context of a homeostatic system this means the following.

·         The purpose of a controller is to monitor and regulate one or more essential state variables of a target system.

·         To succeed, the controller must be able to recognise when a target state variable value strays outside a desired range.

·         The more ways a target system can deviate from its desired state, the more control actions its controller(s) will need.

 

Accordingly, Conant (1970) stated his so-called "Good Regulator theorem" stating that

"every Good Regulator of a System Must be a Model of that System".

This theorem was proved in biology and sociology long before it was articulated.

Both animals and businesses are connected to their wider environment by input-output feedback loops.

They remember the state of actors and activities that they monitor, inform and direct.

They update their memories in response to inputs revealing state changes in those actors and activities.

·         The brain of an animal maintains mental models of things (food, friends, enemies etc.) it monitors and directs.

·         The information systems of a business maintain documented models of things (customers, suppliers, etc.) it monitors and directs.

These memories must model or represent reality well enough, if animals and businesses are to act effectively and survive.

 

Managing complexity (variety)

Where a controller has insufficient variety, design options include:

·         Amplifying (increasing) the variety in the control or management system

·         Attenuating (reducing) the variety in the target or operational system.

 

Amplifying and attenuating variety were major themes in Beer's work in management (the profession of control, as he called it).

Note that there are other design options for managing complexity - mentioned in the conclusions of this paper.

Variety revisited

Ashby said the complexity of a system = its variety.

His law of requisite variety tells us something useful; but it can be misinterpreted and there is more to know.

 

Variety is not an absolute measure of a thing, it relates to a controller’s interest in that thing as a system

A binary variable (like a “bit” in computer science or a “truth value” in mathematical logic) has one of two values.

E.g. An actor might be described by two such variables: financial status = rich or poor, and political status = communist or capitalist.

A system with N such variables can have 2 to the Nth possible states; so the actor described above can have four possible states.

Imagine the combinatorial explosion of actor varieties where an actor is described by many variables that have many possible values.

And now imagine a society of many actors – the value for its variety is huge and well-nigh incalculable.

 

However, Ashby saw every system as a “soft system” in the sense that it is a selective perspective of the real world.

“every material object contains no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” (Ashby 1956)

 

The possible states a target can have are relative to the interest the controller has

E.g. A simple bi-metal strip thermostat has two control actions – switch a heater on or off.

It also moves a pointer along a temperature scale drawn in Celsius from 0 to 100.

It is interested in three states on the scale – the pointer is above the highest desired temperature, below the lowest, or in between.

The target’s state variety is only three - it does not become more complex when the scale is drawn in Fahrenheit.

 

Different controllers - with different interests  - perceive a target as having different varieties.

E.g. One observer of a lake might be interested in maintaining its water level within certain bounds.

In this case, the variable values of interest are the upper and lower water levels, and the flow rates of water in and out.

Another observer of the same lake might be interested in maintaining the populations of many biological species in its ecology.

In this case, the essential variable values of interest could be the upper and lower numbers for each population.

 

So, variety is not an absolute measure of a thing’s complexity.

Not only must an observer identify how many state variables must be regulated to meet some goal(s) “already given”.

They should also identify what changes in variable values require controlling actions to be taken.

Because only then can their view of the system’s complexity be measured.

 

By the way, variety is a measure of observable state complexity rather than internal behavior complexity.

A measure of the latter could be: the number of input event/state combinations * the average procedural complexity of the rules applied to them.

And there are scores of other system complexity measures.

 

Certainly, the variety of states to be regulated is a measure of significance to the designer of a controller.

In some systems thinking discussion, the term variety is used without reference to any agreed definition of variables.

 

Three things the law does not say

 

The law does not say variety is all a controller needs

The law says having enough variety is a necessary precondition to control selected variables in a target system.

It does not say having enough variety is a sufficient precondition.

A controller must also be coupled to its target system tightly enough.

And not overridden by any competing controller.

And not overridden by a target that is self-aware and self-determining.

 

The law does not mean a controller knows all the state variables of the thing controlled

A lean controller knows a minimal amount of what is going on in an entity it controls.

E.g. A thermostat knows nothing but the temperature of the target environment.

It ignores countless other variables that might be measured in “the real machine” it controls.

E.g. the score board of a tennis match models only those variables that measure success in the sport.

The umpire ignores other variables in “the real machine”, like the heights of the players and lengths of their rallies.

Every system describer must “pick out and study the facts that are relevant” to stakeholders’ interests in the target system.

 

The law does not imply a controller should maximise its variety

A controller is interested in state variations that influence its control actions.

The Bank of England may appear to be interested in only two variables: the interest rate and the money supply.

But it gathers a lot of data with a view to deciding how to manipulate those two variables.

 

On the other hand, Ashby emphasised the need to be selective: “pick out and study the facts that are relevant to some main interest.”

It is inefficient to capture data that is currently irrelevant and unused.

E.g. Suppose a citizen is described by two variables: financial status = rich or poor, and political status = communist or capitalist.

And the government monitors every state change, so their record of the citizen has a variety of four – meaning four possible states.

What is the government’s interest here?

If the citizen has entered the “rich communist” state, the government sends a message: “You must either give away your money or become a capitalist”.

So, their main interest in the citizen is actually limited to two states rather than four, and they are recording a little more variety than they need.

When redundant data is collected, it often turns out later that the unused data is low quality.

 

Three other things to know

 

One real world entity may be regulated by several controllers

Discussions of variety typically focus on a target system and its regulation by one controller.

In the real world, a thing may be monitored and directed by several controllers. E.g.

·         A child may be regulated by its teachers and its parents.

·         Homeostasis in an oyster is maintained by distributed ganglia, there is no central nervous system.

·         The state variables of the human body are maintained by relatively independent controllers.

 

The human blood system is regulated by two controllers - the kidney and the liver.

·         The kidneys regulate the amount of water in the body and balance the concentration of mineral ions in the blood.

·         The liver regulates the amount of glucose in the blood, and is in turn regulated by hormones, such as insulin and glucagon.

These two blood system controllers do not share information.

 

Controllers may compete to regulate one entity.

A child might be monitored and directed differently by its teachers and parents –working in competition to different ends.

This seems to be an example of dysfunctionality, but consider competition between controllers in a different example.

Two businesses (or even divisions of one business) may compete to monitor and direct the purchasing behavior of one customer.

In this second example, competition might work to the benefit of the customer.

 

The real world can be seen as mess of more and less tightly coupled systems

"Managers do not solve problems, they manage messes." ~ Russell L. Ackoff

"There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion." ~ Donella Meadows

The boundaries we draw can divide the world distinct systems, nested systems and overlapping system.

 

Ashby illuminated thinking about the world in terms of distinct systems that communicate by input/output data flows.

 “Two or more whole machines can be coupled to form one machine;

and any one machine can be regarded as formed by the coupling of its parts, which can themselves be thought of as small, sub-machines.” (1956)

Today, we discuss how distinct systems may be coupled, loosely or tightly, in various kinds of relationship, for example.

·         Cooperation aka symbiosis

·         Competition aka conflict

·         Control aka regulation

 

Systems can also be nested, can be related by composition or containment.

A subsystem can be contained either tightly or loosely by a wider system.

E.g. the kidney and the liver are firmly contained within a body; and a cell contained in the kidney cannot also act in the liver.

By contrast, a human actor can act in many different social networks.

Each person must trade off between goals of their own and the goals of social networks they belong to.

 

Some points in the section above can make it difficult to apply cybernetics to sociology and management science.

“There are limits to what science and the scientific method can achieve.

In particular, studying society generally shows how difficult it is to control.

This truth may disappoint those who want to build a science to shape society.

But the scientific method is not a recipe that can be mechanically applied to all situations.” From this distillation of “The Pretence of Knowledge” speech

Beer’s application of Ashby’s ideas

This section is the briefest of summaries.

For a longer version read Beer’s ideas on the application of cybernetics (after Ashby and others) to management science.

 

In 1972 (the year Ashby died) Beer’s “Brain of the Firm” was published.

The book title echoes Ashby’s “Design for a Brain” 20 years earlier; and it easy to see why.

In Ashby’s “Design for a Brain”, the human is a homeostatic machine that maintains essential state variables; the brain-to-body relationship is seen as a regulatory system with feedback loops.

In Beer’s “Brain of the Firm”, a business is a homeostatic machine that maintains essential state variables; the management-to-worker relationship is seen as a regulatory system with feedback loops.

 

Project Cybersyn

On taking office in 1970, the new Chilean president, Allende faced a problem.

“How was he to nationalize hundreds of companies, reorient their production toward social needs, and replace the price system with central planning,

all while fostering the worker participation that he had promised?” Ref 3.

Beer was hired to help, and named his project Cybersyn, short for “cybernetics synergy”.

 

In “Brain of the Firm”, Beer had introduced his Viable System Model for business management; it was supposedly based a human’s central nervous system.

You can read the Wikipedia entry on project Cybersyn for a summary of how Beer designed a central nervous system for the Chilean economy.

The aim was to monitor and direct the actions of actors in Chile’s nationalised businesses.

There would be four levels of control: Total, Sector, Branch and Firm.

So-called “algedonic alerts” would be sent upwards when a resource or performance measure strayed outside a defined range, typically after a timeout.

The higher levels would respond by cascading directions downwards to restore the state of business operations to a homeostatic norm.

 

Managers in an operations room at any level would:

·         be notified when a variable in a lower level system moved out of range, for an unacceptable time

·         read the report, make a plan, then

·         cascade advice and directives using telex messages.

 

Six months after its launch in 1973, the project ended when Allende was overthrown and Chilean politics swung away from central planning.

The Nobel prize-winning economist Hayek knew Beer, but they never agreed about planning.

In 1974, Hayek gave a famous prize acceptance speech called “The Pretence of Knowledge”.

He coined the term “scientistic” meaning “a mechanical and uncritical application of habits of thought to fields different from those in which they have been formed.”

This distillation of Hayek’s “Pretence of Knowledge” speech includes this expression of Hayek’s view.

“Fooled into believing that the measurable variables are the most critical, economists propose “solutions” that actually worsen the problem.”

Did Hayek have had in mind the application of cybernetics to the requirement for worker participation in planning?

Beer learned from the project.

 

Beer’s biology-sociology analogy

Over the following years, Beer polished his ideas about applying cybernetics to business management, and gathering feedback from workers.

He refreshed and detailed his “Viable System Model”.

Find “Diagnosing the system for organisations” (1985) on the internet and look at Figure 37, or else the exemplar here.

You’ll see the VSM is much more complex than shown below.

Still, in essence, the VSM is a structural view of a business (and/or unit of a business) that divides it into 5 subsystems, each with its own functions.

 

Beer followed in the long tradition of social systems thinkers who have drawn a biology-sociology analogy

He looked for inspiration in the structure of the human central nervous system (CNS).

This table is my naïve attempt to present a CNS-VSM mapping.

 

Human CNS

Responsible for functions

VSM System

Responsible for functions

Higher brain

voluntary movement, speech and cognition

5

makes policy decisions to steer the whole organization and balance demands from different units.

Mid brain

involuntary movement, the eye, auditory and visual processing.

4

looks out to the environment and monitors how the organization must change to remain viable.

Base brain

basic functions like breathing and sleeping

3

establishes the rules, resources, rights and responsibilities of System 1 and interfaces with Systems 4/5.

Nervous system

fight, flight and freeze; resting, breeding and digestion.

2

information systems that enable primary activities to communicate

and enable System 3 to monitor and co-ordinate those activities.

Body

organs, sensors and motors

1

the primary/core business activities (each may, recursively, be described as viable system)

 

Given what I understand of both, the CNS-VSM mapping is a loose analogy rather than isomorphic.

Even at the high level of abstraction in the table above, some mappings (e.g. from base brain to system 3) look odd.

The paper on Beer’s ideas draws some CNS-VSM contrasts that merit consideration.

 

Of course, the business functions that Beer positioned in the structure of his VSM can be found in businesses.

And information does flow into, up, down, around and out of a business.

However, the biology-sociology analogy seems to me a device for teaching and promotion.

Like many analogies, the more you think about it the less convincing it is.

Conclusions and remarks

Beer, Churchman, Checkland and others have given us models for thinking about a business or a system.

They are useful tools, at least when interpreted by a skilled management consultant or systems analyst.

This paper is neither for nor against the use of any such model.

It is about the applicability of hard science ideas in sociology and management science.

It makes some general points about those ideas and their limits.

 

On the application of general system theory and cybernetics to enterprise architecture

Most systems of interest to us are describable in the same general way.

That is, in terms of actors that interact in regular behaviors that maintain the system’s state and/or consume/deliver inputs/outputs from/to the wider environment.

The following general system concepts appear also in enterprise and software architecture methods.

·         Environment: the world outside the system of interest.

·         Boundary: a line (physical or logical) that separates a system from its environment, and encapsulates a system as an input-process-output “black box”.

·         Interface: a description of inputs and outputs that cross the system boundary.

·         Hierarchy: a system is composed from interacting subsystems; systems are recursively composable and decomposable.

·         Emergence of properties, at a higher level of composition, from coupling of lower-level subsystems.

·         Coupling of systems by input/output information.

·         State: the current structure or variables of a system, which changes over time.

·         Deterministic processing of a system’s inputs with respect to its memory/state data.

 

General system theory, enterprise and software architecture all embrace some cybernetics ideas.

Enterprise architects strive to ensure:

·         a business is optimally decomposed into separately describable and manageable systems

·         those systems are coupled where necessary, more or less tightly, using the best design pattern for the case.

 

Often, a system is required to monitor and direct the state of a target actor or activity.

The more variegated the actions to be directed, the more complex the system needs to be.

And enterprise architects should seek to ensure:

·         the system knows enough about the state of any target it monitors and directs

·         the system detects events that reveal significant state changes in a target - in an acceptably reliable and timely fashion

·         the system responds to events by sending appropriate directives - in an acceptably reliable and timely fashion

·         the system is coupled to its target tightly enough

·         inter-system messaging is efficient and effective enough.

·         other (competing or complementary) controllers directing the target are recognised

·         the system recognises exceptions, when actors do not respond as desired to a directive

·         the system manages exceptions - in an acceptably reliable and timely fashion.

 

On the biology-sociology analogy

One version of this analogy is widely accepted and used.

The behaviors of animals and businesses can be described in terms of three functions.

·         Sensing: collecting input from the external environment.

·         Integrating: processing and interpreting the input – sometimes with reference to the system’s state/memory.

·         Responding: acting appropriately to the input, changing state and/or producing output.

 

Other biology-sociology analogies have not been so widely accepted.

E.g. Beer’s analogy between his “viable system model” and a central nervous system is questionable.

The systems thinkers Ackoff and Bausch, have deprecated drawing a biology-sociology analogy.

One reason: human societies are not homeostats; rather than stay stable, they continually evolve.

Another is that human actors are free to act as they choose, contrary to any system they might be expected to play a role in.

 

On the scope of cybernetics

Cybernetics applies to systems that sense/consume inputs and respond by changing state and/or producing outputs.

Ashby urged us to recognise every “real machine” is infinitely more complex than any system abstracted from it.

Likewise, every real world business (or other social network) is infinitely more complex than any system abstracted from it.

A lot of information, some essential to business survival, flows up, down and around a business.

Much of that information is ad hoc or informal, and stimulates some irregular, unpredictable or unrepeatable behavior

Those information flows and behaviors are beyond the scope of cybernetics as Ashby defined it.

 

On assessing complexity

Some say thinking about a system’s structures is inadequate because what we need to understand, monitor and control is behaviors.

Beer said thinking about the four Ms (Men, Machines, Materials and Money) is inadequate because managers need to think about managing complexity.

How to objectively assess the relative complexity of two business systems?

You might try to do this systematically, as follows.

1.      Choose your measure of complexity (Ashby’s or other)

2.      Identify the system elements to be described (roles, actors, processes, variables, whatever)

3.      Describe the two real world businesses in terms of those elements

4.      Demonstrate your two descriptions have been made to the same level of abstraction.

5.      Test that the two real world business systems behave according to your descriptions.

6.      Then apply the complexity measure to each business system and compare them.

 

The process is impractical for all but trivial systems and complexity measures.

So Beer said a subjective assessment of relative complexity is valid.

OK, but that is to shift away from the objective science Ashby sought to establish.

 

On managing complexity (variety)

Ashby’s law of requisite variety is that “only variety can absorb variety”.

The implication being that when a controller has insufficient variety, design options include:

·         Amplifying (increasing) the variety in the control or management system

·         Attenuating (reducing) the variety in the target or operational system.

 

Note that other design options for managing complexity include:

·         Improve information flow qualities (speed, throughput, integrity etc.)

·         Tighten the control-target coupling and/or remove any competition between controllers

·         Decentralise control: divide regulation responsibilities between maximally autonomous controllers (e.g. kidney and liver)

·         Decentralise what is managed: divide the target system into maximally autonomous subsystems (e.g. “microservices”).

 

On decentralisation of what is managed

A supply chain or manufacturing business is very different from an information processing or knowledge worker business.

Every human organisation has to find its own balance between hierarchy and anarchy.

 

A current fashion (in “agile architecture”) is to dismember a system and decouple its subsystems (“microservices”) as far as possible.

The team responsible for one subsystem is encouraged to develop it independently.

Teams are encouraged to be self-organising; actors are encouraged to be self-determining.

They are allowed to choose their own actions to meet given aims; perhaps even choose their own aims.

They may even make and deploy their own control or coordination system, perhaps in cooperation with neighbours.

 

On self-organisation

For a coherent discussion of self-organising systems we need to distinguish:

·         Self-organising: redesign by self-aware actors who change the variables or rules of a system they play roles in.

from

·         Self-sustaining: in which autopoietic processes make and maintain the structures that perform the processes.

·         Self-assembly: as in the pre-determined growth of a crystal in a liquid, or an embryo into an adult.

·         Self-regulation: as in the maintenance of homeostasis during the life of an entity.

·         Self-determination: as in self-aware actors choosing what to do regardless of any rules.

 

Ashby (and Maturana) rejected the idea of a “self-organising system” as undermining the concept of a system.

We need to distinguish inter-generational evolution from chaotic ad hoc change.

For sure, the actors who play roles in a system may agree to change the variables or rules of that system

Whenever actors discuss and agree changes, they act in a higher or meta system.

And once the change is made, the actors (still members of the same social network) now act in a new system or system generation.

If actors continually change the properties and functions of the organisation they work in, then the concept of a system is lost.

 

On social networks v. social systems

There is big difference between:

·         A social system in which regular activities are performed by a group of actors

·         A social network in which a group of actors perform whatever activities they choose.

 

The universe is divisible into infinite systems and networks: some distinct, some nested and some overlapping.

The boundaries of social networks and systems are drawn by those interested in them.

The fuzziness of network and system boundaries is a challenge for any sociological study.

If the actors in a social network continually (rather than generationally) change the aims, roles and rules of the network they belong to – the concept of a system is lost.

 

A real-world business may be seen as one social network that realises many socio-technical systems.

Designing a system using cybernetics is one thing, managing the social network another.

Directing activities based on analysis data collected is certainly one part of managing business.

Motivating and helping people to reach aims is another part (along with rules that discourage aberrant behaviors).

 

In conclusion

There is some scientism and pseudo-science in social systems thinking discussion.

To repeat from earlier.

“There are limits to what science and the scientific method can achieve.

In particular, studying society generally shows how difficult it is to control.

This truth may disappoint those who want to build a science to shape society.

But the scientific method is not a recipe that can be mechanically applied to all situations.” From this distillation of Hayek’s “Pretence of Knowledge” speech

References and reading

A variety of sources were referred to in the course of writing this paper; they include Ashby’s “Design for a Brain” (1952) and “Introduction to Cybernetics” (1956).

Unless otherwise stated, quotes are from Introduction to Cybernetics” (1956) W. Ross Ashby.

Some speak of complex adaptive systemswhere the meaning of all three terms is debatable.

Read Complex adaptive systemsfor more on that.

 

 

All free-to-read materials at http://avancier.website are paid for out of income from Avancier’s training courses and methods licences.

If you find the web site helpful, please spread the word and link to avancier.website in whichever social media you use.