Ashby’s ideas

Copyright 2017 Graham Berrisford. One of about 300 papers at http://avancier.website. Last updated 13/08/2017 23:56

 

Understanding Ashby’s ideas helps you to understand much else in the field of systems thinking.

Unless otherwise stated, quotes below are from Ashby’s Introduction to Cybernetics (1956).

This paper serves as a preface to other papers including:

·         Introducing general system theory

·         Marxism, systems thinking and enterprise architecture

·         Introducing systems thinkers

·         Systems thinking approaches

·         Beer’s ideas

Contents

Preface. 1

Abstraction. 1

System change. 3

Cybernetics (control systems and target systems) 3

Variety (a candidate measure of complexity) 4

The law of requisite variety ("variety absorbs variety”) 5

Ashby’s law of requisite variety - revisited. 7

 

Preface

Heraclitus of Ephesus was a Greek philosopher known for his doctrine of change being central to the universe.

Plato quoted him as saying “Everything changes and nothing stands still.

Many systems thinkers have addressed system change by borrowing or adapting the biological ideas of homeostasis and evolution.

Given the importance of these two ideas, it is strongly recommended you start by reading this short preface The origins of systems thinking.

Abstraction

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

“Despite being widely influential within cybernetics, systems theory… Ashby is not as well known as many of the notable scientists his work influenced.

W Ross Ashby was one of the original members of the Ratio Club, who met to discuss issues from 1949 to 1958.” Wikpedia in 2017

 

Ashby was keen we separate logical system descriptions from physical entities that realise them.

“Cybernetics depends in no essential way on the laws of physics.”

“In this discussion, questions of energy play almost no part; the energy is simply taken for granted.”

System theory

System descriptions

<form>                        <idealise>

Systems thinkers <observe and envisage> Real world entities

 

No process repeats exactly in the material world; any process we describe in a system is an abstraction from reality.

Even a computer program is different each time it runs at the level of matter and energy, memory spaces and electrical phenomena.

 

Abstraction of processes to describe the behaviors of a system

Ashby emphasised that the real world systems of interest (to general system theory) are ones that experience or perform processes.

System theory

Processes

<form>                        <idealise>

Systems thinkers   <observe and envisage>  Regular behaviors

 

"Cybernetics does not ask "what is this thing?" but ''what does it do?" It is thus essentially functional and behavioristic.”

“[It] deals with all forms of behavior in so far as they are regular, or determinate, or reproducible.”

 

Systems in which regular behaviors can be observed include:

·         A biological organism in which cells of various types play roles in processes, which sustain the organism’s body or help it reproduce.

·         A society in which actors play roles in the performance of processes, which sustain the group or serve other purposes.

·         A software system in which objects instantiate classes in the performance of operations, which update memories and send messages.

·         A business in which changes in one stock level lead to changes in another stock level - as in “System Dynamics”.

 

Abstraction of variables to describe the state of a system

Ashby emphasised that real world entities are systems only to the extent that selected variables are describable and observable.

System theory

Variable types

<form>                        <idealise>

Systems thinkers <observe and envisage> Variable values

 

“At this point we must be clear about how a "system" is to be defined.

Our first impulse is to point at [a real world entity repeating a behavior] and to say "the system is that thing there".

This method, however, has a fundamental disadvantage: every material object contains no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.”

 

In short

A system is a view of a reality; or a role you can observe or envisage real world actors as playing.

To apply system theory is to form an abstract system description that hides the infinite complexity of real-world entities you observe or envisage.

You describe the state of a real-world system in terms of variables whose values can be measured (e.g. the positions of the planets).

You model whatever regular processes can change the values of those variables (e.g. the orbits of the planets).

You describe a process in a way that enables real world behaviors to be tested as matching your description.

System change

“The word "change" if applied to such a machine can refer to two very different things.

·         change from state to state, which is the machine's behaviour, and which occurs under its own internal drive, and

·         change from transformation to transformation, which is a change of its way of behaving, and occurs at the whim of the experimenter or some other outside factor.

The distinction is fundamental and must on no account be slighted.”

 

In other words, one should on no account confuse:

·         system state change (variable value change)

·         system generation change (variable type or behaviour type change).

 

Ashby (and other early system theorists) focused heavily on homeostatic systems.

In early discussions, the term adaptation meant system state change.

So, we need another term for system generation change – and evolution is a natural fit.

For more on different kinds of system change, read The origins of systems thinking.

 

Cybernetics (control systems and target systems)

Just as we abstract a system description from a reality, so, a mechanical control system models some variable(s) detectable in a target system.

Control system

Model of target system variables

Thermostat

Target system

Target system variables and behavior

Heating system on/off

 

Ashby furthered the ideas of general system theory through discussion of control systems.

“Ashby popularised the usage of the term 'cybernetics' to refer to self-regulating systems

The book dealt primarily with homeostatic processes within living organisms, rather than in an engineering or electronic context.” Wikipedia 2017

In “Design for a Brain” (1952) Ashby viewed the human body as self-regulating system.

The graphic below separates a control system (the brain) from its target system (the remainder of the body).

Control system

Target system

“Design for a brain”

<wrote>                  <realised in>

Ashby                 <envisaged>               Brains

 

 

<monitor> Body state variables

<control>  Muscles and organs

 

Note that information flows are central to cybernetics.

Ashby saw the brain as a regulator that maintains a body’s state variables in the ranges suited to life

He presented the brain-body relationship as an information feedback loop.

A brain holds (or has access to) an abstract model of the body’s current state.

The brain receives information from sensors, and sends instructions to motors and organs.

The aim is homeostasis – to maintain the state of the body - and so help it achieve other desired outcomes.

 

Ashby eschewed discussion of consciousness; his basic idea might be distilled into one sentence thus.

A brain is collection of brain cells that interact to maintain body state variables by sending/receiving information to/from bodily sensors and motors.

 

Ashby picked out variables relevant to his interest in the brain as a control system, putting aside other things you might consider important to being human.

His Design for a Brain book holds an abstract description of a brain, which in turn holds an abstract description of a body’s physical variables.

Much as an engineer’s specification holds an abstract description of a control system, which in turn holds an abstract description of the physical variables it controls in a target system.

 

Cybernetics has influenced systems thinking in general, and business system thinking in particular.

A business system is connected to its wider environment by feedback loops.

It receives information about the state of entities and activities in its environment, and records that information in memory.

The state information it receives and stores must model reality well enough, else the system will fail.

It outputs information to inform and direct entities and activities as need be.

 

This paper goes on to introduce two more of Ashby’s ideas, ones that were later used by Stafford Beer.

Read Introducing General System Theory for other system theory terms and concepts in the context of early sources.

Variety (a candidate measure of complexity)

Ashby said that: “a system is any set of variables which he [the observer] selects”. 

“A system's variety V measures the number of possible states it can exhibit, and corresponds to the number of independent binary variables.

But in general, the variables used to describe a system are neither binary nor independent.”

 

In short, complexity = variety = the number of possible states a system can exhibit.

There are several difficulties with this definition of complexity.

 

First, the measure is incalculably large for any system with a non-trivial state.

 

Second, many other measures have been proposed: e.g. McAbe’s procedural complexity.

Here, we might propose: complexity = the number of event/state combinations * the average procedural complexity of the rules applied to them.

There is no agreement as to which complexity measure is right or best.

 

Third, does the measure apply to the control system, the target system, or the real world entity of which only selected variables are controlled?

every real machine embodies no less than an infinite number of variables, all but a few of which must of necessity be ignored.” (Ashby in “Design for a Brain”)

Consider the complexity of a tennis match in real world.

Consider the thought processes of its players, and the molecular structures and movements in the players, court surface and tennis balls.

Obviously, you can never measure the complexity of real world entity or behavior per se.

You can only measure it with respect to your chosen description of its elements (roles, actors, processes, variables, whatever).

And then, only measure it at the level of abstraction at which you choose to describe those elements and their inter-relationships.

 

From the viewpoint of a describer, a system is only as complex as its description.

From the viewpoint of a control system, a target system is only as complex as those variables the control system monitors and controls.

Read “Complexity” for more on that topic.

The law of requisite variety ("variety absorbs variety”)

Brains and businesses do successfully apply the principles of general system theory and cybernetics.

A brain maintains mental models of things (food, friends, enemies, etc.) it perceives to be out there.

A business (in its information systems) maintains documented models of actors and activities it monitors and directs through information feedback loops.

 

“Ashby formulated his Law of Requisite Variety stating that "variety absorbs variety, defines the minimum number of states necessary for a controller to control a system of a given number of states."

This law can be applied for example to the number of bits necessary in a digital computer to produce a required description or model.

In response, Conant (1970) produced his so-called "Good Regulator theorem" stating that "every Good Regulator of a System Must be a Model of that System". Wikipedia 2017

 

The Good Regulator theorem was proved by biological evolution long before Conant articulated it.

Animal brains maintain mental models of things (food, friends, enemies etc.) they care about in their environment.

These mental models must be accurate enough to enable the animals to monitor and manipulate those things. (Read Knowledge and Truth for more on this theme).

 

Ashby's law applies to how a control system controls selected or essential variables of a target system.

"The larger the variety of actions available to a control system, the larger the variety of perturbations [in values of target system variables] it is able to compensate"..

 

The law defines the minimum number of states necessary for a control system to control a target system with a given number of states.

It is interpreted here as meaning:

·         A control system’s information state models only those variables in the target system’s concrete state that are monitored and controlled.

·         For a homeostatic system to be stable, the number of states of the control system must be at least equal to the number of states in the target system.

·         The more ways that a homeostatic system can deviate from its ideal state, the more control actions a control system will need.

 

The law says if a controller does not have enough variety to control the variables that define target system, then it cannot control them.

It does not say having enough variety is to sufficient to ensure the controller can control those variables.

 

Ashby’s law does not mean (as some seem to think) that the control system must be as complex as the concrete reality of the entity controlled

A control system is usually much simpler than any real world entity it controls.

For example, a thermostat models only as much variety (colder or hotter than a given temperature) as the behavior (heating system on or off) it controls.

To the control system, the target system is no more complex than the variables it controls.

Control system

Target system

Control system designs

<create and use>                  <realised by>

Engineers      <observe and envisage> Control systems

 

 

<monitor and control>   Heating systems

 

For example, a score board models only a few essential variables of a tennis match.

To umpires, what matters are the rules that determine the score, not the length or complexity of the rallies.

Control system

Target system

Laws of tennis

<create and use>             <realised by>

LTA      <observe and envisage > Umpires

 

 

<monitor and control> Real tennis matches

 

A corollary to Ashby’s law of requisite variety

Ashby wrote:

“Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.”

 

The implication is that the system describer must be experienced, expert and trained enough to “pick out and study the facts that are relevant”.

System architects must know what is architecturally significant to their and stakeholders’ interests in the target system.

 

Maximising internal variety”

"Since the variety of perturbations a [control] system can potentially be confronted with is unlimited, we should always try maximize its internal variety (or diversity),

so as to be optimally prepared for any foreseeable or unforeseeable contingency." Principia Cybernetica.

This might be interpreted by a business today as filling up a “data warehouse” or collecting lots of “big data”.

There is considerable experience of this leading to data quality issues, as discussed in this paper Beer’s ideas.

Ashby’s law of requisite variety - revisited

It turns out that, in nature and in business, control systems may be distributed rather than centralised (as Ashby and Beer presumed).

I gather the human body does not maintain homeostasis purely by top-down command and control from the higher brain.

Instead, its many variables are maintained by several control systems, operating in parallel, which are distributed and not in direct communication with each other.

An oyster manages to maintain homeostasis without having any central brain and nervous system.

In large businesses, there is rarely if ever an overarching control body that monitors and directs all business processes.

There are in practice several (hierarchical and/or parallel) bodies that may compete for resources, and even have conflicting goals.

 

To different controllers, a real world entity is at once several target systems, a different one to each controller.

But suppose two control systems either compete or complement each other in their efforts to control the same target system state?

This feels to me the very stuff of relationships between people in a social system!

 

Ashby’s law might be interpreted for a business thus:

·         A business system’s information state models only those variables of actors and activities that the business monitors and directs.

·         For directions to be effective, the information state must model those variables accurately enough.

·         The more variegated the actors and activities to be monitored and directed, the more complex a business system must be.

 

To direct an actor or activity in its environment, a brain or a business must be able to:

·         maintain or obtain a model of the state of that actor or activity (the model must be accurate enough).

·         gather inputs: detect events that reveal a significant state change (in an acceptably reliable and timely fashion).

·         produce outputs: respond to events or state changes by sending directives to actors to perform activities (in an acceptably reliable and timely fashion).

·         adapt if the actor does not respond to a directive as expected, in an acceptably reliable and timely fashion.

 

To generalise, a control system

·         must know just enough about the state of target system it monitors and directs

·         must detect events that reveal significant state changes in the target system - in an acceptably reliable and timely fashion.

·         must respond to those events by sending appropriate directives to the target system - in an acceptably reliable and timely fashion.

 

A control system can expect a target system to respond appropriately provided that:

·         monitor and control signals cannot go missing or be corrupted

·         time/speed, capacity/throughput, availability and any other non-functional requirements are met

·         the target system has no other (competing/interfering) control system

·         the target system is not capable of self-determination, of choosing an inappropriate response to a control message.

 

 

All free-to-read materials at http://avancier.website are paid for out of income from Avancier’s training courses and methods licences.

If you find the web site helpful, please spread the word and link to avancier.website in whichever social media you use.