Some commentary on Ashby’s cybernetics

Copyright 2019 Graham Berrisford. A chapter in the book https://bit.ly/2yXGImr. Updated 04/05/2021 18:14

 

Reading online? If your screen is wide, shrink its width for readability.

 

This chapter analyses and comments on some terms and concepts you can find in the first two of three of Ashby’s works below. The third is discussed in a later chapter.

 

·       Design for a Brain” (1954 but originally 1952)

·       Introduction to Cybernetics” (1957)

·       Principles of the self-organizing system” (1962).

 

Since Ashby expressed his ideas somewhat differently in different publications, even in different editions of one publication, I have tried in what follows to stick to the terms used in the particular editions you can find using the links above.

Contents

The terminology problem.. 1

First book: “Design for a brain”. 2

Second book: “An introduction to cybernetics”. 8

What was new?. 17

Conclusions and remarks. 21

Appendices. 22

 

Introduction

1949 to 1958: The Ratio Club - a cross-disciplinary group in the UK, founded by the neurologist John Bates to discuss cybernetics. Members included psychologists, neurobiologists, engineers, physicists, and mathematicians. Many went on to become prominent scientists.

 

Though he is not as well-known as many of the scientists his work influenced, a prominent member of the Ratio Club was the psychologist W. Ross Ashby, who generalized cybernetics, and defined a system in two ways.

 

System as a set of state variables

Ashby defined a system as set of state variables selected from observation of some real-world entity, machine or situation. He represented the state of a physical entity by the values of state variables, as, for example, we represent state of a tennis match by game, set and match scores, or may represent the state of a predator-prey system by predator and prey population numbers.

 

Ashby didn’t exclude the appearance in messages of qualitative information (as a location might be named in his gale warning broadcast). However, the focus of cybernetics is mostly on quantitative variables, like the levels of a stock, population or resource. Given that a quantity changes over time, the variable’s line of behavior may be shown on a graph of quantity against time, as shown in figure 7/1/1 in Ashby’s “Design for a Brain”(1954).

 

System as a set of rule-bound activities

In “Introduction to Cybernetics”, Ashby defined systems by the way they behave over time, as a set of regular or repeatable state changes, rather than what they are made of. This shift in perspective, from the physical structure of a system to its logical behavior, is central to the cybernetic view of the world. When a cybernetician calls a system complex, the reference is to its lines of behaviour rather than its material structure.

 

Ashby was particularly interested in machines and organisms that change from state to state under their own internal drive, without input.

 

(By contrast, enterprise architects are mostly concerned with systems in which actors are prompted to act when inputs arrive, or internal states reach a threshold value. Activities advance the internal state of the business and/or produce outputs that advance the state of external actors. Despite their different interests, there is a considerable overlap of ideas, explored in other chapters.)

 

Regulation, feedback and homeostasis

"Self-regulating mechanisms have existed since antiquity, and the idea of feedback had started to enter economic theory in Britain by the 18th century, but it did not have a name.... In 1868, James Clerk Maxwell wrote a famous paper, "On governors", that is widely considered a classic in feedback control theory. This was a landmark paper on control theory and the mathematics of feedback."  (Wikipedia)

 

The biologist Claude Bernard (1813-1878) wrote on homeostatic feedback loops that maintain the essential variables of the body within critical limits. Later Walter Cannon wrote “Physiological Regulation of Normal States” (1926).

 

If a regulator is to monitor and direct the state of a target then it must know or remember the state of that target, and update that knowledge. Cybernetics explains how a regulator does this by means of a feedback loop that connects the regulator and target subsystems such that the output from one is input to the other.

 

Subsystem

Feedback loop

Subsystem

Regulator

ßstate information

directionà

Target

Thermostat

ßtoo hot

switch offà

Heater

Thermostat

ßtoo cold

switch onà

Heater

 

A controller/regulator typically contains three kinds of component:

·       receptors that sense changes in the state of a target

·       a control center that determines the regulator’s responses to those changes

·       effectors that act to change the state of a target.

 

In his “Design for a Brain” Ashby presented the brain as a regulator that monitors and maintains the state of the body. This table distils the general idea.

 

Generic activity system

Ashby’s design for a brain

Actors

interact in orderly activities to

maintain system state and/or

consume/deliver inputs/outputs

from/to the wider environment.

Brain cells

interact in processes to

maintain body state variables by

receiving/sending information

from/to bodily organs/sensors/motors.

 

Progressive entities or processes

Feedback is important in machines, animals, social entities and business operations. A business is coupled to its environment by producing outputs that affect external entities, and receiving inputs, some of which are responses to previous outputs. However, business activities are usually more progressive than homeostatic.

 

By way of exemplifying progressive state changes, in his “Introduction to Cybernetics”, Ashby discussed an abstract system realized in nature. In the table below, after Tinbergen, the columns show the roles of two sticklebacks in their mating ritual; the rows show the succession of activities/states of the system/process as it progresses to a conclusion.

 

The stickleback mating ritual: the abstract system

The female’s role is to

The male’s role is to

present swollen abdomen, special movements

present a red color and a zigzag dance

swim towards the male

turn around and swim rapidly to the nest.

follow the male to the nest

point its head into the nest entrance

enter the nest

quiver in reaction to the female entering the nest

spawn fresh eggs in the nest

fertilize the eggs

 

The sticklebacks communicate by sending/receiving information in the form visual signals, which are stimuli (akin to data flows or messages in business operations) that inform and direct activity. The sticklebacks may be called actors or active structures; the nest and eggs may be called passive structures.

Ashby’s terminology

Ashby’s work had a profound influence on many prominent scientists, not to mention lesser mortals like me. However, his writing is a tough read. He used the terms system and machine variously, with and without quotes, with and without prefixed adjectives (real, whole, regular, dynamic, independent etc.). Between books, even within a book, he did not use these terms entirely consistently (he had no word processor to “find next” and “change all”).

 

His real system and real machine usually refer to a whole physical entity with infinite potentially definable variables (and sometimes infinite possible inputs) which “stands apart from any observer”. It could be an electrical transducer, an animal, or even a society.

 

His system (without the prefix “real”) is identified with an observer’s selection of variables. In his first book, the selection is arbitrary. In his second book, the selection relates to some “main interest that is already given”. Perversely however, Ashby also sometimes wrote of an animal or a society as being a system, regardless of any observer or selection of variables.

 

In his first book, Ashby narrowed the scope of “system” by saying in chapter 2 that

“from now on we shall be concerned mostly with regular systems. We assume that preliminary investigations have been completed and that we have found a system, based on the real 'machine', that (1) includes the variables in which we are specially interested, and (2) includes sufficient other variables to render the whole system regular.”

 

Later, in chapter 2/17 and 14/3 he narrowed “system” further to mean a selection of variables, observable is being related, and change independently of other variables.

 

His term 'machine' with quotes often seems the same as real 'machine'. By contrast, the term machine without quotes, sometimes means whatever physical entity realizes a system of variables that have been selected a) arbitrarily, or b) as related to some given interest, or c) as related and changing independently of others.

 

(For examples of Ashby using these terms in different ways, see 2/4, 2/5, 2/14, 2.17, 14.3 in “Design for a Brain”, and 21/6, and see 2/1, 3/1, 3/11, 4/3, 6/1 and 6/14 in “Introduction to Cybernetics”.)

 

Abstracting systems from physical entities

Ashby made clear that a dynamic system (a set of variables and rules) is always abstracted from the infinite complexity of any physical entity or real 'machine' in which that system may be observed. In the table below, his real 'machine' is divided into a physical entity and the physical activity system that entity performs, when it realizes a particular abstract system.

 

Ashby’s

terms

Stickleback mating ritual

System

Abstract system (roles, rules, variables)

The ritual typified above

Real

'machine'

Physical activity system (activities, variable values)

An instance of the ritual

Physical entity (actors in the material word)

Sticklebacks, nest and eggs

 

Generally, the relationship between abstract systems and physical entities is many to many. So, the mating ritual in the table above can be realized by many stickleback pairs; and conversely, those sticklebacks may play roles in other systems. Generally, there is more to a physical entity than any abstract system it realizes; and specifically, there is more to know about the sticklebacks than their mating ritual.

 

includes sufficient other variables to render the whole system regular.” And in chapter 2/17 and 14/3 he narrows “system” to mean a selection of variables that change independently of others.

 

With or without the prefix “real”, the term ‘machine’ with quotes often seems to mean the real machine. By contrast, the term machine without quotes, at least sometimes, means whatever physical entity realizes a system of variables that have been selected a) arbitrarily, or b) related to some given interest, or c) change independently of others.

 

For some examples of Ashby using terms in different ways: in Design for a Brain, see 2/4, 2/5, 2/14, 2.17, 14.3 and 21/6; and in Introduction to Cybernetics, see 2/1, 3/1, 3/11, 4/3, 6/1 and 6/14.

First book: “Design for a brain”

This section comments on Ashby’s first book, in which a dynamic system is a set of variables, a variable is a measurable quantity that has a value at moment in time, and the state of the dynamic system is the set of values that all its variables have at a time.

On abstracting dynamic systems from a real ‘machine’

"A system... is independent of the concrete substance of the elements (e.g. particles, cells, transistors, people, etc).” Principia Cybernetica Web

 

19/2 “Our starting point is the idea, much more than a century old, that a machine… in given conditions and at a given internal state, always goes to a particular state.”

 

Chapters/sections

2/1. In the previous chapter we have repeatedly used the concepts of a system, of parts in a whole, of the system's behaviour, and of its changes of behaviour. These concepts are fundamental and must be properly defined. Accurate definition at this stage is of the highest importance, for any vagueness here will infect all the subsequent discussion; and as we shall have to enter the realm where the physical and the psychological meet, a realm where the experience of centuries has found innumerable possibilities of confusion, we shall have to proceed with unusual caution.

The term system appears more than one thousand times. It is usually used loosely, meaning any arbitrarily selected set of state variables. The term dynamic system appears only twenty or thirty times, sometimes but not always appearing to imply an inter-related set of variables.

We start by assuming that we have before us some dynamic system, i.e. something that may change with time. We wish to study it. It will be referred to as the 'machine ', but the word must be understood in the widest possible sense, for no restriction is implied other than that it should be objective.

I don’t read this starting assumption as where we will end up, for two reasons. First, because from that starting point, Ashby wants us to investigate our assumption, using the method he set out in 2/17 and separating independent systems as he set out in chapter 14/3.

 

Second, because if this was the only time he wrote ‘machine’ rather than machine, I would read them as synonyms. But he goes on to use the term ‘machine’ in quotes more than thirty times, and I assume he means to do that with a different implication (from machine without quotes).

 

The term machine appears more than one hundred times. Most appear without quotes, but about one third of those appear in quotes as in the term real ‘machine’ or ‘machine’. It seems to me the quotes around ‘machine’ tends to imply uncertainty about whether the physical entity being studied realizes zero, one or many independent dynamic systems. For example, see the ‘machine’ that is composed of two independent pendulums in the quote at 2/17 below.

 

2/3. The first step is to record the behaviours of the machine's individual parts. To do this we identify any number of suitable variables. A variable is a measurable quantity which at every instant has a definite numerical value. A ' grandfather ' clock, for instance, might provide the following variables : —the angular deviation of the pendulum from the vertical ; the angular velocity with which the pendulum is moving ; the angular position of a particular cog-wheel ; the height of a driving weight ; the reading of the minute-hand on the scale ; and the length of the pendulum. If there is any doubt whether a particular quantity may be admitted as a ‘variable ' I shall use the criterion whether it can be represented by a pointer on a dial. I shall, in fact, assume that such representation is always used: that the experimenter is observing not the parts of the real 'machine' directly but the dials on which the variables are displayed, as an engineer watches a control panel.

2/4. A system is any arbitrarily selected set of variables. It is a list nominated by the experimenter, and is quite different from the real 'machine'.

 

2/5. It will be appreciated that every real 'machine' embodies no less than an infinite number of variables, most of which must of necessity be ignored.

 

2/14 If we restrict our attention to the variables, we find that as every real ‘machine ' provides an infinity of variables, and as from them we can form another infinity of combinations, we need some test to distinguish the natural system from the arbitrary.…From now on we shall be concerned mostly with regular systems. We assume that preliminary investigations have been completed and that we have found a system, based on the real ‘machine ', that (1) includes the variables in which we are specially interested, and (2) includes sufficient other variables to render the whole system regular.

 

2/17 To conclude, here is an example to illustrate this chapter's method. Suppose someone constructed two simple pendulums, hung them so that they swung independently, and from this ‘machine ' brought to an observation panel the following six variables… He starts by selecting a system [a subset of the six variables) at random, constructs its field and deduces whether it is absolute. He then tries another system. It is clear that he will eventually be able to state, without using borrowed knowledge, that just three systems are absolute: (v, w, x, y), {v, x\ and (w, y). He will add that z is unpredictable. He has in fact identified the natural relations existing in the 'machine '. He will also, at the end of his investigation, be able to write down the differential equations governing the systems). Later, by using the method of S. 14/6, he will be able to deduce that the four-variable system really consists of two independent parts.

 

Ashby’s experimenter observes 6 variables output from one ‘machine’ and their changes over time. He starts by selecting a system at random, by selecting an arbitrary subset of the 6 variables. By the end of his investigation, he has identified the natural relations existing in the 'machine' and can write down the differential equations governing several independent dynamic systems.

 

Above, the term ‘machine’ (in quotes) suggests to me a candidate machine rather than a machine corresponding to one independent dynamic system. Later, in chapter 14, Ashby addresses the independence of several systems in one ‘machine’.

 

14/3. “we need a definition of ' independence '. Given a system that includes two variables A and B, and two lines of behaviour whose initial states differ only in the values of B, A is independent of B if A's behaviours on the two lines are identical.

 

14/3 A system R is independent of a system S if every variable in is independent of every variable in S, all possible pairs being considered.

 

5/14 Now the shape and features of any field depend ultimately on the real physical and chemical construction of the ' machine ' from which the variables are abstracted. T

 

7/1 Sometimes physical entities cannot readily be allotted their type. Thus, a steady musical note may be considered either as unvarying in intensity, and therefore a null-function, or as represented by particles of air which move continuously, and therefore a full-function. In all such cases the confusion is at once removed if one ceases to think of the real physical object with its manifold properties, and selects that variable in which one happens to be interested.

 

7/9 If, still dealing with the same real ‘machine', we ignore S, and repeatedly form the field of the system composed of A and B, S being free to take sometimes one value and sometimes the other, we shall find that we get sometimes a field like I in Figure 7/8/2, and sometimes a field like II, the one or the other appearing ac- cording to the value that S happens to have at the time.

 

9/2 Consider the internal linkages in this system. We can sufficiently specify what is happening by using six variables, or sets of variables: those shown in the box-diagram below. By considering the known actions of part on part in the real system we can construct the diagram of immediate effects.…It will be noted that although action 3 has no direct connection, either visually in the real apparatus or functionally in the diagram of immediate effects, with the site of the changes at 6, yet the latter become adapted to the nature of the action.

 

11/8 Our present difficulties are, in fact, largely due to this assumption. By modifying it we shall not only lessen the difficulties but we shall obtain a model more like the real brain.

 

14/2. This attempted criterion obtained its data by a direct examination of the real 'machine '. The examination not only failed in its object but violated the rule of S. 2/8. What we need is a test that uses only information obtained by primary operations.

 

14/6. We can now see that the method for testing an immediate effect, described in S. 4/12, is simply a test for independence applied when all the variables but two are held constant. The relation can be illustrated by an example. Suppose three real machines are linked so that their diagram of immediate effects is z —> y —> x. The system's responses to tests for independence will show that y is independent of x, and that z is independent of both.

 

14/15 The property has nothing to do with energy or its conservation ; nor does it attempt to dogmatise about what real ‘machines' can or cannot do ; it simply says that if B and C remain constant and A changes from inactive to active, then the system cannot be absolute—in other words, it is not completely isolated.

 

21/4 Usually the selection of variables to form an absolute system is rigorously determined by the real, natural relationships existing in the real 'machine ', and the observer has no power to alter them without making alterations in the ' machine ' itself.

 

21/6. The simple physical act of joining two machines has, of course, a counterpart in the equations, shown more simply in the canonical than in the group equations. One could, of course, simply write down equations in all the variables and then simply let some parameter a have one value when the parts are joined and another when they are separated. This method, however, gives no insight into the real events in ' joining ' two systems.

On coupling machines into a wider system

“3/9 The organism as a machine… An organism and its environment may be represented by a set of variables that forms a state-determined system.”

 

“3/12 As the organism and its environment are to be treated as a system.”

 

In other words, when coupled together, a ‘real machine’ and its environment may undergo a succession of state changes that correspond to changes in the values of state variables defined in one abstract system. Later, Ashby discussed coupling two machines so as to realize one abstract system.

On two kinds of adaptation

“5/3 A behavior is adaptive if it maintains the essential variables within physiological limits.”

 

Above, Ashby wrote of homeostatic adaption, but he was quick to note an ambiguity.

 

“5/7 Before proceeding further it must be noted the adaptation is commonly used in two senses, which refer to different processes. The distinction may best be illustrated by the inborn homeostatic mechanisms – the reaction to cold by shivering for instance… [Historically] the first change involved the development of the mechanism itself [a mutation that creates an animal with a new way of behaving]; the second change occurs when the mechanism is stimulated into showing its properties [changing the state of an animal].”

 

In other words, the two different processes are:

·       homeostatic adaption by state-to-state changes within the life of an entity, which maintains it state in a range suitable for life

·       progressive evolution/mutation that replaces one entity by another entity, which is somewhat better adapted to changes in its environment.

On types of function (or line of behavior)

7/1 Ashby was interested in variables that show some constancy over time.

 

 

(A) The full-function has no finite interval of constancy; many common physical variables are of this type: the height of the barometer, for instance.

(B) The part-function has finite intervals of change and finite intervals of constancy.

(C) The step-function has finite intervals of constancy separated by instantaneous jumps.

(D) the null-function, which shows no change over the whole period of observation.

The four types obviously include all the possibilities, except for mixed forms.

 

In his “Summary and Preface”, Ashby reported an interesting conclusion relating two of the function types to adaptability.

 

“I have deduced that any system which shows adaptation must (1) contain many variables that behave as step-functions, (2) contain many that behave as part-functions, and (3) be assembled largely at random, so that its details are determined not individually but statistically. The last requirement may seem surprising: man-made machines are usually built to an exact specification, so we might expect a machine assembled at random to be wholly chaotic. But it appears that this is not so. Such a system has a fundamental tendency, shown most clearly when its variables are numerous, to so arrange its internal pattern of action that, in relation to its environment, it becomes stable. If the system were inert this would mean little; but in a system as active and complex as the brain, it implies that the system will be self-preserving through active and complex behaviour.

I hope to show that the essential difference between the brain and any machine yet made is that the brain makes extensive use of a principle hitherto little used in machines. I hope to show that by the use of this principle a machine's behaviour may be made as adaptive as we please, and that the principle may he capable of explaining even the adaptiveness of Man.”

Ashby explored how the continuous can be regarded as discrete.

 

7/3. “Few variables other than the atomic can change instantaneously; a more minute examination shows that the change is really continuous: the fusing of an electric wire, the closing of a switch, and the snapping of a piece of elastic. But if the event occurs in a system whose changes are appreciable only over some longer time, it may be treated without serious error as if it occurred instantaneously.”

On progressive systems with many input types

Ashby was much concerned with entities like biological organisms that proceed from state to state under their own drive. However, he wrote also of how a machine’s behavior can be influenced by parameters or other input stimuli.

 

“9/17 In a living organism, the reacting part has, in effect three types of input

·       sensory input from the environment,

·       the values of its parameters in S, and

·       those that were determined genetically.”

 

As I interpret Ashby, the sensory input is analogous to messages received by actors in a social activity system that stimulate some activity (e.g. the call of a poker player for another card), And his parameters are directions (natural/inherited or learnt) that determine what rules will be applied to sensory inputs.

Remarks

For the purposes of this book, note two important points. First, Ashby abstracted a dynamic system (an ideal machine defined in terms of variables and rules) from its realization by physical or material entities. Second, he distinguished state-to-state changes within a machine’s normal way of behaving, from changes to the way of behaving itself.

 

Beware there are times where anyone (you, me, even Ashby) lose sight of these distinctions. We speak of a material entity as being a system, rather than realizing one. We may think of a parameter that sets or changes an entity’s way of behaving as though it were an event, or else input that advances the state of the entity in its normal way of behaving.

 

A scenario (a simplified version of Ashby’s)

Before us is a glass box that we start by assuming to be a ‘machine’. Inside the box we can see one light bulb, and two identical pendulums. We have access to three input devices - one switch and two pull strings. The switch turns the light on or off. Pulling one string starts or increases the amplitude of one pendulum. Pulling the other string starts or increases the amplitude of the other pendulum.

 

A question

Ashby consistently presented a dynamic system as an abstract description of selected variables and behavior observable in a real machine. So how to apply the terms system, dynamic system, machine, ‘machine’ and real ‘machine’ to the scenario above?

 

My answer

The glass box is one real ‘machine’ meaning it is a physical entity, a material structure whose state changes over time. It may realize infinite describable systems, but we can’t and don’t want to describe them all. Suppose we select three variables that we (observers) can readily see as changing state:

·       Light status (on or off)

·       Pendulum 1 amplitude

·       Pendulum 2 amplitude

Now we can say the glass box is one real ‘machine’ that realizes the one system defined by these three variables. However…

 

14/3 A system R is independent of a system S if every variable in is independent of every variable in S, all possible pairs being considered.

 

Following Ashby’s method, we experiment with the three input devices (the one switch and two pull strings). We see that each input changes the value of one state variable, but has no effect on the other two variables’ lines of behavior.  Now we may say there is one light machine that realizes one dynamic system. And two pendulum machines that both realize a separate, independent and different dynamic system.

 

In short, we started with the assumption there was one machine realizing one dynamic system. And after following Ashby’s investigation method we end up with four different concepts:

·       1 real ‘machine’ - a physical entity that changes state

·       1 system - an arbitrary selection of the variables available on the real machine (the starting point of the investigation method at 2/17)

·       2 independent dynamic systems – each a subset of the selected variables that stand separately as an independent system (as defined in 14/3)

·       3 machines - each a realization of one dynamic system.

 

How to interpret Ashby? Did he have something like these four concepts in mind, but use his terms somewhat inconsistently? See the introduction to this chapter.

 

Other questions that trouble me

Are the three inputs better called parameters, which determine which of different dynamic systems a machine realizes?  Or input events, which merely trigger a state change in the course of a machine’s realization of one dynamic system?

 

Isn’t there a fourth input acting on the pendulums, that is, gravity?  Which we could change by taking the glass box to the moon?

Second book: “An introduction to cybernetics”

This section comments on chapters in Ashby’s “Introduction to Cybernetics”.

Preface

Ashby started by describing cybernetics as a theory of complex regulating systems.

Preface: “deals with mechanism and information as they are used in biological systems for regulation and control, both in [physiology and psychology]… It lays the foundation for a general theory of complex regulating systems… Thus, on the one hand it provides an explanation of the outstanding powers of regulation possessed by the brain, and on the other hand it provides the principles by which a designer may build machines of like power.”

 

Ashby’s ambition

Ashby believed his principles to be applicable to large and complex entities.

 

“Preface: [this book] introduces the principles that must be followed when the system is so large and complex (e.g. brain or society) that it can be treated only statistically.”

 

The words large and complex surely refer to the size and complexity of a ‘real machine’, since the systems we abstract from a large and complex machine may be of any size or complexity (small or large, simple or complex).

 

What does large mean? Many actors, many variables, many inputs?  In some cases, many actors interact to affect only a few selected variables. (Consider the interactions between a virus and human populations, and the few variables reported in Worldometer). In other cases, an entity maintains many variables, and is affected by many parameters.

Chapter 1: What is new?

 

Abstraction of systems from ‘real machine’s

1/3. Cybernetics stands to the ‘real machine’—electronic, mechanical, neural, or economic—much as geometry stands to a real object in our terrestrial space.

 

Thus, Ashby distinguishes the abstract systems he discusses from any physical ‘real machine’s that realize them. Much as one may distinguish the abstract type “triangle” from its instantiation on many pool tables. He returned to this several times; we’ll return to it also.

 

Application to social entities

1/6 “cybernetics is likely to reveal a great number of interesting and suggestive parallelisms between machine and brain and society. And it can provide the common language by which discoveries in one branch can readily be made use of in the others.”

 

Clearly, Ashby hoped his principles for the regulation of a target by a controller could be applied to large and complex entities like a human society or economy. He returned to this several times; we’ll return to it also.

Chapter 2: Change

2/1. “The most fundamental concept in cybernetics is that of ‘difference’, either that two things are recognisably different, or that one thing has changed with time… We assume change occurs by a measurable jump.”

 

Ashby defined a vocabulary for cybernetics. Here are my interpretations of some terms relating to the occurrence of a state change.

 

·       State variable: an entity’s property, whose value can be changed.

·       Transition: a change from one state variable value to another.

·       Operator: a factor that causes a transition (be it a continuous force, process or condition, or else a discrete input, event or condition).

·       Operand: an entity’s state variable value prior to a transition.

·       Transform: an entity’s state variable value after a transition.

·       Transformation: a set of transitions caused by an operator acting on set of operands.

 

Transformations

A transformation is the set of transitions caused by an operator. It is an abstract model of a “way of behaving”. An open transformation produces transforms that do not appear in the operands. This open transformation is composed of eight transitions.

 

Operand

A

B

C

D

E

F

G

H

Transform

F

H

K

L

G

J

E

M

 

By contrast, this closed transformation produces only transforms found in the operands.

 

Operand

A

B

C

D

E

F

G

H

Transform

H

G

F

E

D

C

B

A

 

This repeated transformation returns the operands to the initial state, and further repetition would cycle between the two states.

 

Operand

A

B

C

D

E

F

G

H

Transform/Operand

H

G

F

E

D

C

B

A

Transform

A

B

C

D

E

F

G

H

 

“A transformation may be applied more than once, generating a series of changes analogous to the series of changes that a dynamic system goes through when active.”

 

A multi-valued transition, given an operand, produces one of two or more alternative transforms. By contrast, a single-valued transformation, given an operand, converts each operand to only one transform (as in example above).

 

Continuous and discrete change

In chapter 2/2, Ashby defines a change from state to state as a transformation that defines/determines for each state variable value (each operand) in the systems of interest what the next value will be (its transform).

 

Ashby also explains why he regards it sufficient for us to model all state-to-state transformations as discrete, even if they are in continuous in the real world.

 

“Often a change occurs continuously, that is, by infinitesimal steps, as when the earth moves through space, or a sunbather’s skin darkens under exposure. The consideration of steps that are infinitesimal, however, raises a number of purely mathematical difficulties, so we shall avoid their consideration entirely. Instead, we shall assume in all cases that the changes occur by finite steps in time and that any difference is also finite. We shall assume that the change occurs by a measurable jump, as the money in a bank account changes by at least a penny. Though this supposition may seem artificial in a world in which continuity is common, it has great advantages in an Introduction and is not as artificial as it seems. When the differences are finite, all the important questions, as we shall see later, can be decided by simple counting, so that it is easy to be quite sure whether we are right or not. Were we to consider continuous changes we would often have to compare infinitesimal against infinitesimal, or to consider what we would have after adding together an infinite number of infinitesimals—questions by no means easy to answer.

 

As a simple trick, the discrete can often be carried over into the continuous, in a way suitable for practical purposes, by making a graph of the discrete, with the values shown as separate points. It is then easy to see the form that the changes will take if the points were to become infinitely numerous and close together.

 

In fact, however, by keeping the discussion to the case of the finite difference we lose nothing. For having established with certainty what happens when the differences have a particular size we can consider the case when they are rather smaller. When this case is known with certainty we can consider what happens when they are smaller still. We can progress in this way, each step being well established, until we perceive the trend; then we can say what is the limit as the difference tends to zero.

 

This, in fact, is the method that the mathematician always does use if he wants to be really sure of what happens when the changes are continuous. Thus, consideration of the case in which all differences are finite loses nothing, it gives a clear and simple foundation; and it can always be converted to the continuous form if that is desired.”

 

Having proposed defining continuous internal system state change as discrete step changes, the question arises as to whether continuous inputs can or should be modelled as discrete events, a question we’ll return to shortly.

Chapter 3: Determinate machines

 

A real machine as performer of a logical transformation

In chapter 3/1, Ashby defined a real physical machine as one that behaves in the same way an abstract logical transformation.

 

“It should be noticed that the definition refers to a way of behaving, not to a material thing. We are concerned in this book with those aspects of systems that are determinate—that follow regular and reproducible courses. It is the determinateness that we shall study, not the material substance.”

 

I interpret this as: We are concerned with processes that follow regular and reproducible courses, rather than the material or biological nature of actors that perform processes.  In other words, a real or physical machine has a characteristic way of behaving, it changes from state to state as defined an abstract transformation.

 

“Every machine or dynamic system has many distinguishable states. If it is a determinate machine, fixing its circumstances and the state it is at will determine, i.e. make unique the state it next moves to… By relating machine and transformation we enter the discipline that relates the behaviours of real physical systems to the properties of symbolic expressions, written with pen on paper.”

 

I interpret this as: by relating a physical entity to a state change we start down a road that may relate the behaviour of that physical entity to the types and rules of an abstract system.

But only if that state change occurs in a regular and reproducible way; it is part of a pattern we can model.

 

Ashby defined a determinate machine in a way that is very restrictive. His determinate machine behaves as a closed single-valued transformation. In other words, given an operand, Ashby’s determinate machine can neither choose between two or more optional transforms, nor produce one not seen in the operands.

 

This might characterise a closed mechanical or biological machine working under its own internal drive. But surely real software, psychological and social machines can choose between alternative actions/transitions, and still embody one system? Moreover, psychological and social entities can produce transforms never previously seen or envisaged?

 

If a ‘real machine’ behaves according to a transformation, then Ashby says at 3/4.

·       “The transformation is the canonical representation of the machine”

·       “The machine is said to embody the transformation.”

 

Above, it appears the mapping is of one machine to one transformation. Later, one machine may perform a set of (alternative) transformations.

 

Might the same ‘real machine’ also be able to behave in other ways not defined by the transformation? That certainly applies to human and social entities, who can do many things beyond any transformations we can model.

 

Coupling of a ‘real machine’ to its environment

3/8. Given an organism, its environment is defined as those variables whose changes affect the organism, and those variables which are changed by the organism's behavior.

 

In other words, there is a feedback loop between an organism and its environment. The organism affects the environment, and the environment affects the organism. Together they are coupled in a ‘real machine’, definable in an abstract system.

 

Abstraction of systems from ‘real machine’s

3/11 “At this point we must be clear about how a “system” is to be defined. Our first impulse is to point at a pendulum and to say “the system is that thing there”. This method, however, has a fundamental disadvantage: every material object contains no less than an infinity of variables and therefore of possible systems.

 

The real pendulum, for instance, has not only length and position; it has also mass, temperature, electric conductivity, crystalline structure, chemical impurities, some radio-activity, velocity, reflecting power, tensile strength, a surface film of moisture, bacterial contamination, an optical absorption, elasticity, shape, specific gravity, and so on and on…

 

… Any suggestion that we should study “all” the facts is unrealistic, and actually the attempt is never made. What is try [sic] is that we should pick out and study the facts that are relevant to some main interest that is already given.”

Chapter 4 The machine with input

 

A real machine as performer of alternative transformations

In chapter 4/1, Ashby defined a machine with input as one that can behave in different ways, depending on the value of an input parameter. To begin with, he gave three examples.

 

·       Dismantling a toy into its part and building a different toy out of its parts

·       Pulling a lever on a machine to change its way of behaving

·       Setting a parameter value to 1, 2 or 3, so a machine will perform transformation R1, R2 or R3

 

The first example seems different from the other two, but let that pass. In the second two examples, the input is a parameter determines which of several pre-defined ways of behaving the machine will proceed to follow.

 

4/1 “A transformation corresponds to a machine with a characteristic way of behaving; so the set of three—R1, R2, and R3— if embodied in the same physical body, would have to correspond to a machine with three ways of behaving. Can a machine have three ways of behaving? It can, for the conditions under which it works can be altered.”

 

“Many a machine has a switch or lever on it that can be set at any one of three positions, and the setting determines which of three ways of behaving will occur. Thus, if a, etc., specify the machine’s states, and R1 corresponds to the switch being in position 1, and R2 corresponds to the switch being in position 2, then the change of R’s subscript from 1 to 2 corresponds precisely with the change of the switch from position 1 to position 2; and it corresponds to the machine’s change from one way of behaving to another.

 

State changes and mutations

4/1 It will be seen that the word “change” if applied to such a machine can refer to two very different things. There is the change from state to state, from a to b say, which is the machine’s behaviour, and which occurs under its own internal drive, and there is the change from transformation to transformation, from R1 to R2 say, which is a change of its way of behaving, and which occurs at the whim of the experimenter or some other outside factor.

 

The distinction is fundamental and must on no account be slighted.”

 

In my words, “change” can mean:

·       A state change: a change from state to state, a transformation that is the system’s way of behaving, which occurs under its own internal drive

·       A mutation (my word): a change that replaces one transformation or way of behaving by another, which occurs in response to some external factor.

 

Parameters versus inputs

A transducer is a device that converts energy from one form to another. ... typically it converts electrical signals to and from other physical quantities (force, light, motion, etc.).

 

“4/4. Input and output. The word “transducer” is used by the physicist, and especially by the electrical engineer, to describe any determinate physical system that has certain defined places of input, which the experimenter may enforce changes that affect its behaviour, and certain defined places of output, at which he observes changes of certain variables, either directly or through suitable instruments….

 

I interpret this as: Ashby’s machine with input or transducer is a physical entity that has switches and dials (or other I/O devices) by which an experimenter can modify its behavior and observe the results.

 

With an electrical system, the input is usually obvious and restricted to a few terminals. In biological systems, however, the number of parameters is commonly very large and the whole set of them is by no means obvious. It is, in fact, co-extensive with the set of “all variables whose change directly affects the organism.”

 

All the variables that affect the organism?

 

“the reader must therefore be prepared to interpret the word “input” to mean either the few parameters appropriate to a simple mechanism or the many parameters appropriate to the free-living organism in a complex environment.”

 

It seems to me Ashby declared the transferability of his cybernetic principles:

 

1.     from a “determinate machine” (chapter 3) that performs a “single-value closed transformation”,

2.     to a “machine with input” (chapter 4) that can be configured by an input parameter to behave (autonomously) this way or that,

3.     to a “black box” machine (chapter 6) with a continuous (say, mechanical or electrical) input stream that can be configured, and also coupled by input and output to other machines in a wider system (be it closed or not),

4.     to a “free-living organism” that has evolved so as to respond to a variety of discrete input events (more akin to a software system),

5.     to a human social entity, in which anything is possible.

 

It seems to me there is some discontinuity in the steps from 3 to 4, and 4 to 5.

 

“A ‘real machine’ whose behaviour can be represented by such a set of closed single-valued transformations will be called a transducer or a machine with input (according to the convenience of the context). The set of transformations is its canonical representation. The parameter, as something that can vary, is its input.”

 

At least three kinds of input seem to be in play here. Consider three cases:

 

·       Case 1: We use parameter values to switch a machine from behaving one way to another – we set a clock to run forwards or backwards.

·       Case 2: We use parameter values to determine the point from which a machine starts or continues – we set the hands of a clock to summer time or winter time.

·       Case 3: We respond to events as we live our life - we hear a spoken message, see a light switched on or off, smell some rotting meat, feel the sting of a nettle, feel a blast of cold air.

 

In chapter 3/4, it appears one machine realizes one transformation. In chapter 4/4, one machine realizes several transformations. Are those transformations (case 1) alternative systems, which a machine can realize depending on the value of an input parameter?  Or (case 3) successive transformations that one system must make when stimulated by different input events?

 

In case 3, does Ashby’s “free-living organism” realize one system or many? Given that each event type triggers a different transformation, does he see each transformation as a different system?

 

Coupling

Ashby spoke of a coupling of two “machines with input” in one closed system. But he also spoke of coupling machines in one machine, and spoke of a system as having inputs.

 

4/6. A fundamental property of machines is that they can be coupled. Two or more whole machines can be coupled to form one machine; and any one machine can be regarded as formed by the coupling of its parts, which can themselves be thought of as small, sub-, machines.

 

Just as, in the material world, smaller machines may be seen as submachines of a larger machine, so in an abstract system description, smaller systems may be seen as subsystems of a larger system. (And in a system dynamics model, each stock can be seen as a subsystem of the whole system.)

 

“The coupling is of profound importance in science, for when the experimenter runs an experiment he is coupling himself temporarily to the system that he is studying.”

 

Surely, Ashby means the machine rather than the system?

 

“To what does this process, the joining of machine to machine or of part to part, correspond in the symbolic form of transformations? Of what does the operation of “coupling” consist? … we want a way of coupling that does no violence to each machine’s inner working, so that after the coupling each machine is still the same machine that it was before.

 

Encapsulation

For this or this to be so, the coupling must be arranged so that, in principle each machine affects the other only by affecting its conditions, i. e. by, affecting its input. Thus, if the machines are to retain their individual natures after being coupled to form a whole, the coupling must be between the (given) inputs and outputs, other parts being left alone no matter how readily accessible they may be.

 

This is of course a basic principle of software system design (be it called modular, object-oriented, component-based, service-oriented, or whatever the latest fashion is). It applies whether the inputs and outputs are continuous electrical signals or discrete messages.

 

Emergence

The properties and outcomes of a wider system emerge from the coupling of its subsystems.

4/10 “That a whole machine should be built of parts of given behavior is not sufficient to determine its behavior as a whole. Only when the details of coupling are added does the whole's behavior become determinate.”

Chapter 5: Stability

“In the cases considered so far, the equilibrium or stability has been examined only at the particular state or states concerned. A “disturbance” is simply that which displaces, that which moves a system from one state to another. So, if defined accurately, it will be represented by a transformation having the system’s states as operands.”

 

Is that disturbance an operator in the sense of chapter 2?

Chapter 6: The black box (parameters and inputs)

In chapter 6. Ashby turned to the definition of a machine with input and output.

 

6/1. The methods developed in the previous chapters now enable us to undertake a study of the Problem of the Black Box; and the study will provide an excellent example of the use of the methods.

 

The Problem of the Black Box arose in electrical engineering. The engineer is given a sealed box that has terminals for input, to which he may bring any voltages, shocks, or other disturbances he pleases, and terminals for output, from which he may observe what he can. He is to deduce what he can of its contents.

 

Having proposed defining continuous internal system state change as discrete changes, the question arises as to whether continuous inputs can or should be modelled as discrete events. Here Ashby seems to countenance the possibility of both. The box may both receive a continual electrical signal, and receive discrete inputs by way of “disturbances”.

 

“Though the problem arose in purely electrical form, its range of application is far wider. The clinician studying a patient with brain damage and aphasia may be trying, by means of tests given and speech observed, to deduce something of the mechanisms that are involved. And the psychologist who is studying a rat in a maze may act on the rat with various stimuli and may observe the rat’s various behaviours; and by putting the facts together he may try to deduce something about the neuronic mechanism that he cannot observe. I need not give further examples as they are to be found everywhere (S.6/17).”

 

What kind of stimuli are envisaged here? Parameters that are input at the start of an experiment (the system), after which the rat pursues a particular way of behaving under its own drive? Or discrete events during one experiment (one system), to which the rat must respond?

 

“Black Box theory is, however, even wider in application than these professional studies. The child who tries to open a door has to manipulate the handle (the input) so as to produce the desired movement at the latch (the output); and he has to learn how to control the one by the other without being able to see the internal mechanism that links them. In our daily lives we are confronted at every turn with systems whose internal mechanisms are not fully open to inspection, and which must be treated by the methods appropriate to the Black Box”.

 

For sure, in our daily lives, and in EA and BA, we may both observe and participate in infinite potentially describable open systems, which are driven to behave as they do by a stream of discrete events.

 

“Every real system has an indefinitely large number of possible inputs—of possible means by which the experimenter may exert some action on the Box. Equally, it has an indefinitely large number of possible outputs—of ways by which it may affect the experimenter, perhaps through recording instruments. If the investigation is to be orderly, the set of inputs to be used and of outputs to be observed must be decided on, at least provisionally.”

 

No system has an infinitely large range of input event types, nor can it be ready, in advance, to switch to any other way of behaving. It can however be replaced (ad infinitum) by a different system that behaves in a different way?

 

“Let us assume, then, that this has been done. The situation that we (author and reader) are considering can be made clearer by the introduction of two harmless conventions. Let it be assumed that the inputs, whatever their real nature, are replaced by, or represented by, a set of levers or pointers—like the as to what is meant by the input “being in a certain state”—it is the state that would be shown on a snapshot of the controls. Also let us assume that the output consists of a set of dials, attached to the Box and affected by the mechanism inside, so that the pointers on the dials show, by their positions at any particular moment, the state of the output. We now see the experimenter much like the engineer in a ship, who sits before a set of levers and telegraphs by which he may act on the engines, and who can observe the results on a row of dials. The representation, though it may seem unnatural, is in fact, of course, capable of representing the great majority of natural systems, even if biological or economic.”

 

Again, Ashby hoped his principles for the regulation of a target by a regulator could be applied to large and complex entities like a human society or economy.

What was new?

Let us revisit what Ashby was said was new about cybernetics.

System as observer’s abstraction

“It is important to stress Ashby defined a system not as something that exists in nature. A system consisted of a set of variables chosen for attention and relationships between these variables, established by observation, experimentation, or design." Ashby’s student Krippendorff writing in 2009 on Ashby’s Information Theory

 

In “Design for a Brain” (1954)

2/4. “A system is any arbitrarily selected set of variables. It is a list nominated by the experimenter, and is quite different from the real ' machine '.”

 

2/5. “It will be appreciated that every real ‘machine ' embodies no less than an infinite number of variables, most of which must of necessity be ignored. Faced with this infinite number of variables, the experimenter must, and of course does, select a definite number for examination—in other words, he defines his system.”

 

In “Introduction to Cybernetics”

3/11 “At this point we must be clear about how a “system” is to be defined. Our first impulse is to point at a pendulum and to say “the system is that thing there”. This method, however, has a fundamental disadvantage: every material object contains no less than an infinity of variables and therefore of possible systems. … Any suggestion that we should study “all” the facts is unrealistic, and actually the attempt is never made. What is try [sic] is that we should pick out and study the facts that are relevant to some main interest that is already given.”

 

Nevertheless, Ashby wrote of the real system nine times, the physical system twice, and often referred to a real-world entity (be it an electrical transducer or an organism) as a system.

 

Below, his real system is an entity that in nature consumes and produces countless inputs and outputs, most of which must be ignored when defining a system.

6/2 “Every real system has an indefinitely large number of possible inputs—of possible means by which the experimenter may exert some action on the Box. Equally, it has an indefinitely large number of possible outputs—of ways by which it may affect the experimenter, perhaps through recording instruments. If the investigation is to be orderly, the set of inputs to be used and of outputs to be observed must be decided on, at least provisionally.”

 

Again below, his “whole system” is something that exists in nature. It is “the thing as it is in itself” which stands “apart from any observer”. 

6/14 “Faced with such a system [meaning, when observing a large real-world entity] the observer must be cautious in referring to “the system”, for the term will probably be ambiguous, perhaps highly so. “The system” may refer to the whole system quite apart from any observer to study it— the thing as it is in itself; or it may refer to the set of variables (or states) with which some given observer is concerned.

 

Remembering Krippendorf’s “Ashby defined a system not as something that exists in nature an entity in reality only becomes a real or physical system (a specified machine) when and in so far as it realizes an abstract system defined by an observer. E.g. A card school only becomes a real or physical card game system (a specified machine) when and in so far as it plays a game of cards.

Abstraction of systems from real machines

Ashby makes clear in both books that observers can abstract N disparate systems from observation of 1 ‘real machine’. He says machine not just to mean a ‘real machine’ but also, usually, implicitly if not explicitly, with regard to that ‘real machine’’s realization of 1 dynamic system (be that 1 or N transformations, however you read him on that). To observers, where one ‘real machine’ realizes two different systems, it acts, to all intents and purposes, as two different machines.

 

1/3. Cybernetics stands to the ‘real machine’—electronic, mechanical, neural, or economic—much as geometry stands to a real object in our terrestrial space.

 

Thus, Ashby distinguishes the abstract systems he discusses from any physical ‘real machine’s that realize them. Much as one may distinguish the abstract type “triangle” from its instantiation on many pool tables. He returned several times to the relationship between a ‘real machine’ and one or more abstract transformations.

 

3/4 Where a ‘real machine’ and a transformation are so related, the transformation is the canonical representation of the machine, and the machine is said to embody the transformation.

 

In chapter 3, it appears Ashby is mapping a machine to one transformation. In chapter 4, he says a machine, given an input, may perform one of different (alternative) transformations and/or repeated transformations.

 

4/1 A ‘real machine’ whose behaviour can be represented by such a set of closed single-valued transformations will be called a transducer or a machine with input. The set of transformations is its canonical representation. The parameter, as something that can vary, is its input.

 

This input parameter is a switch or lever that determines which of several possible ways of behaving the machine will proceed to follow, rather than an input that stimulates an action in the course of that behavior.

 

6/14: “There comes a stage, however, as the system becomes larger and larger, when the reception of all the information is impossible by reason of its sheer bulk. When this occurs, what is he to do? The answer is clear: he must give up any ambition to know the whole system.”

 

Seems to me Ashby might here be making the unreasonable presumption that every large entity of study can rightly be called or modelled as one independent dynamic system.

#

I’d put it thus: Can we scale up from a transducer with one main purpose, and a few switches and dials, to a human society or business? Faced with a such large physical entity with countless variables, inputs and outputs, it is impossible in practice to model the whole of the entity as one coherent system. Moreover, it is impossible in theory, because the entity has multiple purposes, it is distributed, its actors pull in different directions and participate in other entities. And many of the activities are not “regular and reproducible”; they are determined by actors in an ad hoc way.

 

6/14: “Faced with [a large] system, the observer must be cautious in referring to “the system”, for the term will probably be ambiguous, perhaps highly so. “The system” may refer to the whole system quite apart from any observer to study it— the thing as it is in itself; or it may refer to the set of variables (or states) with which some given observer is concerned. Though the former sounds more imposing philosophically, the practical worker inevitably finds the second more important. Then the second meaning can itself be ambiguous if the particular observer is not specified, for the system may be any one of the many sub-machines provided by homomorphism. Why all these meanings should be distinguished is because different sub-machines can have different properties; so that although both sub-machines may be abstracted from the same real “thing”, a statement that is true of one may be false of another.”

 

I interpret this as: Faced with a large physical entity, the observer must beware calling it “the system”, for the term will probably be ambiguous. “The system” may refer to the whole physical entity quite apart from any observer, to the entity as it is in itself; or it may refer its realization of the one abstract system with which some given observer is concerned. The practical worker inevitably finds the second more important. This second meaning can itself be ambiguous if the particular observer is not specified, for the abstract system may be any one of the many systems abstractable by observers from the same physical entity.  Why all these meanings should be distinguished is because different systems can have different properties; so that although two observers may abstract two systems from the same physical entity, a statement that is true of one may be false of another. To observers, when the one physical entity realizes different systems, it acts, to all intents and purposes, as different systems. The observer of one system may be unaware and careless of the other.

 

The abstraction of systems from real things or entities is a theme Ashby returned to time and again. But I sometimes struggle with his wording. Read the quote below. For me, to represent two different aspects of one entity as systems is to define two different systems, potentially in conflict. But to define one system at two different levels of granularity is to define homomorphic representations of the same system.

 

6/14 It follows that there can be no such thing as the (unique) behaviour of a very large system, apart from a given observer. For there can legitimately be as many sub-machines as observers, and therefore as many behaviours, which may actually be so different as to be incompatible if they occurred in one system….

 

The point of view taken here is that science (as represented by the observer’s discoveries) is not immediately concerned with discovering what the system “really” is, but with co-ordinating the various observers’ discoveries, each of which is only a portion, or an aspect, of the whole truth. Were the engineer to treat bridgebuilding by a consideration of every atom he would find the task impossible by its very size. He therefore ignores the fact that his girders and blocks are really composite, made of atoms, and treats them as his units. As it happens, the nature of girders permits this simplification, and the engineer’s work becomes a practical possibility. It will be seen therefore that the method of studying very large systems by studying only carefully selected aspects of them is simply what is always done in practice. Here we intend to follow the process more rigorously and consciously.

 

At first, I interpret this as saying “We cannot abstract one (unique) system from the behavior of a very large and complex entity (be it an organism or a business), since we always select some aspect of it, some variables of interest to us. Here however, I think Ashby means to emphasise that whatever system we do abstract, it is always an abstraction from the finest-grained structures in space and behaviors over time. We always ignore the internal structure and behavior of whatever we regard as atomic actors and activities.

 

Below, Ashby says that at any one time, a machine exhibits one value for each of its (potentially many) state variables. (We shall see it is challenging to scale up this idea to a large and distributed business with hundreds of discrete databases.)

 

7/23. Set and machine. We must now be clear about how a set of states can be associated with a machine, for no ‘real machine’ can, at one time, be in more than one state.

 

Later, Ashby explains that a ‘real machine’’s behavior may not be predictable.

 

12/9. Whether a given ‘real machine’ appears Markovian or determinate will sometimes depend on how much of the machine is observable (S.3/11); and sometimes a ‘real machine’ may be such that an apparently small change of the range of observation may be sufficient to change the appearances from that of one class [Markovian] to the other [determinate].

Application to social entities

1/6 “cybernetics is likely to reveal a great number of interesting and suggestive parallelisms between machine and brain and society. And it can provide the common language by which discoveries in one branch can readily be made use of in the others.”

 

Clearly, Ashby hoped his principles for the regulation of a target by a controller could be applied to large and complex entities like a human society or economy. Ashby returned several times to the idea of applying cybernetics to social entities.

 

4/16. “Up till now, the systems considered have all seemed fairly simple, and it has been assumed that at all times we have understood them in all detail. Cybernetics, however, looks forward to being able to handle systems of vastly greater complexity: computing machines, nervous systems, societies.”

 

12/23. “this chapter has treated only of systems that were sufficiently small and manageable to be understood. What happens, he may ask, when regulation and control are attempted in systems of biological size and complexity? What happens, for instance, when regulation and control are attempted in the brain or in a human society? Discussion of this question will occupy the remaining chapters.”

 

13/10. “Here we shall be thinking not so much of the engineer at his bench as of the brain that, if it is to achieve regulation in its learned reactions, must somehow cause the development of regulatory machinery within the nervous material available; or of the sociologist who wants a regulatory organisation to bring harmony into society.”

 

14/6. “is there not a possibility that we can use our present powers of regulation to form a more highly developed regulator, of much more than human capacity, that can regulate the various ills that occur in society, which, in relation to us, is a very large system?”

 

Remember, while it is true that social entities are larger (in space) than human beings, the systems we abstract from them may be of any size, smaller or larger.

Conclusions and remarks

In short, in cybernetics, a system is a highly selective abstraction from what happens in a physical entity, be it an electrical transducer, an organism or a social entity.  Several systems may be abstracted from one entity. One system may be realized by many entities.

 

A reader tells me most of the predictive power in Ashby’s systems theory comes from the use of linear time-invariant models of systems. (Linear means the input to output relationship is the result of differential equations employing only linear operators. Time invariance means whether we apply an input to the system now or T seconds later, the output will be identical except for a time delay of T seconds.)

 

However, Ashby’s discussion included examples of progressive systems or processes like the stickleback mating ritual, and a gale warning broadcast. And his ambition extended to the regulation of human society. He declared the transferability of his cybernetic principles:

 

1.     from a “determinate machine” (chapter 3) that performs a “single-value closed transformation”,

2.     to a “machine with input” (chapter 4) that can be configured by an input parameter to behave (autonomously) this way or that,

3.     to a “black box” machine (chapter 6) with a continuous (say, mechanical or electrical) input stream that can be configured, and can also be coupled by input and output to other machines in a wider system (be it closed or not),

4.     to a “free-living organism” that has evolved so as to respond to a wide variety of discrete input events (more akin to a software system),

5.     to a human social entity, in which anything is possible.

 

It seems to me there is both some overlap of ideas and some discontinuity in the steps from 3 to 4 to 5. At step 4, one may think about systems more along the lines of event-driven state machines. At step 5, we need to distinguish a social entity from the various state machines its actors may play roles in.

 

It is unclear to me how far Ashby really thought his cybernetic principles for electrical and biological machines are applicable to event-driven psychological entities and business activity systems. However, the next chapter abstracts from Ashby’s cybernetics the 10 principles that seem to me most relevant to EA and BA.

Appendices

Appendix 1: on thermodynamics, entropy and energy

1/2 “Cybernetics started by being closely associated in many ways with physics, but it depends in no essential way on the laws of physics or on the properties of matter.”

 

Social systems thinking is at some remove from cybernetics, which is at some remove from thermodynamics. In his Introduction to Cybernetics, Ashby wrote:

 

1/5 “In this discussion, questions of energy play almost no part—the energy is simply taken for granted.” "Even whether the system is closed to energy or open is often irrelevant”.

 

4/15. Materiality: “cybernetics is not bound to the properties found in terrestrial matter, nor does it draw its laws from them.” “What is important is the extent to which the observed behaviour is regular and reproducible.”

 

7/24. Decay of variety: “any system, left to itself, runs to some equilibrium”. “Sometimes the second law of thermodynamics is appealed to, but this is often irrelevant to the systems discussed here."

 

9/11. Entropy: “Shannon has devised a measure for … entropy—that has proved of fundamental importance …” “The word “entropy” will be used in this book solely as it is used by Shannon; any broader concept being referred to as “variety” or in some other way.”

 

Note that Ashby distinguished entropy of communication theory from entropy of thermodynamics.

 

A whole is in a disordered state if there is no correlation between its parts. That is to say, knowing the state of one part gives no information about other parts. One source defines entropy as the number of ways such uncorrelated parts of a whole can be configured. The higher the number of parts, the higher entropy of the whole since there are more ways in which to arrange the parts. However, if you define one configuration, then organize the parts to match it, you are imposing order on the whole (which requires energy). Designing a system for adaptability may involve complexification - increasing the number of ways parts can be configured - and increasing the variety of system states.

Appendix 2: spurious criticisms of cybernetics

Some myths about Ashby’s cybernetics have grown out of misreading what von Bertalanffy wrote in his 1968 book on general system theory. He deprecated some 19th century thinking, which some call reductionist. He wrote: “notions of interaction and of organization [of parts] were only space-fillers or did not appear at all.” He wrote: “notions of teleology and directiveness appeared to be outside the scope of science”. He advocated what he called “perspectivism”.

 

Is cybernetics "reductionist"? No. Cybernetics is holistic, about how parts interact to produce the "emergent properties" of a whole.

 

Does cybernetics address only linear systems? No. Cybernetics addresses systems with non-linear, complex and chaotic behaviors.

 

Was classical cybernetics was replaced by second order cybernetics? No. Second order cybernetics is a different thing.

 

Does cybernetics ignore goals and goal-directedness? No. Ashby assumed both that system describers have goals, and that a system can be observed as steering itself toward a goal. His systems are models made by observers with an interest in, or goal for, a system. When disturbed, Ashby’s system tends to return to a homeostatic norm or attractor state. Some describe this phenomenon in terms of the system as having a goal, and striving to move towards it. 

 

Even von Bertalanffy said

"Cybernetics, proved its impact in basic sciences… bringing teleological phenomena (previously tabooed) into the range of scientifically legitimate problems." 1968 Page 23.

 

Teleological behavior directed towards a characteristic final state or goal is not off limits for natural science and [not] an anthropomorphic misconception of processes which are undirected and accidental. Rather it is a form of behavior which can well be defined in scientific terms and for which the necessary conditions and possible mechanisms can be indicated." 1968 Page 45.

 

Does cybernetics presume a relativist or perspectivist position? No. Bertalanffy did write “All scientific constructs are models representing certain aspects or perspectives of reality.” And he is said to have advocated what he called “perspectivism”. But that does not mean he denied the possibility of objective knowledge. He merely encouraged systems thinkers describe systems from different viewpoints, and at levels of abstraction (physical, biological and social). Surely like Ashby, he presumed that a system model can represent something of reality? A model can be shared and verified logically and/or empirically?