Ashby’s ideas – an introduction to cybernetics

Copyright 2017 Graham Berrisford. One of more than 100 papers on the “System Theory” page at Last updated 25/09/2019 08:25


Ashby established the essential ideas and principles of classical cybernetics as having a sound scientific basis.

He achieved his aim to introduce cybernetics with reference to models expressed in a mathematical form.

The aim here is to engage people who want to (or should want to) understand and apply the ideas and principles of cybernetics.

And apply them to systems (be they software or social) that are modelled using domain-specific languages rather than mathematics.

This paper includes many quotes from Ashby’s Introduction to Cybernetics (1956) and his later treatise on self-organisation (1962).

This forms the platform from which I later argue “second order cybernetics” ought to be replaced by a “third order cybernetics” that adheres more closely to Ashby's ideas.


Systems in general – recap. 1

The emergence of cybernetics. 1

On abstract and concrete systems - recap. 1

On system rightness and wrongness. 1

System dynamics. 1

On change: all system dynamics can be modelled as discrete. 1

Cybernetics depends on the communication of information. 1

Communication depends on encoding and decoding. 1

Systems are determinate (rule-bound) 1

Machines can process inputs. 1

Systems can be coupled in wider systems. 1

Systems can change in different ways. 1

“Self-organisation” requires a wider or higher system.. 1

Control systems need the requisite variety. 1

On evolution, intelligence and learning. 1

Conclusions and remarks. 1

Further reading. 1

Footnotes. 1


Systems in general – recap

Willard Gibbs (1839 – 1903) was a scientist, instrumental in the development of chemistry into a science.

He defined a system as “a portion of the ... universe which we choose to separate in thought from the rest of the universe."

Here, we use the word “entity” to mean “an observable or conceivable part of the world”.

It could be a planet, a hurricane, a group of people, or a performance of a symphony.


In his work on cybernetics, Ashby urged us not confuse an entity with any abstract system that the entity realises. 

Because to equate entities with systems (one to one) is the most common mistake you will find “systems thinking” discussion.

A concrete entity is a system only when and in so far as it realises a testable system description.


Some define a system as "parts related in an organised whole", which may be true but is too vacuous to be of much use.

That definition includes passive structures and taxonomies, like the Linnaean system for classifying organisms.

Here, the term “system” has the more interesting and useful meaning that emerged in the 20th century.

In most modern systems thinking, the “parts” of a system are actors or components that interact in activities.


Generally, a system can be described as actors interacting in activities to advance the system’s state and/or transform inputs into outputs.

·       The actors are structures (in space) that perform activities - in roles and processes that are describable and testable.

·       The activities are behaviors (over time) that change the state of the system or something in its environment -  governed by rules that are describable and testable.

·       The state is describable as a set of state variables - each with a range of values.

·       An open system is connected to its wider environment - by inputs and outputs that are describable and testable.


These concepts can be seen in writings of Ashby, Forrester and Checkland.

In Ashby’s cybernetics, a system is modelled as processes that advance a set of state variables.

In Forrester’s system’s dynamics, a system is modelled as inter-stock flows that advance a set of stocks (variable populations).

In Checkland’s soft systems method, a system is modelled as actors who perform processes that transform inputs into outputs for customers.

The emergence of cybernetics

Cybernetics emerged in the 1940s out of studying the role of information in system control.

Weiner introduced cybernetics as the science of biological and mechanical machines.

He discussed how a controller directs a target system to maintain some state variable(s) in a desired range.

E.g. A thermostat directs the actions of a heating system.


Cybernetics ideas were soon adopted by psychologists and sociologists.

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

“Despite being widely influential within cybernetics, systems theory…

Ashby is not as well-known as many of the notable scientists his work influenced.” Wikipedia 2017


Ashby popularised the usage of the term 'cybernetics' in the study of self-regulating (rather than self-organising) systems.

“Our starting point is the idea, much more than a century old, that a machine, in given conditions and at a given internal state, always goes to a particular state.” (“Design for a Brain” 19/2).

“A variable is a measurable quantity which at every instant has a definite numerical value.” (“Design for a Brain” 2/3).

“The state of the system is the set of values that the variables have.”

“A system is any set of variables which he [observer] selects from those available on the real machine.”


Information feedback loops

Cybernetics is the science of how a system (be it biological or mechanical) can be controlled.

It addresses how a control system (via an input-output feedback loop) can control at least some activities in a target system.

Information is encoded in flows or messages that pass between systems.

A control system:

·       receives messages that describe the state of a target system

·       responds by sending messages to direct activities in the target system.


Information feedback loops are found in both organic and mechanical systems:

·       A missile guidance system senses spatial information and sends messages to direct the missile.

·       A brain holds a model of things in its environment, which an organism uses to manipulate those things.

·       A business database holds a model of business entities and events, which people use to monitor and direct those entities and events.

·       A software system holds a model of entities and events that it monitors and directs in its environment.


Since the 19th century, many authors have been particularly interested in homeostatic systems.

E.g. In “Design for a Brain” (1952), Ashby, discussed biological organisms as homeostatic systems.

He presented the brain as a regulator that maintains each of a body’s state variables in the range suited to life.

This table distils the general idea.


Generic system

Ashby’s design for a brain


interact in orderly activities to

maintain system state and/or

consume/deliver inputs/outputs

from/to the wider environment.

Brain cells

interact in processes to

maintain body state variables by

receiving/sending information

from/to bodily organs/sensors/motors.


However, homeostatic entities and processes are only a subset of systems in general.

In his more general work, “Introduction to Cybernetics” (1956), Ashby defined a system as a set of regular or repeatable behaviors.

Cybernetics is the science of how a system (be it biological or mechanical) can be controlled.

It addresses how a control system (via an input-output feedback loop) can control at least some activities in a target system.

On abstract and concrete systems – recap

A concrete entity is a system only when and in so far as it realises a testable system description.

We do commonly abuse the term “system”.

We point to an entity (e.g. a business organisation or a biological organism) and casually call it "a system".

But with no explicit or implicit reference to a particular system description, this is to say nothing.

Idly calling an entity (or process, problem or situation) a system is meaningless, because one entity can realise countless systems.


As Ackoff, Ashby, Checkland and other systems theorists have emphasised, a system is a perspective of a reality.

“Different observers of the same [concrete] phenomena may conceptualise them into different [abstract] systems.” Ackoff 1971


Ashby urged us not confuse a concrete entity with an abstract system that the entity realises. 

“At this point we must be clear about how a "system" is to be defined.

Our first impulse is to point at [some real entity or machine] and to say "the system is that thing there".

This method, however, has a fundamental disadvantage: every material object contains no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” (1956, 3/11)


Ashby elaborated later on the same theme.

“Important matter of principle in the study of the very large system.

“The observer must be cautious in referring to “the system”, for the term will probably be ambiguous, perhaps highly so.

“The system” may refer to:

·       the whole system quite apart from any observer to study it— the thing as it is in itself.

·       the set of variables (or states) with which some given observer is concerned.


Though the former sounds more imposing… the practical worker inevitably finds the second more important.

The second meaning can be ambiguous if the particular observer is not specified…

[Since different systems] may be abstracted from the same real “thing”, a statement that is true of one may be false of another.


It follows that there can be no such thing as the (unique) behavior of a very large system, apart from a given observer.

For there can legitimately be as many [systems] as observers, which may... be so different as to be incompatible.


Science is not immediately concerned with discovering what the system “really” is.

But [instead] with coordinating the various observers’ discoveries, each of which is only a portion, or an aspect, of the whole truth.

It will be seen therefore that the method of studying very large systems by studying only carefully selected aspects of them is simply what is always done in practice.” (1956, 6/14)


Any substantial entity (like IBM) can be described as manifesting (instantiating, realising) countless different systems.

So, we must distinguish abstract systems from concrete systems.


System theorists distinguish what some call “soft systems” and Ackoff called “abstract systems” from their realizations.

An abstract system (e.g, the rules of Poker) is a theory of, or insight into, how some part of the world works.

A concrete system (e.g. a game of Poker) is a real-world application, or empirical example, of such a theory.

Science requires us to show the latter conforms (well enough) to the former.


The basis of system theory

Abstract systems (descriptions)

<create and use>                              <represent>

System thinkers   <observe and envisage >  Concrete systems (realities)


These papers take this triangular, scientific, view of system theory as axiomatic.

·       An abstract system (e.g. the normal regular heart beat) is a description or model of how some part of the word behaves, or should behave.

·       A concrete system (e.g. your own heart) is a realisation by a real-world entity that conforms well enough to an abstract system.


Abstract systems are realised by concrete systems; and conversely, concrete systems realise abstract systems.

An abstract system does not have to be a perfect model of what it describes; it only has to be accurate enough to be useful.

It is a type that hides the infinite complexity of real-world actors and activities that instantiate (or realise) the type.


Abstract system description

Theoretical system


A set of roles and rules (the logic or laws actors follow)

Concrete system realisation

Empirical system


Actors playing the roles and acting according to the rules


This table contains some examples.


Abstract system description

“Solar system”

Laws of tennis

The score of a symphony

The US constitution

Concrete system realisation

Planets in orbits

A tennis match

A performance of that symphony

US governments


Note that systems thinking hides the full complexity of real-world entities that realise systems.

In discussion and testing of the stickleback mating ritual, no attention is paid to any individual stickleback, or its complex internal biochemistry.


Abstract system description


A set of roles and rules (the logic or laws actors follow)

The stickleback mating ritual

Concrete system realisation


Actors playing the roles and acting according to the rules

Countless pairs of stickleback.


Note that the relationship between physical entities and abstract systems is many-to-many.

·       One abstract system (e.g. the game of poker) may be realised by countless physical entities (countless card schools).

·       One physical entity (e.g. a card school) may realise countless abstract systems (e.g. poker, whist etc).


 “Since different systems may be abstracted from the same real thing, a statement that is true of one may be false of another.

… there can be no such thing as the unique behavior of a very large system, apart from a given observer.

… there can be as many systems as observers… some so different as to be incompatible.

… studying only carefully selected aspects of things is simply what is always done in practice.” (Ashby 1956, 6/14)


Unfortunately, most of us use the term system for both abstract systems (types), and concrete/real world entities or social networks that instantiate them.

And so, in systems thinking discussions, we often confuse them.

But remember, an abstract system may be realised by several real-world entities - each of which might also do others things

And one entity might realise other abstract systems – each defined by taking a different perspective of the entity.

Reconciling different abstract or soft systems is a theme of Checkland’s "soft systems methodology”.


On system rightness and wrongness

We model system features that are "relevant to some main interest already given".

Suppose the main interest is the perspective taken by an umpire of a tennis match.

What facts are relevant to the umpire?

·       The state – the current values of the variables – as displayed on the score board.

·       The inputs - the movements of the players and the ball (describable as discrete events) that change the variable values.

·       The rules - the laws of tennis by which the inputs advance the state variables.


The laws of tennis are an abstract description of how concrete tennis matches proceed, or should proceed.

Any particular game that conforms well enough to the laws of tennis may be called a tennis match.


Tennis match as a system

The laws of tennis

<maintain>                 <represent>

The LTA <observe and envisage> Tennis matches


You can think of a tennis match as what Ashby called “the real machine” in motion.


The relationship of abstract system to concrete system may be strong or weak.

The terms right and wrong can be interpreted in various ways.


A model may be called wrong because it misrepresents some aspect of a reality.

E.g. a causal loop diagram might be wrong because it suggests an increase in GDP leads to an increase in population growth.


A model may be called right because it accurately represents some aspects of a reality.

·       A data model representing a database structure.

·       A flowchart representing the logic of a program.

·       This flowchart representing cell respiration.

·       This flowchart representing the Kreb’s cycle


A model may be called right but not useful because it omits some feature that matters to its purpose.

E.g. imagine if this NOAA model of wave height along the USA west coast showed where, but not when.


A model may be called right, even though it is instantiated with a degree of inaccuracy.

E.g. The process defined in a musical score is performed somewhat differently each time it is played.


A model may be called right, even though its concrete realisations differ.

E.g. The musical score above.

Also, a computer program is the same each time it runs at the level of abstraction we are interested in.

But it is different at the level of memory spaces and electrical phenomena.


Finally, a model may be seen as right and useful but wrongly instantiated to a greater or lesser degree.

Consider the rules of the card game poker, which define the roles of the players.

Actors, playing their roles in a particular game of poker, may deliberately disobey the rules.

Still, any instance of a card game that conforms well enough to those rules may be called a game of poker.


To be confirm the accuracy of a model, we need some social, logical or empirical verification.

Or else, faith in the mechanisms via which the model is realised.

System dynamics

Some define a system simply as a structure in which things are related to each other.

But (outside of quantum physics) than means every describable thing is a system, and the term system adds no value.

Also, things may be related only by their relationship to something else.

Are the employees of IBM to be called a system purely because they are all paid by one employer?


“The primacy of behavior”

The first question in systems thinking was expressed by Seidl thus:


The first decision…  is what to treat as the basic elements of the social system.

The sociological tradition suggests two alternatives: either persons or actions.” Seidl 2001.


Do you see a system as a set of actors performing actions? Or a set of actions performed by actors?

Ashby and other system theorists focused attention on a system’s actions before its actors.

A system is defined by some orderly behavior(s) that we model with some particular interest in mind.


“Cybernetics deals with all forms of behavior in so far as they are regular, or determinate, or reproducible.”

“[It] treats, not things but ways of behaving. It does not ask "what is this thing?" but ''what does it do?"

It is thus essentially functional and behavioristic.” (Ashby 1956, 1/2)


Donella Meadows, a champion of Forrester’s System Dynamics, also characterised a system by its behaviors.


“A set of elements or parts that is coherently organized and interconnected in a pattern or structure

that produces a characteristic set of behaviors, often classified as its function or purpose."


In short, a system is set of actions or behaviors performed by actors or structures.

The behaviors change the values of state variables.



Solar system

Ashby’s cybernetics

Meadow’s System Dynamics



Roles played by actors

Stocks or populations



Rules for activities

Flows between stocks


Positions in space

State variables

Stock volumes


The primacy of behavior is a principle

It means the structures need be considered only in so far as they perform required behaviors.


System dynamics

Roles, rules and variable types

<define>                          <represent>

Systems thinkers   <observe and envisage>   Regular behaviors


Beware however that the term “behavior” is used with two meanings in systems thinking discussions.


The meaning of “behavior”

In cybernetics, the behaviors are processes that change state variable values.

In Forrester’s System Dynamics, the behaviors are inter-stock flows that change stock populations.


Some use the term behavior differently - to mean the trajectory that a state variable’s values take over time.

That trajectory can be regular (linear or cyclical) or irregular/chaotic.

Even if the state change trajectory is irregular, it is still an inexorable side effect of regular behaviors.

On change: all system dynamics can be modelled as discrete

In a real machine, the inputs, outputs and state changes may be all continuous, all discrete, or a mixture.

For examples of continuous change think of the orbits of the planets, or the hands of a clock,

However, to describe such a machine in a mathematical model, Ashby divided continuous inputs into discrete events.


“Often a change occurs continuously, that is, by infinitesimal steps,

as when the earth moves through space, or a sunbather's skin darkens under exposure.

The consideration of steps that are infinitesimal, however, raises a number of purely mathematical difficulties,

so we shall avoid their consideration entirely.

Instead, we shall assume in all cases that the changes occur by finite steps in time and that any difference is also finite.

We shall assume that the change occurs by a measurable jump, as the money in a bank account changes by at least a penny…

Consideration of the case in which all differences are finite loses nothing.

It gives a clear and simple foundation; and it can always be converted to the continuous form if that is desired.” (1956, 2/1)


In short, to model a physical system, Ashby converted any continuous dynamics to discrete dynamics.

Elsewhere, I have demonstrated that a System Dynamics model (a la Forrester) can be abstracted from a discrete event-driven model (a la Ashby).

Cybernetics depends on the communication of information

“Cybernetics was defined by Wiener as "the science of control and communication, in the animal and the machine" (1956, 1/1)

Weiner discussed how a controller directs a target system to maintain some state variable(s) in a desired range.

And on how information feedback loops connect control and target systems.


A system is a transient island of order carved out of the universe.

The universe is an ever-unfolding process that will end in chaos.

To hold off that chaos, the system must draw energy from its environment

However, cybernetics is focused on flows of information rather than energy.


“Cybernetics started by being closely associated in many ways with physics,

but it depends in no essential way on the laws of physics or on the properties of matter.” (1956, 1/2)

“In this discussion, questions of energy play almost no part; the energy is simply taken for granted.” (1956. 1/5)


Cybernetics is about how behavior is affected by information state and flow (rather than physical state or flow).

Communication depends on encoding and decoding

Cybernetics is the science of how systems communicate, and are controlled by, information flows.

First, for any act of communication to occur, there must be some of what Ashby called variety.

To have some variety a structure must have more than one possible state.


Information can be created or found in any structure that has more than one possible state

E.g. If your office door is always open, you cannot use it to convey a message.

If it can be left open or closed, you can use that structural variety to convey the message that you are open to visitors or not.


For an act of communication to succeed, two roles must be played.

·       One actor must encode some information or meaning in a data structure or message.

·       Another must decode the same information or meaning from that data structure or message – using the same code.


Ashby wrote of “the ubiquity of coding” in the communication of information.


“Let us consider, in some detail, the comparatively simple sequence of events that occurs when a “Gale warning” is broad-cast.

It starts as some patterned process in the nerve cells of the meteorologist, and then becomes

·       a pattern of muscle-movements as she writes or types it, thereby making it

·       a pattern of ink marks on paper. From here it becomes

·       a pattern of light and dark on the announcer’s retina, then

·       a pattern of retinal excitation, then

·       a pattern of nerve impulses in the optic nerve, and so on through her nervous system. It emerges, while she is reading the warning, as

·       a pattern of lip and tongue movements, and then travels as

·       a pattern of waves in the air. Reaching the microphone it becomes

·       a pattern of variations of electrical potential, and then goes through further changes as it is amplified, modulated, and broadcast. Now it is

·       a pattern of waves in the ether, and next

·       a pattern in the receiving set. Back again to

·       a pattern of waves in the air, it then becomes

·       a pattern of vibrations traversing the listener’s ear-drums, ossicles, cochlea, and then becomes

·       a pattern of nerve-impulses moving up the auditory nerve.


… this very brief account mentions no less than sixteen major transformations

through all of which something has been preserved,

though the superficial appearances have changed almost out of recognition.” (1956, 8/2)


The thing preserved in every physical form is the information/meaning that the meteorologists intended to broadcast.

All that remains is for the receivers to decode the message - using the same code the sender used to encode it.

Systems are determinate (rule-bound)

“Our starting point is the idea, much more than a century old, that a machine, in given conditions and at a given internal state, always goes to a particular state.” (“Design for a Brain” 19/2).

“A variable is a measurable quantity which at every instant has a definite numerical value.” (“Design for a Brain” 2/3).

“The state of the system is the set of values that the variables have.”

“A system is any set of variables which he [observer] selects from those available on the real machine.”


Ashby illuminated systems thinking by viewing the world in terms of machines describable as systems that maintain state variables.

In chapter 3, he discussed how a determinate machine may be defined.

“Every real determinate machine or dynamic system corresponds to a closed, single-valued transformation.” (1956, 3)


Ashby’s ideas here include:

·       A transformation is a state change.

·       Single-valued means the system cannot be in two states at once - the set of state variables will have only one set of values at a time.

·       Determinate means each state-to-state transformation is determined by rules.

·       Closed is a constraint we need not discuss here.


To show the relevance of his ideas to the natural world, Ashby offered the following example (after Tinbergen).

The rows of the table below show the successive states of the stickleback mating system.

The columns show stickleback roles (the columns could be swim lanes in a process flow diagram).

The pair of sticklebacks communicate by sending information in the form visual signals to each other.


Stickleback mating system

The female’s role is to

The male’s role is to

present a swollen abdomen and special movements

present a red colour and a zigzag dance

swim towards the male

turn around and swim rapidly to the nest.

follow the male to the nest

point its head into the nest entrance

enter the nest

quiver in reaction to the female being in the nest

spawn fresh eggs in the nest

fertilise the eggs


The table above is an abstract system, a description of roles that typify actors.

Reading left to right, top to bottom, each visual signal (information flow) triggers the partner to act in response.

A concrete system would be any pair of sticklebacks that realises the two abstract roles.

Notice the system includes both active structures (sticklebacks) and passive structures (nest and eggs).

Machines can process inputs

Ashby extended his initial notion of a determinate system to include inputs as well as state variables.


The machine with input or finite state machine

Ashby equated the organisation of a system to the transformation rules of a machine with finite states.


“What matters is the regularity of the behavior….

A machine behaves [such] that its internal state, and the state of its surroundings, defines, uniquely, the next state it will go to.

When the variables are continuous it corresponds to the definition of a dynamic system by giving a set of ordinary differential equations with time as the independent variable.

The fundamental nature of such a representation has been recognised by many.

So arises the modern definition…

The machine with input (Ashby 1958) or the finite state automaton (Jeffrey 1959) is today defined by

·       a set of S internal states,

·       a set of I input or surrounding states and

·       a mapping (say f) of the product set I*S into S.


Here, we have the very essence of the “machine”; all known types of machine are to be found here.

Change f and the organisation changes.” (1962)


The number of possible states could be practically if not theoretically infinite.

What matters is the regularity of the behavior - the determinate nature of state-to-state transformations

Change a transformation, a function, and you have a different machine/system.


The transducer or determinate system

Ashby wrote that his machine with input is identical with the “transducer” of Shannon.


“A system whose next state is determined by its present state and the present values of its parameters.

Since we can predict its next state—we say it is “determinate”. (1956)


Change the determinate rules and you change the machine/system.


The Markovian Machine

In chapter 12, Ashby slightly relaxed his definition of a machine as a system whose behavior is law-abiding or repetitive.

He said it is sufficiently law-abiding or repetitive for us to be able to make some prediction about what the system will do.

He allowed there can be a less constrained, but still rule-bound, system

We cannot predict its next state, but can predict that, in repeated conditions, the frequencies of the various states will be constant.


“We can therefore consider a new class of absolute system.

It is one whose states change with time not by a single-valued transformation but by a matrix of transition probabilities.

For it to remain the same absolute system the values of the probabilities must be unchanging.” (1956, 12/8)


In short, Ashby’s machines include both determinate machines and Markovian machines.

All his systems are predictable to some extent – using determinate and/or probability rules.

Remember, if you add a new variable or function/rule/probability, you make a new machine/system/organisation.

Systems can be coupled in wider systems

 “coupling is of profound importance in science” (1956, 4/6)


In a coupling, one system’s output is input to another system.


“Two or more whole machines can be coupled to form one machine;

and any one machine can be regarded as formed by the coupling of its parts,

which can themselves be thought of as small, sub-machines.” (1956, 4/6)


Via coupling, one (control) system can direct the behavior of another (target) system, and change its state change trajectory.

You may see the relationship from control system to target system as being from a “higher” system to a “lower” system.

The terms “higher” and “lower” are subjective, reflecting how we think of the brain and the body, or a hierarchical human organisation.


Coupling two or more systems makes a wider system (or ecology).

Given a mesh of coupled systems, many possible system boundaries may be drawn.


Properties of a wider system are said to “emerge” from the coupling of the parts.


“That a whole machine should be built of parts of given behavior is not sufficient to determine its behavior as a whole.

Only when the details of coupling are added does the whole's behavior become determinate.” (1956, 4/10)


An experimenter can couple to a system in two ways.

·       A provider of inputs (which change values of the system’s state variable)

·       An observer of the outputs


“when the experimenter runs an experiment he is coupling himself temporarily to the system that he is studying.” (1956, 4/6)


Coupling systems does not change those systems.


“… coupling does no violence to each machine's inner working,

so that after the coupling each machine is still the same machine that it was before.

For this to be so, the coupling must be arranged so that, in principle, each machine affects the other only by affecting its conditions, i.e. by affecting its input.” (1956, 4/6)


But what about reengineering - changing the machine’s inner workings - changing its state variable types or rules?

This kind of change is of particular interest in social and business systems thinking.

Systems can change in different ways

Ashby scoped a system as a set of variables (think, a set of stocks in System Dynamics).

This implies two very different ways to change the system.


·       Change the variables’ values (think, stock volumes)  and you change the system’s state

·       Change the variables (think, stock types) and you change the system itself.


Ashby’s system includes not only variables but also the rules applied to those variables.

Rules are applied when an input/event is detected, or a particular state is reached (or a flow passes between stocks).

So, changing the rules is another way to change the system itself – to change the transformations it can make.


E.g. Consider system state changes and mutations in playing a game of cards.

There are state changes: In playing their cards, according to defined rules, players win “tricks” which advance the state of the game.

There can be mutations: The players can stop the game, agree a change to the rules, then play on; this is a creative act that changes the game itself.


Ashby urged us to distinguish state change and transformation change in this observation.


“The word "change" if applied to [an entity repeating a behavior] can refer to two very different things.

·       change from state to state, which is the machine's behavior, and which occurs under its own internal drive, and