Interpreting
cybernetics for business
Copyright 2019
Graham Berrisford. A chapter in the book https://bit.ly/2yXGImr. Updated 17/06/2021 18:41
Reading online? If your screen is wide, shrink its width for readability.
This chapter informs readers who want to apply Ashby’s cybernetic principles to business systems, and to understand the management cybernetics of Clemson and Beer. It analyses and comments on some terms and concepts you can find in Ashby’s works below.
· “Design for a Brain” (1954 but originally 1952)
· “Introduction to Cybernetics”
(1957)
· “Principles of the self-organizing system” (1962).
Contents
Ashby’s cybernetics was developed to deal with particular kinds of problem. His clearest examples were electrical and mechanical devices that consume some continuous input (an electric current, a push/pull force), and run under their own steam, after being configured by some input parameters. However, in his Introduction to Cybernetics, Ashby declared the transferability of his cybernetic principles from
1. a “determinate machine” (chapter 3)
that performs a “single-value closed transformation”, to
2. a “machine with input” (chapter 4)
that can be configured by an input parameter to behave (autonomously) in
different ways, to
3. a “black box” machine (chapter 6)
with a continuous input stream that can be coupled by input and output to other
machines in a wider system, to
4. a “free-living organism” that has
evolved so as to respond appropriately to a variety of discrete input events,
to
5. a human social entity, in which
(comparatively) anything is possible.
In the steps from 3 to 4 to 5,
there is some overlap of ideas, but also some discontinuity. At step 4, we find
systems that are driven more by input events than by internal conditions. At
step 5, we find a social entity that is distinguishable from the various
activity systems its actors may play roles in.
It is unclear to me how far Ashby really thought his cybernetic
principles are applicable to event-driven psychological and social behavior and
business activity systems. However, Clemson defined 22 principles for
management cybernetics (to be discussed). And, somewhat more modestly, this
chapter abstracts 10 principles that seem most relevant to business. Some may be applied directly by business managers or
enterprise architects; others need some reinterpretation (or “de-ontologizing”
as Luhmann said when applying the biological idea of autopoiesis to sociology).
"A system... is independent of the concrete substance of the elements (e.g. particles, cells, transistors, people, etc).” Principia Cybernetica Web
Ashby’s student Krippendorff wrote:
“What we know of a system always is an ‘observer’s digest’… “It is important to stress Ashby defined a system not as something that exists in nature. A system consisted of a set of variables chosen for attention and relationships between these variables, established by observation, experimentation, or design."
Idea 1: Activity systems are realized by
physical entities
“It is important to stress Ashby
defined a system not as something that exists in nature. A system consisted of
a set of variables chosen for attention and relationships between these
variables, established by observation, experimentation, or design."
Ashby’s student Krippendorff writing in 2009 on Ashby’s Information Theory
A business activity system is a highly
selective abstraction from what happens in a physical entity.
Abstracting a business
activity system from a physical entity |
|
Abstract
system (roles, rules and
variables) |
The “billing” process |
Physical
system (activities that
advance variable values) |
Billing process
instances |
Physical
entity (actors that play
roles in the activities) |
Customers, suppliers
and their banks |
SEE “ABSTRACTING SYSTEMS” CHAPTER
Idea
2: A system can be coupled to its environment information feedback loops
3/8. Given an organism, its environment is defined as those variables whose changes affect the organism, and those variables which are changed by the organism's behavior.
3/11. The organism affects the environment, and the environment affects the organism, such a system is said to have feedback.
4/6 “when the experimenter
runs an experiment, he is coupling himself temporarily to the system that he is
studying.”
Feedback loops appear in machines, organisms and businesses. Within a business, smaller business systems are coupled in wider business systems to produce effects or results the smaller systems cannot produce their own. As a whole, the business affects some things in its environment, and things in that environment affect the business. Together, they business and its environment are coupled in a wider system.
Feedback loop |
Business environment |
|
Regulator |
ßstate
information directionà |
Target |
Consumes inputs
Produces
outputs Maintains
system state |
ß Inputs à Outputs |
External entities and events |
SEE “SYSTEM DYNAMICS” CHAPTER for discussion of feedback loops
Idea 3: Most systems of interest can be
modelled using discrete dynamics
2/1. “The most fundamental concept in cybernetics is that of ‘difference’, either that two things are recognisably different, or that one thing has changed with time… We assume change occurs by a measurable jump.”
In business, as in Ashby’s cybernetics, state changes are usually divided into discrete steps. To describe the continuous flow of time, we divide changes into discrete events and step changes in a system’s state.
SEE “DYNAMICS” CHAPTER for discussion of continuous and discrete dynamics.
Idea 4: System state changes are
constrained by rules
“2/3. … if the
concept of “change” is to be useful it must be enlarged to the case in which
the operator can act on more than one operand, inducing a characteristic
transition in each… Such a set of transitions, on a set of operands, is a
transformation.”
Ashby’s transition
is a change to one state variable from one value to another. The whole
state of a system advances in discrete steps
- whether in response to input events and/or some kind of internal
drive. Ashby’s transformation is
the rules for how all the state variables change in the course of a
transformation.
In business, the state of the business and whatever it monitors and directs is recorded in the state variables of information systems. The current state of the business is represented in the values those variables have right now. The primary concern is regular business operations in which the range of possible activities is defined. If no action is detected when expected, then a time-out event may trigger some compensating action.
This graphic distils some features of business activity systems.
Environment |
Open system |
Environment |
||
Suppliers |
Inputs |
Actors <perform>
Activities Inputs + States
<determine> Activities Activities
<consume> Inputs Activities
<change> States Activities <produce> Outputs |
Outputs |
Consumers |
Does following rules mean an activity system behaves in a “linear” way? No, a rule-bound system may have surprisingly complex or chaotic lines of behavior over time. In enterprise architecture, however, the lines of behavior are not often of interest.
Idea 5: System
state change can appear probabilistic
12/9. Whether a given ‘real machine’ appears Markovian or determinate will sometimes depend on how much of the machine is observable; and sometimes a ‘real machine’ may be such that an apparently small change of the range of observation may be sufficient to change the appearances from that of one class [Markovian] to the other [determinate].
In other words, Ashby’s interest in systems was wider than purely determinate ones. When observing a business activity system in operation, we can classify actors’ responses to stimuli into three kinds.
Causality |
We can predict |
Deterministic |
exactly which action an actor will
perform in response to an event. |
Probabilistic |
how likely an actor chooses
activity type A over activity type B. |
Possibilistic |
the actor will choose from the
known range of activity types. |
Even possibilistic causality is “regular” in the sense an actor is constrained to choose between actions in a defined range. If an agent invents a new action, that is to act outside any defined system. If actors or agents are free to choose between possible activities, then by monitoring the choices they make, you might be able to attach probabilities to the possibilities.
SEE “XXX” HAPTER for discussion of causality
Idea 6: A
regulator needs a model of its environment
The Conant-Ashby theorem, or “good regulator” theorem, was conceived by Roger C. Conant and W. Ross Ashby and is central to cybernetics.
Abstract "The design of a complex regulator often includes the making of a model of the system to be regulated. The making of such a model has hitherto been regarded as optional, as merely one of many possible ways. In this paper a theorem is presented which shows, under very broad conditions, that any regulator that is maximally both successful and simple must be isomorphic with the system being regulated. (The exact assumptions are given.) Making a model is thus necessary.
The theorem has the interesting corollary that the living brain, so far as it is to be successful and efficient as a regulator for survival, must proceed, in learning, by the formation of a model (or models) of its environment."
https://www.tandfonline.com/doi/abs/10.1080/00207727008920220
Evidently, to function and
respond to changes, an animal must “know” what
it going on in its world. It needs a model of entities and events
its environment if it is to find food and mates, and avoid enemies. A brain holds a model of things in its
environment, which an animal uses to manipulate those things. A missile
guidance system senses spatial information, and sends messages to direct the
missile. In short, every good regulator of a system must be (or have) a
model of that system. The richer the model,
the more adaptive the animal, machine or business can be to changes in its
environment.
So, a regulator can be an animal, machine or business that has a model, or has access to a model, of what it needs to monitor and control. Read this triangle from left to right: regulators <have and use> models, which <represent> targets.
The good regulator |
Models <have and use>
<represent> Regulators <monitor and regulate > Targets |
In
business, a database holds a model of business entities and events, which
people use to monitor and direct those entities and events.
Idea 7: Regulators
must recognize the variety they control
“A system's variety V measures the number of possible
states it can exhibit.” Ashby 1956
To succeed in turning a light on or off, you
must know its current state. More
generally, regulators must recognize the variety in the states they seek to
control. The
wider the variety of states to be controlled, the wider the variety of states,
the regulator must recognize, and the wider the variety of actions it must able
to perform. This
is not a physical law like Newton’s laws of motion, which may be supported or
disproven by experiment. It
is an information law, and an expression of what is mathematically inevitable.
Ashby’s law of requisite variety is commonly
expressed as: “only
variety can absorb variety”. He expressed
it rather more clearly as follows.
"The larger the variety of actions available to
a control system, the larger the variety of perturbations it is able to
compensate". Ashby 1956
Perturbations are changes to a target system’s variables that move their values outside of a normal or desirable range. The more ways a target can deviate from its desired state, the more control actions its controller(s) will need.
This
law has some application to business architecture. Business activity systems must remember enough
about entities and events of interest (both inside and outside the system of
interest) to direct them as need be. However, a regulator models only those variables of the
target system it seeks to control. A government institution or a business (like
IBM) is a social entity definable in terms of countless variables. There
is no prospect of typifying the whole of IBM, all describable variables, in one
abstract system.
There are limits to how far a target can be controlled by one regulator. Different regulators - with different interests – may see it as different systems with different varieties. And a regulator can’t fully control variables also controlled by other forces. A regulator designed to monitor and direct some selected variables of IBM may find they are buffeted by other factors – including other regulators and market forces.
SEE
“ASHBY’S’ LAW” CHAPTER for discussion of the two ideas above.
2/1. “The most fundamental concept in cybernetics is that of ‘difference’… We assume change occurs by a measurable jump.”
Idea
8: Mutation differs from state change
Ashby’s “Design for a brain”
5/7 It must be noted the adaptation is commonly used in two senses, which refer to different processes. The distinction may best be illustrated by the inborn homeostatic mechanisms – the reaction to cold by shivering for instance… [Historically] the first change involved the development of the mechanism itself [by a mutation]; the second change occurs when the mechanism is stimulated into showing its properties [changing the state of an animal].”
Ashby’s “Introduction to Cybernetics"
4/1 It will be seen that the word “change” if applied to such a machine can refer to two very different things. There is the change from state to state, which is the machine’s behaviour, and which occurs under its own internal drive, and there is the change from transformation to transformation, which is a change of its way of behaving, and which occurs at the whim of the experimenter or some other outside factor. The distinction is fundamental and must on no account be slighted.” Ashby’s “Introduction to Cybernetics”
We expect a business to change in both ways. It will well-nigh continually update its information systems when entities of interest change. Now and then, it will change or even replaces its business activity systems (their roles, rules and variables). The migration from a baseline system to a target system may be seen as rolling out the next generation of the same system, or a different system altogether.
Idea
9: Self-organization implies a higher entity or system
“The use of the phrase [self-organization] tends to perpetuate a fundamentally confused and inconsistent way of looking at the subject” Ashby 1962
· Self-stabilizing: “any deterministic dynamic system automatically evolves towards a state of equilibrium that can be described in terms of an attractor in a basin of surrounding states.” “Once there, the further evolution of the system is constrained to remain in the attractor.”
· Self-connecting: “Changes from parts separated to parts joined” “Self-connecting” “Perfectly straightforward”.
· Self-improving: “Changing from a bad way of behaving to a good.” “No machine can be self-organizing in this sense.” “The appearance of being self-organizing can be given only by the machine S being coupled to another machine x. Then the part S can be self-organizing within the whole S+x.”
Enterprise architecture is very much about the last of these, designing and planning migrations from baseline systems to target systems.
SEE “SYSTEM CHANGE” CHAPTER for discussion of the two ideas above.
Idea
10: Coding is ubiquitous in thought and communication
8/1. … Our aim will be to work towards an understanding good enough to serve as a basis for considering the extremely complex codings used by the brain.
8/2. Ubiquity of coding. To get a picture of the amount of coding that goes on during the ordinary interaction between organism and environment, let us consider, in some detail, the comparatively simple sequence of events that occurs when a “Gale warning” is broadcast.
Ashby’s cybernetics is concerned with the processes of creating and using information, rather than its physical form. Actors and systems exchange information by encoding and decoding data structures in messages. In a system of communicating actors, meaning is found in both the action of a sender when encoding a data structure and the response of a receiver when decoding a data structure. There is no meaning in a data structure on its own. Meaning appears only in moments when actors create and use data structures, with reference to a code that maps data to meanings. So, to succeed in communicating, actors must share the same code or language.
In society of birds, the meaning of the message in an alarm call is unambiguous, because the code is inherited. In human society, the meaning of every word (say, "policy") is ambiguous, because it is learnt.
In business, where mistaken interpretation of messages is a very serious problem, the meaning of data in memories and messages may be defined in some kind of meta data or domain specific language.
SEE PART 4