Interpreting cybernetics for EA

Copyright 2019 Graham Berrisford. A chapter in the book https://bit.ly/2yXGImr. Updated 10/05/2021 14:45

 

Reading online? If your screen is wide, shrink its width for readability.

 

This chapter informs readers who want to apply Ashby’s cybernetic principles to EA and BA, and to understand the management cybernetics of Clemson and Beer. It analyses and comments on some terms and concepts you can find in Ashby’s works below.

 

·       Design for a Brain” (1954 but originally 1952)

·       Introduction to Cybernetics” (1957)

·       Principles of the self-organizing system” (1962).

Contents

Introduction. 1

On abstraction. 2

On dynamics. 2

On regulation and control 4

On system change. 5

On encoding of information. 5

Conclusions and remarks. 6

 

Introduction

Ashby’s cybernetics was developed to deal with particular kinds of problem. His clearest examples were electrical and mechanical devices that consume some continuous input (an electric current, a push/pull force), and run under their own steam, after being configured by some input parameters. However, in his Introduction to Cybernetics, Ashby declared the transferability of his cybernetic principles:

 

1.     from a “determinate machine” (chapter 3) that performs a “single-value closed transformation”, to

2.     a “machine with input” (chapter 4) that can be configured by an input parameter to behave (autonomously) in different ways, to

3.     a “black box” machine (chapter 6) with a continuous input stream that can be coupled by input and output to other machines in a wider system, to

4.     a “free-living organism” that has evolved so as to respond appropriately to a variety of discrete input events, to

5.     a human social entity, in which (comparatively) anything is possible.

 

In the steps from 3 to 4 to 5, there is both some overlap of ideas and some discontinuity. At step 4, we find systems that are driven more by input events than by internal conditions. At step 5, we find a social entity that is distinguishable from the various activity systems its actors may play roles in.

 

It is unclear to me how far Ashby really thought his cybernetic principles are applicable to event-driven psychological and social behavior and business activity systems. However, Clemson defined 22 principles for management cybernetics (to be discussed). And, somewhat more modestly, this chapter abstracts 10 principles that seem most relevant to EA and BA. Some may be applied directly to EA and BA; others need some reinterpretation (or “de-ontologizing” as Luhmann said when applying the biological idea of autopoiesis to sociology).

On abstraction

"A system... is independent of the concrete substance of the elements (e.g. particles, cells, transistors, people, etc).” Principia Cybernetica Web

 

Idea 1: Activity systems are realized by physical entities

“It is important to stress Ashby defined a system not as something that exists in nature. A system consisted of a set of variables chosen for attention and relationships between these variables, established by observation, experimentation, or design." Ashby’s student Krippendorff writing in 2009 on Ashby’s Information Theory

 

In EA, a business activity system is a highly selective abstraction from what happens in a physical entity. 

 

Abstracting a business activity system from a physical entity

Abstract system (roles, rules and variables)

The “billing” process

Physical system (activities that advance variable values)

Billing process instances

Physical entity (actors that play roles in the activities)

Customers, suppliers and their banks

 

SEE “ABSTRACTING SYSTEMS” CHAPTER

On dynamics

 

Idea 2: A system can be coupled to its environment information feedback loops

3/8. Given an organism, its environment is defined as those variables whose changes affect the organism, and those variables which are changed by the organism's behavior.

 

In EA, there is a feedback loop in which a business affects some parts of its environment, and conversely the environment affects the business Together they are coupled in a wider system.

 

Business activity system

Feedback loop

Business environment

Regulator

ßstate information

directionà

Target

Consumes inputs

Produces outputs

Maintains system state

ß Inputs

à Outputs

External entities

and events

 

Business systems can be coupled in a wider business system to produce effects or results that no one can produce on its own.

 

SEE “SYSTEM DYNAMICS” CHAPTER for discussion of feedback loops

 

Idea 3: Most systems of interest can be modelled using discrete dynamics

2/1. “The most fundamental concept in cybernetics is that of ‘difference’, either that two things are recognisably different, or that one thing has changed with time… We assume change occurs by a measurable jump.”

 

In EA, as in Ashby’s cybernetics, changes are usually divided into discrete units. To describe the continuous space of the universe, we divide it into discrete entities and systems. To describe the continuous flow of time, we divide changes into discrete events and step changes in a system’s state – which is the norm in business activity systems.

 

SEE “DYNAMICS” CHAPTER for discussion of continuous and discrete dynamics.

 

Idea 4: System state changes are constrained by rules

“2/3. … if the concept of “change” is to be useful it must be enlarged to the case in which the operator can act on more than one operand, inducing a characteristic transition in each… Such a set of transitions, on a set of operands, is a transformation.”

 

Ashby’s transition is a change to one state variable from one value to another. The whole state of a system advances in discrete steps  - whether in response to input events and/or some kind of internal drive. Ashby’s transformation is the rules for how all the state variables change in the course of a transformation.

 

In EA, the state of a business and whatever it monitors an directs is recorded in the state variables of information systems. The current state is the values those variables have right now.

 

Idea 5: System state change can appear probabilistic

12/9. Whether a given ‘real machine’ appears Markovian or determinate will sometimes depend on how much of the machine is observable; and sometimes a ‘real machine’ may be such that an apparently small change of the range of observation may be sufficient to change the appearances from that of one class [Markovian] to the other [determinate].

 

In other words, Ashby’s interest in systems was wider than purely determinate ones. In EA, when observing an activity system, we can classify actors’ responses to stimuli into three kinds.

 

Causality

We can predict

Deterministic

exactly which action an actor will perform in response to an event.

Probabilistic

how likely an actor chooses activity type A over activity type B.

Possibilistic

the actor will choose from the known range of activity types.

 

Even possibilistic causality is “regular” in the sense an actor is constrained to choose between actions in a defined range. If an agent invents a new action, that is to act outside any defined system. If actors or agents are free to choose between possible activities, then by monitoring the choices they make, you might be able to attach probabilities to the possibilities.

 

SEE “XXX” HAPTER for discussion of causality

On regulation and control

 

Idea 6: A regulator needs a model of its environment

The Conant-Ashby theorem, or “good regulator” theorem, was conceived by Roger C. Conant and W. Ross Ashby and is central to cybernetics.

 

Abstract "The design of a complex regulator often includes the making of a model of the system to be regulated. The making of such a model has hitherto been regarded as optional, as merely one of many possible ways. In this paper a theorem is presented which shows, under very broad conditions, that any regulator that is maximally both successful and simple must be isomorphic with the system being regulated. (The exact assumptions are given.) Making a model is thus necessary.

 

The theorem has the interesting corollary that the living brain, so far as it is to be successful and efficient as a regulator for survival, must proceed, in learning, by the formation of a model (or models) of its environment."

https://www.tandfonline.com/doi/abs/10.1080/00207727008920220

 

Evidently, to function and respond to changes, an animal must “know” what it going on in its world. It needs a model of entities and events its environment if it is to find food and mates, and avoid enemies. A brain holds a model of things in its environment, which an animal uses to manipulate those things. A missile guidance system senses spatial information, and sends messages to direct the missile. In short, every good regulator of a system must be (or have) a model of that system. The richer the model, the more adaptive the animal, machine or business can be to changes in its environment.

 

In EA, a business database holds a model of business entities and events, which people use to monitor and direct those entities and events. (Note that a stateless system can import its model before it processes an input, then put the model away again.)

 

Idea 7: Regulators must recognize the variety they control

“A system's variety V measures the number of possible states it can exhibit.” Ashby 1956

 

To succeed in turning a light on or off, you must know its current state. More generally, regulators must recognize the variety in the states they seek to control. The wider the variety of states to be controlled, the wider the variety of states, the regulator must recognize, and the wider the variety of actions it must able to perform. This is not a physical law like Newton’s laws of motion, which may be supported or disproven by experiment. It is an information law, and an expression of what is mathematically inevitable.

 

Ashby’s law of requisite variety is commonly expressed as: only variety can absorb variety”.  He expressed it rather more clearly as follows.

"The larger the variety of actions available to a control system, the larger the variety of perturbations it is able to compensate".  Ashby 1956

 

Perturbations are changes to a target system’s variables that move their values outside of a normal or desirable range. The more ways a target can deviate from its desired state, the more control actions its controller(s) will need.

 

In EA, this law has some application to business architecture. However, a regulator models only those variables of the target system it seeks to control. A government institution or a business (like IBM) is a social entity definable in terms of countless variables. There is no prospect of typifying the whole of IBM, all describable variables, in one abstract system.

 

There are limits to how far a target can be controlled by one regulator. Different regulators - with different interests – may see it as different systems with different varieties. And a regulator can’t fully control variables also controlled by other forces. A regulator designed to monitor and direct some selected variables of IBM may find they are buffeted by other factors – including other regulators and market forces.

 

SEE “ASHBY’S’ LAW” CHAPTER for discussion of the two ideas above.

On system change

2/1. “The most fundamental concept in cybernetics is that of ‘difference’… We assume change occurs by a measurable jump.”

 

Idea 8: Mutation differs from state change

Ashby’s “Design for a brain”

5/7 It must be noted the adaptation is commonly used in two senses, which refer to different processes. The distinction may best be illustrated by the inborn homeostatic mechanisms – the reaction to cold by shivering for instance… [Historically] the first change involved the development of the mechanism itself [by a mutation]; the second change occurs when the mechanism is stimulated into showing its properties [changing the state of an animal].”

 

Ashby’s “Introduction to Cybernetics"

4/1 It will be seen that the word “change” if applied to such a machine can refer to two very different things. There is the change from state to state, which is the machine’s behaviour, and which occurs under its own internal drive, and there is the change from transformation to transformation, which is a change of its way of behaving, and which occurs at the whim of the experimenter or some other outside factor. The distinction is fundamental and must on no account be slighted.” Ashby’s “Introduction to Cybernetics”

 

In EA, we expect a business to update its information systems when entities of interest change. And any business activity systems (its roles, rules and variables) can be changed to a new and different system. The migration from a baseline system to a target system may be seen as the next generation of the same system, or a different system altogether.

 

Idea 9: Self-organization implies a higher entity or system

“The use of the phrase [self-organization] tends to perpetuate a fundamentally confused and inconsistent way of looking at the subject” Ashby 1962

 

Ashby’s view changed over the period from 1947 to 1962. Over that period, he wrote of three kinds of self-organization. The first two may be classified as kinds of state change; the third is a kind of system mutation.

 

·       Self-stabilizing: “any deterministic dynamic system automatically evolves towards a state of equilibrium that can be described in terms of an attractor in a basin of surrounding states.”  “Once there, the further evolution of the system is constrained to remain in the attractor.”

·       Self-connecting: “Changes from parts separated to parts joined” “Self-connecting” “Perfectly straightforward”.

·       Self-improving: “Changing from a bad way of behaving to a good.” “No machine can be self-organizing in this sense.” “The appearance of being self-organizing can be given only by the machine S being coupled to another machine x. Then the part S can be self-organizing within the whole S+x.”

 

EA is very much about the last of these, designing and planning migrations from baseline systems to target systems.

 

SEE “SYSTEM CHANGE” CHAPTER for discussion of the two ideas above.

On encoding of information

 

Idea 10: Coding is ubiquitous in thought and communication

8/1. … Our aim will be to work towards an understanding good enough to serve as a basis for considering the extremely complex codings used by the brain.

8/2. Ubiquity of coding. To get a picture of the amount of coding that goes on during the ordinary interaction between organism and environment, let us consider, in some detail, the comparatively simple sequence of events that occurs when a “Gale warning” is broadcast.

 

Ashby’s cybernetics is concerned with the processes of creating and using information, rather than its physical form. Actors and systems exchange information by encoding and decoding data structures in messages. In a system of communicating actors, meaning is found in both the action of a sender when encoding a data structure and the response of a receiver when decoding a data structure. There is no meaning in a data structure on its own. Meaning appears only in moments when actors create and use data structures, with reference to a code that maps data to meanings. So, to succeed in communicating, actors must share the same code or language.

 

In society of birds, the meaning of the message in an alarm call is unambiguous, because the code is inherited. In human society, the meaning of every word (say, "policy") is ambiguous, because it is learnt.

 

In EA, where mistaken interpretation of messages is a very serious problem, the meaning of data in memories and messages may be defined in some kind of meta data or domain specific language.

 

SEE PART 4

Conclusions and remarks

This chapter has discussed ten ideas and principles which (at least by way of a metaphor) can be discussed in relation to enterprise and business architecture.