Clemson’s ideas

Copyright 2020 Graham Berrisford. A chapter in “the book” at https://bit.ly/2yXGImr. Last updated 21/04/2021 14:44

 

This chapter reviews the 22 laws, principles and theorems presented in “What is Management Cybernetics? (1984) by Barry Clemson and Allena Leonard, with reference to Ashby’s “Introduction to Cybernetics”. It seems to me that some of the ideas stand up; some are applicable to business architecture in EA; some are of questionable relevance, and some don’t stand up.

 

Reading online? If your screen is wide, shrink its width for readability.

Introduction

“In my 1984 introductory book on management cybernetics I said that most books with the words “science” and “management” in the title are neither scientific nor good management.”

 

That may still be true. And since Ackoff observed that business managers manage “messes” rather than systems, this chapter puts quotation marks around the word “organization”.

 

“I claimed that management cybernetics, on the other hand, is good science and teaches us how to co-operate with the natural order of things rather than continuously bloodying our heads on stone walls. In that 1984 book, Allenna Leonard and I listed 22 laws, principles, and theorems of management cybernetics.

 

This short chapter briefly reviews each of the 22 items in the list.

 

Ern Reynolds then suggested a very useful exercise: paste the name of your organization in place of the word “system” in the 22 laws, principles and theorems. You will then have a useful cybernetic description of your organization.”

 

The equation, business “organization” = system, is contrary to Ashby’s cybernetics (as well as soft systems methodology) which are based on the idea that observers may identity different systems in the same “organization”.

1. System Holism 

“A) A system has holistic properties possessed by none of its parts.

 

Yes, commonly, parts interact to produce effects they cannot produce on their own. However, consider a system that is homogeneous, in which several uniform parts each do the same thing. Remove one rider from a tandem, the remaining rider can still propel the bicycle. Close half of McDonalds’ outlets, the remainder carry on selling burgers exactly as before. Divide a school of fish into two halves; each part behaves the same as the whole.

 

“B) Each of the system parts has properties not possessed by the system as a whole.”

 

Yes, generally. However, you can encapsulate a part, and present its interface in the interface of the whole. E.g. All the useful properties of a spell checker are available via the interface of your word processor. The functions of the part are a subset of the functions of the whole.

2. Darkness 

“No system can be known completely.”

 

This is axiomatic to all description of reality, not particular to system theory. Even a grain of sand is beyond our full comprehension. In cybernetics, an entity is only a physical system in so far as it realizes an abstract system - a description of regular behavior relevant to “some main interest that is already given” as Ashby put it.

 

How an abstract activity system is realized

Orchestral music

Game of poker

Abstract system: a description of roles for actors, rules for processes and variable types

A musical score

The rules of the game

Physical system: a performance of defined activities, which gives values to variables

A performance of the score

A game of poker

Physical entity: one or more physical actors able to perform the activities

An orchestra

A card school

 

Every human “organization” is infinitely complex and not completely knowable. We can never model every aspect of an orchestra, card school or business; we can model, as a system, only some aspects of its structure or behavior that are “regular” as Ashby called them.

 

Again, if Clemson says a business = a system, that is contrary to Ashby’s point that, in cybernetics, different observers may identity different systems in the same “organization”. Moreover, human actors will sometimes act in self-determined way; they will sometimes recognise, interpret and respond to events or conditions in ways not so far modelled. And so, a business “organization” will always extend beyond the scope of any cybernetic system we might model.

 

SEE THE “WHAT EVERY SYSTEM THINKER SHOULD KNOW” CHAPTER

3. Eighty-Twenty 

“In any large, complex system, eighty percent of the output will be produced by only twenty percent of the system.”

 

This rule of thumb is a variation of the Pareto principle (or 80/20 rule). It may be applied to many system features. E.g. in modelling business processes it is said that 80% of the complexity is found in the 20% of exception cases.

4. Complementarity Law

“Any two different perspectives (or models) about a system will reveal truths about that system that are neither entirely independent nor entirely compatible.”

 

In cybernetics, what Clemson here calls two models are two abstract systems, and what Clemson here calls the system is the entity or “organization” that realizes those two systems.

 

Not entirely independent? Probably, since any two systems realized by one entity likely depend either on each other or on some common “infrastructure”. E.g. they may be independent in so far as they maintain unelated variables (say, body temperature and blood sugar level), yet both depend some other function (say circulation of the blood).

 

Not entirely compatible? There is no obvious reason to assume this. The maintenance of body temperature is compatible with the maintenance of blood sugar levels. But it is true that we may discover activity systems in one business that are in competition or conflict, and that is an issue enterprise architects should address.

5. Hierarchy 

“Complex natural phenomena are organized in hierarchies with each level made up of several integral systems.”

 

“Complex” is troublesome word. For sure, every natural phenomenon (waterfall, hurricane, oyster, sponge, maize plant, fungus) is infinitely complex in reality. However, its complexity can only be assessed with reference to a description or model of it. We can and do build simple models of complex realities, and that is a basis of systems thinking.

 

“Hierarchy” is another troublesome word. People do construct hierarchies to make sense of the messy network of things that exist in reality. This table (bottom to top) presents a history of the universe, from the big bang to human civilisation, as a hierarchy.

 

THINKING LEVEL

Elements or actors

Interact by

Human civilisation

Human organizations

Information encoded in writings

Human sociology

Humans in groups

Information encoded in speech

Social psychology

Animals in groups

Information encoded in signals

Psychology

Animals with memories

Sense, thought, response

Biology

Living organisms

Sense, response. Reproduction

Organic chemistry

Carbon-based molecules

Organic reactions

Inorganic chemistry

Molecules

Inorganic reactions

Physics

Matter and energy

Forces

 

At each level of thinking, other hierarchies may be defined. In physics, there is a hierarchy that descends from galaxies through solar systems down to atomic particles. In biology, we see a body as decomposable successively into organs, cells, organelles, and organic chemicals.

 

We do find it convenient to construct models in which hierarchies are imposed over atomic parts, but perhaps those hierarchies exist only in our models, rather than in reality? And which is the top? Do cells serve the interests of the body? Or does a body serve the interests of its cells? Do genes enable an organism to live? Or does an organism live to reproduce its genes? The answer depends on your perspective.

 

By the way, at first glance, the hierarchical decomposition of a body through organs and cells to organelles seems analogous to the decomposition of a business though divisions and departments to units. However, the analogy between biological organisms and business “organizations” is a very weak one.

 

Cells in a body are

Employees of a business are

mechanical actors

actors with free will or “choice” (Ackoff)

unable to choose or change their mechanisms

able to choose and change their mechanisms

not viable outside their body

able to work for other businesses

dedicated to their role, cannot be repurposed.

not entirely devoted to one role, can be assigned to another

unable to invent responses to stimuli

able to invent responses to stimuli

physically contained inside one body

logically grouped by managers

in physically contact with each other

connected remotely by logical communication

 

SEE THE “WHAT EVERY SYSTEM THINKER SHOULD KNOW” CHAPTER

6. Godel’s Incompleteness Theorem

“All consistent axiomatic foundations of number theory include undecidable propositions.”

 

As I understand it, one implication of this is that we cannot devise a hierarchical ontology in which every word is defined terms of other (higher) words. There will always be a few “root” words at the top of the hierarchy, whose meaning has to be taken as axiomatic.

7. Entropy – Second Law of Thermodynamics

“In any closed system the differences in energy can only stay the same or decrease over time; or, in any closed system the amount of order (or organization) can never increase and must eventually decrease.”

 

This seems of limited application to business architecture. Social systems thinking is at some remove from cybernetics, which is at some remove from thermodynamics. Ashby played down the role of thermodynamics in cybernetics, since our main interest is in open systems where a supply of energy, sufficient to maintain the order of the system, is available. In his Introduction to Cybernetics, he wrote:

 

1/5 “In this discussion, questions of energy play almost no part—the energy is simply taken for granted.” "Even whether the system is closed to energy or open is often irrelevant”.

4/15. Materiality: “cybernetics is not bound to the properties found in terrestrial matter, nor does it draw its laws from them.” “What is important is the extent to which the observed behaviour is regular and reproducible.”

7/24. Decay of variety: “any system, left to itself, runs to some equilibrium”. “Sometimes the second law of thermodynamics is appealed to, but this is often irrelevant to the systems discussed here."

9/11. Entropy: “Shannon has devised a measure for … entropy—that has proved of fundamental importance …” “The word “entropy” will be used in this book solely as it is used by Shannon; any broader concept being referred to as “variety” or in some other way.”

 

Note that Ashby distinguished entropy of communication theory from entropy of thermodynamics.

 

Aside: a whole is in a disordered state if there is no correlation between its parts. That is to say, knowing the state of one part gives no information about other parts. One source defines entropy as the number of ways such uncorrelated parts of a whole can be configured. The higher the number of parts, the higher entropy of the whole since there are more ways in which to arrange the parts. However, if you define one configuration, then organize the parts to match it, you are imposing order on the whole (which requires energy). Designing a system for adaptability may involve complexification - increasing the number of ways parts can be configured - and increasing the variety of system states.

Clemson’s three redundancy principles

 

8. Redundancy of Information Theorem

Errors in information transmission can be protected against (to any level of confidence required) by increasing the redundancy in the messages.

 

That is helpful, but is not enough to protect against errors in information transmission.

SEE THE “INFORMATION AND COMMUNICATION” CHAPTER

 

9. Redundancy of Resources

Maintenance of stability under conditions of disturbance requires redundancy of critical resources.

 

Duplication and redundancy of components are how system designers improve the throughput, reliability, recoverability and scalability of a system.

 

Aside: The terms robustness and resilience (or some say, antifragility) are used variously with reference to how a system is designed or evolves to survive. The terms are perhaps most simply distinguished as follows. To survive in the face of a changing environment or disruptive input. A robust system handles disruptive or unwelcome events and conditions (think, homeostasis, or immunity to infection). A resilient system mutates to handle new events and conditions (think (evolution). See point 21.

 

10. Redundancy of Potential Command

In any complex decision network, the potential to act effectively is conferred by an adequate concatenation of information.

 

For sure, our interest is in social systems in which actors act in response to information that has been received in messages and remembered in memories. However, I don’t know what a “decision network” is or what “concatenation of” implies.

11. Relaxation time 

“System stability is possible only if the system’s relaxation time is shorter than the mean time between disturbances.”

 

Assuming that an isolated, many-part, system, sooner or later, reaches an equilibrium state, irrespective of its initial state, then “relaxation time” is the time the system takes to reach that state.

 

This is of limited application to business architecture. Businesses often advance state variables progressively rather than homeostatically (see point 15). Also, they can process many inputs in parallel, by receiving them into discrete subsystems, which are coordinated later, if need be.

12 and 13. Circular Causality (recap from previous chapter)

Two systems can be coupled as subsystems of a wider system, such that the output from one is input to the other.

 

12. Given positive feedback (i.e., a two-part system in which each stimulates any initial change in the other), radically different end states are possible from the same initial conditions.

13. Given negative feedback (i.e., a two-part system in which each part tends to offset any change in the other), the equilibrial state is invariant over a wide range of initial conditions.”

 

These are the basic ideas of system dynamics. Mutually-reinforcing positive feedback will drive two coupled systems forward to new states. Mutually-constraining negative feedback will tend to keep two coupled systems in a stable state. Such feedback is important in the cybernetic control of one system by another. Controllers (regulators) are coupled to targets by information flows in a feedback loop.

 

 

Feedback loop

Controller

 ßstate information  

directionà

Target

 

Feedback is important to the success of business operations. A business is coupled to its environment by producing outputs that affect external entities, and receiving inputs, some of which are responses to previous outputs.

14. Feedback dominance theorem

For high gain amplifiers, the feedback dominates the output over wide variations in input.

 

No comment.

15. Homeostasis 

“A system survives only so long as all essential variables are maintained within their physiological limits (e.g. blood sugar).”

 

The idea of homeostasis is of limited application to business architecture. Clearly, a business cannot survive unless its human actors are fed, watered and maintained in comfort by HVAC systems. And computers need an electricity supply. But most business management methodology takes these support functions for granted.

 

The core or primary functions of a business “organization” may be more progressive than homeostatic, meaning they continually advance the state of the business. They progressively increase expenditure and income variables. They work to complete end-to-end processes that terminate in deliver a result of value (or fail) and projects that stop when a change has been made (or time or money variables have been exhausted).

16. Steady State 

“If a system is in a state of equilibrium (a steady state), then all sub-systems must be in equilibrium. If all sub-systems are in a state of equilibrium, then the system must be in equilibrium.”

 

Hmm… That may be true of some systems. However, the universe is not a stable system. It is an ever- unfolding process in which new systems evolve and old systems die. The world’s biosphere might be seen as a stable system at a high level of abstraction (say, in terms of its total biomass, but its subsystems are not.  Species and organisms come and go. Arguably, the biosphere is stable despite or because of volatility in its subsystems.

 

Similarly, a nation’s economy may be stable in terms of (say) interest rates and inflation, despite or because there are many business failures and start ups. And a business entity may survive despite or because it is divided into units that are allowed to succeed or fail on their own.

17. Requisite Variety Law (repeat from earlier chapter)

“The control achievable by a given regulatory sub-system over a given system is limited by 1) the variety of the regulator, and 2) the channel capacity between the regulator and the system.”

 

Ashby equated the complexity of a system to its variety. Regulators must be able to recognize the variety they seek to control. You cannot usefully turn on a light unless you know a room is dark (or vice-versa).

 

The wider the variety of states you wish to control, the wider the variety of states you must recognize and the wider the variety of actions you must able to perform. This is not a physical law like Newton’s laws of motion, which may be supported or disproven by experiment. It is an information law, and an expression of what is mathematically inevitable.

 

This law has some application to business architecture. However, there limits to how far a target can be controlled by a regulator.

 

A regulator can control only selected variables. A government institution or a business (like IBM) is a social entity definable in terms of countless variables. Different regulators - with different interests – may see it as different systems with different varieties. There is no prospect of typifying the whole of IBM, all describable variables, in one abstract system.

 

A regulator can’t fully control variables also controlled by other forces. A regulator designed to monitor and direct some selected variables of IBM may find them be buffeted by other factors – including other regulators and market forces.

 

SEE THE “ASHBY’S LAW” CHAPTER

18. Conant-Ashby theorem (repeat from earlier chapter)

Every good regulator of a system must be a model of that system. (International Journal of System Science, Conant and Ashby (1970)

 

This principle is a foundation for much in this book. Here, a regulator can be an animal, machine or business that has a model, or has access to a model, of what it needs to monitor and control. So, read this triangle from left to right: regulators <have and use> models, which <represent> targets.

 

The good regulator

Models

<have and use>           <represent>

Regulators    <monitor and regulate >   Targets

 

Evidently, to function and respond to changes, an animal must “know” what it going on in its world. It needs a model of entities and events its environment if it is to find food and mates, and avoid enemies.

 

The regulator can be an animal, machine or business that has a model, or has access to a model, of what it needs to monitor and direct. Organic, mechanical, business and software entities may all be connected to variables they monitor and direct by feedback loops. A brain holds a model of things in its environment, which an organism uses to manipulate those things. A missile guidance system senses spatial information, and sends messages to direct the missile. A business database holds a model of business entities and events, which people use to monitor and direct those entities and events.

 

The richer the model, the more adaptive the animal, machine or business can be to changes in its environment.

 

Note that a regulator models only those variables of the target system it seeks to control, not necessarily all the variables that the target maintains. Note also that in a discrete event-driven system, a stateless regulator can import its model before it processes an input, then put the model away again.

19. Self-Organizing Systems

“Complex systems organize themselves; the characteristic structural and behavioral patterns in a complex system are primarily a result of the interactions among the system parts.”

 

It is unclear to me whether Clemson’s means business processes are or should be left to the actors who play roles in them. While some patterns may emerge from interactions of independent agents, most business architecture methods presume core business processes are designed by a higher-level entity or meta system, and actors are then employed to play roles in those processes.

 

By 1962, Ashby had concluded: “The use of the phrase [self-organization] tends to perpetuate a fundamentally confused and inconsistent way of looking at the subject”, where the subject is how a system may evolve. And “No machine can be self-organizing in this sense.” His treatise on self-organization can be seen as overcoming the limitations of basic cybernetics. He stated that no machine is capable of changing itself. However, another (let us call it “higher”) machine can do that. It turns out that Ashby’s approach to self-organization sits well alongside some sociological approaches the topic, and merits deeper exploration.

 

SEE THE “SYSTEM CHANGE” CHAPTER

 

Generally, the term “complex system” is ill-defined. Here, it probably refers to a real-world “organization” in which human actors are more or less free to determine their actions, rather than to any particular activity system.

 

SEE THE “COMPLEXITY SCIENCE” CHAPTER

20. Basins of Stability

“Complex systems have basins of stability separated by thresholds of instability. A system “parked” on a ridge will “roll downhill”.

 

In cybernetics, a variable, whatever its initial value, may led by events to settle on a particular value, which appears to be its goal. Or be attracted to the nearest of several possible stable values. It is unclear to me how this idea relates to business architecture, except by drawing a dubious analogy with the obvious fact that a business “organization” may go through periods of stability and periods of volatility.

21. Viability

“Viability is a function of the balance maintained along two dimensions: 1) autonomy of sub-systems versus integration of the system as a whole, and 2) stability versus adaptation.”

 

Wrt 1, in enterprise architecture there are several design pattern pairs, which some systems thinkers presume to be related to the contrast Clemson draws.

 

More autonomy?

More integration?

Network connectivity

Hierarchical connectivity

Point to point communication

Hub and spoke communication

Peer to peer cooperation

Client sever cooperation

Self-organization of activities

Coordination of activities

 

Each pattern has different pros and cons. In activity system design, the designer’s role is to understand the trade-offs and fit the pattern to the situation. And to prioritise what is practical over what some might promote as an ideal.

 

However, the viability of a business depends many other factors such as resources, competition and legality.

 

Wrt 2, what does adaptation mean? Ashby wrote (5/7) that the word is commonly used in two senses which refer to different processes.  He urged us to distinguish system state change from system mutation, as in this observation.

“The word "change" if applied to [an entity repeating a behavior] can refer to two very different things.

·       change from state to state, which is the machine's behavior, and which occurs under its own internal drive, and

·       change from transformation to transformation, which is a change of its way of behaving, and occurs at the whim of the experimenter or some other outside factor.

The distinction is fundamental and must on no account be slighted.” (1956, 4/1)

 

By adaptation, Clemson surely means the latter – system mutation. For sure, a business must balance continuity of activity systems and against the need to change them now and then. An activity system cannot change “continually”, but it can change in discrete steps.

 

SEE THE “WHAT EVERY SYSTEM THINKER SHOULD KNOW” CHAPTER

22. Recursive System Theorem

“If a viable system contains a viable system, then the organizational structure must be recursive; or, in a recursive organizational structure, any viable system contains, and is contained in, a viable system.”

 

I am not sure what this means. A moon rocket is a viable system, but I wouldn’t say its structure is recursive, and I don’t think a business is well called recursive either. Moreover, note that the hierarchies in our models of biological organisms and business organizations are not “fractal”, since a) the decomposition is not infinite and b) structures are different at each level of decomposition.

Conclusions and remarks

This chapter reviews the 22 laws, principles and theorems presented in “What is Management Cybernetics? (1984) by Barry Clemson and Allena Leonard, with reference to Ashby’s “Introduction to Cybernetics”. It seems to me that some of the ideas stand up; some are applicable to business architecture in EA; some are of questionable relevance, and some don’t stand up.

 

Several points, notably 2, 5, 18 and 21, are explored in other chapters of this book.