Beer’s ideas (formerly more about Ashby’s ideas)

Copyright 2017 Graham Berrisford. One of about 300 papers at http://avancier.website. Last updated 25/08/2017 18:59

 

Forgive me reader, but this paper grew too long. Some content has been removed into other papers, notably Ashby’s ideas.

Contents

Ashby’s ideas. 1

Beer’s ideas. 3

Relative complexity. 3

The Viable System Model (VSM) 4

Beer’s management control options: amplification and attenuation. 5

Conclusions and remarks. 5

Ashby’s law of requisite variety - revisited. 5

Beer’s management control options - revisited. 6

Beer’s thought experiment and big data. 7

Further reading. 8

 

Ashby’s ideas

This section is a minimal reminder of ideas that Beer took from Ashby as a starting point.

Read Ashby’s ideas for more.

 

W. Ross Ashby (1903-1972) was a psychologist and systems theorist.

“Despite being widely influential within cybernetics, systems theory… Ashby is not as well known as many of the notable scientists his work influenced.

W Ross Ashby was one of the original members of the Ratio Club, who met to discuss issues from 1949 to 1958.” Wikpedia in 2017

Unless otherwise stated, quotes below are from Ashby’s Introduction to Cybernetics (1956).

 

Abstraction

Ashby was keen we separate logical system descriptions from physical entities that realise them.

“Cybernetics depends in no essential way on the laws of physics.”

System theory

System descriptions

<form>                        <idealise>

Systems thinkers <observe and envisage> Real world entities

 

“Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.”

 

A system is a view of a reality; or a role you can observe or envisage real world actors as playing.

To apply system theory is to form an abstract system description that hides the infinite complexity of real-world entities you observe or envisage.

You describe the state of a real-world system in terms of variables whose values can be measured (e.g. the positions of the planets).

You model whatever regular processes can change the values of those variables (e.g. the orbits of the planets).

You describe a process in a way that enables real world behaviors to be tested as matching your description.

 

Cybernetics (control systems and target systems)

“Ashby popularised the usage of the term 'cybernetics' to refer to self-regulating systems

The book dealt primarily with homeostatic processes within living organisms, rather than in an engineering or electronic context.” Wikipedia 2017

 

Information flows are central to cybernetics.

E.g. Ashby saw the brain as a regulator that maintains a body’s state variables in the ranges suited to life

He presented the brain-body relationship as an information feedback loop.

A brain holds (or has access to) an abstract model of the body’s current state.

The brain receives information from sensors, and sends instructions to motors and organs.

The aim is homeostasis – to maintain the state of the body - and so help it achieve other desired outcomes.

 

Variety (a candidate measure of system complexity)

Ashby said that: “a system is any set of variables which he [the observer] selects”. 

“A system's variety V measures the number of possible states it can exhibit, and corresponds to the number of independent binary variables.

But in general, the variables used to describe a system are neither binary nor independent.”

 

In short, complexity = variety = the number of possible states a system can exhibit.

Several difficulties with this definition of complexity are discussed in Ashby’s ideas.

 

Law of requisite variety ("variety absorbs variety”)

Ashby's law of requisite variety applies to how a control system controls selected or essential variables of a target system.

"The larger the variety of actions available to a control system, the larger the variety of perturbations [in values of target system variables] it is able to compensate"..

 

Ashby’s law defines the minimum number of states necessary for a control system to control a target system with a given number of states.

It is interpreted here as meaning:

·         A control system’s information state models only those variables in the target system’s concrete state that are monitored and controlled.

·         For a homeostatic system to be stable, the number of states of the control system must be at least equal to the number of states in the target system.

·         The more ways that a homeostatic system can deviate from its ideal state, the more control actions a control system will need.

 

The law says if a controller does not have enough variety to control the variables that define target system, then it cannot control them.

It does not say having enough variety is to sufficient to ensure the controller can control those variables.

 

Maximising internal variety”

"Since the variety of perturbations a [control] system can potentially be confronted with is unlimited, we should always try maximize its internal variety (or diversity),

so as to be optimally prepared for any foreseeable or unforeseeable contingency." Principia Cybernetica.

This might be interpreted by a business today as filling up a “data warehouse” or collecting lots of “big data”.

There is considerable experience of this leading to data quality issues, as is noted in the conclusions below.

Beer’s ideas

Stafford Beer (1926- 2002) was a theorist, consultant and professor at the Manchester Business School.

He regarded Ashby as a grandfather of cybernetics (I believe Ashby was a godfather to one of Beer’s children).

He respected Ashby’s ideas, but was focused more on what might be called “management science”.

So, he set out to apply Ashby’s ideas to business systems.

His book title “Brain of the Firm” (1972) may well be a deliberate echo of Ashby’s “Design for a Brain” 20 years earlier.

Relative complexity

Ashby might have said thinking about structural resources (Men, Materials, Machines and Money) is inadequate because it is regular behaviors we need to monitor and control.

Beer said that thinking about the four Ms is inadequate because we need to think about managing complexity.

Knowing that Ashby’s measure of complexity is incalculable in all business systems of interest, Beer said that relative statements are valid.

How to assess relative complexity? How to objectively compare the relative complexity of two real entities, machines, societies or businesses?

 

You could do as follows.

1.      Choose your measure of complexity

2.      Identify the elements to be described (roles, actors, processes, variables, whatever)

3.      Describe two real world entities in terms of those elements

4.      Demonstrate your two descriptions have been made to the same level of abstraction.

5.      Demonstrate by testing that the two real world entities behave according to your two descriptions.

6.      Then apply the complexity measure and compare.

 

However the process looks fanciful and impractical, leaving us with complexity as a subjective assessment.

The Viable System Model (VSM)

For general system theorists, the “organisation” of a system is how actors cooperate by playing describable roles in describable processes.

For some more sociological systems thinkers, the “organisation” of a business is its management (or command and control) structure.

 

In “Diagnosing the system for organisations” (1985) Beer detailed his “Viable System Model”.

He said the VSM was inspired by the structure of the human central nervous system.

Actually: it doesn’t resemble the known structure or workings of the human brain or nervous system.

And it cannot be the VSM, since many viable systems have nothing like a central nervous system (e.g. the solar system, a tree, a bee hive, an oyster).

The VSM is a tool for diagnosing human organization design issues, and generating change proposals.

It is a reference model or design pattern for the organisation of a business.

 

Beer’s writing is not easy to follow and the VSM is complicated.

For a picture of the VSM, find “Diagnosing the system for organisations” on the internet and look at Figure 37.

Beer wrote: “There is no 'correct' interpretation of the VSM. We have spoken instead of more or less useful interpretations.”

Interpreting the VSM is a job in itself, which some management consultants enjoy doing.

Here is a possible interpretation.

The Viable System Model

One interpretation

System 5: makes policy decisions to steer the whole organization and balance demands from different units.

Business executive

System 4: looks out to the environment and monitors how the organization must change to remain viable.

Business strategy and planning

System 3: establishes the rules, resources, rights and responsibilities of System 1 and interfaces with Systems 4/5.

Enterprise architecture?

System 2: information systems that enable primary activities to communicate and System 3 to monitor and co-ordinate those activities.

IT operations

System 1: the primary/core business activities (recursively, each is a viable system)

Business operations

 

Note that Beer (being a steel industry man) hugely underestimated the extent to which core business activities are IT operations.

 

Quantified evidence for the effectiveness of Beer’s VSM is sparse.

“Few organizations have adopted the VSM as their formal organizational structure.

But many consultants have used it to diagnose the way an organization is operating and where improvements are needed.” (Stuart Umpleby).

 

This paper is not about using the VSM; it is about four core concepts that underpin it.

Beer inherited the first two concepts below from Ashby and added two more.

·         Variety: the number of possible states a system can exhibit (one way to assess its complexity).

·         The law of requisite variety: “only variety can absorb variety”

·         Attenuator: a device that reduces variety.

·         Amplifiers: a device that increases variety.

Beer’s management control options: amplification and attenuation

Remember, Beer set out to apply Ashby’s ideas to business systems.

What if a control system fails to control key state variables of a target system?

What if managers fail to control key variables of business, such as production rate, profit etc.?

What if a government fails to maintain the happiness of its population (no joke here, follow the link at the end of this paper).

Beer proposed two options for managers:

·         amplify (increase) variety in the control system, or

·         attenuate (reduce) variety in the target system.

 

Other options are discussed later under “Conclusions and remarks”.

Conclusions and remarks

It turns out that, in nature and in business, control systems may be distributed rather than centralised (as Ashby and Beer presumed).

I gather the human body does not maintain homeostasis purely by top-down command and control from the higher brain.

Instead, its many variables are maintained by several control systems, operating in parallel, which are distributed and not in direct communication with each other.

An oyster manages to maintain homeostasis without having any central brain and nervous system.

In large businesses, there is rarely if ever an overarching control body that monitors and directs all business processes.

There are in practice several (hierarchical and/or parallel) bodies that may compete for resources, and even have conflicting goals.

 

To different controllers, a real world entity is at once several target systems, a different one to each controller.

But suppose two control systems either compete or complement each other in seeking to control the same target system state?

This feels to me the very stuff of relationships between people in a social system!

Ashby’s law of requisite variety – revisited

Go to Ashby’s ideas to read how these four sentences are continued:

·         Ashby’s law might be interpreted for a business thus…

·         To direct an actor or activity in its environment, a brain or a business must be able to…

·         To generalise, a control system must…

·         A control system can expect a target system to respond appropriately provided that…

Beer’s management control options - revisited

What if turns out that a target system cannot or does not always respond appropriately to a control system?

First, the control system can be designed to allow for these cases.

A common practice is for the control system to pass responsibility over to some kind of exception handling process.

 

There are many other design options to consider, starting with Beer's two basis options.

·         attenuate (reduce) variety in the target system

·         amplify (increase) variety in the control system.

 

You might consider also:

·         improving the information flow quality (speed, throughput, integrity etc.)

·         adding another control system in parallel to the existing one.

 

And if the actors in the system are self-determining (can choose their own response to a stimulus) two more options could be:

·         empower actors in the target system to determine their own actions, in the light of given goals

·         empower actors in the target system to find and deploy their own control systems.

 

I am not pretending these to be original ideas; Barry Clemson tells me you can find some of them in Beer’s writings.

Nor am I pretending that this is an exhaustive list of options.

Beer’s thought experiment and big data

Science demands that theories lead to predictions that are testable in experiments.

System design creates system descriptions whose behaviors should be testable in built systems.

If the experiments or system tests fail, then the theory or system design should be discarded or corrected.

 

In the 1970s. Ackoff and Beer predicted the imminent collapse of government institutions, if not the collapse of society as a whole.

What has happened in 50 years since, alongside ever increasing population?

The United Nations report huge advances in the health, life expectancy, education and welfare of people across the globe.

How did those institutions avoid their predicted fate, and succeed so well?

 

Beer’s concern wasn’t difficulties recruiting or motivating staff to implement political decisions, or to question decisions.

He was more concerned with the variety of the real world actors and activities that institutions seek to monitor and direct. 

To address this variety he proposed automating feedback loops to collect more information - “big data” we might call it now.

 

For example:

“Beer built a device that would enable the country’s citizens, from their living rooms, to move a pointer on a voltmeter-like dial that indicated moods ranging from extreme unhappiness to complete bliss.

The plan was to connect these devices to a network—it would ride on the existing TV networks—so that the total national happiness at any moment in time could be determined.

The algedonic meter, as the device was called (from the Greek algos, “pain,” and hedone, “pleasure”), would measure only raw pleasure-or-pain reactions to show whether government policies were working.”

See New Yorker reference

 

Was Beer’s advice sound?

 

Data quality issues

In effect, Beer favored “maximising the internal variety” of the control system, in case that information might be useful.

Consider for example, the information gathering practices of the former East German government.

And the business practice of dumping data in a “data warehouse” or collecting lots of “big data”.

There is considerable experience of businesses having data quality problems, especially when collecting more data than they need.

When they try to use that data, they find the data is out of date, not quite what they want, or inaccurate.

The feedback loop might be characterised as garbage in garbage out.

 

The best control systems are lean.

A lean control system is one that knows a minimal amount of what is going on in the controlled entity.

E.g. A thermostat knows nothing of a heating system bar the temperature of the environment.

Lean non-intrusive government is generally favoured over the practices of the former East German government.

And in business, management by exception is common.

Managers often ask us to minimise the variety they monitor by reporting only “traffic light” status information to them.

 

Humans are not machines

Beer promoted the “participative democracy” that many social system thinkers have long treated as a vision or mission statement.

Even so, his structured approach to management still looks like the top-down command and control you might find in government-run or state-controlled industry.

Collecting big data and directing activities based on analysis of that data was the basis of Beer’s “Project Cybersyn” in Chile.

That project didn’t end well; see further reading below.

 

Social entities and social systems are not the same thing, but may be aligned

Social entities behave in ways beyond description using the terms of general system theory and cybernetics.

Collecting data and directing activities based on analysis of that data is one part of managing business.

Motivating and helping people to reach goals is another part (along with rules that discourage aberrant behaviors).

A real-world business is a mix of social entities and social systems, which may be aligned.

Read Social cells for a longer discussion.

Further reading

I have been told that Beer attempted to model parts of the Chilean economy (and his VSM) using Forrester’s System Dynamics.

It is one thing to envisage a large and complex social entity as a system, and model your theory; quite another to prove that model is right, or useful in practice.

You will surely enjoy reading this article about Stafford Beer http://www.newyorker.com/magazine/2014/10/13/planning-machine

“The Planning Machine: Project Cybersyn and the origins of the Big Data nation.” By Evgeny Morozov, a Critic at Large, October 13, 2014 Issue of the New Yorker.

 

Noting that Beer was a socialist of a kind, you might be interested also to read Marxism and System Theory

 

Sources read in the course of writing this paper include:

·         http://ototsky.com/khipu/lib/beer_diagnozingthesystem_en.pdf

·         http://digitalcommons.colby.edu/cgi/viewcontent.cgi?article=2829&context=cq

·         http:// www.hbcse.tifr.res.in/jrmcont/notespart1/node9.html (this link appears broken)

 

A list of Beer’s works, extracted from “Diagnozing the system”

·         S. beer (1959) Cybernetics and Management. English Universities Press.

·         S. beer (1960) Towards the cybernetic factory. In Principles of Self Organization. Symposium of 1960, Pergamon Press, Oxford.

·         S. beer (1965) The world, the flesh and the metal. Nature 205 (No. 4968), 223-231.

·         S. beer (1966) Decision and Control. Wiley, Chichester.

·         S. beer (1972) Brain of the Firm. Allen Lane, Penguin, Hardmondsworth.

·         S. Beer (1975) Platform for Change. Wiley, Chichester.

·         S. beer (1979) The Heart of Enterprise. Wiley, Chichester.

·         S. beer (1981) Brain of the Firm, 2ndedn. Wiley, Chichester.

·         S. beer (1983) A reply to Ulrich's "Critique of pure cybernetic reason: the Chilean experience with cybernetics". 1. appl. Systems Analysis 10.

 

All free-to-read materials at http://avancier.website are paid for out of income from Avancier’s training courses and methods licences.

If you find the web site helpful, please spread the word and link to avancier.website in whichever social media you use.