Beer’s ideas – and scientism
Copyright 2017 Graham Berrisford. One of several hundred papers at http://avancier.website. Last updated 06/01/2019 11:56
General system theory incorporates classical cybernetics - the science of biological and technological control systems.
Ashby (1956) introduced cybernetics as seeing the world in terms of systems connected by information feedback loops.
In many spheres and applications, these ideas have succeeded brilliantly.
This paper discusses Beer’s use of Ashby’s cybernetic ideas – notably his “law of requisite variety”.
By the way, this is among the most popular papers on Avancier.website.
And seems to be the most popular of c100 papers on this system theory page https://bit.ly/2yXGImr.
But I don’t understand why, since Beer’s ideas and approach were so questionable – as discussed below.
Stafford Beer (1926- 2002) was a theorist, consultant and professor at the Manchester Business School.
Beer saw Ashby as a grandfather of cybernetics.
He was a disciple of Ashby’s cybernetics for self-regulating homeostatic systems.
They knew each other, and I believe Ashby was a godfather to one of Beer’s children.
This section is edited from “Ashby’s ideas”.
W. Ross Ashby (1903-1972) was a psychologist and systems theorist.
“Despite being widely influential within cybernetics, systems theory… Ashby is not as well known as many of the notable scientists his work influenced.”
“Ashby popularised the usage of the term 'cybernetics' to refer to self-regulating systems” Wikipedia 2017
Understanding Ashby’s ideas helps you to understand much else in the field of systems thinking.
The eight ideas below are applied today in methods and modelling languages for enterprise, business and software architecture.
· Systems are abstractions
· Systems are deterministic
· Cybernetics is behavioristic
· System mutation change differs from system state change
· Cybernetics is about information rather than energy flow
· The brain can be modelled as a control system
· Variety is a measure of complexity
· Variety absorbs variety - the law of requisite variety
There follow brief notes on the four ideas highlighted above.
Systems are abstractions
For some, understanding cybernetics requires making a paradigm shift as radical as is needed to understand Darwin’s evolution theory.
People point at a machine or a business in the real world (like IBM) and say "the system is that thing there".
Classical cybernetics is about systems that exhibit regular or repeatable behaviors.
The concrete thing that is IBM can be (can manifest, instantiate, realise) countless different systems.
“A system is any set of variables which he [the observer] selects”. Ashby
A concrete system contains actors and their actions on selected objects/variables.
An abstract system contains roles and rules that actors and their activities adhere to in acting on those objects/variables.
E.g. The abstract roles and rules of the stickleback mating ritual are realised by countless pairs of sticklebacks.
An abstract system hides or ignores the infinite complexity of any real world actors and activities that realise the system.
E.g. In describing and testing the mating ritual, no attention is paid to the complexity of a stickleback’s internal biochemistry.
So, to apply Ashby’s system theory is to apply the scientific method
You observe or envisage a system in the real world – an empirical system – an instance.
You model it in an abstract system description - a theoretical system – a type.
You model the system’s state as variables whose values can be measured (e.g. the positions of the planets).
You model the behavior as processes (e.g. the orbits of the planets) that maintain or advance variable values
The model is a type; it hides the infinite complexity of real-world actors and activities that instantiate (or realise) the model
Ashby’s view of systems
Abstract system descriptions
<create and use> <typify and symbolise>
Systems thinkers <observe & envisage> Concrete system realisations
If and when the concrete system runs in reality, you can test that real-world entity against what the abstract system predicts.
System mutuation differs from system state change
Ashby insisted we should on no account confuse two kinds of change.
System state change
the value of at least one state variable
homeostatic regulation of values to stay within a desired range
the type of at least one variable or behavior.
re-organization changing the variables or the rules that update their values.
For more on different kinds of system change, read System stability and change.
Variety is a measure of complexity
“A system's variety V measures the number of possible states it can exhibit, and corresponds to the number of independent binary variables.
But in general, the variables used to describe a system are neither binary nor independent.”
In short, complexity = variety = the number of possible states a system can exhibit.
There are many difficulties with this definition of complexity.
From the viewpoint of a describer, a system is only as complex as its description.
From the viewpoint of a control system, a target system is only as complex as those variables the control system monitors and controls.
Read “Complex adaptive systems” for more on that topic.
Variety absorbs variety - the law of requisite variety
Ashby's law of requisite variety applies to how a control system controls selected or essential variables of a target system.
"The larger the variety of actions available to a control system, the larger the variety of perturbations [in values of target system variables] it is able to compensate".
The law defines the minimum number of states necessary for a control system to control a target system with a given number of states.
It is interpreted here as meaning:
· A control system’s information state models only those variables in the target system’s concrete state that are monitored and controlled.
· For a homeostatic system to be stable, the number of states of the control system must be at least equal to the number of states in the target system.
· The more ways that a homeostatic system can deviate from its ideal state, the more control actions a control system will need.
· Variety: the number of possible states a system can exhibit
· Attenuator: a device that reduces variety.
· Amplifier: a device that increases variety.
The law says having enough variety is a necessary precondition to control selected variables in a target system.
The law does not say variety is all you need
The law does not say having enough variety is a sufficient precondition to control selected variables.
The law does not mean maximising variety
The law does not say having more than enough variety is a good idea.
Ashby emphasised the need to be selective.
“we should pick out and study the facts that are relevant to some main interest that is already given.”
Which is contrary to the following “maximize internal variety” principle.
"Since the variety of perturbations a [control] system can potentially be confronted with is unlimited, we should always try maximize its internal variety (or diversity),
so as to be optimally prepared for any foreseeable or unforeseeable contingency." Principia Cybernetica.
This suggests redundant design effort, inefficient system operation and may result in data quality issues.
One real world entity may be subject to many control systems
Ashby’s view of a human being had something of Cartesian dualism about it.
He treated the body as the target system and the brain/central nervous system as the control system.
Today, psychobiology tends to the view that mental states and activities are bodily states.
This view, that the mind is inseparable from the body, is called “cognitive embodiment”.
And it appears we do not maintain homeostasis purely by control from the higher brain.
Rather, our many state variables are maintained by different control systems, which operate in parallel.
These control systems are distributed through the body and not in direct communication with each other.
The universe is a mess of more or less related systems
In nature and in business, control systems may be distributed, and act independently of each other.
An oyster manages to maintain homeostasis this way, without a central brain and nervous system.
In a large business, parallel divisions may compete for customers, or resources, or have conflicting goals.
From the perspective of two different control systems, one real world entity can be two different target systems.
Two control systems may simultaneously compete with and complement each other in controlling the state of one real world entity.
Surely, this is the very stuff of relationships between people in a social group?
The wider universe is divisible into infinite systems, separate, nested and overlapping.
Two systems may be described as:
· Control (regulatory) and target systems
· Cooperating (symbiotic) systems
· Competing systems
But which labels you apply to which system, may depend on your perspective.
E.g. doesn’t a heating system control the behavior of its thermostat?
Stafford Beer (1926- 2002) was a theorist, consultant and professor at the Manchester Business School.
Beer saw Ashby as a grandfather of cybernetics.
He was a disciple of Ashby’s cybernetics for self-regulating homeostatic systems.
They knew each other, and I believe Ashby was a godfather to one of Beer’s children.
In cybernetic regulation, a control system directs a target system to maintain its state variables in desired range.
A thermostat (control system) directs the actions of a heating system (target system).
In Ashby’s view, the brain (control system) directs the actions of a body (target system).
Beer took a cybernetic view of management science.
His book “Brain of the Firm” (1972) introduced the Viable System Model for business management.
The book title echoes Ashby’s “Design for a Brain” 20 years earlier.
Why did Beer think to apply cybernetics to business management? Because the analogy is so persuasive.
Bureaucracy is about information flows in management hierarchies.
Cybernetics is about information flows in control/target system hierarchies.
Ashby wrote of the brain and the body – and information feedback loops.
Beer wrote of the Brain of the Firm – and information feedback loops
Beer saw corporations as homeostatic systems - full of feedback loops between the company and its suppliers, between workers and management.
And if we can make homeostatic corporations, why not homeostatic governments?
Beer urged politicians and economists to employ cybernetics.
On taking office in 1970, Allende nationalized Chile’s key industries.
He promised worker participation in the planning process; Beer was hired to help
The project’s name, Cybersyn, was short for “cybernetics synergy”.
A transient success
In 1972, the project’s equipment was used successfully in an impromptu way.
A strike by truck drivers (fearful of nationalization) threatened to paralyze Chile.
The project’s telex machines were deployed to help industries coordinate the sharing of fuel.
Today, a web site could centralise information about fuel resources.
Anybody with a fuel stock can post a message.
Anybody with a fuel need can sort the messages by stock location and stock volume.
Left to themselves, people will create and operate countless such collaboration systems.
Each is “central” to the parties it connects.
But each monitors and directs only a tiny, tiny fraction of what is happening the world.
The vision of a homeostatic control system for business as usual
Successes claimed for the project seem to lie in the provision of central IT resources for businesses.
Factory "control rooms" sent data to the centre, which transformed the data into management information
The aim was to help each factory set production goals, optimise resource use and make investment decisions.
In that regard, it was an ordinary IT project rather than an application of cybernetics.
Beer designed a brain and central nervous system for the Chilean economy.
There would be four levels of control - firm, branch, sector, total.
If a variable moved out of range, for an unacceptable time, the higher level would be notified.
Managers in the higher level operations room would read the data, make a plan, then cascade advice and directives using telex messages.
In this way, to a degree, the system was to monitor and direct the actions of actors in Chile’s nationalised businesses.
(Read the Wikipedia entry on Cybersyn for detail.)
In February 1973, Project Cybersyn delivered the first operational version of the system.
It is unclear to me how completely the homeostatic control system design was implemented.
If it was, then it didn’t just establish a management hierarchy, it prescribed its bureaucratic operation.
The vision of a forecasting system for central planning
There was an element of inter-business collaboration.
One Cybersyn message warned a cement factory manager of a shortage of coal at the local coal mine.
(Unfortunately, the message arrived too late, after the manager had detected the problem and resolved it locally.
Such delays undermined the enthusiasm of factory managers to supply data to the center.)
The vision went beyond enabling and coordinating regular business operations.
Beer argued that “information is a national resource” and anticipated what we might now call big data.
“At the center of Project Cybersyn” was the Operations Room, where cybernetically sound decisions about the economy were to be made.
Those seated in the op room would review critical highlights—helpfully summarized with up and down arrows—from a real-time feed of factory data from around the country.” (Ref. 3)
Data gathered from operations was to be input into economic simulation software.
So government planners could forecast the possible outcomes of different economic decisions.
Data would be collected data from citizens as well as from businesses.
Project Cyberfolk was to track the real-time happiness of the Chilean nation in response to decisions made in the Operations Room.
“Beer built a device that would enable the country’s citizens, from their living rooms, to move a pointer on a voltmeter-like dial that indicated moods ranging from extreme unhappiness to complete bliss.... so that the total national happiness at any moment in time could be determined... to show whether government policies were working.” (Ref. 3)
The vision was worker participation in the planning of nationalised businesses.
The aim was to distribute control; the system centralized control to a degree
The aim was to involve workers in planning by applying cybernetics. Was the aim achieved?
Who defined the target system?
Project analysts modelled the processes workers performed and resources they used.
“One of the participating engineers described the factory modelling process as “fairly technocratic” and “top down”
it did not involve “speaking to the guy who was actually working on the mill or the spinning machine.” Ref. 3.
Who monitored workers in the performance of defined processes?
In normal operations, factory managers did this.
Exception conditions would be reported to a higher level Ops Room.
Who planned what to do when an exception condition arose?
Those in Ops Room would decide what to do and cascade directions by Telex.
Who made plans about what to manufacture, factory expansion or closure, the national economy?
Government planners would do this based on collecting data and using forecasting software.
Inevitably, those at higher levels in a human hierarchy or organisation delegate some responsibility to those at lower levels.
And it is said that the system in Chile delegated more responsibility to factory managers than the system in the Soviet Union.
Still, both were top-down control systems.
Workers were expected to perform processes and use resources in ways that had been planned and modelled.
Exception conditions would be reported upwards; corrective directives would be cascaded downwards.
Bureaucratic feedback loops led from bottom up reports to top-down directions.
Did the application of cybernetic principles to the central control system create a participatory democracy?
“Frustrated with the growing bureaucratization of Project Cybersyn, Beer considered resigning.
“If we wanted a new system of government, then it seems that we are not going to get it,” he wrote to his Chilean colleagues that spring .
“The team is falling apart, and descending to personal recrimination.”
Confined to the language of cybernetics, Beer didn’t know what to do. (Ref. 3).
The project met its end in September 1973, when Allende was overthrown and Chilean politics swung away from central planning.
In a New Yorker article (ref. 3), Evgeny Morozov concluded the project was utopian and scientistic.
Scientism (after Hayek) meaning “a mechanical and uncritical application of habits of thought to fields different from those in which they have been formed.”
Presumably, Morozov was referring to the application of cybernetics to the requirement for worker participation in planning.
After Project Cybersyn, Beer retreated from the world for a while.
He polished his ideas about applying cybernetics to a business, and gathering information feedback from workers.
In “Diagnosing the system for organisations” (1985) Beer refreshed and detailed his “Viable System Model”.
He said the VSM was inspired by the structure of the human central nervous system.
Beer’s writing is not easy to follow and the VSM is complicated.
For a picture of the VSM, find “Diagnosing the system for organisations” on the internet and look at Figure 37.
Beer wrote: “There is no 'correct' interpretation of the VSM. We have spoken instead of more or less useful interpretations.”
Interpreting the VSM is a job in itself, which some management consultants enjoy doing – and find fruitful.
Here is a possible interpretation.
The five systems in Beer’s Viable System Model
5: makes policy decisions to steer the whole organization and balance demands from different units.
4: looks out to the environment and monitors how the organization must change to remain viable.
Business strategy and planning
3: establishes the rules, resources, rights and responsibilities of System 1 and interfaces with Systems 4/5.
2: information systems that enable primary activities to communicate and System 3 to monitor and co-ordinate those activities.
1: the primary/core business activities (recursively, each is a viable system)
Today, many core business activities (1) are IT operations (2).
“There is no 'correct' interpretation of the VSM. We have spoken instead of more or less useful interpretations.” Beer
The VSM doesn’t resemble the known structure or workings of the human brain or nervous system.
It cannot be the VSM, since many viable systems have nothing like a central nervous system (e.g. the solar system, a tree, a bee hive, an oyster).
And there is scant evidence of any business operating in a way you could say closely matches the VSM.
The VSM is primarily a tool for diagnosing human organization design issues, and generating change proposals.
“Few organizations have adopted the VSM as their formal organizational structure.
But many consultants have used it to diagnose the way an organization is operating and where improvements are needed.” (Stuart Umpleby).
This paper is not about using the VSM.
It is about Beer’s use of the four Ashby’s ideas listed above.
· Variety: the number of possible states a system can exhibit (one way to assess its complexity).
· The law of requisite variety: “only variety can absorb variety”.
· Attenuator: a device that reduces variety.
· Amplifiers: a device that increases variety.
In a phrase of the time, the four Ms were four structural resources: Men, Materials, Machines and Money
Ashby might have said thinking about structures is inadequate because it is behaviors we need to understand, monitor and control.
Beer said that thinking about structures is inadequate because managers need to think about managing complexity.
How to objectively assess the relative complexity of two business systems?
You might try to do this systematically, as follows.
1. Choose your measure of complexity
2. Identify the system elements to be described (roles, actors, processes, variables, whatever)
3. Describe the two real world business systems in terms of those elements
4. Demonstrate your two descriptions have been made to the same level of abstraction.
5. Demonstrate by testing that the two real world business systems behave according to your descriptions.
6. Then apply the complexity measure to each business system and compare them.
The process is impractical for all but trivial systems or simplistic complexity measures.
Today, there is no widely agreed measure of complexity.
But even back in the day, Beer knew Ashby’s measure of complexity is incalculable in all business systems of interest
So he said a subjective assessment of relative complexity is valid.
What if a control system fails to control key state variables of a target system?
What if a manager fails to control key variables of business system, such as production rate, profit etc.?
What if a government fails to maintain the happiness of its population (no joke here, follow the link at the end of this paper).
Beer proposed managers should use Ashby’s ideas to:
· Amplify/increase variety in the control or management system, or
· Attenuate/reduce variety in the target or operational system.
You might consider other design options such as:
· improving the quality of information flows (their speed, throughput, integrity etc.)
· adding another control system in parallel to the existing one.
And suppose the actors in the system are self-determining - can choose their own response to a stimulus.
Then two more options could be:
· empower actors in the target system to determine their own actions, in the light of given goals
· empower actors in the target system to find and deploy their own control systems.
What if turns out that a target system cannot or does not always respond appropriately to a control system?
The control system can detect this and pass responsibility over to some kind of exception handling processor.
I am not pretending these to be original ideas; Barry Clemson tells me you can find some of them in Beer’s writings.
Nor am I claiming that this is an exhaustive list of options.
No doubt the VSM is a useful tool when interpreted by a skilled management consultant. But is it science or poetry?
And is Beer’s reliance on Ashby’s law of requisite variety scientific or scientistic?
If I recall correctly, Beer spoke of the imminent collapse of government institutions, if not western civilization.
On the grounds that governments did not have enough variety – did not have enough data about an economy or a society to control it.
Participation means more than supplying required information
Beer wanted to promote the “participative democracy” that many social system thinkers have long treated as a vision or mission statement.
Even so, his structured approach to management still looks like a bureaucratic monitor and control system.
Workers are expected to perform processes and use resources in the ways that have been modelled and planned.
Any significant deviations from that are reported upwards, and corrective directives are cascaded downwards.
The best control systems are lean
A lean control system is one that knows a minimal amount of what is going on in the controlled entity.
E.g. A thermostat knows nothing of a heating system bar the temperature of the environment.
Lean non-intrusive government is generally favoured over the practices of the former East German government.
And in business, management by exception is common.
Managers often ask us to minimise the variety they monitor by reporting only “traffic light” status information to them.
That idea is embedded in Beer’s idea of bottom-up reporting through multi-level control systems.
Data quality issues
Remember the dubious “maximize internal variety” principle?
Beer favored maximising the internal variety of the control system, in case that information might be useful.
Consider for example the information gathering practices of the former East German government.
Or the business practice of dumping data in a “data warehouse” or collecting lots of “big data”.
There is considerable experience that collecting more data than is needed leads to data quality problems.
When people come to use the data, they find it is out of date, not quite what they want, or inaccurate.
As the saying goes: garbage in garbage out.
Social entities and social systems are not the same thing
There is fundamental difference between:
· A social system in which an identifiable group of actors perform repeatable activities
· A social entity in which an identifiable group of actors perform whatever activities they choose.
Social entities behave in ways beyond what can be described using general system theory and cybernetics.
Collecting data and directing activities based on analysis of that data is one part of managing business.
Motivating and helping people to reach goals is another part (along with rules that discourage aberrant behaviors).
A real-world business is a mix of social entities and social systems.
Read Social cells for a longer discussion.
All free-to-read materials at http://avancier.website are paid for out of income from Avancier’s training courses and methods licences.
If you find the web site helpful, please spread the word and link to avancier.website in whichever social media you use.
Ref 1: “Diagnozing the system” http://ototsky.com/khipu/lib/beer_diagnozingthesystem_en.pdf
This provides a list of Beer’s works (1959 to 1983)
This is another source read in the course of writing this paper
Ref 3: Evgeny Morozov, in a New Yorker article http://www.newyorker.com/magazine/2014/10/13/planning-machine
This is an article you will enjoy reading about Project Cybersyn
And just in case the link to the New Yorker breaks one day, here is a copy.
The Planning Machine: Project Cybersyn and the origins of the Big Data nation
By Evgeny Morozov © 2018 Condé Nast. All rights reserved.
In Allende’s Chile, a futuristic op room was to bring socialism into the computer age.
In June, 1972, Ángel Parra, Chile’s leading folksinger, wrote a song titled “Litany for a Computer and a Baby About to Be Born.” Computers are like children, he sang, and Chilean bureaucrats must not abandon them. The song was prompted by a visit to Santiago from a British consultant who, with his ample beard and burly physique, reminded Parra of Santa Claus—a Santa bearing a “hidden gift, cybernetics.”
The consultant, Stafford Beer, had been brought in by Chile’s top planners to help guide the country down what Salvador Allende, its democratically elected Marxist leader, was calling “the Chilean road to socialism.” Beer was a leading theorist of cybernetics—a discipline born of midcentury efforts to understand the role of communication in controlling social, biological, and technical systems. Chile’s government had a lot to control: Allende, who took office in November of 1970, had swiftly nationalized the country’s key industries, and he promised “worker participation” in the planning process. Beer’s mission was to deliver a hypermodern information system that would make this possible, and so bring socialism into the computer age. The system he devised had a gleaming, sci-fi name: Project Cybersyn.
Beer was an unlikely savior for socialism. He had served as an executive with United Steel and worked as a development director for the International Publishing Corporation (then one of the largest media companies in the world), and he ran a lucrative consulting practice. He had a lavish life style, complete with a Rolls-Royce and a grand house in Surrey, which was fitted out with a remote-controlled waterfall in the dining room and a glass mosaic with a pattern based on the Fibonacci series. To convince workers that cybernetics in the service of the command economy could offer the best of socialism, a certain amount of reassurance was in order. In addition to folk music, there were plans for cybernetic-themed murals in the factories, and for instructional cartoons and movies. Mistrust remained. “Chile Run by Computer,” a January, 1973, headline in the Observer announced, shaping the reception of Beer’s plan in Britain.
At the center of Project Cybersyn (for “cybernetics synergy”) was the Operations Room, where cybernetically sound decisions about the economy were to be made. Those seated in the op room would review critical highlights—helpfully summarized with up and down arrows—from a real-time feed of factory data from around the country. The prototype op room was built in downtown Santiago, in the interior courtyard of a building occupied by the national telecom company. It was a hexagonal space, thirty-three feet in diameter, accommodating seven white fibreglass swivel chairs with orange cushions and, on the walls, futuristic screens. Tables and paper were banned. Beer was building the future, and it had to look like the future.
That was a challenge: the Chilean government was running low on cash and supplies; the United States, dismayed by Allende’s nationalization campaign, was doing its best to cut Chile off. And so a certain amount of improvisation was necessary. Four screens could show hundreds of pictures and figures at the touch of a button, delivering historical and statistical information about production—the Datafeed—but the screen displays had to be drawn (and redrawn) by hand, a job performed by four young female graphic designers. Given Beer’s plans to build an entire “factory to turn out operations rooms”—every state-run industrial concern was to have one—Project Cybersyn could at least provide graphic designers with full employment.
Beer, who was fond of cigars and whiskey, made sure that an ashtray and a small holder for a glass were built into one of the armrests for each chair. (Sometimes, it seemed, the task of managing the economy went better with a buzz on.) The other armrest featured rows of buttons for navigating the screens. In addition to the Datafeed, there was a screen that simulated the future state of the Chilean economy under various conditions. Before you set prices, established production quotas, or shifted petroleum allocations, you could see how your decision would play out.
One wall was reserved for Project Cyberfolk, an ambitious effort to track the real-time happiness of the entire Chilean nation in response to decisions made in the op room. Beer built a device that would enable the country’s citizens, from their living rooms, to move a pointer on a voltmeter-like dial that indicated moods ranging from extreme unhappiness to complete bliss. The plan was to connect these devices to a network—it would ride on the existing TV networks—so that the total national happiness at any moment in time could be determined. The algedonic meter, as the device was called (from the Greek algos, “pain,” and hedone, “pleasure”), would measure only raw pleasure-or-pain reactions to show whether government policies were working.
Project Cybersyn can also be viewed as a dispatch from the future. These days, business publications and technology conferences endlessly celebrate real-time dynamic planning, the widespread deployment of tiny but powerful sensors, and, above all, Big Data—an infinitely elastic concept that, according to some inexorable but yet unnamed law of technological progress, packs twice as much ambiguity in the same two words as it did the year before. In many respects, Beer’s cybernetic dream has finally come true: the virtue of collecting and analyzing information in real time is an article of faith shared by corporations and governments alike.
Beer was invited to Chile by a twenty-eight-year-old technocrat named Fernando Flores, whom Allende had appointed to the state development agency. The agency, a stronghold of Chilean technocracy, was given the task of administering the newly nationalized enterprises. Flores was undeterred by Beer’s lack of socialist credentials. He saw that there was a larger intellectual affinity between socialism and cybernetics; in fact, both East Germany and the Soviet Union considered, though never actually built, projects similar to Cybersyn.
As Eden Medina shows in “Cybernetic Revolutionaries,” her entertaining history of Project Cybersyn, Beer set out to solve an acute dilemma that Allende faced. How was he to nationalize hundreds of companies, reorient their production toward social needs, and replace the price system with central planning, all while fostering the worker participation that he had promised? Beer realized that the planning problems of business managers—how much inventory to hold, what production targets to adopt, how to redeploy idle equipment—were similar to those of central planners. Computers that merely enabled factory automation were of little use; what Beer called the “cussedness of things” required human involvement. It’s here that computers could help—flagging problems in need of immediate attention, say, or helping to simulate the long-term consequences of each decision. By analyzing troves of enterprise data, computers could warn managers of any “incipient instability.” In short, management cybernetics would allow for the reëngineering of socialism—the command-line economy.
To take advantage of automated computer analysis, managers would need to get a clear view of daily life inside their own firm. First, they would have to locate critical bottlenecks. They needed to know that if trucks arrived late at Plant A, then Plant B wouldn’t finish the product by its deadline. Why would the trucks be late? Well, the drivers might be on strike, or lousy weather might have closed the roads. Workers, not managers, would have the most intimate knowledge of these things.
When Beer was a steel-industry executive, he would assemble experts—anthropologists, biologists, logicians—and dispatch them to extract such tacit knowledge from the shop floor. The goal was to produce a list of relevant indicators (like total gasoline reserves or delivery delays) that could be monitored so that managers would be able to head off problems early. In Chile, Beer intended to replicate the modelling process: officials would draw up the list of key production indicators after consulting with workers and managers. “The on-line control computer ought to be sensorily coupled to events in real time,” Beer argued in a 1964 lecture that presaged the arrival of smart, net-connected devices—the so-called Internet of Things. Given early notice, the workers could probably solve most of their own problems. Everyone would gain from computers: workers would enjoy more autonomy while managers would find the time for long-term planning. For Allende, this was good socialism. For Beer, this was good cybernetics.
Cybernetics was born in the mid-nineteen-forties, as scholars in various disciplines began noticing that social, natural, and mechanical systems exhibit similar patterns of self-regulation. Norbert Wiener’s classic “Cybernetics; or, Control and Communication in the Animal and the Machine” (1948) discussed human behavior by drawing on his close observation of technologies like the radar and the thermostat. The latter is remarkable for how little it needs to know in order to do its job. It doesn’t care whether what’s making the room so hot is your brand-new plasma TV or the weather outside. It just needs to compare its actual output (the temperature right now) with its predefined output (the desired temperature) and readjust its input (whatever mechanism is producing heat or cold).
Wiener held that a patient suffering from purpose tremor—spilling a glass of water before raising it to his lips—was akin to a malfunctioning thermostat. Both rely on “negative feedback”—“negative” because it tends to oppose what the system is doing. In a way, our bodies are feedback machines: we maintain our body temperature without a specially programmed response for “condition: bathhouse” or “condition: tundra.” The tendency to self-adjust is known as homeostasis, and it’s ubiquitous in both the natural and the mechanical worlds. For Beer, in fact, corporations are homeostats. They have a clear goal—survival—and are full of feedback loops: between the company and its suppliers or between workers and management. And if we can make homeostatic corporations why not homeostatic governments
Yet central planning had been powerfully criticized for being unresponsive to shifting realities, notably by the free-market champion Friedrich Hayek. The efforts of socialist planners, he argued, were bound to fail, because they could not do what the free market’s price system could: aggregate the poorly codified knowledge that implicitly guides the behavior of market participants. Beer and Hayek knew each other; as Beer noted in his diary, Hayek even complimented him on his vision for the cybernetic factory, after Beer presented it at a 1960 conference in Illinois. (Hayek, too, ended up in Chile, advising Augusto Pinochet.) But they never agreed about planning. Beer believed that technology could help integrate workers’ informal knowledge into the national planning process while lessening information overload.
Project Cybersyn, to be sure, lacked the gizmos available to contemporary organizations. When Beer landed in Santiago, he had access only to two mainframe computers, which the government badly needed for other tasks. Beer chose the “cloud” model: one central computer, analyzing reports sent by telex machines installed at state-run factories, could inform the firm of emerging problems and, if nothing was done, alert agency officials.
But computer analysis of factories was only as good as the underlying formal model of how they actually work. Hermann Schwember, a senior member of Cybersyn, described the process in a 1977 essay. The modelling team dispatched to a canning plant, for example, would start with a list of technical questions. What supplies—tin cans, sugar, fruit—were critical to its over-all activity? Were there statistics—say, the amount of peeled fruit, the number of cans in the factory line—that offered an accurate snapshot of the state of production? Were there any machines that might automatically provide the indicators sought by the team (the counter of the sealing unit, perhaps)? The answers would yield a flowchart that started with suppliers and ended with customers.
Suppose that the state planners wanted the plant to expand its cooking capacity by twenty per cent. The modelling would determine whether the target was plausible. Say the existing boiler was used at ninety per cent of capacity, and increasing the amount of canned fruit would mean exceeding that capacity by fifty per cent. With these figures, you could generate a statistical profile for the boiler you’d need. Unrealistic production goals, overused resources, and unwise investment decisions could be dealt with quickly. “It is perfectly possible . . . to capture data at source in real time, and to process them instantly,” Beer later noted. “But we do not have the machinery for such instant data capture, nor do we have the sophisticated computer programs that would know what to do with such a plethora of information if we had it.” [cartoon id="a18529"]
Today, sensor-equipped boilers and tin cans report their data automatically, and in real time. And, just as Beer thought, data about our past behaviors can yield useful predictions. Amazon recently obtained a patent for “anticipatory shipping”—a technology for shipping products before orders have even been placed. Walmart has long known that sales of strawberry Pop-Tarts tend to skyrocket before hurricanes; in the spirit of computer-aided homeostasis, the company knows that it’s better to restock its shelves than to ask why.
Governments, with oceans of information at their disposal, are following suit. That’s evident from an essay on the “data-driven city,” by Michael Flowers, the former chief analytics officer of New York City, which appears in “Beyond Transparency: Open Data and the Future of Civic Innovation,” a recent collection of essays (published, tellingly, by the Code for America Press), edited by Brett Goldstein with Lauren Dyson. Flowers suggests that real-time data analysis is allowing city agencies to operate in a cybernetic manner. Consider the allocation of building inspectors in a city like New York. If the city authorities know which buildings have caught fire in the past and if they have a deep profile for each such building—if, for example, they know that such buildings usually feature illegal conversions, and their owners are behind on paying property taxes or have a history of mortgage foreclosures—they can predict which buildings are likely to catch fire in the future and decide where inspectors should go first. The appeal of this approach to bureaucrats is fairly obvious: like Beer’s central planners, they can be effective while remaining ignorant of the causal mechanisms at play. “I am not interested in causation except as it speaks to action,” Flowers told Kenneth Cukier and Viktor Mayer-Schönberger, the authors of “Big Data” (Houghton Mifflin), another recent book on the subject. “Causation is for other people, and frankly it is very dicey when you start talking about causation. . . . You know, we have real problems to solve.”
In another contribution to “Beyond Transparency,” the technology publisher and investor Tim O’Reilly, one of Silicon Valley’s in-house intellectuals, celebrates a new mode of governance that he calls “algorithmic regulation.” The aim is to replace rigid rules issued by out-of-touch politicians with fluid and personalized feedback loops generated by gadget-wielding customers. Reputation becomes the new regulation: why pass laws banning taxi-drivers from dumping sandwich wrappers on the back seat if the market can quickly punish such behavior with a one-star rating? It’s a far cry from Beer’s socialist utopia, but it relies on the same cybernetic principle: collect as much relevant data from as many sources as possible, analyze them in real time, and make an optimal decision based on the current circumstances rather than on some idealized projection. All that’s needed is a set of fibreglass swivel chairs.
Chilean politics, as it happened, was anything but homeostatic. Cybernetic synergy was a safe subject for the relatively calm first year of Allende’s rule: the economy was growing, social programs were expanding, real wages were improving. But the calm didn’t last. Allende, frustrated by the intransigence of his parliamentary opposition, began to rule by executive decree, prompting the opposition to question the constitutionality of his actions. Workers, too, began to cause trouble, demanding wage increases that the government couldn’t deliver. Washington, concerned that the Chilean road to socialism might have already been found, was also meddling in the country’s politics, trying to thwart some of the announced reforms.
In October, 1972, a nationwide strike by truck drivers, who were fearful of nationalization, threatened to paralyze the country. Fernando Flores had the idea of deploying Cybersyn’s telex machines to outmaneuver the strikers, encouraging industries to coördinate the sharing of fuel. Most workers declined to back the strike and sided with Allende, who also invited the military to join the cabinet. Flores was appointed Minister of Economics, the strike petered out, and it seemed that Project Cybersyn would win the day.
On December 30, 1972, Allende visited the Operations Room, sat in one of the swivel chairs, and pushed a button or two. It was hot, and the buttons didn’t show the right slides. Undaunted, the President told the team to keep working. And they did, readying the system for its official launch, in February, 1973. By then, however, long-term planning was becoming something of a luxury. One of Cybersyn’s directors remarked at the time that “every day more people wanted to work on the project,” but, for all this manpower, the system still failed to work in a timely manner. In one instance, a cement-factory manager discovered that an impending coal shortage might halt production at his enterprise, so he travelled to the coal mine to solve the problem in person. Several days later, a notice from Project Cybersyn arrived to warn him of a potential coal shortage—a problem that he had already tackled. With such delays, factories didn’t have much incentive to report their data.
One of the participating engineers described the factory modelling process as “fairly technocratic” and “top down”—it did not involve “speaking to the guy who was actually working on the mill or the spinning machine.” Frustrated with the growing bureaucratization of Project Cybersyn, Beer considered resigning. “If we wanted a new system of government, then it seems that we are not going to get it,” he wrote to his Chilean colleagues that spring. “The team is falling apart, and descending to personal recrimination.” Confined to the language of cybernetics, Beer didn’t know what to do. “I can see no way of practical change that does not very quickly damage the Chilean bureaucracy beyond repair,” he wrote.
It was Allende’s regime itself that was soon damaged beyond repair. Pinochet had no need for real-time centralized planning; the market was to replace it. When Allende’s regime was overthrown, on September 11, 1973, Project Cybersyn met its end as well. Beer happened to be out of the country, but others weren’t so lucky. Allende ended up dead, Flores in prison, other Cybersyn managers in hiding. The Operations Room didn’t survive, either. In a fit of what we might now call PowerPoint rage, a member of the Chilean military stabbed its slides with a knife.
Today, one is as likely to hear about Project Cybersyn’s aesthetics as about its politics. The resemblance that the Operations Room—with its all-white, utilitarian surfaces and oversized buttons—bears to the Apple aesthetic is not entirely accidental. The room was designed by Gui Bonsiepe, an innovative German designer who studied and taught at the famed Ulm School of Design, in Germany, and industrial design associated with the Ulm School inspired Steve Jobs and the Apple designer Jonathan Ive.
But Cybersyn anticipated more than tech’s form factors. It’s suggestive that Nest—the much admired smart thermostat, which senses whether you’re home and lets you adjust temperatures remotely—now belongs to Google, not Apple. Created by engineers who once worked on the iPod, it has a slick design, but most of its functionality (like its ability to learn and adjust to your favorite temperature by observing your behavior) comes from analyzing data, Google’s bread and butter. The proliferation of sensors with Internet connectivity provides a homeostatic solution to countless predicaments. Google Now, the popular smartphone app, can perpetually monitor us and (like Big Mother, rather than like Big Brother) nudge us to do the right thing—exercise, say, or take the umbrella.
Companies like Uber, meanwhile, insure that the market reaches a homeostatic equilibrium by monitoring supply and demand for transportation. Google recently acquired the manufacturer of a high-tech spoon—the rare gadget that is both smart and useful—to compensate for the purpose tremors that captivated Norbert Wiener. (There is also a smart fork that vibrates when you are eating too fast; “smart” is no guarantee against “dumb.”) The ubiquity of sensors in our cities can shift behavior: a new smart parking system in Madrid charges different rates depending on the year and the make of the car, punishing drivers of old, pollution-prone models. Helsinki’s transportation board has released an Uber-like app, which, instead of dispatching an individual car, coördinates multiple requests for nearby destinations, pools passengers, and allows them to share a much cheaper ride on a minibus.
Such experiments, however, would be impossible without access to the underlying data, and companies like Uber typically want to grab and hold as much data as they can. When, in 1975, Beer argued that “information is a national resource,” he was ahead of his time in treating the question of ownership—just who gets to own the means of data production, not to mention the data?—as a political issue that cannot be reduced to its technological dimensions.
Uber says that it can monitor its supply-and-demand curves in real time. Instead of sticking to fixed rates for car rides, it can charge a floating rate depending on market conditions when an order is placed. As Uber’s C.E.O. told Wired last December, “We are not setting the price. The market is setting the price. We have algorithms to determine what that market is.” It’s a marvellous case study in Cybersyn capitalism. And it explains why Uber’s prices tend to skyrocket in inclement weather. (The company recently agreed to cap these hikes in American cities during emergencies.) Uber maintains that surge pricing allows it to get more drivers onto the road in dismal weather conditions. This claim would be stronger if there were a way to confirm its truth by reviewing the data. But at Uber, as at so many tech companies, what happens in the op room stays in the op room.
Stafford Beer was deeply shaken by the 1973 coup, and dedicated his immediate post-Cybersyn life to helping his exiled Chilean colleagues. He separated from his wife, sold the fancy house in Surrey, and retired to a secluded cottage in rural Wales, with no running water and, for a long time, no phone line. He let his once carefully trimmed beard grow to Tolstoyan proportions. A Chilean scientist later claimed that Beer came to Chile a businessman and left a hippie. He gained a passionate following in some surprising circles. In November, 1975, Brian Eno struck up a correspondence with him. Eno got Beer’s books into the hands of his fellow-musicians David Byrne and David Bowie; Bowie put Beer’s “Brain of the Firm” on a list of his favorite books.
Isolated in his cottage, Beer did yoga, painted, wrote poetry, and, occasionally, consulted for clients like Warburtons, a popular British bakery. Management cybernetics flourished nonetheless: Malik, a respected consulting firm in Switzerland, has been applying Beer’s ideas for decades. In his later years, Beer tried to re-create Cybersyn in other countries—Uruguay, Venezuela, Canada—but was invariably foiled by local bureaucrats. In 1980, he wrote to Robert Mugabe, of Zimbabwe, to gauge his interest in creating “a national information network (operating with decentralized nodes using cheap microcomputers) to make the country more governable in every modality.” Mugabe, apparently, had no use for algedonic meters.
Fernando Flores moved in the opposite direction. In 1976, an Amnesty International campaign secured his release from prison, and he ended up in California, at Berkeley, studying the ideas of Martin Heidegger and J. L. Austin and writing a doctoral thesis on business communications in the office of the future. In California, Flores reinvented himself as a business consultant and a technology entrepreneur. (In the early nineteen-eighties, Werner Erhard, the founder of est, was among his backers.) Flores reëntered Chilean politics and was elected a senator in 2001. Toying with the idea of running for President, he eventually launched his own party and found common ground with the right.
Before designing Project Cybersyn, Beer used to complain that technology “seems to be leading humanity by the nose.” After his experience in Chile, he decided that something else was to blame. If Silicon Valley, rather than Santiago, has proved to be the capital of management cybernetics, Beer wasn’t wrong to think that Big Data and distributed sensors could be enlisted for a very different social mission. While cybernetic feedback loops do allow us to use scarce resources more effectively, the easy availability of fancy thermostats shouldn’t prevent us from asking if the walls of our houses are too flimsy or if the windows are broken. A bit of causal thinking can go a long way. For all its utopianism and scientism, its algedonic meters and hand-drawn graphs, Project Cybersyn got some aspects of its politics right: it started with the needs of the citizens and went from there. The problem with today’s digital utopianism is that it typically starts with a PowerPoint slide in a venture capitalist’s pitch deck. As citizens in an era of Datafeed, we still haven’t figured out how to manage our way to happiness. But there’s a lot of money to be made in selling us the dials.