Classic activity system thinking

Copyright 2017 Graham Berrisford. a chapter in “the book” at Last updated 25/08/2021 14:44


Reading online? If your screen is wide, shrink its width for readability.


If you want to get to grips with systems thinking, you’ll need to understand the core ideas that have emerged over the last century, not merely hear the words and interpret them as you like. This chapter is one of two that take you on canter through a history of systems thinkers and their ideas. It introduces some classic systems thinking schools and outlines their ideas about modelling activity systems.


Introduction. 1

Cybernetics. 2

System dynamics. 7

Soft systems thinking. 8

General system theory. 10

Conclusions and remarks. 11



Some ideas that prefigure modern systems thinking emerged centuries ago. Notable authors have included:

·       Isaac Newton (1642-1726) who described the world as a system of objects that interact by forces, according to the laws of motion

·       Adam Smith (1723-1790) who wrote on specialization of, cooperation and competition between, businesses (cf. autonomous agents)

·       Charles Darwin (1809-1882) who wrote on the evolution of a species by reproduction with modification

·       Claude Bernard (1813-1878) who wrote on homeostatic feedback loops; and

·       Vilfredo Pareto (1848-1923) who is famous for the Pareto principle.


If the term system is to be useful, more than a vacuous noise word, more than a synonym for entity, then it must have some more particular meaning. The general concept of a system became a focus of attention after second world war. The list below distils three views of a system, which came to prominence from 1950s to the 1970s.


·       In Ashby's cybernetics, a system is a set of state variables, governed by rules that determine how those variables change over time.

·       In Forrester's system dynamics, a system is a set of stocks (quantitative variables) that each grow and shrink, and cause related stocks to grow and shrink.

·       In Checkland's soft systems methodology, a system is set of interrelated business activities, performed by actors, that transform inputs into outputs of value to customers.


Every activity system in the real world is “soft” in the sense its boundary is determined by its observer/describer/designer. In operation, it is dynamic in the sense it changes from state to state and/or produces outputs from inputs in a way that is regular or repeatable enough to be modelled by an observer/describer/designer.


·       In Boulding's discussion of how general system theory might be applied to management science, a system is a population of individuals (actors) with their own private memories, who communicate using messages.


This chapter goes on to outline key ideas in the four approaches mentioned above.


Cybernetics emerged out of efforts to understand the role of information in controlling the state of a mechanical or biological system. It is much about the storage and transmission of information, in memories and messages, to describe and direct the state of things.


Cybernetics is independent of any physical form or medium for a memory or a message. Actors can encode information or meaning in any structure of matter or energy that they can later decode. Information can be transmitted using smells, sounds, gestures, words, pictures, dials, lines drawn in the sand, electrical pulses or radio waves.


Well-known cyberneticians include:

·       Norbert Wiener (1894-1964) "the science of control and communication, in the animal and the machine".

·       W. Ross Ashby (1903-1972) known for his law of requisite variety.

·       Alan Turing (1912 –1954) finite state machines and artificial intelligence.


Cybernetics was discussed and promoted by two influential groups.

·       1941 to 1960: The Macy Conferences - cross-disciplinary meetings in New York, with a mandate to aid medical research. Topics included connective tissues, metabolism, the blood, the liver and renal function. Also, infancy, childhood, aging, nerve impulses, and consciousness.

·       1949 to 1958: The Ratio Club - a cross-disciplinary group in the UK, founded by the neurologist John Bates to discuss cybernetics. Members included psychologists, neurobiologists, engineers, physicists, and mathematicians. Many went on to become prominent scientists.


Though he is not as well-known as many of the scientists his work influenced, a prominent member of the Ratio Club was the psychologist W. Ross Ashby, who generalized cybernetics, and defined a system in two ways.


System as a set of state variables

Ashby defined a system as set of state variables selected from observation of some real-world entity, machine or situation. He represented the state of a physical entity by the values of state variables, as, for example, we represent state of a tennis match by game, set and match scores, or may represent the state of a predator-prey system by predator and prey population numbers.


Ashby didn’t exclude the appearance in messages of qualitative information (as a location might be named in his gale warning broadcast). However, the focus of cybernetics is mostly on quantitative variables, like the levels of a stock, population or resource. Given that a quantity changes over time, the variable’s line of behavior may be shown on a graph of quantity against time.


Core concept: Line of behavior: the trajectory of how a state variable’s value changes over time. The shape of the line is an inexorable result of the system following its rules. It may hover around a homeostatic norm, or dwell for a while in one stable state (an attractor), then move to another stable state, or zoom up or down, or change in some other non-linear way as shown in figure 7/1/1 in Ashby’s “Design for a Brain”(1954).

[jpg not visible to on-linear readers]



The progress of a system with two or three variables can be represented on a graph as a two or three-dimensional shape.


System as a set of rule-bound activities

Ashby defined systems by the way they behave over time, as a set of regular or repeatable state changes, rather than what they are made of. This shift in perspective, from the physical structure of a system to its logical behavior, is central to the cybernetic view of the world. When a cybernetician calls a system complex, the reference is to its lines of behaviour rather than its material structure.


Ashby was particularly interested in machines and organisms that change from state to state under their own internal drive, without input. By contrast, enterprise architects are mostly concerned with systems in which actors are prompted to act when inputs arrive, or internal states reach a threshold value. Activities advance the internal state of the business and/or produce outputs that advance the state of external actors. Despite their different interests, there is a considerable overlap of ideas, explored in other chapters.


The good regulator

The “good regulator” theorem, conceived by Roger C. Conant and W. Ross Ashby, is central to cybernetics. In short, it states that every good regulator of a system must have a model of that system.


“Abstract "The design of a complex regulator often includes the making of a model of the system to be regulated. The making of such a model has hitherto been regarded as optional, as merely one of many possible ways. In this paper a theorem is presented which shows, under very broad conditions, that any regulator that is maximally both successful and simple must be isomorphic with the system being regulated. Making a model is thus necessary.


A regulator can be any animal, machine or business that has a model, or has access to a model, of what it needs to monitor and control. So, read this triangle from left to right: regulators <have and use> models, which <represent> targets.


The good regulator


<have and use>           <represent>

Regulators    <monitor and regulate >   Targets


Ashby wrote that a regulator’s model of a target must be isomorphic with the target that is regulated, meaning, the elements and relationships in the model must be correlatable with elements and relationships in the reality. This isomorphism is logical rather than physical.


The theorem has the interesting corollary that the living brain, so far as it is to be successful and efficient as a regulator for survival, must proceed, in learning, by the formation of a model (or models) of its environment."


Evidently, to function and respond to changes, an animal must “know” what it going on in its world. It needs a model of entities and events its environment if it is to find food and mates, and avoid enemies. The richer the model, the more adaptive the animal can be to changes in it environment. Similarly, a business needs to know the state of things it seeks to monitor or direct.


The question is not whether an animal or a business has a model; it is how complete and accurate is the model? To which the answers might be both “very incomplete and somewhat inaccurate” and “remarkably, complete and accurate enough”. Thinking about these answers leads inexorably to the view of description and reality that is outlined in the second half of this book.


Feedback and homeostasis

"Self-regulating mechanisms have existed since antiquity, and the idea of feedback had started to enter economic theory in Britain by the 18th century, but it did not have a name.... In 1868, James Clerk Maxwell wrote a famous paper, "On governors", that is widely considered a classic in feedback control theory. This was a landmark paper on control theory and the mathematics of feedback."  (Wikipedia)


The biologist Claude Bernard (1813-1878) wrote on homeostatic feedback loops that maintain the essential variables of the body within critical limits. Later Walter Cannon wrote “Physiological Regulation of Normal States” (1926).


If a regulator is to monitor and direct the state of a target then it must know or remember the state of that target, and update that knowledge. Cybernetics explains how a regulator does this by means of a feedback loop that connects the regulator and target subsystems such that the output from one is input to the other.



Feedback loop



ßstate information




ßtoo cold or hot

switch on or offà



A controller/regulator typically contains three kinds of component:

·       receptors that sense changes in the state of a target

·       a control center that determines the regulator’s responses to those changes

·       effectors that act to change the state of a target.


In his “Design for a Brain” Ashby presented the brain as a regulator that monitors and maintains the state of the body.

·       sensors detect changes in bodily state variables

·       the brain determines the responses to those changes

·       motor neurons, via glands and muscles, increase or decrease state variable values



Feedback loop



ßstate information



Sensors, Brain

Muscles, Glands

ßvariable values



state variables


This table compares Ashby’s model with general activity system thinking.


Generic activity system

Ashby’s design for a brain


interact in orderly activities to

maintain system state and/or

consume/deliver inputs/outputs

from/to the wider environment.

Brain cells

interact in processes to

maintain body state variables by

receiving/sending information

from/to sensors, muscles, glands


Feedback loops are found in mechanical, organic and business systems. A missile guidance system senses spatial information, and sends messages to direct the missile. An animal brain senses thing its environment, and uses its mental model of the world to respond to or manipulate those things.


A business is coupled to its environment by producing outputs that affect external entities, and receiving inputs, some of which are responses to previous outputs. A business information system maintains a model of the entities and events that it monitors and directs in its environment. In EA, given that the focus is on information flows rather than material flows, the relationship of a business to the external actors it monitors and serves or directs may be seen as a regulator-to-target feedback loop.


Business activity system

Feedback loop

Business environment


ßstate information



Consumes inputs

Produces outputs

Maintains system state

ß Inputs

à Outputs

External actors

Environment state


A core concept: Feedback loop: the circular fashion in which output flows influence future input flows.


Progressive entities or processes

Feedback is important in machines, animals, social entities and business operations. A business is coupled to its environment by producing outputs that affect external entities, and receiving inputs, some of which are responses to previous outputs. However, business activities are usually more progressive than homeostatic.


As an example of progressive state changes, in his “Introduction to Cybernetics”, Ashby discussed an abstract system realized in nature. In the table below, after Tinbergen, the columns show the roles of two sticklebacks in their mating ritual; the rows show the succession of activities/states of the system/process as it progresses to a conclusion.


The stickleback mating ritual: the abstract system

The female’s role is to

The male’s role is to

present swollen abdomen, special movements

present a red color and a zigzag dance

swim towards the male

turn around and swim rapidly to the nest.

follow the male to the nest

point its head into the nest entrance

enter the nest

quiver in reaction to the female entering the nest

spawn fresh eggs in the nest

fertilize the eggs


The sticklebacks communicate by sending/receiving information in the form visual signals, which are stimuli (akin to data flows or messages in business operations) that inform and direct activity. The sticklebacks may be called actors or active structures; the nest and eggs may be called passive structures.


Abstracting systems from physical entities

Ashby made clear that a system (a set of variables and rules) is abstracted from the infinite complexity of any physical entity or real 'machine' in which that system may be observed. The table below divides his machine into a physical entity and the physical activity system it performs, when it realizes a particular abstract system.


Stickleback mating ritual


Abstract system (roles, rules, variables)

The ritual typified above


Physical activity system (activities, variable values)

An instance of the ritual

Physical entity (actors in the material word)

Sticklebacks, nest and eggs


Generally, the relationship between abstract systems and physical entities is many to many. So, the mating ritual in the table above can be realized by many stickleback pairs; and conversely, those sticklebacks may play roles in other systems. Also, there is more to a physical entity than any abstract system it realizes; there is more to know about the sticklebacks than their mating ritual.

Aside on Ashby’s cybernetic terminology

Ashby’s work had a profound influence on many prominent scientists. However, his writing is a tough read. He used the terms system and machine variously, with and without quotes, with and without prefixed adjectives (real, whole, regular, dynamic, independent etc.). Between books, even within a book, he did not use these terms entirely consistently (and he had no “find next” and “change all” function to help him fix that).


Ashby’s real system and real machine usually refer to a whole physical entity with infinite potentially definable variables (and sometimes infinite possible inputs) which “stands apart from any observer”. It could be an electrical transducer, an animal, or even a society.


His system (without the prefix “real”) is defined as an observer’s selection of variables. In his first book, the selection is arbitrary. In his second book, the selection relates to some “main interest that is already given”. Perversely however, Ashby also sometimes wrote of an animal or a society as being a system, regardless of any observer or selected variables.


In his first book, Ashby narrowed the scope of “system” by saying in chapter 2 that

“from now on we shall be concerned mostly with regular systems. We assume that preliminary investigations have been completed and that we have found a system, based on the real 'machine', that (1) includes the variables in which we are specially interested, and (2) includes sufficient other variables to render the whole system regular.”


Later, in chapter 2/17 and 14/3 he narrowed “system” further to mean a selection of variables that inter-related, and change independently of other variables.


His term 'machine' with quotes often seems the same as real 'machine'. By contrast, the term machine without quotes, sometimes seems limited to what a physical entity does by way realizing a system of variables that have been selected (whether arbitrarily, or as related to some given interest, or as related to each other and changing independently of others).


(For examples of Ashby using these terms in different ways, see 2/4, 2/5, 2/14, 2.17, 14.3  and 21/6 in “Design for a Brain”, and see 2/1, 3/1, 3/11, 4/3, 6/1 and 6/14 in “Introduction to Cybernetics”.)



System dynamics

Jay Forrester (1918 to 2016), sometime a professor at the MIT Sloan School of Management, defined the system dynamics method. Donella H. Meadows (1941 to 2001), whose interests included resource use, environmental conservation and sustainability, used and promoted the method (more on Meadows’s ideas later).


Much as Ashby defined a system as a set of variables chosen for attention, and relationships between those variables, Forrester defined a system as a set of “stocks” chosen for attention, and the “flows” between those variables, be they observed or envisaged.


A stock can represent any quantifiable population or resource. E.g. wolf and sheep populations, happiness level, sunlight level, and variables that affect climate change such as CO2 level. A stock level, a variable quantity representing the amount of the stock, is increased and decreased by inter-stock flows. A flow, connecting two stocks, represents how increasing one stock increases or decreases another stock.


A negative or balancing feedback loop, where growth in one stock tends reduce another, tends restore a system to a stable state, regardless of the initial conditions.


Growth in

Feedback loop

Growth in





Homes for Sale



House Prices


A positive or reinforcing feedback loop drives the two coupled populations or resources to extreme levels (barring some natural limit or control from another direction).


Growth in

Feedback loop

Growth in







Polar ice cap






In a network of feedback loops, radically different end states are possible from very similar initial conditions. This is a defining feature of “chaos theory”.


Typically, the modeller begins by drawing a causal loop diagram that names stocks (in nodes) and connects them by flows (arrows). To show whether a flow increases or decreases a stock, the modeller annotates the flow arrow with + or – sign. For examples, see this introductory book


A causal loop diagram is only a sketch of a system's dynamics. The modeller can go on to complete the system dynamics model by defining the mathematical rules for how flows modify stock quantities. When completed, the model can be animated. You can give the state variables some initial values, then set the system in motion. Its state will change in discrete time-steps, and the result can be reported on a graph showing stock level changes over time (lines of behavior).



Soft systems thinking

The term “soft system” is used with at least three meanings. Soft system 1: An observer’s perspective of a business system or situation. Soft system 2: A system in which the observer participates. Soft system 3: A human activity system in which people undertake activities to meet aims.


The distinction between hard and soft systems is debatable. The first two meanings are no different from the view of a system taken in cybernetics and system dynamics. Ashby and Forrester both saw a system as “soft” in the sense that it is an observer’s perspective or model of some physical phenomena. And both presumed an observer may set the parameters for a system to begin operation, and monitor its state changes over time.


Where the interest is particularly in the regular operations of a business, all four soft systems thinkers below looked at the business as an “organization” in which some (not all) human activities are regular and repeatable enough to be modelled.


Churchman, one of the first soft systems thinkers, said "a thing is what it does". He outlined these considerations for designing a system managed by people:

·       “The total system objectives and performance measures;

·       the system’s environment: the fixed constraints;

·       the resources of the system;

·       the components of the system, their activities, goals and measures of performance; and,

·       the management of the system.”


Like others, Churchman sought to integrate system theory into “management science”, and replacing the word “business” by “system” he conflated the ideas of a social entity and a social activity system.


Peter Checkland (born 1930) promoted a “soft systems methodology” for business analysis and design. Checkland regarded a business as an input-to-output transformation system. His “business activity model” shows a network of activities that transform inputs into outputs. So, whatever “soft system” means, the end result of his soft systems method is the definition of a human activity system. The same general principles apply to this as to other kinds of activity system models.


1.     Regular activities transform inputs into outputs wanted by customers

2.     Feedback loops connect a business to its environment thus: a) a business detects changes in the state of its environment b) it determines responses and c) it directs entities to perform activities.

3.     Observers may draw a business activity model, and read the current state of a system in its data store(s).


Using Checkland’s soft systems method, one draws a business activity model to show how regular or repeated activities complete a “transformation” that human actors are employed to make. Of the six letters in his CATWOE acronym for describing a system of interest, five can be found in the extended SIPOC diagram drawn below.







Inputs à Processes à Outputs

Owner and Actors




Different modellers, speaking to different stakeholders, may come up with different CATWOE sentences, different systems of interest. Reconciling different perspectives is part of what must be done, even in the hardest of engineering projects. Checkland noted the distinction between hard and soft system approaches is slippery. He wrote that people get it one day, and lose it the next. And sometimes he said the term “soft” was intended to characterize his method – not the system of interest.


Among other “soft systems thinkers”, Russell L Ackoff (1919-2009) spoke of human organizations as purposeful systems, and Stafford Beer (1926- 2002) wrote on “management cybernetics”. You’ll meet Checkland, Ackoff and Beer again in part two of this book – on systems thinking in management science.

General system theory

“There exist models, principles, and laws that apply to generalized systems or their subclasses, irrespective of their particular kind.” Bertalanffy


The 1954 meeting of the American Association for the Advancement of Science in California was notable. Some at that meeting conceived a society for the development of general system theory. The founding members of the International Society for the Systems Sciences (1955 to date) included:

·       Ludwig von Bertalanffy (1901-1972) the cross-science notion of a system

·       Anatol Rapoport (1911 to 2007) wrote on game theory and social network analysis.

·       Kenneth Boulding (1910-1993) applying general system theory to “management science”


Ludwig von Bertalanffy was a biologist who promoted the idea of a cross-science general system theory from the 1940s. He looked for patterns and principles applicable to several sciences and domains of knowledge. Many have been outlined earlier in this chapter. Two more branches of general system theory should be mentioned here.


Anatol Rapoport was a mathematical psychologist and biomathematician who made many contributions. In researching cybernetic theory, he pioneered the modelling of parasitism and symbiosis. This gave a conceptual basis for his work on conflict and cooperation in social groups.


When actors interact in a system, they may cooperate, as in a football team or a business system. But they don’t necessarily help each other. They may compete, as in a game or a market; or hurt each other, as in a boxing match or a war. Cooperation, conflict and conflict resolution is a focus of bio-mathematics and game theory.


Game theory: cooperation and conflict resolution. In the 1980s, Rapoport won a computer tournament. It was designed to further understanding of how cooperation could emerge through evolution. He was recognized for his contribution, through nuclear conflict restraint, to world peace.


Social network analysis. Rapoport showed that one can measure flows through large networks. This enables learning about the speed of the distribution of resources, including information through a society, and what speeds or impedes these flows.


Kenneth Boulding is perhaps best known for his article in volume 2 of the journal “Management Science”, 1956 in which he proposed a hierarchical classification of system types that places socio-cultural systems near the top. For more about this questionable hierarchy, read the chapter on hierarchical abstraction.


In the same article, Boulding made a less noticed but arguably more fundamental point. He questioned whether the elements of a social system are actors, or the roles they play. This brings us to the fundamental dichotomy in systems thinking between social entities and the activity systems they employ and participate in.

Conclusions and remarks

This chapter is one of two that take you on canter through a history of systems thinkers and their ideas. For a distillation of ideas attributed to the nineteenth century thinkers mentioned at the start, you could read my sketchy notes here thinkers who foreshadowed system theory.


Hard and soft systems thinking

Midgely (2000) presented hard, soft and critical systems thinking approaches as though they were three phases in a historical progression. Yet the hard/soft distinction has been interpreted several ways. And even mechanical and information system engineers are taught “soft” and “critical” systems thinking techniques such as:


·           Analyse current situations, understand assumptions

·           Look at the big picture, overarching drivers, goals and principles

·           Engage in critical thinking

·           Identify stakeholders, their concerns and viewpoints

·           Unfold multiple perspectives, promote shared understanding.

·           Monitor and manage changes to requirements, time, cost, resources etc

·           Outline solutions or changes and how to make them.


The traditional way of presenting the history of systems thinking in three or four waves diverts attention from how Forrester's "system dynamics" may be grouped with classical 1950’s cybernetics (as a quantitative approach, featuring feedback loops, non-linear lines of behavior, attractors etc.) and with 1970’s soft systems (as about activity systems definable as a collection of parts that interact in an orderly way to produce a set of results or effects that no part can produce on its own). All three (system dynamics, cybernetics and soft systems methodology) are about activity systems that are


·       soft in the sense it is an observer’s perspective of an entity or phenomenon, described in the light of some given interests.

·       dynamic in the sense it changes from state to state over time and/or produces outputs from inputs.

·       stable in the sense its “way of behaving” is regular and repeatable enough to be modelled.


Bertalanffy, Rapoport and Boulding, who founded what became Society for General Systems Research, grew disenchanted with its diversion from hard sciences towards “management science”. The quote below reveals that Rapoport and Bertalanffy tried to restrain this, and Boulding also actively joined them.


On the History of Ludwig von Bertalanffy’s ‘General Systemology’, and on its Relationship to Cybernetics – Part II.


“Bertalanffy, Rapoport and Boulding … had to take into account the trends that asserted themselves… Rapoport’s resignation in 1977 corroborates this interpretation. He took his decision with regard to the predominance, in the “Society for General Systems Research”, of instrumental interests oriented toward “managerial” applications… Rapoport and Bertalanffy tried to restrain this evolution.. and Boulding also actively joined them.”

“Two main directions can be observed in systems technology… The first domains of an application, namely computer science and technologies of automation and physical control… connected to the history of early cybernetics. … Their importance gradually decreased on behalf of another systems technology mainly represented by operations research and management sciences, oriented toward “software” (or “soft systems”), i.e. the multitude of organizational issues arising in contemporary society.”

“In the 1970s, notably with the impetus of Peter Checkland, the dichotomy between “hard” and “soft systems” took a polemical sense in referring to differences in the approach of human systems. Ultimately, it referred to a split between an “objectivist” approach of these systems that focused on their functional efficiency and bore an instrumental view of human, and an “interpretative” approach that claimed to take into account the role of individuals and values”


The dichotomy in this work is not between hard and soft systems thinking; it is between “activity systems thinking” and “social entity thinking”. It impossible to unify them. Social entity thinking is not an evolution of activity systems thinking; they are not competitors; they different perspectives with different traditions. Both are needed, but until systems thinkers distinguish social entity thinking from activity systems thinking, the field will remain confused and confusing