Von Foerster’s ideas

On our knowledge of reality and second order cybernetics

https://bit.ly/2CxPlqs

Copyright 2017 Graham Berrisford. One of more than 100 papers on the “System Theory” page at http://avancier.website. Last updated 22/09/2019 13:05

 

Thinkers like Ackoff, Beer and von Foerster were wise men with good advice to offer people.

Arguably however, they and other thinkers in 1970s undermined the concept of a system.

Many today refer to ideas found in von Foerster’s second order cybernetics.

The first part this paper argues some are truisms (axiomatic in modern science) yet also misleading.

The second part argues other ideas are questionable or mistaken.

 

“Thanks for sharing this invaluable knowledge.” Linkedin message

Contents

PREFACE. 1

PART ONE: Axiomatic ideas - related to knowledge and truth. 1

Perception. 1

Knowledge is a biological phenomenon. 1

Description is not reality. 1

Each individual constructs his or her own model of reality. 1

Pause for thought 1

On subjectivity and objectivity. 1

PART TWO: Questionable ideas. 1

Circularity. 1

Relativism and perspectivism.. 1

The hermeneutic principle of communication?. 1

Do we need a theory of the observer?. 1

Self-organising system?. 1

CONCLUSIONS AND REMARKS. 1

Further reading. 1

 

PREFACE

 

System theory - recap

Here, the word “entity” means “an observable or conceivable part of the world”.

It could be a planet, a hurricane, a group of people, or a performance of a symphony.

In his work on cybernetics, Ashby urged us not confuse an entity with any abstract system that the entity realises. 

 

“At this point we must be clear about how a "system" is to be defined.

Our first impulse is to point at [some real-world entity] and to say "the system is that thing there".

This method, however, has a fundamental disadvantage: every material object contains no less than an infinity of variables and therefore of possible systems.

Any suggestion that we should study "all" the facts is unrealistic, and actually the attempt is never made.

What is necessary is that we should pick out and study the facts that are relevant to some main interest that is already given.” (Ashby 1956)

 

Ashby, Ackoff, Checkland and other systems thinkers emphasise that a system is a perspective of a reality.

They distinguish abstract and concrete systems.

 

The basis of system theory

Abstract systems (descriptions)

<create and use>                              <represent>

System thinkers   <observe and envisage >  Concrete systems (realities)

 

A concrete system (in reality) is any entity that conforms well enough to an abstract system (in description).

An abstract system is a description or model of how some entity behaves, or should behave.

An abstract system does not have to be a perfect model of the entity described; it only has to be accurate enough to be useful.

 

General system theory is general in the sense of being cross science.

Systems thinking can and should be an application of the scientific method.

An abstract system is a theory.

We can and should test that a real-world entity empirically exemplifies that theory - to the degree of accuracy we need for practical use.

And steer people away from the anti-science “relativism” that many social systems thinkers fall into.

 

Second order cybernetics

“Second-order cybernetics” was developed around 1970 by thinkers including Heinz von Foerster and Margaret Mead.

Since then, a schism has opened up between classical and second-order cybernetics.

Much modern systems thinking discussion seems confused in the way that Ashby warned us of.

It confuses real-world entities with systems they realise.

There is no reference to the need to verify that an entity actually does behave in accord with a system description.

PART ONE: Axiomatic ideas - related to knowledge and truth

Heinz von Foerster (1911 to 2002) was an Austrian American scientist who combined physics with philosophy.

He is widely credited as the originator of “second-order cybernetics.”

Followers of this school of systems thinking are drawn to these ideas

 

“Knowledge is a biological phenomenon” (Maturana, 1970).

However, knowledge is not restricted to one biological entity, since knowledge can be shared by social animals.

 

“All experience is subjective (Gregory Bateson).

"Objectivity is the delusion that observations could be made without an observer." (von Foerster).

However, sharing and testing the knowledge we acquire from experience can assure us it is also objective to the degree we need.

 

“Each individual constructs his or her own reality" (von Foerster, 1973).

However, we can verify the knowledge that we construct - socially, logically and empirically.

 

These ideas are explored in the following sections.

Perception

A sensor is a (biological or technological) machine that can represent some features of a reality.

A sensation or perception is a representation (model or image) of those features. 

It is not the actual features; it may be fuzzy, incomplete and malleable.

Nevertheless however, the accuracy of the model can be tested by using it.

 

At the most external point of a sensor (e.g. the eye's lens), there may be no or minimal filtering or censorship.

As the sensation progresses through the nervous system, processes can be applied to it.

E.g. The retina of a cat's eye is especially sensitive to thin wiggly lines - like mouse tails.

Further into the brain, cognitive processes may reshape the incoming message.

According to this Anil Seth talk, research suggests the brain combines

·       Observation: sensing information input from what is out there.

·       Envisaging: making a best guess as to what has been sensed, with reference to what is expected

 

That does not mean what you perceive and remember is purely an invention - or does not represent knowledge of the external world.

It only means your brain (given the time and resources at its disposal) makes the best bet it can as to what your senses tell you about the world.

If animals could not compare new and old perceptions, and match them correctly enough, then they could not learn to recognise and manipulate entities in the world.

 

What we expect to see is not purely fanciful - invented out of nothing.

It is what a mix of inheritance and experience predicts is likely to be true.

Thus, the brain optimises its matching of perception and experience.

Else, it would have the hopeless task of analysing each perception from scratch.

 

A perception can only model a thing - otherwise it would be the thing.

That does not mean (as Seth implies) that the thing does not exist, or that perception is hallucination.

The existence of humankind depends on the presumption that:

·       things exist out there

·       our perceptions and memories of those things are useful models of them and

·       we can share those models by translating them into and out of messages.

 

We may sometimes hallucinate - perceive something where there is nothing.

A mental model may be a poor representation, it may fade to nothing.

But still, our brains are designed or evolved to perceive what does exist out there.

And our survival depends being able to do this reasonably well, most of the time.

 

To know a thing is to have access in our thoughts to a useful model or representation of it.

We can never know – perfectly - what a thing is; that is not even a meaningful suggestion.

We can only know a thing as it is represented in some kind of model, description or theory.

 

Moreover, we can share our knowledge with others.

And we can test that things do turn out in the way our knowledge leads us to predict.

To deny that sharing and testing help to confirm our knowledge of the world would be to deny the history of mankind.

Knowledge is a biological phenomenon

“Knowledge is a biological phenomenon” (Maturana, 1970)

Millions of years ago, animals evolved to conceptualise things they perceived in the world.

To remember a thing, they encoded a representation of it (or some of its features) in a neural memory.

To recall a thing, they decoded that memory.

Thus, knowledge evolved in organisms to help them survive and thrive.

 

(Today, machines created by biological actors can remember and communicate knowledge.)

 

Knowledge is a model or representation of a reality (or some of its features) created by an actor.

It is not the reality itself.

The only test it has to pass is that it represents a reality well enough to prove useful in observing or manipulating that reality.

When it passes that test, we see it as knowledge.

When it fails, we see it as useless information.

 

This paper defines knowledge as information that proves accurate enough when used.

A cat holds information that represents a mouse's features; that information proves to be knowledge when the cat spots a mouse.

A honey bee communicates information of where pollen can be found; that information is knowledge when other honey bees use it to find pollen. 

I give you information about Newton's laws of motion; that information is knowledge whenever you use those laws effectively.

 

This table goes further; it distinguishes knowledge from information and information from data.

 

WKID

meaning

Wisdom

the ability to respond effectively to knowledge in new situations

Knowledge

information that is accurate enough to be useful

Information

any meaning created or found in a structure by an actor

Data

a structure of matter/energy in which information has been created or found

 

Data

You may read the time of day from the direction of a shadow on a sundial.

You may tell people whether you are open to visitors or not by leaving your office door open or closed.

A honey bee can tell other bees about pollen locations using dance movements.

The direction of the shadow, the position of the door, the shape of the dance – each is used as a data structure.

 

You can create or find information or meaning in any structure or motion that is variable - has a variety of values.

The spoken word gives humans the ability - with almost no effort - to form infinite data structures.

And the written word gives us the ability to preserve those structures in shared memory spaces.

 

Information

There is no information or meaning in a data structure (shadow, door, dance movement or words) on its own.

Meaning only exists in the process of creating or using a data structure.

Meaning is encoded in a data structure when it is created and decoded when it is read.

An encoding or decoding process requires the use of a code known to the actor,

Encoding and decoding processes are ubiquitous, appear in all forms of communication.

 

Communication

For an act of communication to succeed in conveying information, two roles must be played.

·       One actor (a sender) must encode some information or meaning in a data structure or message.

·       Another actor (a receiver) must decode the same information or meaning from that data structure or message.

 

Consider how one bird (acting as a receiver) understands the alarm call made by another bird (acting as a sender).

The sender and receiver must share the same language or code for encoding and decoding the message.

But, prior to exchanging that message, they may be entirely unknown to each other.

 

Consider the transmission of an SOS message, which conveys the idea that help is needed.

It is broadcast by a sender to any and every actor able to receive it.

It is understood only by receivers who can decode the message, using the language it was created in.

It might be a fake, intended to waste the time of its receivers – which is to say it conveys misinformation rather than knowledge.

 

One message can be and often is interpreted differently by different receivers (using different codes, in different states, using different rules).

However, social communication would never have evolved if senders did not manage to share knowledge with receivers often enough.

That sharing of knowledge takes place is evident from empirical observations - regardless of how it works.

Worry not how weakly the model in a message represents a reality, and how different the internal models of a sender and receiver may be.

Consider only this - the evidence is that we can and do share knowledge.

E.g. You tell me a train is coming and I then step off the railway track.

That evidence indicates we share a considerable amount of knowledge about the world.

 

Humankind brought four innovations to communication.

1.     Words: an infinitely flexible box of sounds for communicating, which cost almost nothing to create and use.

2.     Oral speech: speaking and hearing words, using sound waves to symbolise meaning in messages.

3.     Writing: recording words in persistent memory structures.

4.     A “domain-specific language” in which the ambiguities of words in natural language are minimised.

 

The fact that social actors share knowledge is demonstrated whenever they cooperate successfully.

In a human social network, information about the world is represented:

·       Internally, and mysteriously in the neurons of individual actors, and

·       Externally, in messages actors exchange and in memories/records they share access to.

 

To overcome the limitations of natural language, people use controlled vocabularies in which words have universally agreed meanings (like SOS).

They also "talk around" around a message, express it several ways, to ensure its meaning is conveyed.

The stronger their social relationship, the more likely they will do this long enough to understand each other.

 

Knowledge

Speakers create meanings in messages; hearers find meanings in messages.

Communication succeeds when created and found meanings are the same.

But there is no information/meaning in a message on its own

Information/meaning exists only in the act of creating or using a message.

And knowledge only exists where the information is useful.

 

If the information created or found in a message or memory is accurate enough to be useful, then it may be called knowledge.

E.g. the knowledge of where to find some pollen can be communicated by one honey bee to another.

E.g. knowledge of Newton’s laws of motion is useful, even though Einstein showed them to be only approximations.

 

(What a message sender considers true and useful knowledge, a message receiver may consider false, and vice versa.

E.g. I feel the swimming pool is warm; I tell you that and you “take me at my word”.

You dive in, but find the swimming pool is colder than you expected, and complain that I lied.)

 

Wisdom

Wisdom is the ability to respond effectively to knowledge in new situations.

The application of wisdom to knowledge implies a higher level of intelligence than communication alone.

Description is not reality

 “Knowledge "fits" but does not "match" the world of experience” (Glasersfeld, 1987).

“We cannot know the essences of things in themselves; all we can know is what we know as abstracting nervous systems.” Alfred Korzybski

 

This is axiomatic: a description is not the reality it describes.

However, no biologist would accept the view expressed by one system thinker that "internal cognitions do not reflect any external reality".

First, the neural systems of animals evolved to represent things in their environment (food, friends and enemies) in bio-chemical memories.

This helps individual animals survive and thrive by recognizing and manipulating things in their environment.

Social animals evolved further to share knowledge of things in their world, by translating internal representations into external messages.

For example, birds make alarm calls, and honey bees tell other honey bees where to find some pollen.

Human animals evolved further to communicate information about the world (descriptions, directions and decisions) by using words.

 

Discussion of the difference between description and reality goes back to the ancient Greeks.

This triangular graphic separates descriptions from the realities they describe, symbolise or represent.

 

Abstraction of description from reality

Descriptions

<create and use>          <represent>

Describers  <observe & envisage> Realities

 

A description is created by a process of encoding, and used by a process of decoding.

A description can never be more than a very selective perspective or model of a reality (else it would be the reality).

Recursively, descriptions of realities are also real.

Descriptions are abstractions that appear in the physical forms of memories and messages – and can be described.

Each individual constructs his or her own model of reality

“Each individual constructs his or her own reality" (von Foerster, 1973).

"The environment as we perceive it is our invention." (von Foerster, 2007).

 

It is axiomatic in biology and psychology that your brain is unique to you.

However, it also axiomatic that one animal can communicate some knowledge about the world to another

E.g. I tell you there is an apple in the blue box behind that door; you open the door, open the blue box and eat the apple; QED, I transferred some knowledge to you.

In basic cybernetics also, it is presumed that two systems can exchange some knowledge about the state of the world.

Yet von Foerster’s aphorisms (above) lead some to deny that knowledge can be shared.

His second order cybernetics leads people to make other assertions that range from questionable to nonsense.

 

Remembering knowledge

Contrary to the initial hopes of cyberneticians (Weiner and McCullough, Ashby and Turing) it has been clear since c1960 that the brain does not work like a computer.

It does not store static persistent data structures, its mental images are incomplete, fuzzy and malleable.

Experiments show that people find it difficult to apply the rules of logic.

Nevertheless, the brain is not empty; it does remember some features of entities and events it has perceived.

It evidently does process representations of those features when it remembers and recalls them.

You do know a dollar bill is rectangular, and green, and has a one on it.

You remember enough to recognise a dollar bill when you look at one – though, yes, you might well be fooled by a fake dollar bill.

How information is stored in a brain is a mystery, but if no information was stored how you could describe a friend to somebody else?

 

Sharing knowledge

In biology, there is no need for a mind-body separation.

Even a single-celled organism has some knowledge of its environment.

But it is convenient here to speak here of mental models.                                                                                                                                          

Obviously, a mental model is unique to the mind that holds it.

No two of us share the same biochemical mental model of the world.

But for a systems thinker to say “no one shares the same knowledge of the world” is clearly untrue, since it denies the success of social animal species.

 

“We cannot transcend ourselves as organisms that abstract” Alfred Korzybski

The evidence suggests we can and do transcend ourselves as individual organisms when we successfully cooperate socially.

All social animals depend on being able to communicate descriptions that usage proves to be accurate.

Demonstrably, animals do manage to share “facts” that we abstract from observations of reality.

Humans use domain-specific languages, the rules of logic and the scientific method to transcend our subjective experience.

 

Examples of knowledge that is shared

Many have used Newton's laws of motion to calculate a force.

The accuracy of the calculation is revealed in its successful use, every day, all over the world.

 

You ask someone to call you at 11.00 hours; they call you at the appointed time.

Demonstrably, you share an understanding of the abstraction labelled “the time of day”.

 

Your fishmonger advertises cod steaks; you buy some and eat them.

Demonstrably, you share an understanding of what the abstraction labelled “cod steak” means.

 

A honey bee can encode their mental model of a pollen location in a dance.

Another bee can decode the dance into a mental model of where that pollen is.

To find the pollen, the second bee must share the mental model of the first.

This shows that both mental models represent the same facts – the distance and direction of the pollen source.

The facts recorded in these mental models are objective and accurate enough for us to call them “true”.

Suppose you see one honey bee finds the pollen described to it by another honey bee.

You (3rd party observer or experimenter) have evidence that they have shared an objective description of the world.

That is the very definition of objective (not limited to one intelligent entity - and confirmed by empirical evidence).

Moreover, in an example of cross-species communication, scientists can read the dance of a honey bee and find the pollen themselves!

 

We (you and I) can both read and remember this sentence.

Our two mental models of the sentence are different and yet the same.

They are different – our mental models are bio-chemically distinct and different.

They are the same - we can both recall and recite the sentence accurately.

The objectivity of our mental models is found not in their biochemistry.

It is found in communication showing we both recall and recite the sentence.

A 3rd party can test what we recite is objectively accurate.

 

Designators

Designators that biological organisms create to identify or typify real-world objects are physical, but not rigid.

There is no absolute truth in any verbal or non-verbal representation of a reality.

There are however domain-specific languages in which people agree to share a vocabulary

And there are degrees of truth or usefulness in statements formed using those languages.

And there is DNA – the unique identifier of an organism.

Pause for thought

The ideas explored above are often discussed in the context of second order cybernetics.

They might be extended thus.

·       Knowledge is a biological phenomenon - there was no description before life. 

·       Knowledge "fits" but does not "match" the world of experience - a description is a reality, but not the reality it describes. 

·       Each individual constructs his or her own reality - our mental models are bio-chemically distinct and different.

 

The trouble is that these quotes can lead some towards a kind of "relativism" that undermines science.

Whereas they are perfectly compatible with Ashby’s first order cybernetics, and with a Darwinian analysis of how animal intelligence evolved.

On subjectivity and objectivity

"All experience is subjective." Gregory Bateson.

For sure, but readers of this quote may mistakenly conclude that everything we believe, since it is subjective, is equally valid.

How do we find food and eat it if we have no objective-enough knowledge of the world?

 

"Objectivity is the delusion that observations could be made without an observer." Heinz von Foerster

For sure, there could be no observations before observers, and no description before life.

However, readers of this quote may mistakenly conclude there can be no objectivity in a description of reality.

How many subjective interpretations can different observers make of Newton's f = ma or Einstein's e = mc2?

 

Treating objectivism/subjectivism as a dichotomy is problematic - it hinders more than it helps.

Science does not divide views into 100% subjective or 100% objective or true.

Subjective does not mean wrong; it means personal, perhaps influenced by personal feelings, tastes, or opinions.

Objective does not mean infallible; it means not restricted to one individual and/or verifiable in some way, at least to a degree.

 

By social, logical and empirical verification we increase the objectivity of what may start out as a personal or subjective description.

Some descriptions (e.g. Newton’s laws of motion) have proved so objective – to have such a high degree of truth – that we trust our lives to them.

Physicists may describe light as a stream of particles or of waves.

They do not say either description is “true”, they say only that each model can be useful.

 

Evidently, animals can perceive and know some things with a sufficient degree of truth for practical uses

This was the motor for the evolution of animal memory and social communication

To deny that would be to deny the survival and flourishing of life on earth.

It was also the motor for the evolution of science.

PART TWO: Questionable ideas

Von Foerster was something of a dilettante (“I don't know where my expertise is; my expertise is no disciplines”).

He enjoyed thinking about all things circular, recursive and self-referential and enjoyed provoking people with thoughts about them.

Circularity

"Should one name one central concept, a first principle, of cybernetics, it would be circularity." ~ Heinz von Foerster

OK, but there very different kinds of circularity, for example:

·       the circularity in a feedback loop between two cooperating machines

·       the circularity between a machine and its designer, or a system and its observer.

 

Second order cybernetics is said to be the circular or recursive application of cybernetics to itself.

It shifts attention from observed systems to the observers of systems.

It allows systems actors to be system thinkers, who can study and reorganise the system they play roles in.

But to discuss only observers and the systems they observe is relatively naive.

In a proper discussion of it, classical cybernetics (after Ashby) is more sophisticated.

An observer does not observe a system directly.

The observer observes a real word entity from which countless different, possibly conflicting systems may be abstracted.

Relativism and perspectivism

Relativism is the idea that knowledge and truth exist only relation to particular culture, society, or historical context.

For sure, people perceive the world a little differently from each other.

And people see the world somewhat differently from how birds, bats and bees see it.

But more importantly, their conceptualisations can be and are shaped by testing them against reality.

 

Historical figures including Protagoras, Nietzsche and von Foerster have subscribed to a kind of relativism or perspectivism that can be misleading.

Friedrich Nietzsche (1844 to 1900) was a philosopher whose metaphysical ideas influenced many Western intellectuals.

“Nietzsche claimed the death of God would eventually lead to the loss of any universal perspective on things, along with any coherent sense of objective truth.

Nietzsche rejected the idea of objective reality, arguing that knowledge is contingent and conditional, relative to various fluid perspectives or interests.

This leads to constant reassessment of rules (i.e., those of philosophy, the scientific method, etc.) according to the circumstances of individual perspectives.

This view has acquired the name perspectivism.” Wikipedia December 2018

 

Protagoras, Nietzsche and von Foerster have a lot to answer for, as discussed in Postmodern Attacks on Science and Reality.

Some Marxists and postmodernists interpret perspectivism as meaning all descriptions of the world are subjective, and perhaps, therefore, equally valid.

At the extreme, this leads to the view that the “dialectic” is more important than evidence.

That any persuasively argued or widely believed assertion carries the same weight as science.

Or even that any personal opinion is as true as the facts the world’s best scientists agree.

 

Scientists are aware that our sensory tools, perceptions, memories and communications are subjective and imperfect.

That doesn’t mean science is unreliable and should be discarded; the reverse is the case.

The scientific method is the best tool we have to transcend limitations as individual observers.

It involves testing of results against predictions, logical analysis and peer group review.

That is how we incrementally improve our confidence that a model or theory is valid.

The hermeneutic principle of communication?

“Information” is a relative concept that assumes meaning only when related to the cognitive structure of the observer of this utterance (the “recipient”). Von Foerster (1981).

 

Here, von Foerster endorsed what some call the hermeneutic principle of communication.

The Hermeneutic principle: "The hearer, not the speaker, determines the meaning of a message."

This is misleading

A message has a meaning when related to the cognitive structure of its sender - as well as its recipient.

The principle makes innocent speakers guilty of causing offence where none was intended.

The intention of a speaker does matter, practically, logically and morally.

 

Ashby emphasised that the meaning of a message depends on what the recipient knows of the sender.

E.g. In his example, two soldiers are taken prisoner by countries A and B.

Their wives each receive the same brief message “I am well”.

However, the same message (or signal) conveys different meanings to each.

Because country A allows the prisoner a choice of three messages: I am well, I am slightly ill and I am seriously ill,

Whereas country B allows the prisoner only one message: I am well (meaning no more than “I am alive”).

 

(Aside: every business depends on information/data processing systems in which it is assumed that data transmission is perfect.

And that message senders and receivers use the same language to encode and decode messages.

So, in enterprise architecture and IT, the terms information and data are usually interchangeable.

And a so-called “information model” is a somewhat informal and abstract “data model”.)

Do we need a theory of the observer?

… the classical concept of an “ultimate science”… contains contradictions.

To remove these one had to account for an “observer” (that is at least for one subject):

(i)              Observations are not absolute but relative to an observer’s point of view (i.e., his coordinate system: Einstein);

(ii)            Observations affect the observed so as to obliterate the observer’s hope for prediction (i.e., his uncertainty is absolute: Heisenberg)

After this, we are now in the possession of the truism that a description (of the universe) implies one who describes it (observes it).

What we need now is the description of the “describer” or, in other words, we need a theory of the observer.” von Foerster (1981).

 

It is indeed a truism that a description is a product of a describer.

But we can understand a description of an observed thing with no knowledge of the describer.

All we need is to know is the language they used to form the description.

E.g. To understand and apply the laws of motion, we need no knowledge of Newton.

To watch a tennis match and understand the score board, we need no knowledge of the umpire

And in observing how pricing affects supply and demand, the economist Hayek did not need to exclude himself as a buyer or seller in the market.

 

It is not true that a describer necessarily affects what they describe.

Von Foerster’s reference to Heisenberg’s uncertainty principle applies to observations of micro-scale sub-atomic particles.

It is not generally applicable to macro-scale objects and social phenomena.

 

It is true that observers can become involved in, and affect, the social phenomena they observe and describe.

This is rightly a concern to sociologists (e.g. Mead) in studying the behaviours of tribal societies.

And the actors who play roles in a system may observe and redefine the roles and rules of that system

But no “theory of the observer” is needed if we distinguish actors from roles they play.

We can and should separate the actor’s role in a system from the same actor’s role as an observer of the same system.

Self-organising system?

"First-order cybernetics is the science of observed systems” von Foerster

This is a curious starting point for discussion, since classical cybernetics is also about observed systems.

Ashby suggested infinite systems may be abstracted from a concrete entity.

Every abstraction requires an abstracter – the system observer or definer.

 

“Something strange evolved among the philosophers, the epistemologists and, the theoreticians.

They began to see themselves more and more as being included in a larger circularity; maybe within the circularity of their family;

or that of their society and culture; or even being included in a circularity of cosmic proportions!” von Foerster in "Ethics and Second-Order Cybernetics"

 

Von Foerster is credited with initiating second-order cybernetics, said to be the recursive application of cybernetics to itself.

He asked: “Am I a part of the system, or I am apart from the system?

This is typical of his playful phrasing of aphorisms and questions.

How would Ashby have answered the question? Here is a reply based on Ashby’s writings

 

“Heinz, cybernetics does not ask "what is this thing?" but ''what does it do?" It is thus essentially functional and behaviouristic.

You must distinguish yourself (a concrete entity) from the actions you perform in various systems.

You part you play in any social or socio-technical system is the behavior you contribute to it, rather than you as a person.”

 

E.g. You are only a “part” of a traffic system in so far as you play a role in it, say as the driver of a motor car.

Being self-aware and having free will, you can ignore or break the rules of that role: jump a red light, or close your eyes for a nap and crash the car.

You have the same freedom of choice in every social or socio-technical system you play a role in.

Moreover, you can both a play a role in a traffic system, and play a different role as an observer and legislator who changes the rules of that system.

CONCLUSIONS AND REMARKS

 

The need to seek and accept objectivity

Some of the ideas expressed by second order cyberneticians are axiomatic.

They underpin not only Ashby’s classical cybernetics but all modern science.

But to suggest we know nothing of the world is misleading.

That cognitions can represent realities is to essential to how animals, on their own and in social species, deal with things in their world.

To deny that would be to deny the evidence that scientific experiments reveal, or even deny there is a world out there at all.

 

And to suggest that all theories, being subjective, are equally, would undermine science.

It encourages people towards an anti-science relativism, in which nothing is regarded as true or objective knowledge.

By social, logical and empirical verification we do demonstrably shift our descriptions from the subjective towards what may well be called objective.

 

State change v system mutation

Unfortunately, second order cybernetics can mislead systems thinkers about classical cybernetics.

Classical cybernetics emerged out of thinking about biological and mechanical control systems.

A system responds in a regular way to changes in its environment or another (coupled) system.

The term adaptation usually means system state change - changing state variable values in response to events.

It often means homeostasis - maintaining given variable values in a desirable range - but not all systems are homeostatic.

The trajectory of a system’s state change (be it oscillating, linear, curved or jagged) is an inexorable result of the system’s rules.

 

Second-order cybernetics is often applied to thinking about social organisations.

Here, the term adaptation often means system mutation or evolution – changing the system’s state variables, roles and rules.

This changes the very nature of the system; it changes its laws.

The trouble is that continual adaptation or reorganisation of a system undermines the general concept of a system – which is regularity.

Consequently, second-order cybernetics tends to undermine more general system theory.

If we don't distinguish an ever-evolving social network from the various modellable systems it may realise, the concept of a system evaporates.

 

Self-organisation

What might be called self-organisation appears in many different forms.

There are self-regulating, self-sustaining, and self-stabilising systems.

 

However, Ashby and Maturana rejected the idea the notion of a “self-organising system”.

It undermines the very idea of a system in classical cybernetics and more general system theory.

For sure, the actors who play roles in a system may also observe it, and agree to change its variables, roles or rules.

But whenever actors discuss and agree such a change, they are (for that time) acting in a higher level or meta system.

And once the change is made, the actors (still members of the same social network) now act in new roles in a new system (or system generation).

 

See “Self-organising systems” for further discussion.

Further reading

von Foerster (1981) "Notes on an epistemology for living things" in Observing Systems, The Systems Inquiry Series, Intersystems. Publications (1981), p. 258-271.

von Foerster (2007) “Understanding Understanding: Essays on Cybernetics and Cognition”, p.212, Springer Science & Business Media

von Foerster in "Ethics and Second-Order Cybernetics"

 

This is one of many companion papers that analyse some systems thinkers’ ideas.

·       Read Ashby’s ideas for an introduction to his ideas about cybernetics, mostly integral to general system theory

·       Read Ackoff’s ideas on the application of general system theory (after von Bertalanffy and others) to management science.

·       Read Ashby’s ideas about variety on his measure of complexity and law of requisite variety.

·       Read Beer’s ideas on the application of cybernetics (after Ashby and others) to management science.

·       Read Von Foerster’s ideas on ideas attributed to Heinz von Foerster and his second order cybernetics.

 

Further reading on the “System Theory” page at http://avancier.website includes:

Boulding’s ideas, Checkland’s ideas, Luhmann’s ideas, Marx and Engels’ ideas, Maturana’s ideas and Snowden’s ideas.