The Physics of Systems
Copyright Graham Berrisford 2018. Last updated 12/02/2019 23:06
One of a hundred papers on the System Theory page at http://avancier.website.
Find that page for a link to the next System Theory for Architects Tutorial in London.
The systems of interest here are islands of orderly behavior in the ever-unfolding process of the universe.
This paper is about the physics of systems, which barely concern us most of time.
In systems thinking, we normally take the physics for granted.
Nevertheless, before we delve into the evolution and philosophy of system theory, here are a few thoughts relevant to it.
To begin with there was a lot of energy.
The temperatures were incredibly high, much higher than the interior of the sun.
In physics, energy is the quantitative property that must be transferred to an object in order to perform work on, or to heat, the object.
Energy is a conserved quantity; the law of conservation of energy states that energy can be converted in form, but not created or destroyed. (Wikipedia 12/02/2019)
Very soon after the big bang, some of the energy was converted into particles of matter called “quarks”.
What follows is edited from a discussion with Murray Gell-Mann
After a tiny fraction of a second, quarks fall apart into other particles, such as the omega-minus
· the omega-minus can decay into a neutral pion and xi-minus,
· the pion decays into photons,
· the xi-minus decays into a negative pion and a lambda.
· the lambda decays into a negative pion and a proton.
Thus, energy was transformed into various kinds of matter.
Today, quarks don’t normally exist
Though they occur when cosmic rays (which are mostly protons) strike atomic nuclei in the earth’s atmosphere..
And they can be produced in a particle collision in an accelerator,
at much lower temperatures, the matter around us is composed atoms, made of
neutrons, protons and electrons
And you can’t normally see quarks directly, because they are permanently trapped inside other particles like neutrons and protons.
How to describe a quark?
In classical physics you could think of a quark as a point.
In quantum mechanics a quark is not exactly a point; it’s quite a flexible object.
Sometimes it behaves like a point, but it can be smeared out a little.
Sometimes it behaves like a wave.
Notice physicists have two ways of looking at matter, as particles or waves.
Neither description is “right”; but both are true to the extent that they are useful for particular purposes.
The systems described by system theorists are ways of looking at the world, abstract descriptions of reality.
A system does not have to be a perfect model of what is described; it only has to be true to the extent that it is useful.
Physicists consider our world to be embedded in a four-dimensional space-time continuum.
It is called a “continuum” because it is assumed that space and time can be subdivided without any limit to size or duration.
In normal life, we divide the universe for the purposes of description.
We don’t attempt to describe the whole universe, over all time.
Nor do we attempt to define atomic particles, or nano second processes
Somewhere between the two, we divide the world into discrete entities and events.
Sometimes the divisions correspond to phase (solid, liquid, gas) boundaries.
Sometimes the divisions are biological or logical boundaries
The external-internal duality
Having divided the universe into discrete entities, we can look at each entity from the outside.
Suppose we encapsulate part of space in an imaginary three-dimensional box.
Over time, we can observe what goes into the box and what comes out of it.
We can describe the box as transforming input matter and energy into output matter and energy.
We can also look inside the box and describe what it contains and/or how it works.
This in-out duality is an important part of systems thinking.
The structure- behavior duality
We look at matter as composed of discrete entities or structures that exist in space and move through time.
We think of energy also as either stored in a structure, or flowing between structures in a various ways (e.g. light, heat, force and sound).
This structure-motion duality appears in many pairs of words we use to describe the world.
This duality is another important part of systems thinking..
What the world is made of
What happens, what is done
The system quadrant
A system can be described from various viewpoints, as shown in this table.
Life on earth began at least 3,500 million years ago, possibly more.
It evolved to the point that humans (at least physicists) started to understand energy, matter, space and time.
We all presume time inexorably moves forwards.
Physicists can see no reason why time cannot flow backwards as well
Time is change, or the processes of change.
Suppose you stood outside time, looked at the world, and saw people frozen in motion.
You would assume, for them, time had stopped.
The people are neither getting older nor younger.
If time moves forward, the process of physics, chemistry and biology run in one direction.
According to thermodynamics, disorder across the universe increases.
You lay down memories and grow older.
If time moves backward, the process of physics, chemistry and biology run in the opposite direction.
According to thermodynamics, order across the universe increases.
You forget things and grow younger.
So, the only time direction we can appreciate is the forward one.
In describing systems, we take it for granted that processes, run forward from start to end.
An asteroid is formed in space, then flies around for eons until they crash into the moon
The current structure (or state) of the system can be seen as a kind of memory of all that has happened to it.
The asteroids are recorded as craters on the moon surface.
Recording information in memory is fundamental to business and software systems.
Thermodynamics is the branch of physics about energy and entropy.
The opposite of entropy is called negative entropy or “negentropy”,
Some systems thinkers speak of systems maintaining negentropy.
This sounds a little pretentious, so in place of entropy and negentropy, we may instead say order and disorder.
As time moves forward, disorder across the universe increases.
As time moves backward, order across the universe increases.
Generally, a system must consume energy to create or maintain order in its structures.
Biological systems find energy in sunlight and food.
Many man-made systems find their energy in oil or electricity
Software systems consume electricity of course.
Erwin Schrödinger (1887 –1961) discussed the thermodynamic processes by which organisms maintains themselves in an orderly state.
Ludwig von Bertalanffy (1901-1972) considered an organism as a thermodynamic system in which homeostatic processes keep entropy at bay.
“By importing complex molecules high in free energy, an organism can maintain its state, avoid increasing entropy…." Bertalanffy
The increase of order inside an organism is more than paid for by an increase in disorder outside by the loss of heat.
Orderliness is an essential characteristic of a system; every system is orderly in some way.
In cybernetics, a system is an island of orderly behavior in the ever-unfolding process of the universe.
The system must consume energy to create and maintain the order of its state.
In designing mechanical systems, energy can be a limiting cost.
In designing information systems, the usually assumption is that the energy requirement is trivial
So, thermodynamics is tangential to most practical applications of general system theory.
“Cybernetics depends in no essential way on the laws of physics.”
“In this discussion, questions of energy play almost no part; the energy is simply taken for granted.” Ashby
While homeostasis was a focus of many early system theorists, it is not a property of all systems.
For some scientists, negentropy = order = information
“Living matter evades the decay to thermodynamical equilibrium by homeostatically maintaining negative entropy (today this quantity is called information) in an open system.” Cornell University web site.
In 2009, Mahulikar & Herwig re-defined the negative entropy (negentropy) of a dynamically ordered sub-system.
Negentropy = the entropy deficit of an ordered system relative to its surrounding chaos.
Negentropy might be equated with “free energy” in physics or with “order”; some equate it with "information".
Hmm… in cybernetics and systems thinking "information" usually has a more specific meaning.
Information is the meaning created or found by an actor in a message or memory.
To encode meaning in a message or memory is to create a very specific kind of order.
To decode meaning from a message or memory is transform order from one form to another
To do either, encode or decode, requires energy.
Usually, we take it for granted that information process consumes relatively little energy compared with mining, machines.and material processing
However, the energy needed to do information processing is becoming an issue.
“Nature's many complex systems - physical, biological, and cultural - are islands of low-entropy order within increasingly disordered seas of surrounding, high-entropy chaos.” Cornell University web site.
The larger and more complex the system, the more energy it needs
Systems in competition tend to optimise their use of energy
This “optimal use of energy” principle means that systems which consume less energy are more likely to survive.
“Nature's many complex systems--physical, biological, and cultural--are islands of low-entropy order within increasingly disordered seas of surrounding, high-entropy chaos.
Energy is a principal facilitator of the rising complexity of all such systems in the expanding Universe, including galaxies, stars, planets, life, society, and machines.
Energy flows are as centrally important to life and society as they are to stars and galaxies.
Operationally, those systems able to utilize optimal amounts of energy tend to survive and those that cannot are non-randomly eliminated.” Cornell University web site.
This “optimal use of energy” principle has been at work in the evolution of biological systems.
But where minimising energy consumption is of little or no advantage, evolution proceeds in a suboptimal way.
Where resources are cheap, systems tend to sub-optimise use of energy
Human society uses much energy in keep the internal temperature of buildings in a comfortable range.
The highest energy consumption per head is found in countries that are
· too cold: Iceland, Canada
· too hot: Trinidad and Tobago, Qatar, Kuwait, Brunei Darussalam, United Arab Emirates, Bahrain, Oman, or
But not, notice, countries in Africa or South America, because another factor is money.
The highest energy consumption per head is also found in countries
· too rich to care about the cost: Luxembourg, and the United States.
The energy consumed by software is becoming a problem for society – not least in global warming.
The fear is that many modern software systems are over vastly over complex and suboptimal.
Because we have been careless in their design and simply given them as much memory space and electricity as they need.
Beware the spiral to inefficiency
Inefficiency arises from
Optimisation arises through
Having only one design option
Having competing designs
Freely expanding resources
Preventing change to a design
Enabling change to a design
Long generation/change cycles
Short generation/change cycles
Pricing based on desire to have the system
Pricing based on cost of making the system
Kenneth Lloyd has pointed me to a paper in the Journal of Artificial Intelligence Research.
The paper asks: how to discover and improve solutions to complex problems?
Experiments with robots suggest that complexifying evolution discovers significantly more sophisticated strategies than evolution of networks with fixed structure.
The experimental results suggested three trends:
(1) As evolution progresses, complexity of solutions increases,
(2) Evolution uses complexification to elaborate on existing strategies
(3) Complexifying coevolution is significantly more successful in finding highly sophisticated strategies than non-complexifying coevolution.
The suggestion is that, to discover and improve complex solutions, evolution should be allowed to complexify as well as optimize.
Complexification - incremental extension and elaboration of a system – has pros and cons.
Through successive releases, it seems a software application grows larger than its additional features justify.
How to constrain complexifcation?
One way is to pitch a variety of different complexifications against each other in a competition for resources.
Enterprise architecture encourages standardising and integrating business roles and processes.
The intention is to reduce the variety of behaviours and remove duplications between systems.
In the baseline state, the enterprise has many small and disintegrated systems.
In the target state, the enterprise has fewer and more integrated systems.
This reduces the overall size and complexity of smaller, silo, business systems.
At the same time, it increases the overall size and complexity of the business-as-a-system.
And standardisation increases the population of actors who play roles in that one system.
This has pros and cons.
A limiting factor is the ability of humans to manage the business-as-a-system.