The physics of
systems
Copyright Graham Berrisford 2018. Last updated 07/12/2019 11:36
One of a hundred papers on the System Theory page at http://avancier.website.
“Cybernetics depends in no essential way on
the laws of physics.”
“In
this discussion, questions of energy play almost no part; the energy is simply
taken for granted.” Ashby
Nevertheless,
though it barely concerns us most of time, this paper is about the physics of
systems.
Contents
Classical
mechanics and systems
How
does this relate to general system theory?
How
does this relate to general system theory?
How
does this relate to general system theory?
How
does this relate to general system theory?
How
does this relate to general system theory?
Chunking space and time
Spacetime is a model that fuses the three dimensions of space and the dimension of time into a four-dimensional continuum.
The term “continuum” implies each dimension can be subdivided with no limit to size or duration.
Nevertheless, classical mechanics does divide the universe into discrete, measurable, chunks.
In computational physics/engineering the accuracy of solutions is improved by finer-grained chunking of space/time.
Sometimes, if the chunk size is incompatible with one set of physical laws, another set may be applied.
The
object-motion duality
Physical reality is mysterious beyond human understanding.
Physicists observe and envisage that reality.
They tell us they describe it best in the mathematical equations of quantum mechanics.
Abstracting from the mathematic to ordinary words, they describe it in terms of particles or waves.
Quantum mechanics embraces classical mechanics as a special case - accurate enough in our everyday experience.
Classical (Newtonian) mechanics describes
reality in terms of objects in
that occupy space and move through time.
It begins with the quantitative variables of an object such as its mass, velocity and acceleration, and forces acting on it.
Classical mechanics |
Objects
and motions <think in terms of> <represent> Physicists <observe and envisage > Physical
realities |
We instinctively divide space and time into discrete entities and events
We usually divide space into entities that have physical phase boundaries (solid/liquid/gas).
And divide time into events that represent noticeable state changes, as when an object moves from place to place.
The structure-behavior duality in system
description
The object-motion duality appears, in many ways, in the language we use to describe the world.
System describers typically classify system
elements into structures and behaviors.
Structures
in space What the world is made of |
Behaviors
over time What happens, what is done |
Actors |
Activities |
Components |
Co-operations |
Entities |
Events |
Forms |
Functions |
Items |
Interactions |
Memories |
Messages |
Objects |
Operations |
Performers |
Processes |
Roles |
Rules |
Stores
(Stocks) |
Streams
(Flows) |
Structures occupy addressable places
in space.
Some structures are active, they act;
others are passive, they are acted on.
Behaviors take place over time, and
move or otherwise change structures.
The in-out duality in
system description
Having boxed
the universe into discrete objects, we can look at each object in two ways.
From the
outside, we can see it as consuming and/or producing matter and energy.
Looking inside the box, we can describe what it contains and/or how it works.
The two dualities above are important in systems thinking.
System thinkers typically encapsulate a system behind an in-out boundary
And perceive and describe systems in terms of actors (structures) that occupy space and activities (behaviors) that occur over time.
This table maps one duality to the other.
|
Behavioral perspective |
Structural perspective |
External
perspective |
Discrete
events |
System
interface |
Internal
perspective |
Activities |
Actors |
Here is a different version of that same quadrant used in our system architect training.
Behavioural view |
Structural view |
|
External view |
Service
contract: an
end-to-end process defined
as external entities see it. |
Interface
definition: a
declaration of available
and accessible behaviours |
Internal view |
Process: a sequence of activities performed
by one or more components. |
Component:
a subsystem capable of performing
one or more behaviours |
Quantum mechanics has superseded classical mechanics as the foundation of physics.
It is needed to explain and predict microscopic processes at the molecular, atomic, and sub-atomic level.
It has a wider scope, and encompasses classical mechanics as a sub-discipline which applies under certain restricted circumstances.
Scientists believe our universe started with a big bang about 14,000
million years ago.
To begin with there was a lot of
energy.
The temperatures were incredibly high, much higher than the interior of the sun.
Very soon after the big bang, some of the energy was converted into particles of matter called “quarks”.
After a tiny fraction of a second, quarks fall apart into other particles, such as the omega-minus
· the omega-minus can decay into a neutral pion and xi-minus,
· the pion decays into photons,
· the xi-minus decays into a negative pion and a lambda.
· the lambda decays into a negative pion and a proton.
Thus, energy was transformed into various kinds of matter.
Today, quarks don’t normally exist
Though they occur when cosmic rays (which are mostly protons) strike atomic nuclei in the earth’s atmosphere..
And they can be produced in a particle collision in an accelerator,
Today, at much lower temperatures,
the matter around us is composed atoms, made of neutrons, protons and electrons
And you can’t normally see quarks directly, because they are permanently
trapped inside other particles like neutrons and protons.
How to describe a quark?
In classical physics you could think of a quark as a point.
In quantum mechanics a quark is not exactly a point; it’s quite a flexible object.
Sometimes it behaves like a point, but it can be smeared out a little.
Sometimes it behaves like a wave.
(Edited from a discussion with Murray Gell-Mann.)
The Schrödinger equation represents the possibilities of an electron's characteristics as a wave of chance.
The particle is everywhere at a bunch of speeds … some more likely than others.
A split second after the particle starts moving, you can be fairly sure it's still near its starting point.
Over time, the range of its possible positions and speeds expands.
However, Schrödinger's equation is reversible
Which means a 'smeared' particle can localise back into a small region of space over the same time period.
Notice there are two ways of looking at subatomic entities, as particles or waves.
Quantum
mechanics |
Particles
or waves? <think in terms of> <represent> Physicists <observe and envisage> Quantum entities |
Neither description is “right”; but both are true to the extent that they are useful for particular purposes.
Ashby,
Ackoff, Checkland and other systems thinkers distinguish abstract systems from
concrete systems.
This
triangular graphic shows the abstraction of system description from the
physical world.
System theory |
Abstract systems (descriptions) <think
in terms of>
<represent> System
theorists <observe and envisage > Concrete
systems (realisations) |
These papers take this triangular view of system theory as axiomatic.
An abstract system is a description of how some part of the word behaves, or should behave.
Ashby spoke of what is
called a Discrete Event-Driven System (DEDS).
In his kind a system, there is:
·
An abstract system of state variables whose
values are changed by events.
· A concrete system of physical entities that realises the abstract system.
Cybernetics |
Event and state variable types <think
in terms of>
<represent> Cyberneticians
<observe and envisage
> Physical entities changing |
The abstract system does not have to be a perfect model of what is described; it only has to be accurate enough to be useful
An exception is the code of a software system, which is expected to be a perfect representation of the run-time system.
Scientists discuss thermodynamics with reference to a system and its surroundings.
First law of thermodynamics: Energy cannot be created or destroyed inside an isolated system.
Energy is the capacity for doing work; it may exist in various forms, and be transferred from one body to another.
It can be input or output.
Second law of thermodynamics: The entropy of an isolated system always increases.
Entropy is the measure of a system’s thermal energy per unit of temperature that is not available for doing useful work.
Energy transforms and spreads out from areas where it's most intense.
The second law of thermodynamics is a principle more than a rule.
In classical physics, it explains why the balls on a pool table don’t reform the starting triangle.
If you saw pool balls reform their starting triangle, it would be a sobering experience.
It is so incredibly unlikely you would be shocked.
You'd probably need to stare at billions of pool tables forever to see it happen once.
In quantum mechanics, such strange things do happen.
Negentropy as a kind of order – created by consuming energy
The opposite
of entropy is called negative entropy or “negentropy”.
Systems thinkers speak of systems maintaining negentropy - or maintaining order.
To maintain the order inside a system requires the input of energy from the rest of the universe.
The
maintenance of order inside the system is more than paid for by an increase in
disorder outside - by the output of heat.
Erwin
Schrödinger (1887 –1961) discussed the thermodynamic processes by which organisms
maintains themselves in an orderly state.
Ludwig von Bertalanffy (1901-1972) considered an organism as a thermodynamic system in which
homeostatic processes keep entropy at bay.
“By importing complex molecules high in free energy, an organism can maintain its state, avoid increasing entropy…." Bertalanffy
“Nature's
many complex systems - physical, biological, and cultural - are islands of
low-entropy order within increasingly disordered seas of surrounding,
high-entropy chaos.” Cornell University web
site.
Simply put, the plants in the
biosphere maintain its order by consuming high-level energy from the sun.
Each animal maintain its internal
order by consuming high-level energy in the form of complex molecule).
It increases the disorder of its
surroundings by producing lower-level waste products (simpler molecules) and
heat energy.
However, thermodynamics is irrelevant to most systems
thinking, certainly at the level social systems.
In Ashby's cybernetics and Forrester's System Dynamics, the
provision of sufficient energy to drive the system is taken for granted.
Introduction to cybernetics, 1956:
Page
3: "In this discussion, questions of energy play almost no
part—the energy is simply taken for granted.
Even
whether the system is closed to energy or open is often irrelevant."
Page 136: "Sometimes the second law of thermodynamics is appealed to, but this is often irrelevant to the systems discussed here."
Information as a kind
of order – created by coding and decoding
There is an analogy to be drawn.
In thermodynamics, think of energy as either stored in a structure, or flowing between structures (e.g. light, heat, force and sound).
In cybernetics, think of information as either stored in a memory structure, or flowing between structures in messages.
But it seems to me some confusion arises from the ambiguity
of "information".
In the domain of information theory, information exchange
implies a sender and a receiver who code and decode a structure with two or
more states.
In the domain of thermodynamics, negentropy = order =
information, which I'd rather call "information potential".
“Living matter evades the decay to
thermodynamical equilibrium by homeostatically maintaining negative entropy
(today this quantity is called information) in an open system.” Cornell
University web site.
In 2009, Mahulikar & Herwig re-defined the negentropy of a dynamically ordered sub-system.
Negentropy = the entropy deficit of an ordered system relative to its surrounding chaos.
“Negentropy
might be equated with “free energy” in physics or with “order”; some equate it
with "information".
In
cybernetics and systems thinking "information" usually has a more
specific meaning.
Information
is the meaning created or found by an actor in a message or memory – using
energy.
To encode meaning in a message or memory is
to create a very specific kind of order.
To decode meaning from a message or memory
is transform order from one form into another.
Remember the second law above: The entropy of an isolated system always increases.
Energy transforms and spreads out from areas where it's most intense.
This is a one directional law; whereas most other of laws of physics can be reversed and still make sense.
Time is change, or the processes of change.
We all presume time inexorably moves in one direction - forwards.
Suppose you stood outside time, looked at the world, and saw people frozen in motion.
You would assume, for them, time had stopped.
The people are growing neither older or younger.
Physicists say time can flow backwards as well as forwards.
“Physicists Just
Reversed Time on The Smallest Scale by Using a Quantum Computer” https://www.sciencealert.com/physicists-successfully-put-time-into-reverse-on-the-smallest-scale
Surely these physicists did not reverse time; they only simulated reversing time using a computer?
If they did reverse time outside of the computer, they would not only undo the experiment, but forget they intended to do it!
.
At the macro scale at which our brains work, the second law of thermodynamics applies.
As time moves forward, the process of physics, chemistry and biology run in one direction.
You create memories by using energy to create orderly patterns in your brain – and grow older
As time moves backward, the process of physics, chemistry and biology run in the opposite direction.
You forget things and grow younger.
So, the only time direction you can appreciate is the forward one.
In describing systems, we take it for granted that processes run forward from start to end.
An asteroid is formed in space, then flies around for eons until it crashes into the moon
The current structure (or state) of the system can be seen as a memory of all that has happened to it.
And to create or maintain order in its internal structure, a system must consume energy.
Energy
is the quantitative property that must be transferred to an object in order to
perform work on, or to heat, the object.
Energy is a conserved quantity; the law of conservation of energy states that energy can be converted in form, but not created or destroyed. (Wikipedia 12/02/2019)
Biological systems find energy in sunlight and food.
Many man-made systems find their energy in oil or
electricity
Software
systems consume
electricity of course.
To encode or
decode information requires energy.
Usually, we take it for granted that information process consumes relatively little energy compared with mining, machines.and material processing
However, the
energy needed to do information processing is becoming an issue.
Principles: systems in competition
tend to optimise their use of energy
This “optimal
use of energy” principle means that systems which consume less energy are more
likely to survive.
“Nature's
many complex systems--physical, biological, and cultural--are islands of
low-entropy order within increasingly disordered seas of surrounding,
high-entropy chaos.
Energy
is a principal facilitator of the rising complexity of all such systems in the
expanding Universe, including galaxies, stars, planets, life, society, and
machines.
Energy
flows are as centrally important to life and society as they are to stars and
galaxies.
Operationally,
those systems able to utilize optimal amounts of energy tend to survive and
those that cannot are non-randomly eliminated.” Cornell University web
site.
This “optimal
use of energy” principle has been at work in the evolution of biological
systems.
But where
minimising energy consumption is of little or no advantage, evolution proceeds
in a suboptimal way.
Principle: where resources are cheap,
systems tend to sub-optimise use of energy
Human society
uses much energy in keep the internal temperature of buildings in a comfortable
range.
The highest energy
consumption per head is found in countries that are too cold (Iceland, Canada).
And too hot
(Trinidad and Tobago, Qatar, Kuwait, Brunei Darussalam, United Arab Emirates,
Bahrain, Oman).
But in not
countries in Africa or South America, because another factor is money.
The highest energy consumption per head is also found in countries too rich
to care about the cost (Luxembourg, and the United States).
The energy
consumed by software is becoming a problem for society – not least in global
warming.
The fear is
that many modern software systems are over vastly over complex and suboptimal.
Because we
have been careless in their design and given them as much memory space and
electricity as their design needs.
Beware the spiral to inefficiency
Inefficiency arises from |
Optimisation arises through |
Having
only one design option |
Having
competing designs |
Freely
expanding resources |
Limiting
resources |
Preventing
change to a design |
Enabling
change to a design |
Long
generation/change cycles |
Short
generation/change cycles |
Pricing
based on desire to have the system |
Pricing
based on cost of making the system |
Kenneth Lloyd has pointed me to a paper in the Journal of Artificial Intelligence Research.
The paper asks: how to discover and improve solutions to complex problems?
Experiments with robots suggest that complexifying evolution discovers significantly more sophisticated strategies than evolution of networks with fixed structure.
The experimental results suggested three trends:
(1) As evolution progresses, complexity of solutions increases,
(2) Evolution uses complexification to elaborate on existing strategies
(3) Complexifying coevolution is significantly more successful in finding highly sophisticated strategies than non-complexifying coevolution.
The suggestion is that, to discover and improve complex solutions, evolution should be allowed to complexify as well as optimize.
Suppose EA leads me to integrate my CRM and Billing systems.
Think thermodynamics: he integration increases the order and complexity of the enterprise-as-a-digital-system.
Think trade offs; it reduces the energy spent the human activity system.
But increases the energy required in the computer activity system, and the maintenance thereof.
Enterprise architecture encourages standardising and integrating business roles and processes.
The intention is to reduce the variety of behaviours and remove duplications between systems.
In the baseline state, the enterprise has many small and disintegrated systems.
In the target state, the enterprise has fewer and more integrated systems.
This reduces the overall size and complexity of smaller, silo, business systems.
At the same time, it increases the overall size and complexity of the business-as-a-system.
And standardisation increases the population of actors who play roles in that one system.
This has pros and cons.
A limiting factor is the ability of humans to manage the business-as-a-system.
Complexification - incremental extension and elaboration of a system – has pros and cons.
Through successive releases, it seems a software application grows larger than its additional features justify.
How to constrain complexifcation?
One way is to pitch a variety of different complexifications against each other in a competition for resources.