Other
information theories
This page is
published under the terms of the licence summarized in the footnote.
All free-to-read materials on the Avancier
web site are paid for out of income from Avancier’s training courses and
methods licences.
If you find the web site helpful, please
spread the word and link to avancier.website in whichever social media you use.
Our information theory is about the creation, use and meanings of messages in social, business and software systems.
This paper
mentions other information theories.
Contents
The
scope of “information” in this work
Information in Quantum Electro Dynamics
(QED)
Information in ISO 2382-1 (1993)
This work may be seen as philosophical methodology based on a scientific view of the world.
Especially on biological evolution and the science of digital information systems.
For us, information appears in the act of informing – in the communication of knowledge from senders and to receivers.
The elements of information are descriptive or directive qualities (including quantities).
Our focus is on information created and used in social, business and software systems (which all depend on communication).
And we use the term in relation to messages sent or stored with an intended purpose.
Business systems
integration (a focus of enterprise architects) depends on information flowing
in messages and stored in memories.
In today’s “information age”, businesses look to capitalise on information they glean from messages transmitted and stored in huge quantities.
So, our information theory is about the creation, use and meanings of messages in social, business and software systems.
It takes a "survival of the fittest" view of how information processing emerged.
It views business systems as formalised social systems, in which actors communicate to cooperate in activities.
However, scientists
have developed a broader view of information.
Information Philosophy (I-Phi) <http://www.informationphilosopher.com> is a philosophical method grounded in science.
Especially in modern physics, biology, neuroscience, and the science of information.
And here, the definition of information is very broad.
“The simple definition of information is the act of informing - the communication of knowledge from a sender to a receiver that informs (literally shapes) the receiver.
By information we mean a quantity that can be understood mathematically and physically.
It corresponds to the common-sense meaning of information, in the sense of communicating or informing.
It is like the information stored in books and computers.
But it also measures the information in any physical object, like a snow crystal or a star like our sun, as well as the information in biological systems, including the genetic code, the cell structure, and the developmental learning of the phenotype.
Although some commentators would like to limit the term "information" to messages sent with an intended purpose, physical scientists have long included the structure in physical objects as something that can be measured by an observer and thus is also information.
Information philosophy recognizes material objects as "information structures," from which the pure information can be abstracted as meaningful knowledge.
Information in physical systems was connected to a measure of the structural order in a system as early as the nineteenth century.
Ludwig Boltzmann described an increase in the thermodynamic entropy as "lost information."
In 1929, Leo Szilard calculated the mean value of the quantity of entropy produced by a 1-bit ("yes/no") measurement as S = k log 2, where k is Boltzmann's constant.
Ludwig von Bertalanffy, Norbert Wiener, Claude Shannon, John von Neumann, and others all had similar views of the connection between physical entropy and abstract "bits" of information.”
Claude Shannon
developed “information theory” about the limits on signal processing operations
such as compression, storage and communication.
Shannon wrote "The fundamental
problem of communication is that of reproducing at one point either exactly or
approximately a message at another point."
Shannon’s concern
was maintaining signal quality in communication; his theory is about the
physical content of a message.
Its application
in technical communication mechanisms, including the internet, is largely taken
for granted by enterprise, system and software architects.
Shannon wrote: "Frequently the
messages have meaning"
Any variety in an energy flow or structure of matter is a potential message or signal.
But a signal that
is meaningless to any sender or receiver is not a useful concept here.
Some systems
thinkers refer to Shannon’s theory in discussion of linguistics and human
perception.
But as long ago
as 1956, Boulding observed that Shannon did not address the meaning of
communication in social systems.
Thanks to Ron
Segal for this link
http://www.dummies.com/how-to/content/string-theory-and-quantum-electrodynamics.html
"as [two
electrons] get near to each other... the two particles communicate their
electromagnetic information by emitting and absorbing a photon.
A photon that acts in this manner is
called a virtual photon or a messenger photon, because it’s created solely for
the purpose of exchanging this information."
Electrons
exchange information? Electrons have purposes?
This is a very
different domain of knowledge – surely too remote from social and business
systems to be drawn into the same conversation.
Today,
information processing is seen as essential to all living systems addressed in
biology.
“Living is information processing: from
molecules to global systems.”
“the whole of life
can be viewed… as an integrated information processing system:” (source lost?)
The quoted text
below is from a paper by Sara Imari Walker and Paul
C.W. Davies.
The Algorithmic
Origins of Life, Journal of the Royal Society Interface, 2013, DOI: 10.1098/rsif.2012.0869
(open access)
Which is summarised at
http://www.kurzweilai.net/an-information-processing-approach-to-the-origin-of-life.
The authors shift
attention from the “hardware” — the chemical basis of life — to the “software”
— its information content.
They suggest that
the crucial distinction between non-life and life is the way living organisms
manage the information flowing through the system.
“We propose that
the transition from non-life to life is unique and definable,” said Davies.
“We suggest that
life may be characterized by its distinctive and active use of information,
thus providing a roadmap to identify rigorous criteria for the emergence of
life.
“This is in sharp
contrast to a century of thought in which the transition to life has been cast
as a problem of chemistry.
“Chemical based
approaches,” Walker said, “have stalled at a very early stage of chemical
complexity — very far from anything we would consider ‘alive.’
“More seriously
they suffer from conceptual shortcomings in that they fail to distinguish
between chemistry and biology.
“To a physicist
or chemist life seems like ‘magic matter,’” Davies explained.
“It behaves in
extraordinary ways that are unmatched in any other complex physical or chemical
system.
“Such lifelike properties
include autonomy, adaptability and goal-oriented behavior — the ability to
harness chemical reactions to enact a pre-programmed agenda, rather than being
a slave to those reactions.
“We believe the
transition in the informational architecture of chemical networks is akin to a
phase transition in physics.
“We place special
emphasis on the top-down information flow in which the system as a whole gains
causal purchase over its components.
“This approach
will reveal how the logical organization of biological replicators differs
crucially from trivial replication associated with crystals (non-life).
“By addressing
the causal role of information directly, many of the baffling qualities of life
are explained.”
Non-local biological functions
“The most
important features of biological information (i.e. functionality) are
decisively nonlocal,” the authors say.
“Biologically
functional information is therefore not an additional quality, like electric
charge, painted onto matter and passed on like a token.
“It is of course
instantiated in biochemical structures, but one cannot point to any specific
structure in isolation and say “Aha! Biological information is here!”
“In all of these cases
where appeal is made to an informational narrative, we encounter context-
(state-) dependent causation.
“In this respect,
biological systems are quite unlike traditional mechanical systems evolving
according to fixed laws of physics.
“In biological causation,
subject to informational control and feedback, the dynamical rules will
generally change with time in a manner that is both a function of the current
state and the history of the organism (suggesting perhaps that even the concept
of evolution itself may be in need of revision.”
Life in non-organic substrates?
“Purely analog
life-forms could have existed in the past but are not likely to survive over
geological timescales without acquiring explicitly digitized informational
protocols.
“Therefore
life-forms that ‘go digital’ may be the only systems that survive in the
long-run and are thus the only remaining product of the processes that led to
life.
“As such, the
onset of Darwinian evolution in a chemical system was likely not the critical
step in the emergence of life. …
“Instead, the
emergence of life was likely marked by a transition in information processing
capabilities.
“This transition
should be marked by a reversal in the causal flow of information from bottom-up
only to a situation characterized by bi-directional causality.
“Characterizing
the emergence of life as a shift in causal structure due to information gaining
causal efficacy over matter marks the origin of life as a unique transition in
the physical realm.”
Hallmarks of life
The authors
suggest these specific hallmarks of life:
·
Global
organization
·
Information
as a causal agency
·
Top-down
causation
·
Analog
and digital information processing
·
Laws
and states co-evolve
·
Logical
structure of a universal constructor
·
Dual
hardware and software roles of genetic material
·
Non-trivial
replication
·
Physical
separation of instructions (algorithms) from the mechanism that implements
them.
The standard defines data as the representation of information.
That is aligned with the information theory above; data is a form or matter and/or energy that encodes meaningful information.
(Though saying datum = fact confuses things, since facts sound like meaningful information.)
The standard seems at first to align information with meaning
That is also aligned with the information theory above.
The trouble is, the ISO standard goes on to scramble the idea.
It defines information as knowledge or viewpoint, which is ambiguous and confusing.
If knowledge is the meaning given to some data by an actor - OK
If knowledge only has meaning in a given context - then it
has several possible meanings.
Is knowledge an item of information that can convey
several information/meanings?
Or is it an item of data that can convey several information/meanings?
Then, a viewpoint does not integrate data; what integrates data in IS0 42010 is a view.
A view is an aggregate of data elements - a compound data structure.
It may be designed to address the concerns of a stakeholder.
But the meaning obtained from the view is whatever meaning that stakeholder obtains.
Footnote: Creative Commons
Attribution-No Derivative Works Licence 2.0 1/6/2016
Attribution: You may copy, distribute
and display this copyrighted work only if you clearly credit “Avancier Limited:
http://avancier.website” before the start and include this footnote at the end.
No Derivative Works: You may copy,
distribute, display only complete and verbatim copies of this page, not
derivative works based upon it.
For more information about the licence, see http://creativecommons.org