The Information Philosopher - dedicated to the new information philosophy
The fundamental question of information philosophy is cosmological and ultimately metaphysical. What is the process that creates information structures in the universe?
Given the second law of thermodynamics, which says that any system will over time approach a thermodynamic equilibrium of maximum disorder or entropy, in which all information is lost, and given the best current model for the origin of the universe, which says everything began in a state of equilibrium some 13.75 billion years ago, how can it be that living beings are creating and communicating new information every day? Why are we not still in that state of equilibrium?The question may be cosmological and metaphysical, but the answer is eminently practical and physical. It is found in the interaction between quantum mechanics and thermodynamics.
When information is stored in any structure, two physical processes must occur.
The first is the mysterious collapse of a quantum-mechanical wave function, which happens in any measurement process. Such quantum events involve irreducible indeterminacy and chance, but less often noted is the fact that quantum physics is directly responsible for the extraordinary temporal stability of most information structures.
The second is a local decrease in the entropy (which appears to violate the second law) corresponding to the increase in information. Entropy greater than the information increase must be transferred away, ultimately to the cosmic background, to satisfy the second law.
The discovery of a two-part cosmic creation process casts light on a some classical problems in philosophy and physics , because it is the same process that creates new biological species and it explains the freedom and creativity of the human mind.
The cosmic creation process generates the conditions without which there could be nothing of value in the universe, nothing to be known, and nothing to do the knowing. It may help to explain Leibniz' great question "Why is there something rather than nothing?" and the neoplatonic philosopher Porphyry's fateful question on the ontological status of universals, "Do the genera and species subsist in themselves, or do they exist only in the mind?" -->
In less than two decades of the mid-twentieth century, the word information was transformed from a synonym for knowledge into a mathematical, physical, and biological quantity that can be measured and studied scientifically.
In 1939, Leo Szilard connected an increase in thermodynamic (Boltzmann) entropy with any increase in information that results from a measurement, solving the problem of "Maxwell's Demon," the thought experiment suggested by James Clerk Maxwell, in which a reduction in entropy is possible when an intelligent being interacts with a thermodynamic system..
In the early 1940s, digital computers were invented, by Alan Turing, Claude Shannon, John von Neumann, and others, that could run a stored program to manipulate stored data.
Then in the late 1940s, the problem of communicating digital data signals in the presence of noise was first explored by Shannon, who developed the modern mathematical theory of the communication of information. Norbert Wiener wrote in his 1948 book Cybernetics that "information is the negative of the quantity usually defined as entropy," and in 1949 Leon Brillouin coined the term "negentropy."
Finally, in the early 1950s, inheritable characteristics were shown by Francis Crick, James Watson, and George Gamow to be transmitted from generation to generation in a digital code.
Information is neither matter nor energy, but it needs matter for its embodiment and energy for its communication.
Immaterial information is perhaps as close as a physical scientist can get to the idea of a soul or spirit that departs the body at death. When a living being dies, it is the maintenance of biological information that ceases. The matter remains.
Biological systems are different from purely physical systems primarily because they create, store, and communicate information. Living things store information in a memory of the past that they use to shape their future. Fundamental physical objects like atoms have no history.
And when human beings export some of their personal information to make it a part of human culture, that information moves closer to becoming immortal.
Human beings differ from other animals in their extraordinary ability to communicate information and store it in external artifacts. In the last decade the amount of external information per person has grown to exceed an individual's purely biological information.
Since the 1950's, the science of human behavior has changed dramatically from a "black box" model of a mind, one that started out as a "blank slate" conditioned by environmental stimuli. The new mind model contains many "functions" implemented with stored programs, all of them information structures in the brain. The new cognitive science likens the brain to a computer, with some programs and data inherited and others developed as appropriate reactions to experience.
But the brain may be regarded less as an algorithmic computer than as a primitive experience recorder and reproducer. Information about an experience - the sights, sounds, smells, touch, and taste - is recorded along with the emotions - feelings of pleasure, pain, hopes, and fears - that accompany the experience. When confronted with similar experiences later, the brain can reproduce information about the original experience (an instant replay) to guide current actions.
Information is constant in a deterministic universe. There is "nothing new under the sun." The creation of new information is not possible without the random chance and uncertainty of quantum mechanics, plus the extraordinary temporal stability of quantum mechanical structures.
It is of the deepest philosophical significance that information is based on the mathematics of probability. If all outcomes were certain, there would be no "surprises" in the universe. Information would be conserved and a universal constant, as some mathematicians mistakenly believe. Information philosophy requires the ontological uncertainty and probabilistic outcomes of modern quantum physics to produce new information.
But at the same time, without the extraordinary stability of quantized information structures over cosmological time scales, life and the universe we know would not be possible. Quantum mechanics reveals the architecture of the universe to be discrete rather than continuous, to be digital rather than analog.
Moreover, the "correspondence principle" of quantum mechanics and the "law of large numbers" of statistics ensures that macroscopic objects can normally average out microscopic uncertainties and probabilities to provide the "adequate determinism" that shows up in all our "Laws of Nature."
Information philosophy explores some classical problems in philosophy with deeper and more fundamental insights than is possible with the logic and language approach of modern analytic philosophy.
By exploring the origins of structure in the universe, information philosophy transcends humanity and even life itself, though it is not a mystical metaphysical transcendence.
Information philosophy uncovers the providential creative process working in the universe
to which we owe our existence, and therefore perhaps our reverence.It locates the fundamental source of all values not in humanity ("man the measure"), not in bioethics ("life the ultimate good"), but in the origin and evolution of the cosmos.
Information philosophy is an idealistic philosophy, a process philosophy, and a systematic philosophy, the first in many decades. It provides important new insights into the Kantian transcendental problems of epistemology, ethics, freedom of the will, god, and immortality, as well as the mind-body problem, consciousness, and the problem of evil.
In physics, information philosophy provides new insights into the problem of measurement, the paradox of Schrödinger's Cat, the two paradoxes of microscopic reversibility and macroscopic recurrence that Josef Loschmidt and Ernst Zermelo used to criticize Ludwig Boltzmann's explanation of the entropy increase required by the second law of thermodynamics, and finally information provides a better understanding of the entanglement and nonlocality phenomena that are the basis for modern quantum cryptography and quantum computing.
From Rod