Information & Energy

Last year, Caleb Scharf authored a book, The Ascent of Information: Books, Bits, Genes, Machines, and Life's Unending Algorithm, perfectly illustrating an important contemporary conundrum: specialized technologies quickly derived from new scientific insights lack much of the underlying science's larger understandings. This creates significant issues concerning technology's greater societal impacts. Questions rising from these developments in regards to all recent technologies remain essentially unexamined. Which isn't surprising, the same thing can still be said for centuries-old industrial technology.
The book valuably brings together various disciplines involved in the creation of the new world of information technology. However, the author's attempt to tie them all together under one grand information theory is flawed and unhelpful to any better understanding. There's an old joke started with Einstein's later life efforts to unite the various theories of 20th century physics. It goes, “Old physicists never die, they just try to create a unified field theory.” Such is this book's great fault, but it still has important specific perspectives.
What is information? That's a very difficult question to answer. Maybe the best answer is to borrow Supreme Court Justice Potter's reply on defining pornography, “I know it when I see it.” In many ways, information can only be defined within the context of how it is used, though defining through specific context clouds the ability to gain greater meaning.
Maybe defining information is a quandary best understood with the great physicist Niels Bohr's idea of complementarity. This idea is helpful for any “whole understanding” of quanta, and Bohr argued life in general. Understanding a single characteristic of any one object can only truly be understood in relation to every other characteristic, even if they are in direct contradiction. For example, characteristics of quanta such as light and electrons to be fully understood, must be looked at as both a particle and a wave. How you look defines what you find.

Or moving above the quantum world to human life, complementarity is valuable in understanding war. The only way to understand a war's beginnings, movements, and consequences is to understand the histories, thinking, and actions of all opposing sides.
With information, the harder, the more specific any given applied definition, in many ways excludes larger meaning. It can be argued, especially after reading Scharf's book, best to keep any general definition of information soft. Hard definitions are all tied to applied context – color, shape, quantity, a bear, a mouse, a word, a number – all are contextually information specific. All need larger context to be truly useful. Massive amounts of specialized information devoid of larger context is one of the great challenges facing science, technological development, and society as a whole.
The most valuable part of this book is tying together energy and information. Scharf is not the first to do this. The whole school of mid-20th century Information Theory developed by Claude Shannon, John Von Neuman, and others uses math concepts developed by physics to look at information as physics looks at energy. This way of manipulating information is the foundation of our present information technologies. It is undeniably powerful, but its specificity – specialized manipulation of innumerable calculable binary choices contextualized by the use of algorithms – prejudices any greater information environment. As a whole, life is neither constrained by binary choices or defined by algorithms.
In a physic's understanding of information, there's an intriguing and essential view of the anti-entropy value of information. Scharf writes, “Informational entropy and physical entropy are two inextricably linked sides to the same story.” Entropy outside its very specific thermodynamic meaning is a soft and often greatly misused concept. In thermodynamics, which birthed the concept, it is very specifically defined as the Second Law of Thermodynamics. Britannica defines, “The measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work.”
Without going into a long discourse trying to define entropy, thermodynamically, if you place together two objects of different temperatures, they will eventually equalize to the same temperature. Heat from the hotter object will transfer to the cooler object. In this sense, the hotter object is more ordered, the cooler less. Generalized, entropy is the movement of a more ordered system to being less ordered. Importantly, it doesn't work in reverse. The less order, the greater entropy.
In the 19th century, physicist James Clerk Maxwell developed a thought experiment where it was possible to develop an anti-entropy process. The idea was controlling a door in the middle of a box of equally dispersed moving particles of various velocities. Maxwell called this process a demon. By continually opening the door with specific timing, the demon could gradually gather on one side of the box all the fastest moving particles, raising the temperature on that side of the box, thus violating the Second Law and reducing entropy.

A half-century later, Hungarian physicist Leo Szilard wrote a paper entitled, “On the reduction of entropy in a thermodynamic system by the intervention of an intelligent being.” Szilard figured out Maxwell's demon was really introducing information into the system. Applied information can reduce entropy, creating greater order. Some would argue, this is in a sense a definition of life. Across its existence all life creates new order by reordering energy.

The concept of entropy is fundamental to Claude Shannon's Information Theory, basically figuring out what amounts of data are necessary for a message to be orderly transferred, that is communicated not just as jumbled data, but understandable information. With our present information technologies, this is accomplished by the information equivalent of brute force, all requiring ever vaster amounts of energy.
Our present information technologies are a massive energy sink. Scharf writes, “Our computational data centers alone are already using an amount of energy comparable to the entire planet's output in the early 1800s.”
He adds alarmingly, “A study in 2016 by the Semiconductor Industry Association in the US produced a road map assessment of where things looked to be headed. The bleak prediction: by sometime around 2040 the world's computer chips will demand more electricity than is expected to be produced globally.”
This is both unsustainable development and an unobtainable end. However, it makes very clear why in an era of tightening and more expensive fossil fuel supplies combined with the increasing impact of humanity's past and present energy use on the greater planetary ecology, the Tech industry is a great nuclear power advocate. Nukes have plenty of problems, but with current energy usage growth rates, those of information technologies alone would be impossible to meet with any feasible addition of nuclear generation.
Life in the 21st Century is a reader-supported publication. Consider becoming a paid subscriber.
Nor will we be capable of just plugging into various renewable generation sources to meet such future demand. As Scharf writes, “The issue is not that we necessarily run out of electricity, but that infrastructure for generating ever more power consumes ever more resources (in rare earths or lithium for batteries and their manufacture, copper cabling, or sheer land area).” The land area issue is a fossil fuel industry induced red herring. Renewables can be sourced across already human inhabited areas and integrated in such way to be relatively unintrusive, but that still leaves the very important resource questions.
Years ago, working on energy issues, I was talking with a friend and colleague, a physicist, Rich Ferguson. I was always and remain a great solar advocate, but I said something flippant. Rich looked at me, smiled, and said, “There's no such thing as free energy.” This is true both locally and universally. There is the ability and necessity to change our present energy sources, design more efficient energy systems, and to be much more energy wise. This is very difficult to do when there's entrenched power, and entrenched economic and social habits reliant on present usage. This includes exorbitant energy waste and maybe most difficult to change, the idea this waste is wealth.
In this regards, information technologies create vast amounts of information and energy waste. Scharf writes, “Every idea is a burden in some way or other, whether in biological metabolism or technological energy demands, but not every idea is equal in its value or utility.” This might be the most invaluable insight of the book.
Spending any time on the internet, you're instantly deluged by vast amounts of information waste, seeking to sell more waste. The production, communication, and intellectual processing of this information all need time and energy. We presently value information in context of the industrial values of more – more energy, more resources, more stuff, no matter what it is, more is better. Quantitative value quashes qualitative. A revaluation of industrial values is required. Cutting information waste reduces energy waste. But also flipping this, we can use information to cut energy waste. Our entrenched industrial infrastructure and processes can be redesigned to be more energy and resource efficient.
Our current energy technologies operate entirely on quantitative value. One different understanding, and there needs to many, to our present information technologies energy waste lies in a better understanding of ourselves and the ecological systems from which we evolved. For instance, Scharf points out, “Biological brains are approximately a billion times more efficient than current standard microprocessor capabilities.” Biological brain functions are not binary, though it's fashionable to try and understand and model as such, and no doubt some specific knowledge can be gained in doing so.
However, Scharf writes, “Humans themselves are multicellular things, composed of perhaps 30 trillion or more cells in total, together with at least as many single-celled microbial passengers.” These systems are not binary or run by algorithms. They are not centrally controlled. We are amazingly complex, incredibly distributed systems. There is no CEO cell or President microbe anywhere in our bodies. There is great energy value to this decentralized information and communication architecture, value we little understand or appreciate. In developing future information systems and networks, indeed all physical infrastructure, and social and political organization, we need to explore the workings and design of distributed network architectures.
It is increasingly clear this specialized way of treating information, defined by physics, this narrow perspective, must change with the addition of more inclusive, complex perspectives, particularly regarding information’s and energy's interaction in regards to biological organisms and their relation with the greater ecological systems from which they evolved. Just as Bohr’s complementarity showed with an understanding of classical and quantum physics, the latter doesn't overthrow the former, both become necessary for complete understanding, so too a more complex understanding of information is necessary.
This is where Scharf's book falls most short, doing so because he and many he chronicles want to create a unified information theory. Unfortunately, the more he attempts to introduce a unified general definition, the less informative the book, especially due to the analogies and metaphors he uses. He unhelpfully employs a long list of ideas we understand specifically or in many cases misunderstand, such as “viruses cheat,” “genes choose,” and “life is an algorithm.” Most distressing, the idea the world is one big computer and “Life on Earth is nothing more than a four-billion-year-old catalytic chemical computation.” (his italics)
Analogy and metaphors are essential in helping explain many ideas in science. It is using the known to help conceptualize something previously unknown. When science comes up with truly new concepts, they are the most difficult to convey. What we already know limits our initial understanding to anything truly new. History is full of new technologies being used to define the universe – clocks, steam engines, and now most unfortunately computers.
It's inadvisable to attempt devising a general information theory, but especially by taking very specific scientific understandings and their technological progeny and trying to generalize them as laws of the universe. More important and a great deal more valuable is understanding all technology has developed in very specific contexts, their very being excluding wider, existentially necessary perspectives. At this point, doubling down on only what we know and what has already been engineered isn't going to do anybody any good.
You can share Life in the 21st Century via: Reddit, Twitter, Facebook