Most people have a basic understanding of the difference between analog and digital. But the cause and origin of the distinction are rarely discussed. Is nature analog or digital, continuous or discrete? Does the brain operate on analog or digital signals? The issue is important in various disciplines, ranging from philosophy to artificial intelligence. It even appears to have a cultural dimension. Approach the distinction from a philosophical and esthetic perspective and we may conclude that the West is analog and the East (more specifically East Asia) is digital.
The analog-digital dichotomy was first discussed during the legendary Macy Conferences on Cybernetics held between 1946 and 1953. The conference brought together many of the leading thinkers of the era, including the people who set the digital revolution in motion. The analog-digital debate was the most contentious issue at the conferences. The discussions centered on the nature of the human brain and why it was important to computing.
Ralph Waldo Gerard, a neurophysiologist and behavioral scientist, claimed that the brain’s operations are “much more analog than digital.” He called into question the digital logic-based model developed in 1943 by neuroscientist Warren S McCulloch and logician Walter Pitts. They tried to explain how the brain could produce highly complex patterns by using many basic cells (neurons) that are connected.
After a contentious debate, famed social scientist George Bateson called for clarification of the distinction between analog and digital to remove remove ambiguities from the debate. Nobody came up with a satisfying explanation and the issue was shelved as “old business unresolved.” In the following years, the digital approach to computing won the day, but for practical rather than philosophical reasons. Analog computers rely on the continuous variation of voltage, while digital computers deal only with discrete, unambiguous current – either on or off. Digital systems proved to be not only more stable, they were also easier to program.
Seventy years after the Macy Conferences, digital technology is ubiquitous, but the analog-digital dichotomy still causes confusion. The digital revolution led to the popular assumption that digital has replaced analog. We speak of “digital music,” but there is no such thing as digital music. There is only digitally stored music. Sound is an analog wave. When we digitize audio, we “sample” the analog sound wave 44,100 times a second. We give each sample a binary number and write the resulting binary strings to a storage medium. For playback, we use a digital-analog converter to reconstitute the wave in order to make it audible. Our ear is an analog organ that only responds to waves.
(Copyright © 2016 MathsIsFun.com)
Artificial intelligence is part of the reason for a renewed interest in the analog-digital dichotomy. Mathematician Freeman Dyson addressed the issue in his 2001 lecture “Is Life Analog or Digital?” Dyson stressed the difficulty of understanding brain functions like memory. He wrote:
“It seems likely that memories are recorded in variations of the strengths of synapses connecting the billions of neurons in the brain with one another. But we do not know how the strengths of synapses are varied. It could well turn out that the processing of information in our brains is partly digital and partly analog. If we are partly analog, the downloading of human consciousness into a digital computer may involve a certain loss of our finer feelings and qualities.”
The latter point may very well be an elegant understatement. In biology and other sophisticated processes, let alone the human brain, loss of information, no matter how small, is decisive. Professor Dyson points at a third possibility: The processing of information in our brains is done with quantum processes, and the brain is the biological equivalent of a quantum computer. He adds this is merely speculation, noting that we have no evidence that anything resembling a quantum computer exists in our brains. “Whether a universal quantum computer can efficiently simulate a physical system is an unresolved problem in physics.”
Quantum computing is said to hold the promise of virtually limitless computing power that will enable us to emulate the behavior of cell chemistry in minutes. Quantum computing uses quantum analog wave functions and is therefore analog, just like chemical and biological processes. Given sufficient computational speed, we could create an electric liver or immune system and test drugs in minutes rather than years, and do so without killing laboratory animals. But quantum computing faces major challenges, among them exacting demands of stability and temperature.
In a Ted Talk in 2018, “Analog Supercomputers: From Quantum Atom to Living Body,” Dartmouth professor Rahul Sarpeshkar argued that “collective analog computing” can be a viable alternative to quantum computing. Sarkeshkar pointed out that “that the brain computes in a hybrid (analog-binary) fashion and that an under-appreciated and important reason for the efficiency of the human brain … is the hybrid and distributed nature of its architecture.” He went on to say:
“So fortunately for us, electronics and chemistry are very deeply linked and it’s no surprise because chemistry is about the motion of electron rearrangements and atoms and molecules, and electronics, why it’s called electronics, is about electrons moving from one place to another, long-range, in fact, atomic electronics is chemistry, which means that if electronics and chemistry are deeply linked, we have a way in principle of taking electronic circuits and mapping them to circuits themselves, with DNA and proteins and RNA, the molecules in cells that do the computation, and we can go the other way, we can build an analog supercomputer to try and model the cell.”
Time will tell if quantum computing or collective analog computing ultimately wins the day, but we are still left with the same question: What is the nature of the analog-digital distinction beyond computing? Science knows how to harness the analog-digital pair, just like science knows how to harness the wave-particle duality. Is the distinction inherent in nature or is it a human construct? We may have to turn to the humanities – philosophy, epistemology, ontology, and esthetics – to get a better idea of the issue, what it means and what it can tell us about human perceptions.
The word “analog” is defined as something that is similar or comparable to something else; it comes from the Greek words ana (up to) and logos (ratio or proportion). Greek sculpture was “analogous” to the human body. In 1946, the word entered computer language as an adjective to describe a signal that is continuous in amplitude. Digital (called binary by Gottfried Leibniz, inventor of the binary code) comes from the Latin word digitus (finger), as fingers are typically used for discrete counting. But these definitions hardly convey to the complexity of the analog-digital dichotomy.
In her 1997 essay “Being Analog,” media scholar Carol Wilder wrote: “It has become apparent that analog/digital carry both precise meanings at the level of physiological, chemical, and electrical processes and broadly metaphorical meanings when applied to human communication and behavior.” Wilder asked colleagues and associates to expand upon the standard examples and analog and digital with actual examples. The answers she got ranged from the whimsical to the profound, but they reflect the wide range of meaning people associate with analog and digital.
(Copyright © 1997 Carol Wilder/Being Analog)
Apart from its technical aspect, the analog-digital distinction also has a philosophical and aesthetic dimension. Leibniz credited the Chinese with having invented the first binary code. He referred to the 64 hexagrams that form the basis of the I Ching. The hexagrams are binary symbols the Chinese used to “quantize” nature in the binary opposites of yin and yang : dark/light, passive/active, death/life, contracting/expanding, space/time, human/nature. The aim of identifying these opposites was to reconcile them so that humans could “insert” themselves into the binary universe with the least amount of friction.
The I Ching not only forms the basis of East Asia’s worldview, but it also shaped its culture, everything from its art and architecture to its social structure, its philosophy, and its aesthetic sensibilities. Distinguishing analog and digital is an art rather than a science, but if we apply Wilder’s method of classifying analog and digital to East and West, we may conclude that the distinction found expression in their respective cultural development.
(Table by author)
If the ancient Chinese sages who conceived the yin-yang code had known about the analog-digital dichotomy, they may have classified it as one more yin-yang opposite that must be reconciled. The red thread running through the writing of Confucius, Lao Lao, and the other interpreters of the I Ching is the need to reconcile opposites, in the broadest sense of the word. One should be materialistic (yin) nor spiritual (yang), one should be both; realistic nor idealistic but both; conservative nor progressive but both. Had they lived today, the ancient sages may have added: One should be Eastern (yin) nor Western (yang); one should be both.