Tackling the Analog-Digital Divide as AI's Greatest Challenge

From P2P Foundation
Jump to navigation Jump to search


Discussion

Jan Krikke:

The digital revolution heralded the virtual end of analog computers. The use of analog systems has since been confined to highly specialist areas like inertial guidance systems. But analog is making a come-back. Digital processors have become smaller and hotter and the limits to ever faster speed and performance are coming in sight. Moreover, digital computers have limits when it comes to processing wave-like analog continuous phenomenon we find in biology - and most crucially in the brain. Scientists are now trying to find ways to combine analog and digital in hybrid chips and other technologies.

Over the past 40 years, digitization has become the norm. Nearly all media, the Internet and nearly all industrial processes today are digital, courtesy of high-powered processors and sophisticated algorithms. The process is of digitizing is as easy as it is ingenious. A textbook example is "digital music." The sound wave is sampled more than 40,000 times a second. Each sample is given a binary number and the numbers are written to a storage device. An audio system reads the binary strings and converts the string back into an analog sound wave. A sampling rate of 40,000 times a second is high, largely obscures the missing information in the analog-digital conversion, but it still is discrete rather than continuous.

In biology and other more sophisticated processes, this missing information can be decisive. The human brain, perhaps the most complex of all living organisms, is still not fully understood. The brain works with voltages, which are analog, while firing neurons which operate on the binary principle of on and off, firing and non-firing. The noted mathematician Freeman Dyson, in his 2014 lecture 'Are Brains Analogue or Digital', explained the complexities of understanding brain functions like memory. "It seems likely that memories are recorded in variations of the strengths of synapses connecting the billions of neurons in the brain with one another. But we do not know how the strengths of synapses are varied. It could well turn out that the processing of information in our brains is partly digital and partly analog. If we are partly analog, the downloading of a human consciousness into a digital computer may involve a certain loss of our finer feelings and qualities."

Professor Dyson points at a third possibility: The processing of information in our brains is done with quantum processes, and the brain is the biological equivalent of a quantum computer. He adds this is speculation: "Quantum computers are possible in theory and are theoretically more powerful than digital computers. But we don't yet know how to build a quantum computer, and we have no evidence that anything resembling a quantum computer exists in our brains. Whether a universal quantum computer can efficiently simulate a physical system is an unresolved problem in physics."

Converging analog and digital

The intricacies of the brain exist on the cellular and quantum level and are governed by electrical voltage variations. We can safely assume that variations in one part of the brain instantaneously impact other parts of the brain, as well as the rest of the body. Digitizing such quantum-level variations, no matter how high the sampling rate, results in loss of information. Moreover, converting digitized data to analog output requires massively complex equations performed in real time, a special challenge in simulating biological processes. Analog computing directly solves ordinary differential equations that are at the heart of continuous (wave-like) processes.

Scientists are attacking the problem in various ways. Yipeng Huang, a computer architecture engineer at Columbia University, developed a chip architecture conceived as a digital host with an analog accelerator. He points out that computations that are more efficiently done through analog computing get handed off by the host to the analog accelerator. In other words, the chip interleaves analog and digital processing within a single problem, applying each method according to what it does best. The high performance of the analog computing speeds the computation by skipping initial iterations and the incremental digital approach zeroes in on the most accurate answer.

In 2016, researchers at MIT's Computer Science and Artificial Intelligence Laboratory and Dartmouth College presented a paper on a new analog compiler that could help to enable simulation of whole organs and even organisms. The compiler takes as input differential equations and translates them into voltages and current flows across an analog chip. The researcher used an analog chip design from electronics engineer Rahul Sarpeshkar to test their compiler on five sets of differential equations commonly used in biological research. Sarpeshkar said in an MIT news release. “With a few transistors, cytomorphic (cell-resembling) analog circuits can solve complicated differential equations — including the effects of noise — that would take millions of digital transistors and millions of digital clock cycles,”

Current attempts to integrate analog and digital processes are primarily aimed at improving and speeding up complex equations required for analog processes. The issue does not seem to concern AI experts and the Singularity community who believe science will develop non-biological intelligence that will match the range and subtlety of human intelligence in a few decades. They point at the exponential growth in the power of computers and believe reverse-engineering of the human brain is possible with sufficient computing power. Only time will tell whether infinite binary computing speed can sufficiently compensate for the "missing information" of analog processes in creating artificial human-like intelligence.

Purpose

The debate on the analog-digital divide has a surprisingly long history, not only in technical differences but also on the intuitive and theoretical level. It has been addressed by Immanuel Kant, Soren Kierkegaard, Ludwig Wittgenstein and Gregory Bateson. Carol Wilder, professor of Media Studies at The New School in New York, discussed the aesthetic dimension of the analog-digital issue in her paper 'Being Analog'. She wrote: "It has become apparent that analog/digital carry both precise meanings at the level of physiological, chemical, and electrical processes and broadly metaphorical meanings when applied to human communication and behavior." Wilders conducted an informal survey asking students what they identified with analog and digital phenomena. The answers ranged from the whimsical to the profound but reflect popular thought about the issue - see sources below.

The analog-digital debate serves to remind us that predictions about the birth of the first humanoid are premature. Assuming it is desirable, relying only on computing may not be the best strategy. AI could do well to adopt the method used by Norbert Wiener when he developed Cybernetics: an interdisciplinary approach. It could include experts from the field of physics, biology, macrophysics as well as from the humanities: psychology, philosophy, epistemology and specialists in meditation and other spiritual practices. A meeting of such minds could not only help to separate fact from fiction, but also give direction and purpose.

More information

Sources:

Is life analog or digital - Freeman Dyson [1]

Back to analog computing - Columbia University [2]

Being Analog - Carol Wilder [3]