My personal term for the inaccurate nature of the messy and fuzzy world was confused. But then, in 1980, I acquired an Ohio Scientific desktop computer and found fast, lasting relief. All of its operations were built on a foundation of binary arithmetic, in which a 1 was always exactly a 1 and a 0 was a true 0, with no fractional quibbles. The 1 of existence, and the 0 of nothingness! I fell in love with the purity of the digital and learned to write code, which became a permanent refuge from fuzzy math.
Of course, numerical values still had to be stored in fallible physical components, but margins of error took care of that. In a modern 5 volt digital chip, 1.5 volts or less would represent the digit 0 while 3.5 volts or more would represent the digit 1. Components on a properly designed motherboard would stay within these limits, so it should only no misunderstandings. .
So when Bernd Ulmann predicted that analog computers would make a comeback, I wasn’t just skeptical. I found the idea a little…disturbing.
Hoping for a reality check, I consulted with Lyle Bickley, a founding member of the Computer History Museum in Mountain View, California. Having served for years as an expert witness in patent lawsuits, Bickley maintains an encyclopedic knowledge of everything that has been done and is still done in computing.
“A lot of companies in Silicon Valley have secret analog chip projects,” he told me.
Really? But why?
“Because they consume so little energy.”
Bickley explained that when, for example, brute-force natural language AI systems distill millions of words across the internet, the process is incredibly energy-intensive. The human brain runs on a small amount of electricity, he said, about 20 watts. (It’s the same as a light bulb.) “Yet if we try to do the same thing with digital computers, it takes megawatts.” For this type of application, digital “won’t work. It’s not a smart way to do it.
Bickley said he would breach confidentiality to give me details, so I went looking for startups. Quickly, I found a company in the San Francisco Bay Area called Mythic, which claimed to market the “industry’s first analog AI matrix processor”.
Mike Henry co-founded Mythic at the University of Michigan in 2013. He’s an energetic guy with a neat haircut and a neatly pressed shirt, like a former IBM salesman. He expanded on Bickley’s point, citing the brain-like neural network that powers GPT-3. “It has 175 billion synapses,” Henry said, comparing the processing elements to the connections between neurons in the brain. “So every time you run this model to do one thing, you have to load 175 billion values. Very large data center systems can barely keep up.
That’s because, says Henry, they’re digital. Modern AI systems use a type of memory called static RAM, or SRAM, which requires constant power to store data. Its circuits must remain on even when it is not performing a task. Engineers have done a lot to improve the efficiency of SRAM, but there is a limit. “Tricks like reducing the supply voltage wear themselves out,” Henry said.