Tuesday, December 24, 2019

The limits of computing

There is a lot of talk nowadays about the Singularity, the scary rate of development of intelligent machines, and the future of artificial intelligence. I'd like to discuss these concepts from a physics perspective. Some of the interesting questions to me are: how close to the fundamental limit of computation are modern computers? How close is the human brain? If we extrapolate a "Moore's Law" type curve based on energy efficiency arguments, when does it intersect human capabilities?

It is well known that the fundamental thermodynamic limit for how much energy a single bit requires in an information processing system is kBT ln 2. This is known as the Landauer limit. Actually, I lied. Some theorists argue that the Landauer limit can be exceeded, for example by using quantum spin instead of energy as a store of information, or reversible computing. Nonetheless, let's assume that the Landauer limit holds true, and see what this means for the future of computation. As we'll see, theoretical arguments about exceeding it are analogous to discussing what we're going to do with all the excess clean energy. Spoiler alert: we are nowhere near an excess of clean energy and we are nowhere near the Landauer limit.

So first of all: how close are modern computers to the Landauer limit? In other words, how many watts does one bit erasure require, and how close is this to the Landauer limit? Well, the FLOPS (Floating Point Operations per Second) per watt (Joules per second) = 6x10^9 FLOPS/W. Taking the reciprocal, that's 1.7x10^-10 Joules per FLOP. Let's assume that since a floating point number has 32 bits, that we get 5.2x10^-12 Joules per bit in a modern computer. By comparison, the Landauer limit at room temperature (293 K) is: 2.8*10^-21 Joules per bit. In other words, modern computers are using 5.2x10^-12/2.8*10^-21= 1.8 billion times more energy than the Landauer limit. There's a lot of room at the bottom!!! Even the most energy efficient modern computer is horribly inefficient at doing math. An energy analogy would be like if driving 100 kilometers in a car required the entire world's energy supply.

Redoing the above calculation in terms of FLOPS/Watt for the Landauer limit, we get 2.8*10^-21 J/bit*(32FLOP/bit)*/(1W/1J/s)=9*10^20W/FLOPS, or 10^19FLOPS/W

OK so computers are pretty inefficient. What about the human brain? Well, first of all: how many operations does an average brain do? Well, from what I have been able to find, somewhere between 10^12 and 10^28 FLOPS. Yes, you read that right! https://aiimpacts.org/brain-performance-in-flops/ The estimates for how many computations the human brain can do vary more 16 orders of magnitude. That's insane. 16 orders of magnitude uncertainty is the greatest uncertainty I have ever seen. I'm not even going to try to make an analogy for that one because it's too much uncertainty to even comprehend. I think this brings up an interesting point for scientists around the world: come up with a better way to measure human brain performance!

Despite the craziness above, we can put some upper bounds on performance: the Landauer limit! And we know that the brain uses around 20W. So the upper bound for the number of computational cycles of human brain performance (assuming no reversible computing or other fanciness), is 20W/2.8*10^-21 Joules per bit=7.1*10^21 bits/s. Assuming 32 bits/FLOP, that's 2.2x10^20 FLOPS. Breaking news: scientists discover that the human brain has exceeded the Landauer limit! Just kidding. A more likely explanation is that whoever estimated 10^28 FLOPS made some bad assumptions that got them 8 orders of magnitude above the Landauer limit! Or maybe the human brain inherently does reversible computing or something. My guess is that no, it doesn't. So the human brain is somewhere between 8 orders of magnitude less efficient than the Landauer limit, or 8 orders of magnitude more efficient than the Landauer limit. lolz.

OK back to computers. Koomey's Law suggests that the amount of energy per computing cycle doubles every 2.6 years. If that's true, and given that modern computers are at 1.8 billion times the Landauer limit, we'll hit the Landauer limit in t=2.6 * ln(1.8x10^9)/ln(2) = 79 years.

The energy singularity (when computers can do more work per joule than the human brain) will happen around 2045 if the human brain is 8 orders of magnitude less efficient than the Landauer limit, or 2100 if it's at the Landauer limit, assuming that computers continue to double in energy efficiency every 2.6 years. I predict that they won't. I think that the traditional silicon computer will not continue to double every 2.6 years and we'll need a dramatically improved computer architecture. This type of dramatic redesign will slow down scaling to doubling every 8 years or so, which will push the singularity way back in time.

I represented this on a graph:



The biggest sources of uncertainty are: how many equivalent computational cycles does the human brain perform? and how quickly will FLOPS/W scale as the limits of traditional silicon are reached?

A common criticism of this type of thinking is: "but the human brain doesn't work like a computer! it uses neurons and is good at parallel computing and doesn't use 1's and 0's." Response: actually, the brain is like a computer! A biochemical system, like "is ion channel open?" can be thought of like a bit of information. Similarly, a relevant protein expression level or horomone level can be quantized. All the brain chemistry put together can be thought of as a computing machine, and it can be quantified in terms of FLOPS! Just because computers today do very different things than the brain does today, does not mean they obey different physics, just that their hardware architecture is different!

This is a first draft to please let me know what you think, any feedback, any typos, etc, and I'll update it!

No comments:

Post a Comment