January 29, 2015

Moore's Law and the second half of the chessboard

Photo by Tristan Martin using CC license
The wheat and chessboard problem, as told in the story about the emperor and inventor, provides an
interesting and useful way to think about everything from computing power to advanced sensors. In the poem, written around a thousand years ago, an Indian emperor hires a mathematician to come up with a new board game. The emperor is so pleased with the invention, the game of chess, that he grants the mathematician a reward. The mathematician asks for a single grain of wheat on the first square, two on the second, four on the third square and so on until the board has been filled. The emperor, not realizing the nature of exponential growth, is somewhat perplexed and even offended by such a seemingly mediocre request. However, when calculating the amount of wheat owed to the mathematician, the treasurer soon informs the emperor of the results. It turns out there's not enough wheat in the whole world, let alone the kingdom, to keep the promise made to the inventor. The emperor, feeling betrayed and humiliated, has the mischievous mathematician executed.

So much wheat does it take to fill all the 64 squares? It takes 264 -1= 18,446,744,073,709,551,615 grains of wheat! The human brain is so used to the linear that it's nearly impossible to come up with this intuitively. In computer technology, something similar has been observed. Moore's Law*, simply put, is an observation that overall processing power for computers will double every 18 to 24 months. If you've ever seen a photo composition of an early computer next to a modern smartphone you what the Moore's Law means. In fact, Google did some interesting calculations and came up with this:
It takes about the same amount of computing to answer one Google Search query as all the computing done — in flight and on the ground — for the entire Apollo program! 
Some smartphones, for example the Samsung Galaxy S5, are more powerful, in terms of share computing power, than the supercomputer Deep Blue that beat Gary Kasparov in chess back in 1997. Moreover, memory capacity, sensors, LEDs and DNA sequencing seem to follow the same rule of thumb*. This has resulted in affordable handheld supercomputers such as your smartphone. Smartphone sensors are so good these days that there's a space start-up in the US that builds small satellites using smartphone components and launching them into space at a fraction of the cost of mainstream satellites.

The chessboard example comes in handy when thinking about the future of computing power, DNA sequencing and sensor technology. In the chessboard story, the first half of the chessboard, the first 32 squares, contain roughly 4 billion grains of wheat. By the 33rd square it's up to 8 billion grains, by the time you reach the 40th square, the emperor would have to part with a trillion grains of wheat (that's 1'000'000'000'000). Erik Brynjolfsson and Andrew McAfee wrote about this in their book Rage Against the Machine:
...constant doubling, reflecting exponential growth, is deceptive because it is initially unremarkable. Exponential increases initially look a lot like standard linear ones, but they’re not. As time goes by—as we move into the second half of the chessboard—exponential growth confounds our intuition and expectations. It accelerates far past linear growth, yielding Everest-sized piles of rice and computers that can accomplish previously impossible tasks.
So where are we in the history of business use of computers? Are we in the second half of the chessboard yet? This is an impossible question to answer precisely, of course, but a reasonable estimate yields an intriguing conclusion. The U.S. Bureau of Economic Analysis added “Information Technology” as a category of business investment in 1958, so let’s use that as our starting year. And let’s take the standard 18 months as the Moore’s Law doubling period. Thirty-two doublings then take us to 2006 and to the second half of the chessboard. Advances like the Google autonomous car, Watson the Jeopardy! champion supercomputer, and high-quality instantaneous machine translation, then, can be seen as the first examples of the kinds of digital innovations we’ll see as we move further into the second half—into the phase where exponential growth yields jaw-dropping results.
It is the second half of the chessboard where things get interesting. 
Image from Wikipedia
While the number of grains on the first half of the chessboard is large, the amount on the second half is vastly (232 > 4 billion times) larger. - from Wikipedia
It is the second half where businesses that once didn't use computing power to a large extent begin to realize the economic benefits of doing so. Law firms, for example, are using computers in the discovery process to scan tens of thousands of documents for important connections and buried evidence. It's been calculated that a computer, at today's speed, can do the work of 500 legal professionals when looking for information. As far as their accuracy goes, Rage Against the Machine has this:
Herr … used e-discovery software to reanalyze work his company’s lawyers did in the 1980s and ’90s. His human colleagues had been only 60 percent accurate, he found. “Think about how much money had been spent to be slightly better than a coin toss,” he said.
In a previous post, I wrote about the job market implications of automation (and Moore's Law). I have a post lined up about what Moore's Law is doing to space exploration. Stay tuned.

*In reality, Moore's Law is more like a rule of thumb. There is no inherent reason why processing speed should double every 18 months. However, companies such as Intel and it's competitors in the semiconductor industry have set it as a long-term goal and so far they've been able to achieve their targets. Some have pointed to the limitation of silicon-based processors, but so far this has been circumnavigated by developing 3D chips. Other solutions are available as well. 

No comments:

Post a Comment