"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years."
Ignore the nerdy wordiness: Moore basically surmised that the number of components in an integrated circuit would double each year, thus lowering costs and boosting processing speeds at an exponential rate. It was a terrific stab at a predicting the future of a then infant industry, but it didn't prove completely accurate. Moore actually tweaked his prediction in 1975, changing the rate to a "doubling every two years" instead of every year. This updated forecast, now known as "Moore's Law" has stood-up remarkably well against the test of time, but as John Markoff notes in the New York Times, this trend may be coming to an end.
The problem is not that [researchers] cannot squeeze more transistors onto the chips -- they surely can -- but instead, like a city that cannot provide electricity for its entire streetlight system, that all those transistors could require too much power to run economically. They could overheat, too.
I can imagine the talk in the chip-maker meeting rooms. "OK, guys. We can't just add transistors anymore. It's time to think outside the wafer."
If chip-makers want to continue to boost speeds at the rates we're all used to, this is just what they will have to do. They already adapted once when they hit a minor speed-bump about six years ago. Chip-makers used to simply boost processor clock speeds, but they hit that apex around 3-4 gigahertz due to the chips overheating. They overcame this speed-bump by adding additional cores and improving architecture. This number disparity still prompts the technologically-ignorant to wonder why the heck a 3.6 gigahertz Pentium 4 is slower than a 1.7 gigahertz Core i7. ("Don't worry, Dad, just buy it for me.")
Still, let's say that Moore's Law does come to a halt and the processor speed apocalypse does occur; will it really be that bad? Most of the activities that the average person performs on their computer devices are not processor intensive (email, internet, music, videos, etc.) so the end of Moore's Law will not inhibit those functions. Also, a slower rate of consumer technology upgrades would be certain to assuage those who are easily afflicted with buyer's remorse and would also mean that last year's technology might have a better re-sale value.
Lastly, reduced speed increases do not necessarily equate to a downturn for the economy, as is feared by some. Let us not forget the emerging markets in China and India; they're going to want their iPads and iPhones regardless of how much faster the next generation is over the previous one. And let's be honest, we'll want them, too.