I’ve mentioned Moore’s Law in passing a few times before. While many in the technology industry see the concept only on its most direct level – that of semiconductor scaling (the ability of the semiconductor industry, so far, to double transistor density every two or so years) – I believe this fails to capture its true essence. It’s not so much a law pertaining to a specific technology (which will eventually run out of steam when it hits a fundamental physical limit), but an “economic law” about an industry’s learning curve and R&D cycle relative to cost per “feature”.
Almost all industries experience a learning curve of some sort. Take the automotive industry – despite all of its inefficiencies, the cost of driving one mile has declined over the years because of improvements in engine technology, the building of the parts, general manufacturing efficiency, and supply chain management – but very few have a learning curve which operates on the same speed (how rapidly an industry improves its economic performance) and steepness (how much efficiency improves given a certain amount of “industry experience”) as the technology industry which can rely not only on learning curves but disruptive technological changes.
One of the best illustrations I’ve seen of this is a recent post on MacStories comparing a 2000 iMac and Apple’s new iPhone 4:
|2000 iMac||2010 iPhone 4|
|Processor||500 MHz PowerPC G3 CPU||1 Ghz ARM A4 CPU|
|Graphics||ATI Rage 128 Pro (8 million triangles)||PowerVR SGX 535 (28 million triangles)|
|Storage||30GB Hard Drive||32GB NAND Flash|
|Weight||34.7 pounds||4.8 ounces|
Although the comparisons are not necessarily apples-to-apples, they give a sense of the speed at which Moore’s Law progresses. Amazing, no?