Moore's Law is an empiric observation stating in effect that at our rate of technological development and advances in the semiconductor industry the complexity of integrated circuits doubles every 18 months.

It is attributed to Gordon E. Moore (a co-founder of Intel, not to be confused with the philosopher G. E. Moore). Moore outlined his "law" in 1965. He original empirical observation was that the number of components on semiconductor chips with lowest per-component cost doubles roughly every 12 months, and he conjectured that the trend will stay for at least 10 years. In 1975, Moore revised his estimate for the expected doubling time, arguing that it was slowing down to about two years (see the external link below).

Although Moore never claimed the concept as a formal law, it has eventually become a kind of self-fulfilling prophecy, driving both marketing and engineering departments crazy every 18 months.

Since then the "law" is formulated in numerous redactions, the most popular ones being about the doubling the number of transistors on integrated circuits (a rough measure of computer processing power). At the end of the 1970s, Moore's law became known as the limit for the number of transistors on the most complex chips.

Another version of Moore's law claim that RAM storage capacity increases at the same rate as processing power. However, memory speeds have not increased as fast as CPU speeds in recent years, leading to a heavy reliance on caching in current computer systems.

Recent computer industry technology 'roadmaps' predict (as of 2001) that Moore's law will continue for several chip generations. Depending on the doubling time used in the calculations, this could mean up to 100 fold increase in transistor counts on a chip in a decade. The semiconductor industry technology roadmap uses a three-year doubling time for microprocessors, leading to about nine-fold increase in a decade.

Since the rapid exponential improvement could put 100 GHz personal computers in every home and 20 GHz devices in every pocket, some commentators have speculated that sooner or later computers will meet or exceed any conceivable need for computation. This is only true for some problems - there are others where exponential increases in processing power are matched or exceeded by exponential increases in complexity as the problem size increases. See computational complexity theory and complexity classes P and NP for a (somewhat theoretical) discussion of such problems, which occur very commonly in applications such as scheduling.

Although Moore's law has since the 1970s been defined in terms of the number of transistors on a chip, it is common to refer to Moore's law in reference to the rapid continuing advance in computing power per dollar cost.

A similar progression has held for hard disk storage available per dollar cost - in fact, the rate of progression in disk storage over the past 10 years or so has actually been faster than for semiconductors—although, largely because of production cost issues, hard drive performance increases have lagged significantly.

Extrapolation based on Moore's Law has led futurists such as Vernor Vinge and Bruce Sterling to speculate about a technological singularity. Historical analysis of Moore's law has, however, shown that its interpretations have qualitatively changed over the years and that it has not very accurately described developments in semiconductor technology. For a detailed study on Moore's law and its historical evolution, see 'The lives and death of Moore's law,' First Monday, November 2002, at http://firstmonday.org/issues/issue7_11/tuomi/.

The effect of the operation of Moore's Law on computer component suppliers is profound. A typical major design project (such as an all-new CPU or hard drive) takes between two and five years to reach production-ready status: in consequence, component manufacturers face enormous timescale pressures: just a few weeks delay in a major project can spell the difference between great success and massive losses, even bankruptcy. Expressed as "a doubling every 18 months", Moore's Law suggests the phenomenal progress of technology in recent years. Expressed on a shorter timescale, however, Moore's Law equates to an average performance improvement in the industry as a whole of over 1% a week. For a manufacturer competing in the cut-throat CPU, hard drive or RAM markets, a new product that is expected to take three years to develop and is just two or three months late is 10 to 15% slower or larger in size than the directly competing products, and is usually unsellable.

Note that not all aspects of computing technology develop in capacities and speed according to Moore's Law. Random Access Memory speeds, and hard drive seek times improve at best at a few percentage points per year.

Another, sometimes misunderstood, point is that exponentially improved hardware does not necessarily imply exponentially improved software to go with it. The productivity of software developers most assuredly does not increase exponentially with the improvement in hardware, but by most measures has increased only slowly and fitfully over the decades.

See also: CPU monthly (for a month by month display of Top Processors from Intel and AMD)

External links