Moore's law: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Pat Palmer
(archiving info)
mNo edit summary
 
(12 intermediate revisions by 7 users not shown)
Line 1: Line 1:
{{subpages}}
{{TOC|Right}}
<!--[[Image:Moore_Law_diagram_(2004).jpg|thumb|350px|Growth of [[Electronic switch#Transistor|transistor counts]] for [[Intel]] processors (dots) and Moore's Law (upper line, 18 months; lower line, 24 months)]]-->
<!--[[Image:Moore_Law_diagram_(2004).jpg|thumb|350px|Growth of [[Electronic switch#Transistor|transistor counts]] for [[Intel]] processors (dots) and Moore's Law (upper line, 18 months; lower line, 24 months)]]-->


Line 4: Line 6:
The phenomemon predicted by Moore's law, first described in 1965, has held remarkably true to date, and experts predict that this trend might continue until ~2020 or so, declining at the point where [[Electronic switch|switching element]] sizes reach the molecular level.  Moore's law is not really a law, but rather more a “rule of thumb” or a practical way to think about something.   
The phenomemon predicted by Moore's law, first described in 1965, has held remarkably true to date, and experts predict that this trend might continue until ~2020 or so, declining at the point where [[Electronic switch|switching element]] sizes reach the molecular level.  Moore's law is not really a law, but rather more a “rule of thumb” or a practical way to think about something.   


==Origins==
Another way to think about it is that density is inversely proportional to the distance that signals must traverse, so Moore's Law does not only address the computational capability of more circuit elements, but workarounds to speed-of-light limitations.
{{quotation|186,300 miles per second. It's not just a good idea. It's the Law.|Seen at a T-shirt at an IETF meeting}}
 
Moore's law is named for [[Gordon Moore]], a co-founder of [[Intel]], who wrote about it in "Cramming more components onto [[integrated circuits]]", ''[[Electronics (magazine)|Electronics Magazine]]'' 19 April 1965<ref name="IntelInterview">{{cite web| year =2005|url=ftp://download.intel.com/museum/Moores_Law/Video-Transcripts/Excepts_A_Conversation_with_Gordon_Moore.pdf| title =Excerpts from A Conversation with Gordon Moore: Moore’s Law| format =PDF| pages =1| publisher=[[Intel|Intel Corporation]]| accessdate =May 2| accessyear =2006}}</ref>:
Moore's law is named for [[Gordon Moore]], a co-founder of [[Intel]], who wrote about it in "Cramming more components onto [[integrated circuits]]", ''[[Electronics (magazine)|Electronics Magazine]]'' 19 April 1965<ref name="IntelInterview">{{cite web| year =2005|url=ftp://download.intel.com/museum/Moores_Law/Video-Transcripts/Excepts_A_Conversation_with_Gordon_Moore.pdf| title =Excerpts from A Conversation with Gordon Moore: Moore’s Law| format =PDF| pages =1| publisher=[[Intel|Intel Corporation]]| accessdate =May 2| accessyear =2006}}</ref>:
<blockquote>''The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer''</blockquote>
<blockquote>''The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer''</blockquote>
Line 11: Line 15:


In 1975, Moore projected a doubling only every two years. He is adamant that he never said "every 18 months", but that is how it has been quoted. The [[SEMATECH]] roadmap follows a 24 month cycle. In April 2005, Intel offered $10,000 for a copy of the original ''[[Electronics (magazine)|Electronics Magazine]]''. <ref>{{cite web| year =2005|url=http://news.zdnet.co.uk/0,39020330,39194694,00.htm| title =$10,000 reward for Moore's Law original |date=2005-04-12| author=Michael Kanellos|publisher=CNET News.com | accessdate =June 24| accessyear =2006}}</ref>
In 1975, Moore projected a doubling only every two years. He is adamant that he never said "every 18 months", but that is how it has been quoted. The [[SEMATECH]] roadmap follows a 24 month cycle. In April 2005, Intel offered $10,000 for a copy of the original ''[[Electronics (magazine)|Electronics Magazine]]''. <ref>{{cite web| year =2005|url=http://news.zdnet.co.uk/0,39020330,39194694,00.htm| title =$10,000 reward for Moore's Law original |date=2005-04-12| author=Michael Kanellos|publisher=CNET News.com | accessdate =June 24| accessyear =2006}}</ref>
 
It has become common practice to cite Moore's Law as a predictor for the rapid advance in computing power per [[unit cost]] in a variety of computer-related technologies, such as [[Hard Disk|hard disk]] storage cost per unit of information, even though such advances may have little to do with transistor technology advances.  Moore's Law has become synonymous with saying "things very quickly gain capabilities, while cost falls".
==Processor speed not always the limiting factor==
A new important chapter started for Moore's law in 2003. The cycles per second for microprocessors went flat, and processor designers began to keep Moore's law alive by adding
extra cores. [[multicore]] microprocessors in personal computers. <ref Name=ConRev>[http://www.ddj.com/cpp/184401916 Multi-Core]</ref>
===Power consumption===
The problem of absolute power consumption surfaced at about the same time.
The power consumption of computers is now a non-trivial fraction of each modern nation's total power output. (about 2% in 2005) The current limitations to Moore's law are not due to hardware engineering, rather the burden is on programmers to make use of multiple cores and the burden is on society to pay the cost of the extra electrical load. In response, metrics like flops/watt are being used to help keep computation affordable. Note: 1.2% of total US electrical output in 2005 went for servers alone.  http://dl.klima2008.net/ccsl/koomey_long.pdf (Koomey, 2008)
===Processor faster; software slower===
There is a joke about "Gates' Law", a sort of inverse of Moore's Law:


==Versions==
{{quotation| “The speed of software halves every 18 months.” This oft-cited law is an ironic comment on the tendency of software bloat to outpace the every-18-month doubling in hardware capacity per dollar predicted by Moore's Law. The reference is to Bill Gates; Microsoft is widely considered among the worst if not the worst of the perpetrators of bloat. [http://www.catb.org/~esr/jargon/html/G/Gatess-Law.html Jargon file]}}
The most popular formulation is of the doubling of the number of [[Electronic switch#Transistor|transistors]] on [[integrated circuit]]s (a rough measure of computer processing power) every 18 months. At the end of the 1970s, Moore's Law became known as the limit for the number of transistors on the most complex chips. However, it is also common to use it to refer to the rapidly continuing advance in computing power per [[unit cost]]. A similar law has held for [[hard disk]] storage cost per unit of information. The rate of progression in [[disk storage]] over the past decades has actually sped up more than once, corresponding to the utilization of [[error correcting code]]s, the [[magnetoresistive effect]] and the [[giant magnetoresistive effect]]. The current rate of increase in [[hard drive]] capacity is roughly similar to the rate of increase in transistor count and has been dubbed [[Kryder's Law]]. However, recent trends show that this rate is falling, and has not been met for the last three years. Another version states that [[Random Access Memory|RAM]] storage capacity increases at the same rate as processing power. However, memory speeds have not increased as fast as [[CPU]] speeds in recent years, leading to a heavy reliance on caching in current computer systems.
===Memory===
Some applications are memory, not processor limited. Admittedly, memory is also dependent on many of the same semiconductor technologies as processors, but a given computer may not be able to accept more physical memory.  Desktop computers with Microsoft operating systems often improve more with more memory than a replacement faster processor -- although the situation can be confusing if a different processor is needed to accept more memory.


In the early 1990s, the crisis in Internet routing, which was worked around with [[Classless Inter-Domain Routing]] (CIDR), was a practical consequence of the Cisco AGS, then the most common Internet core router, holding a maximum of 16 megabytes of memory. Until the CIDR techniques went into operational use, the number of routes was doubling every five months and threatening to overload the routing tables of those routers.
===Networks===
Another set of laws deals with the growth of connectivity in computer networks. Sarnoff's Law deals with the value of a one-to-many radio or television broadcast network, as opposed to Metcalfe's Law about any-to-any internets.
==References==
==References==
<references />
{{reflist|2}}[[Category:Suggestion Bot Tag]]
 
[[Category:Computers Workgroup]]
[[Category:CZ Live]]

Latest revision as of 06:01, 21 September 2024

This article is developing and not approved.
Main Article
Discussion
Definition [?]
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Moore's law is the prediction that the transistor density inside integrated circuits will double every two years, and that prices will decline at the same time[1]. The phenomemon predicted by Moore's law, first described in 1965, has held remarkably true to date, and experts predict that this trend might continue until ~2020 or so, declining at the point where switching element sizes reach the molecular level. Moore's law is not really a law, but rather more a “rule of thumb” or a practical way to think about something.

Another way to think about it is that density is inversely proportional to the distance that signals must traverse, so Moore's Law does not only address the computational capability of more circuit elements, but workarounds to speed-of-light limitations.

186,300 miles per second. It's not just a good idea. It's the Law. — Seen at a T-shirt at an IETF meeting

Moore's law is named for Gordon Moore, a co-founder of Intel, who wrote about it in "Cramming more components onto integrated circuits", Electronics Magazine 19 April 1965[1]:

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer

Although named for him, Gordon Moore may not have invented Moore's law; instead, he may have heard Douglas Engelbart, a co-inventor of the mechanical computer mouse, discuss the projected downscaling of integrated circuit size in a 1960 lecture.[2]. Moore's observation was named a 'law' by the Caltech professor and VLSI pioneer Carver Mead[1].

In 1975, Moore projected a doubling only every two years. He is adamant that he never said "every 18 months", but that is how it has been quoted. The SEMATECH roadmap follows a 24 month cycle. In April 2005, Intel offered $10,000 for a copy of the original Electronics Magazine. [3]

It has become common practice to cite Moore's Law as a predictor for the rapid advance in computing power per unit cost in a variety of computer-related technologies, such as hard disk storage cost per unit of information, even though such advances may have little to do with transistor technology advances. Moore's Law has become synonymous with saying "things very quickly gain capabilities, while cost falls".

Processor speed not always the limiting factor

A new important chapter started for Moore's law in 2003. The cycles per second for microprocessors went flat, and processor designers began to keep Moore's law alive by adding extra cores. multicore microprocessors in personal computers. [4]

Power consumption

The problem of absolute power consumption surfaced at about the same time. The power consumption of computers is now a non-trivial fraction of each modern nation's total power output. (about 2% in 2005) The current limitations to Moore's law are not due to hardware engineering, rather the burden is on programmers to make use of multiple cores and the burden is on society to pay the cost of the extra electrical load. In response, metrics like flops/watt are being used to help keep computation affordable. Note: 1.2% of total US electrical output in 2005 went for servers alone. http://dl.klima2008.net/ccsl/koomey_long.pdf (Koomey, 2008)

Processor faster; software slower

There is a joke about "Gates' Law", a sort of inverse of Moore's Law:

“The speed of software halves every 18 months.” This oft-cited law is an ironic comment on the tendency of software bloat to outpace the every-18-month doubling in hardware capacity per dollar predicted by Moore's Law. The reference is to Bill Gates; Microsoft is widely considered among the worst if not the worst of the perpetrators of bloat. Jargon file

Memory

Some applications are memory, not processor limited. Admittedly, memory is also dependent on many of the same semiconductor technologies as processors, but a given computer may not be able to accept more physical memory. Desktop computers with Microsoft operating systems often improve more with more memory than a replacement faster processor -- although the situation can be confusing if a different processor is needed to accept more memory.

In the early 1990s, the crisis in Internet routing, which was worked around with Classless Inter-Domain Routing (CIDR), was a practical consequence of the Cisco AGS, then the most common Internet core router, holding a maximum of 16 megabytes of memory. Until the CIDR techniques went into operational use, the number of routes was doubling every five months and threatening to overload the routing tables of those routers.

Networks

Another set of laws deals with the growth of connectivity in computer networks. Sarnoff's Law deals with the value of a one-to-many radio or television broadcast network, as opposed to Metcalfe's Law about any-to-any internets.

References

  1. 1.0 1.1 1.2 Excerpts from A Conversation with Gordon Moore: Moore’s Law (PDF) 1. Intel Corporation (2005). Retrieved on May 2, 2006.
  2. NY Times article April 17 2005
  3. Michael Kanellos (2005-04-12). $10,000 reward for Moore's Law original. CNET News.com. Retrieved on June 24, 2006.
  4. Multi-Core