Hi there spacebloom! Great question!
Moore's law stems from an observation by Gordon Moore (who co-founded
Intel) that transistor density would double every year. He then
amended this to every 18 months (See here for more:
http://www.webopedia.com/TERM/M/Moores_Law.html )
Most people use the law to suggest that computer *performance* will
double every 18 months. So working on that basis:
The fastest single "computer" is probably NEC's Earth Simulator, no
slouch with a peak performance of 40 teraflops:
http://www.nec.co.jp/press/en/0203/0801.html
Assuming Moore's revised 18 month law, it will have the opportunity to
double in speed 163 times before 2248 rolls around, so if all goes
well, it will be breezing along at a mere
467680523945888933825179146469210566289898413752320 teraflops.
If we consider a boring old Pentium 4, 3ghz desktop computer, then
2248 would see it running a little faster, at
35076039295941670036888435985190792471742381031424ghz.
Unfortunately, Moore's law will probably only hold for a little while
longer, so we're unlikely to see these rather fast speeds - Intel
suggests that it can keep going until the end of this decade:
http://www.intel.com/research/silicon/mooreslaw.htm
I hope this answers your question! If anything's unclear, then please
don't hesitate to request clarification before rating this answer.
All the best,
--seizer
Search strategy:
Moore's Law
fastest computer |
Clarification of Answer by
seizer-ga
on
08 May 2003 03:34 PDT
I'm sorry to confuse!
Teraflops is a measurement which indicates trillions of floating point
operations per second. It's a solid measurement, and you can say
something meaningful about a computer's speed, using that measurement.
Gigahertz merely indicates how many times per second the computer's
internal clock ticks - in the Intel P4's case, 3 billion times per
second. Despite this, the P4 can't reach 3 billion floating point
operations per second (which would be 3 gigaflops, by the way) because
it takes more than one clock tick to do a floating point operation. We
don't really know what the P4's flops rating is, because Intel prefer
not to use it (see this Usenet posting for a technical explanation
why: http://groups.google.co.uk/groups?selm=37CED591.CA341215%40mailbox.intel.com
)
Nevertheless, I thought I'd quickly indicate what Moore's law would
take the gigahertz rating to, because it's a measurement that more
people are familiar with!
Hope this helps!
Regards,
--seizer
|