Hi there spacebloom! Great question!
Moore's law stems from an observation by Gordon Moore (who co-founded
Intel) that transistor density would double every year. He then
amended this to every 18 months (See here for more:
Most people use the law to suggest that computer *performance* will
double every 18 months. So working on that basis:
The fastest single "computer" is probably NEC's Earth Simulator, no
slouch with a peak performance of 40 teraflops:
Assuming Moore's revised 18 month law, it will have the opportunity to
double in speed 163 times before 2248 rolls around, so if all goes
well, it will be breezing along at a mere
If we consider a boring old Pentium 4, 3ghz desktop computer, then
2248 would see it running a little faster, at
Unfortunately, Moore's law will probably only hold for a little while
longer, so we're unlikely to see these rather fast speeds - Intel
suggests that it can keep going until the end of this decade:
I hope this answers your question! If anything's unclear, then please
don't hesitate to request clarification before rating this answer.
All the best,
Clarification of Answer by
08 May 2003 03:34 PDT
I'm sorry to confuse!
Teraflops is a measurement which indicates trillions of floating point
operations per second. It's a solid measurement, and you can say
something meaningful about a computer's speed, using that measurement.
Gigahertz merely indicates how many times per second the computer's
internal clock ticks - in the Intel P4's case, 3 billion times per
second. Despite this, the P4 can't reach 3 billion floating point
operations per second (which would be 3 gigaflops, by the way) because
it takes more than one clock tick to do a floating point operation. We
don't really know what the P4's flops rating is, because Intel prefer
not to use it (see this Usenet posting for a technical explanation
Nevertheless, I thought I'd quickly indicate what Moore's law would
take the gigahertz rating to, because it's a measurement that more
people are familiar with!
Hope this helps!