Page 1 of 1

Measuring computing speed

Posted: 2003-12-09 06:36am
by His Divine Shadow
I'd like to know some units for measuring computer speed

Like what is the base unit? An operation? A computation? How many FLOP's would that equal and so on.

Posted: 2003-12-09 01:52pm
by phongn
There are a few different types.
  1. Clock speed is one way to measure things, but it isn't a very accurate measurement. For example, a Palamino-core Athlon XP running at 1.53GHz will have superior performance over a Willamette-core Pentium 4 running at 1.6GHz
  2. Instructions Per Second (usually measured in the millions, i.e. MIPS). This isn't very accurate either, because of different architectures. Two different processors may have different MIPS ratings yet complete a task in the same amount of time
  3. FLOPS is another measurement, often used in the supercomputing world. It's more useful, but doesn't measure integer performance at all.
  4. Benchmarking is another method, and there are scores of different ways to do it. However, it is difficult not to have a sort of bias when doing it (due to unoptimized code, for example).
In short, there is no one magic number that can quantify computer performance.

Posted: 2003-12-09 02:45pm
by General Zod
it's mainly a combinationof everything that you have set up in your system.

the processor might seem like the biggest part of it, but it's also dependent on what you're using the machine for, and the accessories that you have included in the PC. alot of the times the motherboard will affect chipset performance, and depending on a graphics or soundcard your CPU might do better or worse. No real single factor you can use to measure a computer's performance really. . . .

Posted: 2003-12-09 02:53pm
by His Divine Shadow
Hmmm, well I was under the impression there was some correlation, I must have misunderstood