Page 1 of 1
Comparing 2 processors
Posted: 2006-04-04 09:45am
by 2000AD
I've been looking at the propeties of 2 computers:
1 PC is new and has an AMD Athlon 64 3200+, beneath which it says "2.01 GHz"
The other is a 3 year old rig and has an AMD Athlon XP 2600+ which says "2.13 Ghz"
Shouldn't the newer PC be running faster?
THe reason i ask is that the older one is one that i bought close to 3 years ago, and the new one is one my brother got custom built by his friend and according to my bro it should be top of the range.
Posted: 2006-04-04 10:19am
by Uraniun235
Clockspeed doesn't necessarily translate to performance.
For example, that A64 3200+ running at 2 GHz will absolutely kick the shit out of a Pentium 4 running at 2 GHz.
Similarly, the A64 3200+ running at 2 GHz will outperform the Athlon XP running at 2.13 GHz.
Re: Comparing 2 processors
Posted: 2006-04-04 10:39am
by Miles Teg
2000AD wrote:I've been looking at the propeties of 2 computers:
1 PC is new and has an AMD Athlon 64 3200+, beneath which it says "2.01 GHz"
The other is a 3 year old rig and has an AMD Athlon XP 2600+ which says "2.13 Ghz"
Shouldn't the newer PC be running faster?
THe reason i ask is that the older one is one that i bought close to 3 years ago, and the new one is one my brother got custom built by his friend and according to my bro it should be top of the range.
Clockspeed (M/Ghz) is only one small, essentially useless comparitor between different processors. Think of it this way (if you know anything about engines). It's like when a gasoline engine running at 8000RPM only puts out the same horsepower and torque as a deisel engine running at only 1200RPM. Essentially, the newer A64s do more work per cycle than the XPs, and FAR more work/cycle than a P4.
This is the reason AMD started using a rating number instead of a clock value on their processors. The + numbers are supposidly calibrated against the original Athlon 1Ghz processor (some say the P4, but I've heard both). In other words, the A64 3200+ performs as a theoretical Athlon 3.2Ghz would, while the XP 2600+ performs like an Athlon running at 2.5Ghz. How much that is true, who knows.
Edit: As far as the 3200+ being "top of the range" it's not. It's actually at the bottom of AMDs offerings (other than their value chips). It's actually been discontinued. The "top of the range" is actually the FX line of processors. Just FYI
Miles Teg
Re: Comparing 2 processors
Posted: 2006-04-04 01:16pm
by Praxis
2000AD wrote:I've been looking at the propeties of 2 computers:
1 PC is new and has an AMD Athlon 64 3200+, beneath which it says "2.01 GHz"
The other is a 3 year old rig and has an AMD Athlon XP 2600+ which says "2.13 Ghz"
Shouldn't the newer PC be running faster?
THe reason i ask is that the older one is one that i bought close to 3 years ago, and the new one is one my brother got custom built by his friend and according to my bro it should be top of the range.
This is somewhat simplified, but...
Clockspeed (GHz ~ one billion clock cycles per second) is not a measurement of performance.
If you have two processors of the same architecture with all the same attributes that do the same things every clock cycle (say, 2 GHz Pentium 4 and a 2.2 GHz Pentium 4), THEN clock speed is a good way to compare them.
If you're comparing two processors that do different amounts of work per clock cycle and have different attributes, then the clock speed is virtually useless for comparing the two.
That's the same reason the XBox 360 is not more powerful than any computer on the market (despite having three 3.2 GHz processors- they're not very fast 3.2 GHz processors).
The Athlon 64 is 'only' 2.01 GHz, but it gets so much done every clock cycle that it would outperform a 3 GHz Pentium 4 easily, and absolutely owns the Athlon XP you quote.
Posted: 2006-04-04 03:29pm
by Ypoknons
A 3200+ is hardly top of the range these days, but it is decent.
Posted: 2006-04-04 06:30pm
by Neko_Oni
I've got that processor (the A64 3200+), seems perfectly fine. At least for a casual gamer like me.
Posted: 2006-04-04 08:55pm
by Uraniun235
A 3200+ is more than sufficient even for high-end games.
Posted: 2006-04-05 02:16am
by Ace Pace
Uraniun235 wrote:A 3200+ is more than sufficient even for high-end games.
Lets reaim that that the basic dual core(x2 3800?) is the low end now, dual core offer such usability improvements that I can't see anyone today passing up on them.
Posted: 2006-04-05 03:00am
by Uraniun235
Ace Pace wrote:Uraniun235 wrote:A 3200+ is more than sufficient even for high-end games.
Lets reaim that that the basic dual core(x2 3800?) is the low end now, dual core offer such usability improvements that I can't see anyone today passing up on them.
The important thing is that a single-core 3200+ still achieves good framerates even with the latest games. Windows being a little snappier on the desktop isn't really worth nearly doubling the price of the CPU to me; that money can be
much better spent on beefing up the video card.
Posted: 2006-04-05 03:12am
by Xon
Uraniun235 wrote:A 3200+ is more than sufficient even for high-end games.
Oblivion says no to that
Posted: 2006-04-05 03:33am
by Arthur_Tuxedo
Oblivion is more GPU bound than CPU. If you have a good graphics card, a 3200+ will not limit you with Oblivion. You might run into some bottlenecking if you've got a 7900 or X1900, but then you're talking about a system where the GPU is 3 times as expensive as the CPU.
Posted: 2006-04-05 04:02am
by Stark
Yeah, I use a 3200/64 with Oblivion and it's fine apart from GPU stuff like AA and HDR.
EDIT - In what universe is an AU$440 3800/64 low-end? $100 chips are low end (lolz Celeron), $440 is mid-high.
Posted: 2006-04-05 12:53pm
by Uraniun235
Xon wrote:Uraniun235 wrote:A 3200+ is more than sufficient even for high-end games.
Oblivion says no to that
Care to post some benchmarks to back up that claim?
Posted: 2006-04-05 01:09pm
by Xon
Uraniun235 wrote:Care to post some benchmarks to back up that claim?
linky wrote:
Fair to say, but the reason you won't see a GT as an AGP card is because AGP can barely feed the fewer pipelines of the GS.
I was theorizing that Bethesda actually has some engine problems. The game is not optimized well. I proved my hunch this morning. My intention was to compare SD to HD resolutions, so I ran the game at both 640x480 and 1280x720. The HD vs. SD results are irrelevant; what surprised me the most was that even at 640x480, the stuttering frame-drop issues are still present. In essence, there is definitely a texture load/cache issue that needs further work. When I tweaked iPreloadSizeLimit it really helped with it, but I got CTDs so went back to the default.
Found you!
Yeh I agree with your comments. I have noticed that CPU has a lot to do with framerate once you have a decent card like a 6800GT or higher. I can play like crazy with the grass, objects andshadows, but I still get low frames. HDR chews a lot and the lighting uses heaps of horsepower. Preload size limit made a difference but I still get the occasional hitch. To be honest though I still think it's better than the FEAR engine..
(bolding mine)
Posted: 2006-04-05 01:18pm
by Uraniun235
The first quote doesn't even attempt to demonstrate
why that particular instance was CPU bound.
My intention was to compare SD to HD resolutions, so I ran the game at both 640x480 and 1280x720. The HD vs. SD results are irrelevant; what surprised me the most was that even at 640x480, the stuttering frame-drop issues are still present. In essence, there is definitely a texture load/cache issue that needs further work.
In the second quote, you
dismiss the very thing you set out to compare, and instead make a comment about texture loading - how do you know that's CPU bound and not GPU bound or even
memory bound?
And I asked for
benchmarks, as in testing the
same aspect of the game with
different sets of hardware at various resolutions... as in
actual useful data. When two people here say that Oblivion is more GPU bound from their experiences and you've got two guys saying Oblivion is more CPU bound... well, I think that pretty well demonstrates the weakness of anecdotal evidence, don't you?
Posted: 2006-04-05 01:19pm
by Uraniun235
Also: Bethesda making an inefficient, badly optimized engine?
Posted: 2006-04-05 01:21pm
by Xon
Uraniun235 wrote:Also: Bethesda making an inefficient, badly optimized engine?
image
You got that in one