Comparing 2 processors
Moderator: Thanas
- 2000AD
- Emperor's Hand
- Posts: 6666
- Joined: 2002-07-03 06:32pm
- Location: Leeds, wishing i was still in Newcastle
Comparing 2 processors
I've been looking at the propeties of 2 computers:
1 PC is new and has an AMD Athlon 64 3200+, beneath which it says "2.01 GHz"
The other is a 3 year old rig and has an AMD Athlon XP 2600+ which says "2.13 Ghz"
Shouldn't the newer PC be running faster?
THe reason i ask is that the older one is one that i bought close to 3 years ago, and the new one is one my brother got custom built by his friend and according to my bro it should be top of the range.
1 PC is new and has an AMD Athlon 64 3200+, beneath which it says "2.01 GHz"
The other is a 3 year old rig and has an AMD Athlon XP 2600+ which says "2.13 Ghz"
Shouldn't the newer PC be running faster?
THe reason i ask is that the older one is one that i bought close to 3 years ago, and the new one is one my brother got custom built by his friend and according to my bro it should be top of the range.
Ph34r teh eyebrow!!11!Writers Guild Sluggite Pawn of Chaos WYGIWYGAINGW so now i have to put ACPATHNTDWATGODW in my sig EBC-Honorary Geordie
Hammerman! Hammer!
Hammerman! Hammer!
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
Re: Comparing 2 processors
Clockspeed (M/Ghz) is only one small, essentially useless comparitor between different processors. Think of it this way (if you know anything about engines). It's like when a gasoline engine running at 8000RPM only puts out the same horsepower and torque as a deisel engine running at only 1200RPM. Essentially, the newer A64s do more work per cycle than the XPs, and FAR more work/cycle than a P4.2000AD wrote:I've been looking at the propeties of 2 computers:
1 PC is new and has an AMD Athlon 64 3200+, beneath which it says "2.01 GHz"
The other is a 3 year old rig and has an AMD Athlon XP 2600+ which says "2.13 Ghz"
Shouldn't the newer PC be running faster?
THe reason i ask is that the older one is one that i bought close to 3 years ago, and the new one is one my brother got custom built by his friend and according to my bro it should be top of the range.
This is the reason AMD started using a rating number instead of a clock value on their processors. The + numbers are supposidly calibrated against the original Athlon 1Ghz processor (some say the P4, but I've heard both). In other words, the A64 3200+ performs as a theoretical Athlon 3.2Ghz would, while the XP 2600+ performs like an Athlon running at 2.5Ghz. How much that is true, who knows.
Edit: As far as the 3200+ being "top of the range" it's not. It's actually at the bottom of AMDs offerings (other than their value chips). It's actually been discontinued. The "top of the range" is actually the FX line of processors. Just FYI
Miles Teg
Now I am become death -- the shatterer of worlds...
-- Oppenheimer 1945
-- Oppenheimer 1945
Re: Comparing 2 processors
This is somewhat simplified, but...2000AD wrote:I've been looking at the propeties of 2 computers:
1 PC is new and has an AMD Athlon 64 3200+, beneath which it says "2.01 GHz"
The other is a 3 year old rig and has an AMD Athlon XP 2600+ which says "2.13 Ghz"
Shouldn't the newer PC be running faster?
THe reason i ask is that the older one is one that i bought close to 3 years ago, and the new one is one my brother got custom built by his friend and according to my bro it should be top of the range.
Clockspeed (GHz ~ one billion clock cycles per second) is not a measurement of performance.
If you have two processors of the same architecture with all the same attributes that do the same things every clock cycle (say, 2 GHz Pentium 4 and a 2.2 GHz Pentium 4), THEN clock speed is a good way to compare them.
If you're comparing two processors that do different amounts of work per clock cycle and have different attributes, then the clock speed is virtually useless for comparing the two.
That's the same reason the XBox 360 is not more powerful than any computer on the market (despite having three 3.2 GHz processors- they're not very fast 3.2 GHz processors).
The Athlon 64 is 'only' 2.01 GHz, but it gets so much done every clock cycle that it would outperform a 3 GHz Pentium 4 easily, and absolutely owns the Athlon XP you quote.
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
The important thing is that a single-core 3200+ still achieves good framerates even with the latest games. Windows being a little snappier on the desktop isn't really worth nearly doubling the price of the CPU to me; that money can be much better spent on beefing up the video card.Ace Pace wrote:Lets reaim that that the basic dual core(x2 3800?) is the low end now, dual core offer such usability improvements that I can't see anyone today passing up on them.Uraniun235 wrote:A 3200+ is more than sufficient even for high-end games.
Oblivion says no to thatUraniun235 wrote:A 3200+ is more than sufficient even for high-end games.
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
- Arthur_Tuxedo
- Sith Acolyte
- Posts: 5637
- Joined: 2002-07-23 03:28am
- Location: San Francisco, California
Oblivion is more GPU bound than CPU. If you have a good graphics card, a 3200+ will not limit you with Oblivion. You might run into some bottlenecking if you've got a 7900 or X1900, but then you're talking about a system where the GPU is 3 times as expensive as the CPU.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
Uraniun235 wrote:Care to post some benchmarks to back up that claim?
Oblivion CPU bound at times and other findings wrote: First of all I have run into at least one instance, in the city, where the game is CPU bound. This was at 27fps. Interesting or not, the game does have its CPU bound moments.
(bolding mine)linky wrote:Found you!Fair to say, but the reason you won't see a GT as an AGP card is because AGP can barely feed the fewer pipelines of the GS.
I was theorizing that Bethesda actually has some engine problems. The game is not optimized well. I proved my hunch this morning. My intention was to compare SD to HD resolutions, so I ran the game at both 640x480 and 1280x720. The HD vs. SD results are irrelevant; what surprised me the most was that even at 640x480, the stuttering frame-drop issues are still present. In essence, there is definitely a texture load/cache issue that needs further work. When I tweaked iPreloadSizeLimit it really helped with it, but I got CTDs so went back to the default.
Yeh I agree with your comments. I have noticed that CPU has a lot to do with framerate once you have a decent card like a 6800GT or higher. I can play like crazy with the grass, objects andshadows, but I still get low frames. HDR chews a lot and the lighting uses heaps of horsepower. Preload size limit made a difference but I still get the occasional hitch. To be honest though I still think it's better than the FEAR engine..
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
The first quote doesn't even attempt to demonstrate why that particular instance was CPU bound.
And I asked for benchmarks, as in testing the same aspect of the game with different sets of hardware at various resolutions... as in actual useful data. When two people here say that Oblivion is more GPU bound from their experiences and you've got two guys saying Oblivion is more CPU bound... well, I think that pretty well demonstrates the weakness of anecdotal evidence, don't you?
In the second quote, you dismiss the very thing you set out to compare, and instead make a comment about texture loading - how do you know that's CPU bound and not GPU bound or even memory bound?My intention was to compare SD to HD resolutions, so I ran the game at both 640x480 and 1280x720. The HD vs. SD results are irrelevant; what surprised me the most was that even at 640x480, the stuttering frame-drop issues are still present. In essence, there is definitely a texture load/cache issue that needs further work.
And I asked for benchmarks, as in testing the same aspect of the game with different sets of hardware at various resolutions... as in actual useful data. When two people here say that Oblivion is more GPU bound from their experiences and you've got two guys saying Oblivion is more CPU bound... well, I think that pretty well demonstrates the weakness of anecdotal evidence, don't you?
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact: