What the fuck is wrong with videocard names?
Moderator: Thanas
- Darth Wong
- Sith Lord
- Posts: 70028
- Joined: 2002-07-03 12:25am
- Location: Toronto, Canada
- Contact:
What the fuck is wrong with videocard names?
Seriously, shouldn't it be reasonable to expect that if the number is higher, then the performance should be higher? If I'm going through a computer store and I see a Radeon X1300 Pro next to a Radeon X800 GTO, wouldn't it be perfectly reasonable of me to assume that the X1300 Pro is significantly faster, rather than being significantly slower?
What the fuck is up with these model numbers? NVidia does the same thing. A 6200 might be slower than a 5500, for example.
What the fuck is up with these model numbers? NVidia does the same thing. A 6200 might be slower than a 5500, for example.
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing
"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC
"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness
"Viagra commercials appear to save lives" - tharkûn on US health care.
http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC
"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness
"Viagra commercials appear to save lives" - tharkûn on US health care.
http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
- Fingolfin_Noldor
- Emperor's Hand
- Posts: 11834
- Joined: 2006-05-15 10:36am
- Location: At the Helm of the HAB Star Dreadnaught Star Fist
The 2 companies follow a silly convention where the first digit signifies the series number. Then they make things worse by using a variety of alphabets to signify how fast this card is.
The Radeon 9000 series was particularly bad. They had a blizarre 9100, 9200, 9500, 9600 (9500 is actually faster), 9700, 9800. They have improved somewhat but now a shitload of XTX, XT, Pro, GT, GTX and what bloody not. Typically, now, the 3 companies maintain 3 categories of performance per series. It's x1300 and 7300 for entry level for ATI and Nvidia, x1600 and 7600 for midrange, and x1800/x1900 and 7800/7900 for high end. Note though that there was a huge jump in performance between x1800 and x1900, and similarly for 7800 and 7900.
It takes time to work one's way round. Try the online computer stores.. might be easier to navigate.
The Radeon 9000 series was particularly bad. They had a blizarre 9100, 9200, 9500, 9600 (9500 is actually faster), 9700, 9800. They have improved somewhat but now a shitload of XTX, XT, Pro, GT, GTX and what bloody not. Typically, now, the 3 companies maintain 3 categories of performance per series. It's x1300 and 7300 for entry level for ATI and Nvidia, x1600 and 7600 for midrange, and x1800/x1900 and 7800/7900 for high end. Note though that there was a huge jump in performance between x1800 and x1900, and similarly for 7800 and 7900.
It takes time to work one's way round. Try the online computer stores.. might be easier to navigate.
- Nephtys
- Sith Acolyte
- Posts: 6227
- Joined: 2005-04-02 10:54pm
- Location: South Cali... where life is cheap!
For the Radeons, I think the new set's been like so...
Last Generation Low/Med/High End: X300/X600/X800.
This Gen Low/Med/High End: X1300, X1600, X1800.
As with NVidia...
Last Gen Med/High End: 6600GT/6800GT
New set: 7600GT/ 7800Gt.
The first number is now a generation desigor, it seems. I'm relatively new to this whole deal too.
Last Generation Low/Med/High End: X300/X600/X800.
This Gen Low/Med/High End: X1300, X1600, X1800.
As with NVidia...
Last Gen Med/High End: 6600GT/6800GT
New set: 7600GT/ 7800Gt.
The first number is now a generation desigor, it seems. I'm relatively new to this whole deal too.
- FSTargetDrone
- Emperor's Hand
- Posts: 7878
- Joined: 2004-04-10 06:10pm
- Location: Drone HQ, Pennsylvania, USA
From my brief stint at ATI, I noticed that internal naming conventions for GPU's actually made sense in comparison to the marketing nonsense. Part of it is probably because there are a number of different cards aimed at different markets, but apparently for business reasons, they decided against differentiating this in any meaningful way.
Whereas 9800 was a full-featured gaming card for the PC, stuff like the 9200 was aimed for small size, low power-consumption and anything else that was desired in a mobile chipset. It was ridiculous, fellow employees had trouble recalling the relative differences between model lines.
Now that the megahertz myth is pretty much debunked, aren't CPU's starting to adopt a confusing nomenclature as well? I recall some laughter about Pentium Extremes or what not.
Whereas 9800 was a full-featured gaming card for the PC, stuff like the 9200 was aimed for small size, low power-consumption and anything else that was desired in a mobile chipset. It was ridiculous, fellow employees had trouble recalling the relative differences between model lines.
Now that the megahertz myth is pretty much debunked, aren't CPU's starting to adopt a confusing nomenclature as well? I recall some laughter about Pentium Extremes or what not.
They fucking had it with Geforce 4 and pissed it away.
It was fucking perfect. Want the expensive Preformance Beast? Fine it's a Gerforce X-GTX, want the durable preformer? Get the Geforce X GT, and want the poor man's card? Get that $99.99 Geforce MX card. And of course the rich man's card which cost twice as much as the GTX for 6% more preformance? Why get the Geforce X-GTX ULTRA, it's even got ULTRA in it's name!
It was @#$@ perfect! And then two generations later we had GL's, GFTX's, Super duper fucking MX2's and all sorts of other shit.
It was fucking perfect. Want the expensive Preformance Beast? Fine it's a Gerforce X-GTX, want the durable preformer? Get the Geforce X GT, and want the poor man's card? Get that $99.99 Geforce MX card. And of course the rich man's card which cost twice as much as the GTX for 6% more preformance? Why get the Geforce X-GTX ULTRA, it's even got ULTRA in it's name!
It was @#$@ perfect! And then two generations later we had GL's, GFTX's, Super duper fucking MX2's and all sorts of other shit.
"A cult is a religion with no political power." -Tom Wolfe
Pardon me for sounding like a dick, but I'm playing the tiniest violin in the world right now-Dalton
- Fingolfin_Noldor
- Emperor's Hand
- Posts: 11834
- Joined: 2006-05-15 10:36am
- Location: At the Helm of the HAB Star Dreadnaught Star Fist
Part of the problem was that they had a propensity of designing 2 sets of graphic cores at the same time. So after x1800/7800 came out, they threw in x1900/7900.
Then comes the fact that they do a statistical distribution on the cores, and start deactivating the pipelines/shaders and whatever feature they had on the chips that weren't of high quality.
Next, they made adjustments to the clock speed and the memory speed. So then they start to divide the whole lot and give different names.
Then they made things worse with new changes to the existing cores etc. and now you even have 7950 GTX (which is actually 2 boards smack together) and x1950 GTX.
Then comes the fact that they do a statistical distribution on the cores, and start deactivating the pipelines/shaders and whatever feature they had on the chips that weren't of high quality.
Next, they made adjustments to the clock speed and the memory speed. So then they start to divide the whole lot and give different names.
Then they made things worse with new changes to the existing cores etc. and now you even have 7950 GTX (which is actually 2 boards smack together) and x1950 GTX.
While I detest all the marketing bullshit surrounding graphics cards (the below 9500 series cards and Geforce 4 MX cause particular ire since they are techincly part of older generations while having names that put them in current generations -> cue howling of uninformed customers on various games' forums where they feel betrayed by the developer), the 9600 on today's games is faster then the 9500 but at the time of its introduction it was slower. This is because the 9500 is IIRC slightly faster clocked and thus had a performance advantage in DX7 games that were the standard in the day. Today however DX9 games that use shaders are the standard and the 9500's shader unit is somewhat crippled by a very low instruction per pass ability, which is obscured by the drivers but exists while the 9600 has a much better ability. I can't recall the exact statistic but I belive the diffrence was 20-30% in favor of the 9600 in shader intensive games, while the 9500 enjoyed a 10% lead in non-shader games.Fingolfin_Noldor wrote:... 9200, 9500, 9600 (9500 is actually faster), 9700, 9800.
- Ace Pace
- Hardware Lover
- Posts: 8456
- Joined: 2002-07-07 03:04am
- Location: Wasting time instead of money
- Contact:
The reason for the 9500 being faster was both a faster clock, and the fact it was basicly a cut down 9700 core and could use all of it's features, and be VERY easily soft modded into a 9700 Pro.
Therfor, the 9500 gained a reputation as a very very good card, while infact, unmodded, it was roughly equivilent to the 9600.
After the 9500, neither nVidia or ATi ever let a card go into the market soft moddable. Untill the same thing happened to ATi with the X800GT0^2(don't ask) which could apprently be modded to a X800XT without touching hardware.
Therfor, the 9500 gained a reputation as a very very good card, while infact, unmodded, it was roughly equivilent to the 9600.
After the 9500, neither nVidia or ATi ever let a card go into the market soft moddable. Untill the same thing happened to ATi with the X800GT0^2(don't ask) which could apprently be modded to a X800XT without touching hardware.
Brotherhood of the Bear | HAB | Mess | SDnet archivist |
- Arthur_Tuxedo
- Sith Acolyte
- Posts: 5637
- Joined: 2002-07-23 03:28am
- Location: San Francisco, California
It's really quite ridiculous, and I'm willing to bet that if someone bothered to do a study, they'd find that it hurts sales. Any intelligent person can come up with much better naming systems than the drivel we've seen in the last few years.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
- General Zod
- Never Shuts Up
- Posts: 29211
- Joined: 2003-11-18 03:08pm
- Location: The Clearance Rack
- Contact:
It's about as bad as the latest processor naming conventions. You'll get chips rated at 3000, 4000, 2400, etc. that gives you virtually no frakking clue as to their actual clock speed. A 3000 can be anywhere from 2.5 ghz to 3+ ghz, and so on which makes it infuriating to tell what it is at a glance. It would be nice if there were a universal naming convention for chipsets in general (graphics, cpus, etc) that was easy to decipher.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
- Count Dooku
- Jedi Knight
- Posts: 577
- Joined: 2006-01-18 11:37pm
- Location: California
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
Why do you need to know the clockspeed? I'm pretty sure the AMD system is actually one of the sanest; a 3200+ is pretty well definitely going to be superior to a 3000+.General Zod wrote:It's about as bad as the latest processor naming conventions. You'll get chips rated at 3000, 4000, 2400, etc. that gives you virtually no frakking clue as to their actual clock speed. A 3000 can be anywhere from 2.5 ghz to 3+ ghz, and so on which makes it infuriating to tell what it is at a glance. It would be nice if there were a universal naming convention for chipsets in general (graphics, cpus, etc) that was easy to decipher.
Things get a bit dodgier when you try and compare gaming performance between A64 and X2 chips, but even within the X2 lineup bigger = better.
"There is no "taboo" on using nuclear weapons." -Julhelm
What is Project Zohar?
"On a serious note (well not really) I did sometimes jump in and rate nBSG episodes a '5' before the episode even aired or I saw it." - RogueIce explaining that episode ratings on SDN tv show threads are bunk
"On a serious note (well not really) I did sometimes jump in and rate nBSG episodes a '5' before the episode even aired or I saw it." - RogueIce explaining that episode ratings on SDN tv show threads are bunk
- General Zod
- Never Shuts Up
- Posts: 29211
- Joined: 2003-11-18 03:08pm
- Location: The Clearance Rack
- Contact:
It's an old habbit. I'm used to rating computer performance and whether or not I want to go for the chip based on that. Ever since they switched to this new system it's been confusing the hell out of me.Uraniun235 wrote:Why do you need to know the clockspeed? I'm pretty sure the AMD system is actually one of the sanest; a 3200+ is pretty well definitely going to be superior to a 3000+.General Zod wrote:It's about as bad as the latest processor naming conventions. You'll get chips rated at 3000, 4000, 2400, etc. that gives you virtually no frakking clue as to their actual clock speed. A 3000 can be anywhere from 2.5 ghz to 3+ ghz, and so on which makes it infuriating to tell what it is at a glance. It would be nice if there were a universal naming convention for chipsets in general (graphics, cpus, etc) that was easy to decipher.
Things get a bit dodgier when you try and compare gaming performance between A64 and X2 chips, but even within the X2 lineup bigger = better.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
- RedImperator
- Roosevelt Republican
- Posts: 16465
- Joined: 2002-07-11 07:59pm
- Location: Delaware
- Contact:
Yeah, but at least car names are memorable. "Corvette" and "Camary" are meaningless, but I have a general idea what the difference is between them, and the technical data is easier to remember when it's attached to a name instead of a series of letters and numbers.FSTargetDrone wrote:It's like names given to automobiles. Fancy looking labels that are meaningless.
Any city gets what it admires, will pay for, and, ultimately, deserves…We want and deserve tin-can architecture in a tinhorn culture. And we will probably be judged not by the monuments we build but by those we have destroyed.--Ada Louise Huxtable, "Farewell to Penn Station", New York Times editorial, 30 October 1963
X-Ray Blues
X-Ray Blues
-
- Sith Devotee
- Posts: 2777
- Joined: 2003-09-08 12:47pm
- Location: Took an arrow in the knee.
- Contact:
But you realise of course, that it's equally meaningless to pit a 1.66 GHz Athlon XP against a 1.66GHz P4, or against a 1.66GHz Core 2 Duo...General Zod wrote: It's an old habbit. I'm used to rating computer performance and whether or not I want to go for the chip based on that. Ever since they switched to this new system it's been confusing the hell out of me.
Edit to save a post: It goes beyond obscure video card series number/letter conventions - Like the 512MB x1300. The uninformed will see 512 MB video ram and think it's great when of course an x1300 is barely capable of using 64MB properly, nevermind 512.
Last edited by AniThyng on 2006-10-11 10:09pm, edited 1 time in total.
I do know how to spell
AniThyng is merely the name I gave to what became my favourite Baldur's Gate II mage character
AniThyng is merely the name I gave to what became my favourite Baldur's Gate II mage character
- Arthur_Tuxedo
- Sith Acolyte
- Posts: 5637
- Joined: 2002-07-23 03:28am
- Location: San Francisco, California
Not equally meaningless, much more meaningless. CPU naming may have its foibles, but it's a shining beacon of righteousness compared to GPU naming.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
- General Zod
- Never Shuts Up
- Posts: 29211
- Joined: 2003-11-18 03:08pm
- Location: The Clearance Rack
- Contact:
Obviously. But when you're familiar with how the chips work, knowing the clockspeed and bus gives you a fairly good indication of how they'll stack up against other chips, as well as the core design (single core, dual, etc). But some meaningless number that simply suggests a range is worthless.AniThyng wrote:
But you realise of course, that it's equally meaningless to pit a 1.66 GHz Athlon XP against a 1.66GHz P4, or against a 1.66GHz Core 2 Duo...
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
I don't think the product name necessarily needs to include the clockspeed though, because that can invite things like someone seeing a 2.4GHz P4 and a 2.0 GHz A64 and thinking the P4 is the superior chip, when in fact the A64 will blow it away.General Zod wrote:Obviously. But when you're familiar with how the chips work, knowing the clockspeed and bus gives you a fairly good indication of how they'll stack up against other chips, as well as the core design (single core, dual, etc). But some meaningless number that simply suggests a range is worthless.AniThyng wrote:
But you realise of course, that it's equally meaningless to pit a 1.66 GHz Athlon XP against a 1.66GHz P4, or against a 1.66GHz Core 2 Duo...
Honestly, if you know enough to be able to recognize that comparing clockspeeds between two different architectures is next to meaningless, you ought to be looking up benchmarks for the chips on the internet so you can get an even better sense of differences in chip performance beyond "well this chip is 200MHz faster".
I do think Intel really stumbled with the whole Core Duo -> Core 2 Duo thing. That's just crazy.
"There is no "taboo" on using nuclear weapons." -Julhelm
What is Project Zohar?
"On a serious note (well not really) I did sometimes jump in and rate nBSG episodes a '5' before the episode even aired or I saw it." - RogueIce explaining that episode ratings on SDN tv show threads are bunk
"On a serious note (well not really) I did sometimes jump in and rate nBSG episodes a '5' before the episode even aired or I saw it." - RogueIce explaining that episode ratings on SDN tv show threads are bunk
I'm glad someone else brought this up. What the hell is with the "cover art" of graphics cards? Most of it looks worse than the graphics the card inside could render in real-time.Stark wrote:For a moment I thought this thread would be about the retarded third-party named cards, or the whole 'poorly rendered titties' boxart.
- Darth Quorthon
- Jedi Knight
- Posts: 580
- Joined: 2005-09-25 12:04am
- Location: California
That reminds me of a conversation I had with a guy at my work. He was looking to buy a new computer, and his choices were between a single-core P4 at 3.6GHz and a dual-core at 3.2Ghz. I told him to go for dual-core, and he said, "but it's 400MHz slower". It's fun explaining to a not-so-computer-savvy fellow that the clockspeed wars are over. I've often privately wondered if in the past if system builders were touting clockspeed over actual performance to take buyers for a ride. I remember a couple of years ago when the gulf between Intel and AMD clockspeeds was nearly 1 GHz and Intel said that AMD couldn't "keep up", and a guy from AMD said "I'm not interested in clockspeeds, I'm interested in performance".Uraniun235 wrote:Why do you need to know the clockspeed? I'm pretty sure the AMD system is actually one of the sanest; a 3200+ is pretty well definitely going to be superior to a 3000+.
How did we ever make decisions on what hardware to buy before all of these benchmarking sites appeared?
"For the first few weeks of rehearsal, we tend to sound like a really, really bad Rush tribute band." -Alex Lifeson
"See, we plan ahead, that way we don't do anything right now." - Valentine McKee
"Next time you're gonna be a bit higher!" -General from Birani
"A cynic is a man who, when he smells flowers, looks around for a coffin." - H. L. Mencken
He who creates shields by fire - Rotting Christ, Lex Talionis
"See, we plan ahead, that way we don't do anything right now." - Valentine McKee
"Next time you're gonna be a bit higher!" -General from Birani
"A cynic is a man who, when he smells flowers, looks around for a coffin." - H. L. Mencken
He who creates shields by fire - Rotting Christ, Lex Talionis