What the fuck is wrong with videocard names?

GEC: Discuss gaming, computers and electronics and venture into the bizarre world of STGODs.

Moderator: Thanas

User avatar
Darth Wong
Sith Lord
Sith Lord
Posts: 70028
Joined: 2002-07-03 12:25am
Location: Toronto, Canada
Contact:

What the fuck is wrong with videocard names?

Post by Darth Wong »

Seriously, shouldn't it be reasonable to expect that if the number is higher, then the performance should be higher? If I'm going through a computer store and I see a Radeon X1300 Pro next to a Radeon X800 GTO, wouldn't it be perfectly reasonable of me to assume that the X1300 Pro is significantly faster, rather than being significantly slower?

What the fuck is up with these model numbers? NVidia does the same thing. A 6200 might be slower than a 5500, for example.
Image
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing

"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC

"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness

"Viagra commercials appear to save lives" - tharkûn on US health care.

http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
User avatar
Fingolfin_Noldor
Emperor's Hand
Posts: 11834
Joined: 2006-05-15 10:36am
Location: At the Helm of the HAB Star Dreadnaught Star Fist

Post by Fingolfin_Noldor »

The 2 companies follow a silly convention where the first digit signifies the series number. Then they make things worse by using a variety of alphabets to signify how fast this card is.

The Radeon 9000 series was particularly bad. They had a blizarre 9100, 9200, 9500, 9600 (9500 is actually faster), 9700, 9800. They have improved somewhat but now a shitload of XTX, XT, Pro, GT, GTX and what bloody not. Typically, now, the 3 companies maintain 3 categories of performance per series. It's x1300 and 7300 for entry level for ATI and Nvidia, x1600 and 7600 for midrange, and x1800/x1900 and 7800/7900 for high end. Note though that there was a huge jump in performance between x1800 and x1900, and similarly for 7800 and 7900.

It takes time to work one's way round. Try the online computer stores.. might be easier to navigate.
User avatar
Nephtys
Sith Acolyte
Posts: 6227
Joined: 2005-04-02 10:54pm
Location: South Cali... where life is cheap!

Post by Nephtys »

For the Radeons, I think the new set's been like so...

Last Generation Low/Med/High End: X300/X600/X800.
This Gen Low/Med/High End: X1300, X1600, X1800.

As with NVidia...

Last Gen Med/High End: 6600GT/6800GT
New set: 7600GT/ 7800Gt.

The first number is now a generation desigor, it seems. I'm relatively new to this whole deal too. :P
User avatar
FSTargetDrone
Emperor's Hand
Posts: 7878
Joined: 2004-04-10 06:10pm
Location: Drone HQ, Pennsylvania, USA

Post by FSTargetDrone »

It's like names given to automobiles. Fancy looking labels that are meaningless.
Image
User avatar
Elessar
Padawan Learner
Posts: 281
Joined: 2004-10-06 02:56pm
Location: Toronto, ON

Post by Elessar »

From my brief stint at ATI, I noticed that internal naming conventions for GPU's actually made sense in comparison to the marketing nonsense. Part of it is probably because there are a number of different cards aimed at different markets, but apparently for business reasons, they decided against differentiating this in any meaningful way.

Whereas 9800 was a full-featured gaming card for the PC, stuff like the 9200 was aimed for small size, low power-consumption and anything else that was desired in a mobile chipset. It was ridiculous, fellow employees had trouble recalling the relative differences between model lines.

Now that the megahertz myth is pretty much debunked, aren't CPU's starting to adopt a confusing nomenclature as well? I recall some laughter about Pentium Extremes or what not.
User avatar
Mr Bean
Lord of Irony
Posts: 22464
Joined: 2002-07-04 08:36am

Post by Mr Bean »

They fucking had it with Geforce 4 and pissed it away.

It was fucking perfect. Want the expensive Preformance Beast? Fine it's a Gerforce X-GTX, want the durable preformer? Get the Geforce X GT, and want the poor man's card? Get that $99.99 Geforce MX card. And of course the rich man's card which cost twice as much as the GTX for 6% more preformance? Why get the Geforce X-GTX ULTRA, it's even got ULTRA in it's name!

It was @#$@ perfect! And then two generations later we had GL's, GFTX's, Super duper fucking MX2's and all sorts of other shit.

"A cult is a religion with no political power." -Tom Wolfe
Pardon me for sounding like a dick, but I'm playing the tiniest violin in the world right now-Dalton
User avatar
Fingolfin_Noldor
Emperor's Hand
Posts: 11834
Joined: 2006-05-15 10:36am
Location: At the Helm of the HAB Star Dreadnaught Star Fist

Post by Fingolfin_Noldor »

Part of the problem was that they had a propensity of designing 2 sets of graphic cores at the same time. So after x1800/7800 came out, they threw in x1900/7900.

Then comes the fact that they do a statistical distribution on the cores, and start deactivating the pipelines/shaders and whatever feature they had on the chips that weren't of high quality.

Next, they made adjustments to the clock speed and the memory speed. So then they start to divide the whole lot and give different names.

Then they made things worse with new changes to the existing cores etc. and now you even have 7950 GTX (which is actually 2 boards smack together) and x1950 GTX.
User avatar
Ace Pace
Hardware Lover
Posts: 8456
Joined: 2002-07-07 03:04am
Location: Wasting time instead of money
Contact:

Post by Ace Pace »

You also have many brand names making up their own letter designations to 'seperate' their own cards from the masses. Nevermind brand name is apprently enough to throw makers like BFG ahead of the pack regardless of using OCed cores or not.
Brotherhood of the Bear | HAB | Mess | SDnet archivist |
User avatar
Beowulf
The Patrician
Posts: 10621
Joined: 2002-07-04 01:18am
Location: 32ULV

Post by Beowulf »

The two GPU card is the 7950GX2, not to be confused with the 7950GT, which is actually slower than the 7900GTX.
"preemptive killing of cops might not be such a bad idea from a personal saftey[sic] standpoint..." --Keevan Colton
"There's a word for bias you can't see: Yours." -- William Saletan
User avatar
Netko
Jedi Council Member
Posts: 1925
Joined: 2005-03-30 06:14am

Post by Netko »

Fingolfin_Noldor wrote:... 9200, 9500, 9600 (9500 is actually faster), 9700, 9800.
While I detest all the marketing bullshit surrounding graphics cards (the below 9500 series cards and Geforce 4 MX cause particular ire since they are techincly part of older generations while having names that put them in current generations -> cue howling of uninformed customers on various games' forums where they feel betrayed by the developer), the 9600 on today's games is faster then the 9500 but at the time of its introduction it was slower. This is because the 9500 is IIRC slightly faster clocked and thus had a performance advantage in DX7 games that were the standard in the day. Today however DX9 games that use shaders are the standard and the 9500's shader unit is somewhat crippled by a very low instruction per pass ability, which is obscured by the drivers but exists while the 9600 has a much better ability. I can't recall the exact statistic but I belive the diffrence was 20-30% in favor of the 9600 in shader intensive games, while the 9500 enjoyed a 10% lead in non-shader games.
Ypoknons
Jedi Knight
Posts: 999
Joined: 2003-05-13 06:02am
Location: Manhattan (school year), Hong Kong (vacations)
Contact:

Post by Ypoknons »

I've always loved Anandtech video cards guides. The video card branding would probably get a F in marketing these days...
User avatar
Ace Pace
Hardware Lover
Posts: 8456
Joined: 2002-07-07 03:04am
Location: Wasting time instead of money
Contact:

Post by Ace Pace »

The reason for the 9500 being faster was both a faster clock, and the fact it was basicly a cut down 9700 core and could use all of it's features, and be VERY easily soft modded into a 9700 Pro.

Therfor, the 9500 gained a reputation as a very very good card, while infact, unmodded, it was roughly equivilent to the 9600.

After the 9500, neither nVidia or ATi ever let a card go into the market soft moddable. Untill the same thing happened to ATi with the X800GT0^2(don't ask) which could apprently be modded to a X800XT without touching hardware.
Brotherhood of the Bear | HAB | Mess | SDnet archivist |
User avatar
Arthur_Tuxedo
Sith Acolyte
Posts: 5637
Joined: 2002-07-23 03:28am
Location: San Francisco, California

Post by Arthur_Tuxedo »

It's really quite ridiculous, and I'm willing to bet that if someone bothered to do a study, they'd find that it hurts sales. Any intelligent person can come up with much better naming systems than the drivel we've seen in the last few years.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali

"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
User avatar
General Zod
Never Shuts Up
Posts: 29211
Joined: 2003-11-18 03:08pm
Location: The Clearance Rack
Contact:

Post by General Zod »

It's about as bad as the latest processor naming conventions. You'll get chips rated at 3000, 4000, 2400, etc. that gives you virtually no frakking clue as to their actual clock speed. A 3000 can be anywhere from 2.5 ghz to 3+ ghz, and so on which makes it infuriating to tell what it is at a glance. It would be nice if there were a universal naming convention for chipsets in general (graphics, cpus, etc) that was easy to decipher.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
User avatar
Count Dooku
Jedi Knight
Posts: 577
Joined: 2006-01-18 11:37pm
Location: California

Post by Count Dooku »

I bought my GPU and CPU based off of several Bench Mark scores...To be completely honest, I don't have one company that I favor over another.
"Religion is regarded by the common people as true, by the wise as false, and by the rulers as useful." (Seneca the Younger, 5 BC - 65 AD)
User avatar
Uraniun235
Emperor's Hand
Posts: 13772
Joined: 2002-09-12 12:47am
Location: OREGON
Contact:

Post by Uraniun235 »

General Zod wrote:It's about as bad as the latest processor naming conventions. You'll get chips rated at 3000, 4000, 2400, etc. that gives you virtually no frakking clue as to their actual clock speed. A 3000 can be anywhere from 2.5 ghz to 3+ ghz, and so on which makes it infuriating to tell what it is at a glance. It would be nice if there were a universal naming convention for chipsets in general (graphics, cpus, etc) that was easy to decipher.
Why do you need to know the clockspeed? I'm pretty sure the AMD system is actually one of the sanest; a 3200+ is pretty well definitely going to be superior to a 3000+.

Things get a bit dodgier when you try and compare gaming performance between A64 and X2 chips, but even within the X2 lineup bigger = better.
"There is no "taboo" on using nuclear weapons." -Julhelm
Image
What is Project Zohar?
"On a serious note (well not really) I did sometimes jump in and rate nBSG episodes a '5' before the episode even aired or I saw it." - RogueIce explaining that episode ratings on SDN tv show threads are bunk
User avatar
General Zod
Never Shuts Up
Posts: 29211
Joined: 2003-11-18 03:08pm
Location: The Clearance Rack
Contact:

Post by General Zod »

Uraniun235 wrote:
General Zod wrote:It's about as bad as the latest processor naming conventions. You'll get chips rated at 3000, 4000, 2400, etc. that gives you virtually no frakking clue as to their actual clock speed. A 3000 can be anywhere from 2.5 ghz to 3+ ghz, and so on which makes it infuriating to tell what it is at a glance. It would be nice if there were a universal naming convention for chipsets in general (graphics, cpus, etc) that was easy to decipher.
Why do you need to know the clockspeed? I'm pretty sure the AMD system is actually one of the sanest; a 3200+ is pretty well definitely going to be superior to a 3000+.

Things get a bit dodgier when you try and compare gaming performance between A64 and X2 chips, but even within the X2 lineup bigger = better.
It's an old habbit. I'm used to rating computer performance and whether or not I want to go for the chip based on that. Ever since they switched to this new system it's been confusing the hell out of me.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
User avatar
RedImperator
Roosevelt Republican
Posts: 16465
Joined: 2002-07-11 07:59pm
Location: Delaware
Contact:

Post by RedImperator »

FSTargetDrone wrote:It's like names given to automobiles. Fancy looking labels that are meaningless.
Yeah, but at least car names are memorable. "Corvette" and "Camary" are meaningless, but I have a general idea what the difference is between them, and the technical data is easier to remember when it's attached to a name instead of a series of letters and numbers.
Image
Any city gets what it admires, will pay for, and, ultimately, deserves…We want and deserve tin-can architecture in a tinhorn culture. And we will probably be judged not by the monuments we build but by those we have destroyed.--Ada Louise Huxtable, "Farewell to Penn Station", New York Times editorial, 30 October 1963
X-Ray Blues
AniThyng
Sith Devotee
Posts: 2777
Joined: 2003-09-08 12:47pm
Location: Took an arrow in the knee.
Contact:

Post by AniThyng »

General Zod wrote: It's an old habbit. I'm used to rating computer performance and whether or not I want to go for the chip based on that. Ever since they switched to this new system it's been confusing the hell out of me.
But you realise of course, that it's equally meaningless to pit a 1.66 GHz Athlon XP against a 1.66GHz P4, or against a 1.66GHz Core 2 Duo...

Edit to save a post: It goes beyond obscure video card series number/letter conventions - Like the 512MB x1300. The uninformed will see 512 MB video ram and think it's great when of course an x1300 is barely capable of using 64MB properly, nevermind 512.
Last edited by AniThyng on 2006-10-11 10:09pm, edited 1 time in total.
I do know how to spell
AniThyng is merely the name I gave to what became my favourite Baldur's Gate II mage character :P
User avatar
Arthur_Tuxedo
Sith Acolyte
Posts: 5637
Joined: 2002-07-23 03:28am
Location: San Francisco, California

Post by Arthur_Tuxedo »

Not equally meaningless, much more meaningless. CPU naming may have its foibles, but it's a shining beacon of righteousness compared to GPU naming.
"I'm so fast that last night I turned off the light switch in my hotel room and was in bed before the room was dark." - Muhammad Ali

"Dating is not supposed to be easy. It's supposed to be a heart-pounding, stomach-wrenching, gut-churning exercise in pitting your fear of rejection and public humiliation against your desire to find a mate. Enjoy." - Darth Wong
User avatar
General Zod
Never Shuts Up
Posts: 29211
Joined: 2003-11-18 03:08pm
Location: The Clearance Rack
Contact:

Post by General Zod »

AniThyng wrote:
But you realise of course, that it's equally meaningless to pit a 1.66 GHz Athlon XP against a 1.66GHz P4, or against a 1.66GHz Core 2 Duo...
Obviously. But when you're familiar with how the chips work, knowing the clockspeed and bus gives you a fairly good indication of how they'll stack up against other chips, as well as the core design (single core, dual, etc). But some meaningless number that simply suggests a range is worthless.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
User avatar
Stark
Emperor's Hand
Posts: 36169
Joined: 2002-07-03 09:56pm
Location: Brisbane, Australia

Post by Stark »

For a moment I thought this thread would be about the retarded third-party named cards, or the whole 'poorly rendered titties' boxart. :)

I mean, who doesn't want an Sapphire Xtreme Masochistic with some circa 2001 cg tits on the box?
User avatar
Uraniun235
Emperor's Hand
Posts: 13772
Joined: 2002-09-12 12:47am
Location: OREGON
Contact:

Post by Uraniun235 »

General Zod wrote:
AniThyng wrote:
But you realise of course, that it's equally meaningless to pit a 1.66 GHz Athlon XP against a 1.66GHz P4, or against a 1.66GHz Core 2 Duo...
Obviously. But when you're familiar with how the chips work, knowing the clockspeed and bus gives you a fairly good indication of how they'll stack up against other chips, as well as the core design (single core, dual, etc). But some meaningless number that simply suggests a range is worthless.
I don't think the product name necessarily needs to include the clockspeed though, because that can invite things like someone seeing a 2.4GHz P4 and a 2.0 GHz A64 and thinking the P4 is the superior chip, when in fact the A64 will blow it away.

Honestly, if you know enough to be able to recognize that comparing clockspeeds between two different architectures is next to meaningless, you ought to be looking up benchmarks for the chips on the internet so you can get an even better sense of differences in chip performance beyond "well this chip is 200MHz faster".

I do think Intel really stumbled with the whole Core Duo -> Core 2 Duo thing. That's just crazy.
"There is no "taboo" on using nuclear weapons." -Julhelm
Image
What is Project Zohar?
"On a serious note (well not really) I did sometimes jump in and rate nBSG episodes a '5' before the episode even aired or I saw it." - RogueIce explaining that episode ratings on SDN tv show threads are bunk
User avatar
Neko_Oni
Padawan Learner
Posts: 389
Joined: 2002-09-11 09:15am
Location: Tokyo, Japan.

Post by Neko_Oni »

Stark wrote:For a moment I thought this thread would be about the retarded third-party named cards, or the whole 'poorly rendered titties' boxart.
I'm glad someone else brought this up. What the hell is with the "cover art" of graphics cards? Most of it looks worse than the graphics the card inside could render in real-time.
User avatar
Darth Quorthon
Jedi Knight
Posts: 580
Joined: 2005-09-25 12:04am
Location: California

Post by Darth Quorthon »

Uraniun235 wrote:Why do you need to know the clockspeed? I'm pretty sure the AMD system is actually one of the sanest; a 3200+ is pretty well definitely going to be superior to a 3000+.
That reminds me of a conversation I had with a guy at my work. He was looking to buy a new computer, and his choices were between a single-core P4 at 3.6GHz and a dual-core at 3.2Ghz. I told him to go for dual-core, and he said, "but it's 400MHz slower". It's fun explaining to a not-so-computer-savvy fellow that the clockspeed wars are over. I've often privately wondered if in the past if system builders were touting clockspeed over actual performance to take buyers for a ride. I remember a couple of years ago when the gulf between Intel and AMD clockspeeds was nearly 1 GHz and Intel said that AMD couldn't "keep up", and a guy from AMD said "I'm not interested in clockspeeds, I'm interested in performance".

How did we ever make decisions on what hardware to buy before all of these benchmarking sites appeared? :)
"For the first few weeks of rehearsal, we tend to sound like a really, really bad Rush tribute band." -Alex Lifeson

"See, we plan ahead, that way we don't do anything right now." - Valentine McKee

"Next time you're gonna be a bit higher!" -General from Birani

"A cynic is a man who, when he smells flowers, looks around for a coffin." - H. L. Mencken

He who creates shields by fire - Rotting Christ, Lex Talionis
Post Reply