Page 1 of 1
A bad day for nVidia.
Posted: 2003-09-11 09:38am
by phongn
Valve has just announced initial benchmark results for the NV35 and R350 lines of GPUs for their current build of HL2. They spent
five times as much effort optimizing for the special nVidia pathway as with the generic DirectX 9 path (that ATI uses).
And nVidia is, well, crushed. In DX9 mode, the GFFX5900 Ultra gets about 31FPS, while the R9800 Pro gets ~61 FPS. When the GFFX5900 is in their special nVidia pathway, the 5900U gets about 48FPS or so.
TechReport
Re: A bad day for nVidia.
Posted: 2003-09-11 09:53am
by MKSheppard
phongn wrote:Valve has just announced initial benchmark results for the NV35 and R350 lines of GPUs for their current build of HL2. They spent five times as much effort optimizing for the special nVidia pathway as with the generic DirectX 9 path (that ATI uses).
So HL2 will run a bit slower on NVidia cards, as opposed to ATI
...that happens with random applications...I'll be sticking with
NVidia as opposed to bugfest ATI....and my GOD, Gabe Newell
need's to fucking lose weight!
Posted: 2003-09-11 09:56am
by MKSheppard
ooooh
"
"One of the most fascinating things he managed to listen in on was the story of Microsoft's attempt to flex its muscles with DirectX 9. It seems that a story we published a fair while ago uncovered a goodly amount of the truth.
When Microsoft was first putting together the specifications for DirectX 9 they gathered various companies together to help build the API. After a very short while, Intel and Nvidia both walked away from the whole thing and now we know why.
It seems that Microsoft wanted both companies to agree to reveal any patents they had relating to the technology the Vole was building. Once revealed, the Vole expected the companies to hand the patents over for good. Intel and Nvidia walked away. Only recently has it signed an altered agreement. That is why the GeForce FX does not follow the DirectX 9 specifications to the letter.""
Posted: 2003-09-11 09:59am
by phongn
Actually, no, this is indicative of a severe problem with the NV30/NV35 series - it has very poor DirectX 9 performance. Not every game company has the resources that Valve does - not everyone can optimize for nVidia's cards and for DX9 at once. In fact, the NV20 (i.e. GF4) in DX8 mode beats out the NV30 in DX9 mode as well as the 5200U and 5600U in DX8/8.1 mode. Something is wrong with the NV30.
FWIW, Catalyst 3.7 on my R8500LE has been rock-solid stable and most reports I have show general ATI stability as of late.
Posted: 2003-09-11 10:01am
by phongn
MKSheppard wrote:When Microsoft was first putting together the specifications for DirectX 9 they gathered various companies together to help build the API. After a very short while, Intel and Nvidia both walked away from the whole thing and now we know why.
It seems that Microsoft wanted both companies to agree to reveal any patents they had relating to the technology the Vole was building. Once revealed, the Vole expected the companies to hand the patents over for good. Intel and Nvidia walked away. Only recently has it signed an altered agreement. That is why the GeForce FX does not follow the DirectX 9 specifications to the letter.""
I heard something else; that nVidia and MS made up shortly afterwards and that based on the close relationship between the two for the NV22 XBox part, nVidia made the
assumption that their upcoming NV30 would dictate the overall design of DX9.
It was quite a shock to them when they found out otherwise.
Posted: 2003-09-11 10:21am
by phongn
This article may explain the poor DX9 performance, especially when Pixel Shaders 1.4 or 2.0 are utilized.
Re: A bad day for nVidia.
Posted: 2003-09-11 11:43am
by Companion Cube
MKSheppard wrote:....and my GOD, Gabe Newell
need's to fucking lose weight!
He's a genius, he's allowed to be plump.
Posted: 2003-09-11 12:23pm
by Slartibartfast
DirectX is shit. Go OpenGL.
(in fact, lately I haven't been able to run HL on DirectX correctly, but OpenGL works fine. and it's a fresh OS install)
Posted: 2003-09-11 02:24pm
by Shinova
Don't you need DirectX period to run any of the later games, whether you use OpenGL for the video or not?
Posted: 2003-09-11 02:28pm
by Crazy_Vasey
Don't you need DirectX period to run any of the later games, whether you use OpenGL for the video or not?
Yeah, it's pretty much the best way to get user input, do sound, etc on windows platforms. At least for games.
Anyway it's not just with DX9 that GeforceFX has problems. I remember reading an article from Carmack that effectively said the FX sucked ass in OpenGL, compared to ATI anyway, without using card specific extensions. Not good.
Posted: 2003-09-11 03:18pm
by Slartibartfast
Shinova wrote:Don't you need DirectX period to run any of the later games, whether you use OpenGL for the video or not?
DirectX is an easier, non-proprietary way to access the card's functions, the same happens with OpenGL. DirectX has borrowed a lot from OpenGL. Things like the Quake 1/2/3 engine use OpenGL. Usually a good hint is if a game has been ported to Linux, then it's generally not DirectX dependent... then there are the more obvious ones, like Half-Life, that let you choose in the Video Options.
You could say that both DirectX and OpenGL are "drivers" of some sort. The answer to your question would be YES - a programmer can choose whether to use DX or OpenGL or the native functions of each card (unlikely) or not use acceleration at all. And whatever library programmer chooses, you need to have it. That's why I said "go OpenGL" because people should stop using DX crap to make games and use OpenGL instead. And sound/music can perfectly be used without a "game library" at all, and there are alternatives... besides, we're only talking about the 3D aspect of games.
Posted: 2003-09-11 03:45pm
by Shinova
Does this nvidia problem only apply for FXs or for Ti's also?
Posted: 2003-09-11 04:29pm
by phongn
The problem is specific to the GeForce FX (e.g. NV30/NV35). The NV20 series is unaffected, since it is not a DirectX 9 part, and at any rate uses a different architecture.
Valve chose DirectX 9 over the proposed OpenGL 2 due to feature set and a few other reasons.
Posted: 2003-09-11 04:33pm
by MKSheppard
Crazy_Vasey wrote:I remember reading an article from Carmack that effectively said the FX sucked ass in OpenGL, compared to ATI anyway, without using card specific extensions. Not good.
Tell that to me, Trainz runs great in Open GL, and looks great, but when
I try to run it in Direct 3D, I get all kinds of nasty clipping problems and
transparency bleedthroughs on the edges of the polygons
Posted: 2003-09-11 04:44pm
by phongn
That's probably because Trainz isn't well coded for DirectX. Furthermore, it's unlikely to be as demanding as Doom 3 will be (which pushes the envelope for OpenGL) and finally, I thought you had a GF4?
Posted: 2003-09-11 04:46pm
by Crazy_Vasey
MKSheppard wrote:Crazy_Vasey wrote:I remember reading an article from Carmack that effectively said the FX sucked ass in OpenGL, compared to ATI anyway, without using card specific extensions. Not good.
Tell that to me, Trainz runs great in Open GL, and looks great, but when
I try to run it in Direct 3D, I get all kinds of nasty clipping problems and
transparency bleedthroughs on the edges of the polygons
I would have thought it obvious that I was talking about speed and not correctness with the topic of this thread. Nvidia has zero problems with making it look right, as far as I know, but it seems to have issues with making it run at a decent speed without card specific hacks, which sucks majorly. It's pretty shoddy really for something that took as long as the GeforceFX to actually get to market.
Posted: 2003-09-11 05:30pm
by phongn
Actually, Valve has implied that the Detonator 50 series of drivers is doing some trickery that's quality-related, and they aren't happy about it. In fact, they're warning the community not to bench using them, since it appears that nVidia is leaking betas of it.
Re: A bad day for nVidia.
Posted: 2003-09-11 07:11pm
by Kamakazie Sith
phongn wrote:Valve has just announced initial benchmark results for the NV35 and R350 lines of GPUs for their current build of HL2. They spent
five times as much effort optimizing for the special nVidia pathway as with the generic DirectX 9 path (that ATI uses).
And nVidia is, well, crushed. In DX9 mode, the GFFX5900 Ultra gets about 31FPS, while the R9800 Pro gets ~61 FPS. When the GFFX5900 is in their special nVidia pathway, the 5900U gets about 48FPS or so.
TechReport
Yup, that's why I bought myself the Radeon 9800 Pro All In Wonder....it's sweet
*Actually I bought it like three weeks ago......
Posted: 2003-09-11 08:19pm
by phongn
At any rate, regardless of the quality issues at hand, the Detonator 50 series - at least the early versions - seem to be getting a good boost. OTOH, there are still the dark rumours that it is disabling fog table.
Posted: 2003-09-12 01:28am
by Darth Wong
If only ATI had better drivers, especially on Linux. Their hardware has been setting the pace for a while now.
Posted: 2003-09-12 01:58am
by Hamel
Posted: 2003-09-12 02:00am
by MKSheppard
phongn wrote:Furthermore, it's unlikely to be as demanding as Doom 3 will be (which pushes the envelope for OpenGL)
Uhm, LOL, you have to see some of the stuff Trainz can do - it can
make even a monster rig choke and scream for it's digital silicon mother.
and finally, I thought you had a GF4?
Burned out, got FX to replace it
Posted: 2003-09-12 05:42am
by Crazy_Vasey
Darth Wong wrote:If only ATI had better drivers, especially on Linux. Their hardware has been setting the pace for a while now.
I have to agree on the Linux drivers. My one attempt to use the RPMs on their site resulting in X not even starting on next boot and me having to alter the X config file with vi to get the old, working XFree drivers back.
Posted: 2003-09-12 02:20pm
by phongn
MKSheppard wrote:phongn wrote:Furthermore, it's unlikely to be as demanding as Doom 3 will be (which pushes the envelope for OpenGL)
Uhm, LOL, you have to see some of the stuff Trainz can do - it can
make even a monster rig choke and scream for it's digital silicon mother.
Nevermind, that does look damn good, though I'm wondering what pixel shaders it uses and such (the new killer, versus detailed models and textures and such).