Page 2 of 2
Posted: 2004-04-16 02:47pm
by Ace Pace
phongn wrote:Ace Pace wrote:Is it clocked slower? All reviews I read, show that the only differnce is 4 pipes.
I'm not sure. The non-Ultra version will also have a much smaller cooling fan (one slot rather than two) as well.
I need that thing dammit

Wait for ATI's offerings and then determine your best deal.
Of course I will, my current card is ATI, i'm not going to drop them over 3%.
Still, I DO want that monster.
Too bad I have the money, but not the PC to deal with that thing.
Beyond3d.com has something intresting, the driver specifies 2 differnt configurations, 1 for the ultra, the other is non ultra, and the weird thing is that the card is using the non-ultra configuration.
The boards that reviewers have received were slightly delayed and have had manual revisions, which we assume are related. A close look at the back of the board reveals an additional wire and a coil at the area of the fan power supply from the board and we are beginning to suspect there have been some fairly last minute specification changes to the boards. The boards we had previously seen at CeBit, and probably the ones that shipped to developers and been used at GDC, had core speeds of 475MHz, yet the final Ultra specification, we are assured, is 400MHz. Indeed, looking at the driver strings we can see the following references:
NVIDIA_NV40.DEV_0040.1 = "NVIDIA GeForce 6800 Ultra"
NVIDIA_NV40.DEV_0041.1 = "NVIDIA GeForce 6800"
However, checking the chipset device IDs on the system reveals the board to have a device id of "0041", which corresponds to the non-Ultra variant in these particular drivers. We are wondering if there have been some last minute heat and / or related power concerns that has forced a change in specification and larger cooling solutions.
Maybe i'm not getting this, but from what I get is that the Ultra version was too hot, so they sent out reguler versions.
Posted: 2004-04-16 03:36pm
by Shinova
Just saw something funny regarding this
http://mysite.freeserve.com/smallsheep/ ... st_rig.jpg
A power supply dedicated for NVidia.

Posted: 2004-04-16 04:46pm
by phongn
Someone photoshopped the "Best played with nVidia" to say "Best played with Antec" on it

Posted: 2004-04-16 04:58pm
by Comosicus
The only thing I care for this is that the price of the other cards will get lower. I can't afford such a monster. Not for the near future.
Posted: 2004-04-17 02:08am
by Arthur_Tuxedo
I hope the NU version's performance is almost as good, because at $300, that's actually pretty tempting.
Posted: 2004-04-17 10:31am
by The Kernel
Arthur_Tuxedo wrote:I hope the NU version's performance is almost as good, because at $300, that's actually pretty tempting.
Even with only 12 pipelines, it should still have the awesome shader performance, so figure that the performance will be within 25% of the Ultra.
Posted: 2004-04-18 05:17am
by Ypoknons
It seems like that for once even Gainward, Chaintech and Asus are using to use the referance cooler, just changing the sticker on the cooler. Man... I thought my 200W PSU would be just fine...
By the way, is there a mid-range card to go with the NV40? Or are they just going to use GDDR3 FX5700 for now? If a NV40 is way too hot for my badly cooled Dell and Hong Kong summers then perhaps I should just get something cool.
Posted: 2004-04-18 05:48am
by Shinova
The Kernel wrote:Arthur_Tuxedo wrote:I hope the NU version's performance is almost as good, because at $300, that's actually pretty tempting.
Even with only 12 pipelines, it should still have the awesome shader performance, so figure that the performance will be within 25% of the Ultra.
25% as in it's around 75% of the Ultra's Performance or it's only 25% of the Ultra's performace?
Posted: 2004-04-18 08:31am
by Ace Pace
Shinova wrote:The Kernel wrote:
Even with only 12 pipelines, it should still have the awesome shader performance, so figure that the performance will be within 25% of the Ultra.
25% as in it's around 75% of the Ultra's Performance or it's only 25% of the Ultra's performace?
75% I assume.
Edit: however, with the little driver mystery (see my posts above), whats going on with that?
Posted: 2004-04-18 06:20pm
by Arthur_Tuxedo
I'm guessing performance of the NU is 65-75% of the Ultra, which is a bigger step down than I would normally be comfortable with, but will still smoke a 9800XT I'm guessing, and for $100 less.
I'll probably end up getting a 6800 ultra in about a year when it's fallen down to like $300, because right now I'm in no financial position to even consider a new computer.
Posted: 2004-04-18 08:02pm
by Shadowhawk
Anyone have any idea what ATI's answer to the nv40 is gonna be like?
I'm planning on doing a full-system upgrade soon, and I'm considering dropping the money on the 6800U (I haven't had a top-of-the-line card since I had a Diamond Monster3d), but I'd like to get some comparisons to ATI's upcoming hardware (the x800?).
Posted: 2004-04-18 09:02pm
by phongn
Shadowhawk wrote:Anyone have any idea what ATI's answer to the nv40 is gonna be like?
I'm planning on doing a full-system upgrade soon, and I'm considering dropping the money on the 6800U (I haven't had a top-of-the-line card since I had a Diamond Monster3d), but I'd like to get some comparisons to ATI's upcoming hardware (the x800?).
I linked to another message board in this thread somewhere with the specs for the R420. It looks to be quite impressive.
Posted: 2004-04-18 09:09pm
by The Kernel
phongn wrote:Shadowhawk wrote:Anyone have any idea what ATI's answer to the nv40 is gonna be like?
I'm planning on doing a full-system upgrade soon, and I'm considering dropping the money on the 6800U (I haven't had a top-of-the-line card since I had a Diamond Monster3d), but I'd like to get some comparisons to ATI's upcoming hardware (the x800?).
I linked to another message board in this thread somewhere with the specs for the R420. It looks to be quite impressive.
Impressive, but unlikely to reach the immense shader performance of NV40. Granted that R420/423 will have a higher clockrate, but the the shaders are virtually unchanged from the R3xx chips which means that even at 600MHz, NV40 should have a performance edge, especially since NV40 is operating on an early driver, while R4xx will probably see far less performance improvements from mature drivers (its architecture being based on R3xx being the primary reason).
I haven't had a chance to see the results of the R4xx chips yet, but I talked to a friend at nVidia a few days ago and she was pretty confident in nVidia's ability to smoke ATI's R4xx, especially in their mainstream solution (GeForce 6800 non-ultra).
Posted: 2004-04-18 09:12pm
by The Kernel
Arthur_Tuxedo wrote:I'm guessing performance of the NU is 65-75% of the Ultra, which is a bigger step down than I would normally be comfortable with, but will still smoke a 9800XT I'm guessing, and for $100 less.
I'll probably end up getting a 6800 ultra in about a year when it's fallen down to like $300, because right now I'm in no financial position to even consider a new computer.
The 6800 non-ultra should only see a performance decline of 25% tops, more likely 20% on a clock for clock basis. So unless the 6800 non-ultra debuts at a significantly lower clockrate (unlikely since it is a lower transistor chip) it should kick the crap out of an R4xx 12-pipe solution on shader intensive games.
Posted: 2004-04-19 12:50am
by Darth Wong
Am I the only one who's tired of NVidia's brute-force approach? I don't personally like the idea of a video card that generates half of your entire computer's heat load and eats power as if it's a separate machine.
Posted: 2004-04-19 01:26am
by Arthur_Tuxedo
The Kernel wrote:The 6800 non-ultra should only see a performance decline of 25% tops, more likely 20% on a clock for clock basis. So unless the 6800 non-ultra debuts at a significantly lower clockrate (unlikely since it is a lower transistor chip) it should kick the crap out of an R4xx 12-pipe solution on shader intensive games.
I was including clockspeed differences when I said that.
Darth Wong wrote:Am I the only one who's tired of NVidia's brute-force approach? I don't personally like the idea of a video card that generates half of your entire computer's heat load and eats power as if it's a separate machine.
I'll say one thing. Given the soaring popularity of small form factor cases for which 230W power supplies are about what the limit of what you can cram in there, they'll have to produce something that eats less or lose out.
Posted: 2004-04-19 03:05am
by The Kernel
Darth Wong wrote:Am I the only one who's tired of NVidia's brute-force approach? I don't personally like the idea of a video card that generates half of your entire computer's heat load and eats power as if it's a separate machine.
Mike, do you realize that NV35 consumed
less power than RV3xx at full load? Do you also realize that NV40 only consumes 15W-20W more than NV35 at full load and less while at idle? Don't be fooled by the giant coolers, nVidia cards aren't super power guzzlers. They don't even require 480W power supplies--nVidia only used that figure because PS manufactuerers usually don't put enough power rails on < 480W supplies to put dual sockets in.
The real fault here is not nVidia's, but the AGP spec, which doesn't provide enough power for modern graphics cards. PCIExpress should alleviate this somewhat, but we'll need a second revision and a dedicated slot design before real progress is made. Either that or go with 3dfx's external power supply.
Posted: 2004-04-19 03:06am
by The Kernel
Arthur_Tuxedo wrote:
I'll say one thing. Given the soaring popularity of small form factor cases for which 230W power supplies are about what the limit of what you can cram in there, they'll have to produce something that eats less or lose out.
Actually the new Shuttle's have 250W supplies which more than adaquately feeds an NV35, so the 6800 non-ultra should have no problems, especially since Shuttle is planning to move to a 300W design.
Posted: 2004-04-19 03:23am
by Shinova
How big are these "shuttle" things?
Posted: 2004-04-19 05:38am
by Crayz9000
Shuttle is a computer equipment manufacturer. They build motherboards, (I think) power supplies, and other assorted equipment.
Posted: 2004-04-19 05:47am
by The Kernel
Shinova wrote:How big are these "shuttle" things?
Here's one example of them:
You can get a pretty good estimate of the size by looking at the USB ports, and at the top 5-1/4 drive bay.
Posted: 2004-04-19 09:12am
by Ace Pace
Darth Wong wrote:Am I the only one who's tired of NVidia's brute-force approach? I don't personally like the idea of a video card that generates half of your entire computer's heat load and eats power as if it's a separate machine.
Look at the specs before saying something. The 6800 has a lower clock rate then the ealier generation and is still better. Is that brute force?
Aproximate information about the X800:
http://www.anandtech.com/video/showdoc.html?i=1966
0.13-micron low-k manufacturing process
160M transistors
~500MHz core clock
8 pipe design
6 vertex engines
Improvements to all of the basic architectural features (shader engines, AA, etc...)
256MB 256-bit GDDR3 (~1GHz data rate)
Single slot design
Posted: 2004-04-19 09:14am
by The Kernel
The Inquirer has some new info on ATI's next-gen part:
The Inquirer wrote:WISE BIRDS tell us here in Vienna that ATI and at least some of their knowledgeable partners don't feel so bad about NV40 Ultra. Even though the NV 40 Ultra, Geforce 6800 Ultra is a very fast card that outperforms everything else on planet for the time being, the ATI next generation chip might end up even faster.
The canaries are singing that 12 pipelines, 475MHz/950MHz card with 96 bit precision and PS 2.0 shader only will end up faster then 16 pipelines, 400MHz /1100MHz cards with 128 bit precision and PS 3.0 shader model.
ATI knew for some time it would lose out feature wise, but stays committed to bring the faster technology, without some of the extra features.
The R400 card - now renamed to R500 - has support for all these nice features but got postponed but that is the chip with 128 bit precision and PS 3.0 support from ATI. Until that chip dawns, later this year at the earliest, ATI stays PS 2.0 and 96 bit precise only.
But Nvidia seems to be keeping some of its powder dry too. We hear about some faster NV40 chips that Nvidia is saving for later. ยต
Now I would take this info with a grain of salt (especially given that there are no numbers given) but it does present an interesting perspective. In any case, we'll know soon enough.
Posted: 2004-04-19 09:18am
by Ace Pace
See my post above, it seems like ATI wants to ignore this round, for a killer card next round.
Posted: 2004-04-19 09:21am
by The Kernel
Ace Pace wrote:See my post above, it seems like ATI wants to ignore this round, for a killer card next round.
That info is quite outdated; X800 will actually be a TWELVE pipe solution, with the X800XT model possesing the full sixteen pipes. Interestingly though, its shaders aparently haven't changed much so the NV40
should still have the more powerful shader performance. Of course, with so much disinformation out there, I doubt we'll know anything for certain until we see the benchmarks.