Page 1 of 1

Gigabyte announces Quad SLI Mobo.

Posted: 2005-05-26 04:16pm
by Beowulf
Tom's Hardware wrote:The board, currently named "GA-8N-SLI Quad", shows four PCI Express slots that can be occupied by SLI-compatible graphics cards. According to sources, Gigabyte found a way to combine two nForce4 SLI chipsets on one platform. Interestingly, the board integrates two different versions of the chipset - the nForce4 SLI Intel Edition (Crush 19) represents the Northbridge, the version for AMD processors is used as Southbridge.
So the ultimate gaming machine is going to have 4 6800 Ultras? You know, that's as much VRAM as I have system RAM in my computer (more if you spring for the 512MB models)

Yikes.

Posted: 2005-05-26 04:30pm
by SPOOFE
In twenty years, when you'll be able to go to Wal-Mart and pick up a computer module the size of a deck of cards capable of rendering movie-quality images in realtime for fifty bucks, I'm gonna think back to this day and age and just la-a-a-augh...

Posted: 2005-05-26 04:32pm
by Einhander Sn0m4n
Name it the 'Quad Damage' just in time for Quake 4 and watch the sales jump as Quakers buy it up.

If it doesn't suck.

Posted: 2005-05-28 10:46am
by Ace Pace
Looking at Gigabyte's duel 6600GT, its not going to suck, but I have to wonder why didn't they just take one of the Nforce4 server chipsets, those have a ton of PCI-E lanes.

Posted: 2005-05-28 12:05pm
by Dahak
The question is, if they'll deliver your very own powerplant for this feat, because you'll be needing it...

Posted: 2005-05-28 12:13pm
by Ace Pace
Dahak wrote:The question is, if they'll deliver your very own powerplant for this feat, because you'll be needing it...
If its 4 6600GT's you can live fine, won't be much more then a pair of Ultra's.

Now if its 4 6800GT's... I think I want that Enermex 600W.

Posted: 2005-05-28 12:22pm
by Dahak
Ace Pace wrote:
Dahak wrote:The question is, if they'll deliver your very own powerplant for this feat, because you'll be needing it...
If its 4 6600GT's you can live fine, won't be much more then a pair of Ultra's.

Now if its 4 6800GT's... I think I want that Enermex 600W.
Given that nVidia advises people to use at least a 480W power supply for one 6800 Ultra, it might be a bit on the low side :)
(I know they can run with smaller power supplies, but it can get a bit iffy...)

Posted: 2005-05-28 12:36pm
by Beowulf
Dahak wrote:
Ace Pace wrote:
Dahak wrote:The question is, if they'll deliver your very own powerplant for this feat, because you'll be needing it...
If its 4 6600GT's you can live fine, won't be much more then a pair of Ultra's.

Now if its 4 6800GT's... I think I want that Enermex 600W.
Given that nVidia advises people to use at least a 480W power supply for one 6800 Ultra, it might be a bit on the low side :)
(I know they can run with smaller power supplies, but it can get a bit iffy...)
I've heard of people running 6800GTs on power supplies only rated for 240W, and fairly successfully. I'm running a ATi AIW x800 off a 240W supply myself. It's not the wattage, it's the amperage on the 12V line.

Posted: 2005-05-28 03:16pm
by Ace Pace
A quality 350W is good for a very high end system, the problem is the word Quality.

Posted: 2005-05-30 07:37am
by Ace Pace
Reading more, I wonder if its really Quad SLI, or just the ability to have 4 SLI capable graphic cards together, or maybe hooking them 2 and 2.

Posted: 2005-05-30 02:20pm
by SPOOFE
I wonder if its really Quad SLI, or just the ability to have 4 SLI capable graphic cards together, or maybe hooking them 2 and 2.
You mean, 4 video cards total divided among 2 displays, instead of all 4 cards rendering for one display?

Posted: 2005-05-30 03:01pm
by Ace Pace
SPOOFE wrote:
I wonder if its really Quad SLI, or just the ability to have 4 SLI capable graphic cards together, or maybe hooking them 2 and 2.
You mean, 4 video cards total divided among 2 displays, instead of all 4 cards rendering for one display?
Yes, because 4 cards at once would force a driver re-write(AFAIK the drivers are built for 2 cards splitting the screen) and I don't belive nVidia wants to do that, not when the current drivers took years to develop.