Page 1 of 1

AMD Releases Final "R600" Specs

Posted: 2007-02-17 02:55am
by Ace Pace
Holy.fucking.shit
Six weeks from now, the world will get the first retail Radeon X2900 XTX

Late yesterday DailyTech was briefed on the final details for the upcoming R600 retail specifications, just in time for everyone to go on vacation for Chinese New Year.

AMD's guidance claims R600 will feature 700 million transistors. By comparison, the Radeon X1900 series R580 GPU incorporated 384 million transistors into its design; the half-generation before that, R520, only featured 320 million.

As disclosed by DailyTech earlier this year, the GPU features a full 512-bit memory interface with support for GDDR3 and GDDR4. R580 was also similar in this regard as it supported GDDR3 and GDDR4.

On March 30, 2007, AMD will initially debut the R600 as the ATI Radeon X2900 XTX in two separate configurations: one for OEMs and another for retail. The OEM version is the full length 12" card that will appear in high-end systems.

ATI guidance claims the X2900 XTX retail card comes as a two-slot, 9.5" design with a vapor chamber cooler. Vapor chambers are already found on high-end CPU coolers, so it would be no surprise to see such cooling on a high-end GPU either. The OEM version of the card is a 12" layout and features a quiet fan cooler.

1GB of GDDR4 memory is the reference configuration for Radeon X2900 XTX. Memory on the reference X2900 XTX cards was supplied by Samsung.

Approximately one month later, the company will launch the GDDR3 version of the card. This card, dubbed the Radeon X2900 XT, features 512MB of GDDR3 and lower clock frequencies than the X2900 XTX. The X2900 XT is also one of the first Radeons to feature heatpipes on the reference design.

AMD anticipates the target driver for X2900 XT to be Catalyst 8.36. WHQL release of the X2900 XTX drive will appear around the Ides of March.

Radeon X2900 will feature native CrossFire support via an internal bridge interface -- there is no longer a need for the external cable found on the Radeon X1000 series CrossFire. There is no Master card, as was the case with other high-end CrossFire setups. Any Radeon X2900 can act as the Master card.

A much anticipated feature, native HDMI, will appear on all three versions of Radeon X2900.

One 6-pin and one 8-pin (2x4) VGA power connectors are featured on Radeon X2900, but both connectors are also backwards compatible with 6-pin power supply cables.

AMD claims the R600 target schedule will be a hard launch -- availability is expected to be immediate. Board partners will be able to demonstrate R600 at CeBIT 2007 (March 15 - 21), but the only available cards will be reference designs.

Why was there such discrepancy with the board layouts and designs up until now? An ATI insider, who wished to remain nameless, states "The original Quad-Stealth design is what we build the R600 on: GDDR4, full-length and dual-slot cooling. As the silicon further revised, [ATI] took up several alternative designs which eventually included GDDR3 and heatpipes into the specification. The release cards demonstrate the versatility of R600 in each of these unique setups."

Final clock frequencies will likely remain estimates until later this month.

Posted: 2007-02-17 09:12am
by InnocentBystander
So uhm.. this will be more powerful than the 8800gtx, yes?

Posted: 2007-02-17 09:17am
by Ace Pace
InnocentBystander wrote:So uhm.. this will be more powerful than the 8800gtx, yes?
Possibly, and possibly mixed with higher power consumption, considering the coolers on this thing.

Posted: 2007-02-17 09:39am
by InnocentBystander
So, how powerful of a PSU do you think this thing demands?

Posted: 2007-02-17 11:00am
by Arthur_Tuxedo
Hard to say whether it will be more powerful or not. The 8800 GTX and GTS were pretty damn impressive, a much larger leap over the previous gen than usual.

Posted: 2007-02-17 01:38pm
by Count Dooku
Arthur_Tuxedo wrote:Hard to say whether it will be more powerful or not. The 8800 GTX and GTS were pretty damn impressive, a much larger leap over the previous gen than usual.
That's for darn sure. I wasn't convinced they were going to be worth a darn until I actually had a chance to see a GTX in action. Oblivion at 1900x1200, outdoors, without a hitch. I might have to pick one up this Christmas.

Posted: 2007-02-17 02:20pm
by Arrow
The rumor mill already has Nvidia getting ready to release the 8900 series to counter the R600. Supposedly, the 8900s will be built on 80nm process (opposed to the 8800's 90) for reduced heat and power, while running a couple hundred megahertz faster. Another rumor has the 8900 having 25% more shaders. And it looks like there will be a 8950GX2 for those that want SLI on a card or quad SLI. The supposed launch date is on or just after the R600 launch.

On the flipside, most of the information looks like its from Chinese sites, so take it with massive quantities of salt.

Also, rumor has it that the R600 is only 10% faster than the 8800 GTX, while needing a lot more power.

Its going to a memorable launch for AMD. I just don't know if its going to be the good kind or bad kind.

Posted: 2007-02-17 02:25pm
by Ace Pace
Atleast it will be a hard launch, something ATi never really managed.

Posted: 2007-02-17 03:50pm
by InnocentBystander
Have any of these cards been benched on dx10 games yet?

Posted: 2007-02-17 04:00pm
by Ace Pace
InnocentBystander wrote:Have any of these cards been benched on dx10 games yet?
What DX10 games? With what drivers? :lol:

Posted: 2007-02-17 04:07pm
by InnocentBystander
Well I've seen in-games of Crysis, so somewhere, somehow, there must be directX 10 capable drivers... right? Right?

Posted: 2007-02-17 04:09pm
by Ace Pace
InnocentBystander wrote:Well I've seen in-games of Crysis, so somewhere, somehow, there must be directX 10 capable drivers... right? Right?
From what I understand, they either use a MS developed software renderer, or using tricks that fake a DX10 card with a DX9 card.

And DX10 drivers exist, it's just that..well, there has been a class action suit filed against nVidia for reasons Arrow can talk about.

Posted: 2007-02-17 05:45pm
by Arrow
Ace Pace wrote:
InnocentBystander wrote:Well I've seen in-games of Crysis, so somewhere, somehow, there must be directX 10 capable drivers... right? Right?
From what I understand, they either use a MS developed software renderer, or using tricks that fake a DX10 card with a DX9 card.

And DX10 drivers exist, it's just that..well, there has been a class action suit filed against nVidia for reasons Arrow can talk about.
DX10 drivers do exist. The Nvidia 100.64 will run a DX10 tech demo (which I sent Ace a screenshot of from my system). IIRC, DX10 Crysis was running at the 8800 launch event, but that was an early driver released to developers only.

We won't have any real DX10 benchmarks until both AMD and Nvidia get all their shit together, and the Company of Heros and Supreme Commander DX10 patches come out (I find it funny as hell that the first games to use DX10 are going to be RTSes).

Posted: 2007-02-17 10:14pm
by Medic
And once AMD and nVidia trade more blows, my 7950 GPU will be even more technically obsolete. (though it's still obviously useful) My head's spinning. Wasn't it just August 05, a year and a half ago, that the then-much-heralded 7800 came out? :shock:

Posted: 2007-02-18 03:46pm
by InnocentBystander
I was hoping my 7800gt would last two years, but its looking like its hardly going to get past a year and a half. More and more its looking like expensive graphics cards are just a bad investment.

Posted: 2007-02-18 04:35pm
by Arthur_Tuxedo
When you buy any shiny piece of computer tech, you have to prepared for the eventuality that it will be rendered painfully obsolete when it seems like you just bought the damn thing. But really, I've upgraded many times before, and a new video card may be three times as fast as what you've got, yet the real difference in how it looks on the screen just isn't that much unless you've got something that's completely outdated or the game is a real hog.

Posted: 2007-02-18 04:48pm
by Uraniun235
InnocentBystander wrote:I was hoping my 7800gt would last two years, but its looking like its hardly going to get past a year and a half. More and more its looking like expensive graphics cards are just a bad investment.
*shrug* I'm pretty happy with my 7800GT's performance in Supreme Commander. I can't crank up all the settings, but that's to be expected. Are you sure it's not your CPU that's limiting you?

If you want to always be able to run the latest games at the highest settings, then yeah, you're going to need to upgrade more often.

Posted: 2007-02-18 05:14pm
by InnocentBystander
A new CPU isn't going to get me DX10 support, unfortunately.

However I suspect SC will run a lot better (it runs okay, just not with lots of players) once I can make use of my x2 3800+'s second core. Though I would like to run it on two monitors...

Posted: 2007-02-18 09:59pm
by Arthur_Tuxedo
DX10 support, pah! There aren't even working DX10 drivers, much less games with DX10, much less games that are DX10 exclusive. DX10 is just like any other graphics innovation before it. By the time shit really supports it, the first products to have it are completely obsolete anyway.

Posted: 2007-02-18 11:04pm
by InnocentBystander
I thought Crysis and Alan Wake were DX10 exclusives?

Posted: 2007-02-18 11:13pm
by Fingolfin_Noldor
InnocentBystander wrote:I thought Crysis and Alan Wake were DX10 exclusives?
And supposedly out by the end of the year too. Just in time for the prices for the latest generation of cards to stabilize a little.

Posted: 2007-02-18 11:20pm
by InnocentBystander
The end of the year? I thought it was sooner than that? Maybe my 7800gt will last two years...

Posted: 2007-02-18 11:27pm
by Fingolfin_Noldor
InnocentBystander wrote:The end of the year? I thought it was sooner than that? Maybe my 7800gt will last two years...
Er.. Alan Wake is a bit of an unknown after digging. Crysis by Q3 2007.

Posted: 2007-02-18 11:48pm
by Arrow
Crysis and Alan Wake will both have DX9 support (and most of the screenshots and videos are DX9. But Crysis, Alan Wake, Hellgate and whole bunch of other titles will be pushing DX10. The first game to use DX10 will most likely be Company of Heroes, unless GPG beats Relic and gets a Supreme Commander DX10 patch out first.

Alan Wake is going to be out some time in 2008.