Page 1 of 1

R600 keeps being delayed

Posted: 2007-01-04 12:48pm
by Ace Pace
The Inq
THE FIRST AMD/ATI graphics chip just cannot enter the world without much screaming and wriggling.

In fact, it's looking more like Nvidia is going to be to the market first with an 80 nanometre version of the G80 graphics chip, rather than ATI's 720 million transistor monster.

The last revision of the prototype chip - upon which a certain "pre-review" is based - also suffers from problems which are serious enough to get another re-spin, sources tell me. This re-spin puts a hold on the launch for another couple of weeks, and now R600 is looking like an early March launch, probably the week before SnowBIT in Hangover. However, AMD/ATI is making severe changes to the whole line up and we can say that this launch, when it happens, will be very, very shocking for the 3D industry.

I am not talking about pseudo-penismarks here.

We learned about the performance and feature characteristics and they're looking pretty impressive in more than one way, although you can expect that Dr. Spinola's engine is getting into defensive mode, saying people "don't understand what the R600 is".

However, engineer teams are working around the clock to make this launch as good as possible.

One small note on the tech demos. Prepare for your mandibles to drop, because Adrienne should look like a high-school attempt at 3D graphics compared to ATI's thing. ยต

Posted: 2007-01-04 12:57pm
by InnocentBystander
Uhm, who or what is Adrienne?

Posted: 2007-01-04 01:16pm
by Ace Pace
InnocentBystander wrote:Uhm, who or what is Adrienne?
The nVidia playboy model.

Posted: 2007-01-04 02:04pm
by Zac Naloen
When are we going to start seeing the ATI tech demos then? That doesn't actually say..

Posted: 2007-01-04 02:28pm
by InnocentBystander
Ace Pace wrote:
InnocentBystander wrote:Uhm, who or what is Adrienne?
The nVidia playboy model.
So... the chip inside the new 8800s?

Posted: 2007-01-04 02:33pm
by Zac Naloen
InnocentBystander wrote:
Ace Pace wrote:
InnocentBystander wrote:Uhm, who or what is Adrienne?
The nVidia playboy model.
So... the chip inside the new 8800s?
I think it's the demonstration they use to show off what the Chip can do.

Posted: 2007-01-04 03:11pm
by Netko
They basically took a real life model and redid her in 3d. Its kinda creepy how well they got to photorealism - forget dr. Aki from Final Fantasy, this was better. Still not quite there, but very very close - at least very very close to a TV picture of a woman (when they zoom in you see that the detail is still very artificial looking).

Link to a relatively high res interview with the model also showing the rendering

I'm hoping that AMD/ATI's response will at least match up to that since otherwise we're going to have a monopoly on performance cards which isn't good. It's worrying to me because, supposedly, nVidia was working on the G80 chip for several years, much longer then any other of its chips and these rumors spilling out about ATI having to tweak stuff sound very much like what happened with the ATI 9500+ series of graphics cards and nVidia's 5k series, except with the roles reversed.

Posted: 2007-01-04 07:00pm
by Arthur_Tuxedo
This is eerily reminiscent of the lead-up to GeForce's FX series, which simply couldn't hold a candle to ATI's 9-series. nVidia spent 4 years developing the 8-series. If it did indeed catch ATI flat-footed, there's not a whole hell of a lot they can do about it in the short term. They might be delaying the thing to try and squeeze some last minute performance out of it.

Posted: 2007-01-04 07:36pm
by InnocentBystander
I really hope it does well, the 8 series will never come down in price if ATI doesn't come out with some competition.

Posted: 2007-01-04 09:22pm
by Fingolfin_Noldor
Given the delays involved with the x1800, we shouldn't be surprised that the delays happen again.

Posted: 2007-01-04 11:32pm
by Alan Bolte
I must admit, she sits decidedly in the uncanny valley. I'll be more interested when they can get past it.

Posted: 2007-01-05 09:37am
by InnocentBystander
I'm not that impressed with it, to be honest. Its good, but the skin is very flat and fake looking. Now the smoke demo, that was pretty impressive.

Posted: 2007-01-05 10:41am
by salm
Indeed, the skin shader does look very flat. Same for the bathing suit.

Posted: 2007-01-05 11:08am
by Netko
Also, while that level of quality is now possible, its unlikely to be used, in my estimation, for at least a generation or two in actual productions, until games with such an expensive process budgeted from the planning stage appear. Just listen to the interview how the model was done - you basically need a live model to make the 3d one - very costly to do in any significant amount. And unless you use that level of quality or close to it for every npc, your graphical quality is uneven which is very noticeable.

For instance, HL2 actually scaled back the quality of their main protagonists since they couldn't match it on the NPCs and the main characters stood out too much when the maximum quality was applied.

What is stunning however is when we compare what the 1st real 3d cards were capable of (3dfx Vodoo) and today's cards. In a decade we went from relativly basic, blocky, 3d with blurry textures to relatively close to photorealism (I agree we aren't there yet, but its getting pretty good). I wonder where real time computer graphics will be in a decade...

Posted: 2007-01-05 12:09pm
by InnocentBystander
Gaphics continue to improve, but its game AI and object interaction (you know... realish physics) that now have to start catching up. And you can never have enough processing power when it comes to AI.

Posted: 2007-01-05 12:49pm
by MKSheppard
mmar wrote:Just listen to the interview how the model was done - you basically need a live model to make the 3d one - very costly to do in any significant amount.
Uhm why? Not everyone is a supermodel. Just get game development staff, some random people off the street, and use them as bases for your NPCs.

Posted: 2007-01-05 01:45pm
by Arrow
MKSheppard wrote:
mmar wrote:Just listen to the interview how the model was done - you basically need a live model to make the 3d one - very costly to do in any significant amount.
Uhm why? Not everyone is a supermodel. Just get game development staff, some random people off the street, and use them as bases for your NPCs.
That's basically what Bioware is doing for Mass Effect. However, it still takes a shit load of time and effort for the modelers to create NPCs out of them.

Posted: 2007-01-05 02:10pm
by Ace Pace
MKSheppard wrote:
mmar wrote:Just listen to the interview how the model was done - you basically need a live model to make the 3d one - very costly to do in any significant amount.
Uhm why? Not everyone is a supermodel. Just get game development staff, some random people off the street, and use them as bases for your NPCs.
Takes a LONG time to do a good texture job. A long time.

Posted: 2007-01-05 03:08pm
by Beowulf
MKSheppard wrote:
mmar wrote:Just listen to the interview how the model was done - you basically need a live model to make the 3d one - very costly to do in any significant amount.
Uhm why? Not everyone is a supermodel. Just get game development staff, some random people off the street, and use them as bases for your NPCs.
It's not the cost of the live model, it's the cost of the modelers' and texturers' time.

Posted: 2007-01-05 06:14pm
by Netko
Yeah, sorry, that's what I meant. You need to recreate the live model from the raw data, and even if you get much better geometry data and have high quality shots of the person to base textures on, its still a very labor intensive job, far in excess of what is needed to make more cartoonish characters (even if the design phase is probably more involved for the later).

This is essentially why development costs are skyrocketing, it isn't so much that the tech is that harder to work with, in a lot of cases its the opposite - its the fact that art assets need to be much more detailed and also there needs to be much more of them (totally clone armies are no longer tolerated for the most part) - it drives up cost of development immensely.