More PS3 published

GEC: Discuss gaming, computers and electronics and venture into the bizarre world of STGODs.

Moderator: Thanas

User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

HyperionX wrote: Why exactly? $400, when adjusted for inflation is less than $300 10 years ago. It's also way less than a comparably powerful PC at launch too, so I see no reason why the enthusiasts won't pay that much.
And gas costs less today when you adjust for inflation compared to the late 70's, but so what? It's what people are willing to pay, years of market research has shown that $299 is the sweet spot for a new console.

Its just like gasoline, consumers will have no problem paying $1.99 for a gallon of gas, but once it breaks the $2 barrier, people start getting very pissy.
US launch is slated sometimes in Spring 2006 AFAIK. Even a year late is still only like 5-7 million units.
Microsoft could easily ship twice that many in a year. Just look at how many consoles the PS2 shipped during a year of worldwide sales.
The CPU can still run T&L on top of the GPU still. Nothing to stop the devs from doing so.
Except that it's exponentially slower to do so. They would also have to send the graphics data back and forth across the system bus, cutting into availible bandwidth. Don't you realize that this was exactly the problem with the PS2?
Also it is needed to run things like physics, audio, video, etc. Anyways, stop with trying to shift the burden of proof to me: exactly what will you do with integer power in a console? Outside of AI and game logic there's basically nothing usable.
You still haven't answered the question about what are you going to do with all of the FP units. Audio doesn't need that kind of power, 90-95% of the graphics work is done on the GPU, and about the only thing you can use it for is physics, which I've already conceeded is about the only possible use for Cell.
You could turn on 4xAA for the PS3 and still come out ahead performance-wise.
1) You don't know that considering you lack full performance specs on RSX AND the Xenon GPU.

2) That doesn't mean that developers will. Like I already said, they didn't do it on the Xbox despite having only a minimal performance hit at 2x. With the Xbox 360, the devs don't have a choice in the matter.
Self-selving doesn't mean their wrong. We do know the specs, and specs-wise the RSX has more pixel shading power than the Xenos has total shading power (24 pipelines w/ 2 ALU's each vs. 48 ALUs total for the Xenos).
I think you should check those numbers, according to you, they have an equal number of ALU's.

Besides, there are many aspects that lead to overall graphics performance. You cannot simplify them by comparing ALU's, bandwidth or any other single factor, ESPECIALLY when dealing with a brand new design like the unified shaders in the Xenon GPU.

Not that this really matters though, I never said the Xenon GPU was more powerful than the RSX. What I said was, it's going to be too close to make much of a difference. The NV2x and ATI's Flipper were much further apart in performance, yet the Gamecube managed to go toe-to-toe with the Xbox graphics wise in a number of cases.

The biggest thing Sony has going against them is the price of their box, and that isn't going to change as long as they insist on this proprietary design bullshit as well as including a Blu-Ray player. There will be plenty of people willing to pay $400 for a PS3, but not as many as are willing to pay $300, and marketing research on the industry has shown that at $300, you can sell out an entire 18-month run of a popular console before lowering the price. At $400, Sony is taking a big risk; I won't buy one at that price at launch and neither will a lot of other people.
User avatar
Master of Ossus
Darkest Knight
Posts: 18213
Joined: 2002-07-11 01:35am
Location: California

Post by Master of Ossus »

The problem with conceding a large "head-start" to Microsoft is that the two companies are fighting for networking externalities. A customer will be much more likely to buy a system that lets him play against his friend than to buy another system and ask his friend to buy that one, too. Since both consoles (and the Revolution, for that matter) seem to be looking at creating networks of users, even a slight head start may be insurmountable.
"Sometimes I think you WANT us to fail." "Shut up, just shut up!" -Two Guys from Kabul

Latinum Star Recipient; Hacker's Cross Award Winner

"one soler flar can vapririze the planit or malt the nickl in lass than millasacit" -Bagara1000

"Happiness is just a Flaming Moe away."
User avatar
HyperionX
Village Idiot
Posts: 390
Joined: 2004-09-29 10:27pm
Location: InDoORS

Post by HyperionX »

The Kernel wrote:And gas costs less today when you adjust for inflation compared to the late 70's, but so what? It's what people are willing to pay, years of market research has shown that $299 is the sweet spot for a new console.

Its just like gasoline, consumers will have no problem paying $1.99 for a gallon of gas, but once it breaks the $2 barrier, people start getting very pissy.
Well gas usage hasn't drop like a rock, or much at all for that matter. I'd like see to this market research, because I seriously doubt how big of an effect it'll have. For one thing Sony certainly doesn't think it's a big deal.
Microsoft could easily ship twice that many in a year. Just look at how many consoles the PS2 shipped during a year of worldwide sales.
Even the PS2 sold only 10 million it's first year.
Except that it's exponentially slower to do so. They would also have to send the graphics data back and forth across the system bus, cutting into availible bandwidth. Don't you realize that this was exactly the problem with the PS2?
Do you really know what you're talking about here? Transform and lighting are nothing more than matrix and vector related operations, something that a SPE is well designed for. Not as efficient as a vertex shader perhaps, but way faster (3.2Ghz vs. ~550Mhz). Overall the PS3's CPU could do a fuckload of vertex shading. You also are woefully unaware of the PS3's design: the CPU-to-GPU bus is 20GB/s read and 15GB/s write (from the GPU's point of view). That's fucking huge, and there's very little chance of starvation.
You still haven't answered the question about what are you going to do with all of the FP units. Audio doesn't need that kind of power, 90-95% of the graphics work is done on the GPU, and about the only thing you can use it for is physics, which I've already conceeded is about the only possible use for Cell.
Physics is pretty damn demanding actually if want to do it well. There's also a number of non-obvious uses like a particle system or procedurally generating data. You can also use the SPEs in assisting the GPU like vertex shading and maybe pixel shading. I also heard you can do SSAA via the CPU. Also you're still dodging: There's still no point to having a lot of integer power, so FP is basically it.
1) You don't know that considering you lack full performance specs on RSX AND the Xenon GPU.
Basically an argument from ignorance. We know enough that it's very unlikely that the X360's GPU will outpower the PS3's GPU in most cases.
2) That doesn't mean that developers will. Like I already said, they didn't do it on the Xbox despite having only a minimal performance hit at 2x. With the Xbox 360, the devs don't have a choice in the matter.
True. But the points stand, which is that the eDRAM of the Xenos isn't much of a boost relative to the PS3's GPU.
I think you should check those numbers, according to you, they have an equal number of ALU's.
They have a higher clockspeed. Plus they are dedicated ALUs with a lot of "tweak" type of functionality which means individually they are going to be more powerful. I probably should've elaborated that.
Besides, there are many aspects that lead to overall graphics performance. You cannot simplify them by comparing ALU's, bandwidth or any other single factor, ESPECIALLY when dealing with a brand new design like the unified shaders in the Xenon GPU.
True, but we can estimate. Also brand new designs are not miracle workers, so if the gap is large enough I don't see a new design changing anything.
Not that this really matters though, I never said the Xenon GPU was more powerful than the RSX. What I said was, it's going to be too close to make much of a difference. The NV2x and ATI's Flipper were much further apart in performance, yet the Gamecube managed to go toe-to-toe with the Xbox graphics wise in a number of cases.
Well the ultimate result is highly subjective. But I am saying that there's simply no way that the X360 is equivalent to the PS3 in power. IMO that means PS3 will overall look better. And the NV2a and the flipper were not much more far apart in performance (both were 4 pipeline GPUs, but NV2a's were more powerful and faster), but in most cases the Xbox pulled away.
The biggest thing Sony has going against them is the price of their box, and that isn't going to change as long as they insist on this proprietary design bullshit as well as including a Blu-Ray player. There will be plenty of people willing to pay $400 for a PS3, but not as many as are willing to pay $300, and marketing research on the industry has shown that at $300, you can sell out an entire 18-month run of a popular console before lowering the price. At $400, Sony is taking a big risk; I won't buy one at that price at launch and neither will a lot of other people.
Well neither will the vast majority of console buyers will either. Only something like 15 million or so bought the PS2 at $300 dollars. The rest got it for $200 or less. Also, since most of the part for the PS3 are made in-house the PS3 clearly has a better cost structure as the generation progresses. The higher cost issue is only short term.
"Hey, genius, evolution isn't science. That's why its called a theory." -A Fundie named HeroofPellinor
"If it was a proven fact, there wouldn't be any controversy. That's why its called a 'Theory'"-CaptainChewbacca[img=left]http://www.jasoncoleman.net/wp-images/b ... irefox.png[/img][img=left]http://img296.imageshack.us/img296/4226 ... ll42ew.png[/img]
User avatar
SPOOFE
Sith Devotee
Posts: 3174
Joined: 2002-07-03 07:34pm
Location: Woodland Hills, CA
Contact:

Post by SPOOFE »

So HyperionX will ignore what "analysts" say and demand "market research", but bases his own arguments on what "most people [he] met" say?

I think that says it all right there.
The Great and Malignant
User avatar
Xon
Sith Acolyte
Posts: 6206
Joined: 2002-07-16 06:12am
Location: Western Australia

Post by Xon »

HyperionX wrote:Overall the PS3's CPU could do a fuckload of vertex shading.
At the expansive of everything else. There is a reason they added a conventional graphics chip rather than do everything on the CPU.
You also are woefully unaware of the PS3's design: the CPU-to-GPU bus is 20GB/s read and 15GB/s write (from the GPU's point of view). That's fucking huge, and there's very little chance of starvation.
Under that model, the CPU needs todo lots of memory reads/writes wrt to main memory, and then push the results to the GPU. Since the GPU<->CPU shares it bandwidth with the CPU<->main memory, the GPU<->CPU bandwidth doesnt matter one bit, if the CPU's outbound bandwidth is being consumed by main memory writes.
SPOOFE wrote:So HyperionX will ignore what "analysts" say and demand "market research", but bases his own arguments on what "most people [he] met" say?

I think that says it all right there.
Well normally there is a reason someone gets a "Village Idiot" title, and it is often the result of incredibly boneheaded debating.
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
User avatar
HyperionX
Village Idiot
Posts: 390
Joined: 2004-09-29 10:27pm
Location: InDoORS

Post by HyperionX »

SPOOFE wrote:So HyperionX will ignore what "analysts" say and demand "market research", but bases his own arguments on what "most people [he] met" say?

I think that says it all right there.
I've never claimed what other people I've met and their opinions matter at all. I was wondering what "market research" states that anything over $300 is too much, seeing how $400 is actually less $300 10 or so years ago when adjusted for inflation. The Kernel also never actually provided this market research, merely stated it to be so.
"Hey, genius, evolution isn't science. That's why its called a theory." -A Fundie named HeroofPellinor
"If it was a proven fact, there wouldn't be any controversy. That's why its called a 'Theory'"-CaptainChewbacca[img=left]http://www.jasoncoleman.net/wp-images/b ... irefox.png[/img][img=left]http://img296.imageshack.us/img296/4226 ... ll42ew.png[/img]
User avatar
HyperionX
Village Idiot
Posts: 390
Joined: 2004-09-29 10:27pm
Location: InDoORS

Post by HyperionX »

ggs wrote:
HyperionX wrote:Overall the PS3's CPU could do a fuckload of vertex shading.
At the expansive of everything else. There is a reason they added a conventional graphics chip rather than do everything on the CPU.
Well that's what the SPE's are for (you have 7 of them). Each one can do a FMADD per cycle at 3.2Ghz, where as each one of the RSX vertex shaders can also do one FMADD per cycle but only at 550Mhz. In total, the RSX vertex units can do 4.4 billion FMADDs per second, but Cell's SPE's can do 22.4 billion FMADDs. Although there's a lot more to it than this, FMADDs are the most important aspect of vertex shading and the Cell has it in droves.
You also are woefully unaware of the PS3's design: the CPU-to-GPU bus is 20GB/s read and 15GB/s write (from the GPU's point of view). That's fucking huge, and there's very little chance of starvation.
Under that model, the CPU needs todo lots of memory reads/writes wrt to main memory, and then push the results to the GPU. Since the GPU<->CPU shares it bandwidth with the CPU<->main memory, the GPU<->CPU bandwidth doesnt matter one bit, if the CPU's outbound bandwidth is being consumed by main memory writes.
Cell's main memory bandwidth is 25.6 GB/s, a figure that dwarfs your average PC. This shouldn't be a problem. Also the CPU can read from VRAM too if that's not enough.
SPOOFE wrote:So HyperionX will ignore what "analysts" say and demand "market research", but bases his own arguments on what "most people [he] met" say?

I think that says it all right there.
Well normally there is a reason someone gets a "Village Idiot" title, and it is often the result of incredibly boneheaded debating.
I don't think either one of you are paying enough attention in this debate.
"Hey, genius, evolution isn't science. That's why its called a theory." -A Fundie named HeroofPellinor
"If it was a proven fact, there wouldn't be any controversy. That's why its called a 'Theory'"-CaptainChewbacca[img=left]http://www.jasoncoleman.net/wp-images/b ... irefox.png[/img][img=left]http://img296.imageshack.us/img296/4226 ... ll42ew.png[/img]
User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

HyperionX wrote: Well gas usage hasn't drop like a rock, or much at all for that matter. I'd like see to this market research, because I seriously doubt how big of an effect it'll have. For one thing Sony certainly doesn't think it's a big deal.
Go look up the consoles from the 1990's, particularly the Sega Saturn which retailed at $399 at launch. After the demise of the expensive series of consoles during the 90's, the Playstation was launched at $299. That was found to be the sweet spot for a new console, which is why every new console since then has launched at or below this point.

The point about the Saturn cannot be overstated, Sega was a major player prior to the Saturn (they were #2 in the US and Japan) so they had plenty of brand recognition, but consumer balked at the $399 price.
Even the PS2 sold only 10 million it's first year.
You didn't read my previous post did you? Sony only sold 10 milion units for the first year because they did not do a simultaneous worldwide launch, and because they did not have sufficient parts availible during the first year, mostly due to yield problems.

The Xbox 360 isn't having such problems, TSMC is already getting good yields out of the Xbox 360 hardware, and Microsoft is going to do a worldwide simulataeous launch (or at least, near enough to one).
Do you really know what you're talking about here? Transform and lighting are nothing more than matrix and vector related operations, something that a SPE is well designed for. Not as efficient as a vertex shader perhaps, but way faster (3.2Ghz vs. ~550Mhz). Overall the PS3's CPU could do a fuckload of vertex shading. You also are woefully unaware of the PS3's design: the CPU-to-GPU bus is 20GB/s read and 15GB/s write (from the GPU's point of view). That's fucking huge, and there's very little chance of starvation.
I'm perfectly aware of the amount of system bandwidth the PS3 has thank you. I'm also aware that doing T&L calculations on the CPU is stupid because it will put unneeded stress on the bus, be done slower than the GPU can do it, and leave the GPU waiting for geometry data. Don't you know why Sony ditched the idea of a dumb renderer in the first place? :roll:
Physics is pretty damn demanding actually if want to do it well.
Which I've already conceded to. I still don't know how much impact it will have though, physics programming is in its infancy.
There's also a number of non-obvious uses like a particle system or procedurally generating data. You can also use the SPEs in assisting the GPU like vertex shading and maybe pixel shading. I also heard you can do SSAA via the CPU.
All of which the GPU can do better. And if you think that developers are going to use SSAA when they won't even use MSAA, you are kidding yourself.
Also you're still dodging: There's still no point to having a lot of integer power, so FP is basically it.
Don't strawman my position. I never claimed that integer power is super important to a console, I claimed that the Cell design has more FP power than any console needs or can realistically use for anything. Cell was designed with a dumb renderer in mind, now that Sony isn't using one, all that power will either have to be used for other things (dubious) or not be used at all.
Basically an argument from ignorance. We know enough that it's very unlikely that the X360's GPU will outpower the PS3's GPU in most cases.
No we don't dumbass. We don't know the performance characteristics of a unified shader architecture (since there hasn't been one) AND we don't know what affect the integrated DRAM will have (since no GPU has had one).
True. But the points stand, which is that the eDRAM of the Xenos isn't much of a boost relative to the PS3's GPU.
And you know this how?
They have a higher clockspeed. Plus they are dedicated ALUs with a lot of "tweak" type of functionality which means individually they are going to be more powerful. I probably should've elaborated that.
I wouldn't count on any clockspeed numbers from Sony until they ship, especially since they are already having manufacturing troubles.
True, but we can estimate. Also brand new designs are not miracle workers, so if the gap is large enough I don't see a new design changing anything.
How exactly is the gap large when we don't have final clockspeed numbers OR any idea about how the unified shader architecture will perform in the real world?
Well the ultimate result is highly subjective. But I am saying that there's simply no way that the X360 is equivalent to the PS3 in power. IMO that means PS3 will overall look better. And the NV2a and the flipper were not much more far apart in performance (both were 4 pipeline GPUs, but NV2a's were more powerful and faster), but in most cases the Xbox pulled away.
Are you kidding? Flipper didn't have programmable shaders, ran at a paltry 162 MHz, and didn't have near the memory bandwidth of the Xbox IGP.
Well neither will the vast majority of console buyers will either. Only something like 15 million or so bought the PS2 at $300 dollars. The rest got it for $200 or less. Also, since most of the part for the PS3 are made in-house the PS3 clearly has a better cost structure as the generation progresses. The higher cost issue is only short term.
That depends on R&D, the nVidia licensing structure, the premium of the Blu-Ray drive and the cost of the Nagasaki fab.

R&D for Cell is obviously higher for Cell, especially since Sony shouldered a great deal of it, hoping for cross application uses (remains to be seen and means nothing to the PS3 costs). The premium for Blu-Ray is obviously significant (this has been stated by numerous electronics manufacturers) and the costs of the Nagasaki fab are considerable, especially since Sony is going to be using it purely for PS3/PSP manufacturing. Not to mention that the most expensive parts (the GPU) are going to be manufactured by Fishkill, not Nagasaki.
Last edited by The Kernel on 2005-08-06 12:03am, edited 1 time in total.
User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

HyperionX wrote:
SPOOFE wrote:So HyperionX will ignore what "analysts" say and demand "market research", but bases his own arguments on what "most people [he] met" say?

I think that says it all right there.
I've never claimed what other people I've met and their opinions matter at all. I was wondering what "market research" states that anything over $300 is too much, seeing how $400 is actually less $300 10 or so years ago when adjusted for inflation. The Kernel also never actually provided this market research, merely stated it to be so.
FYI, the game market is immune to inflation from the consumer perspective. Why do you think games are only now passing the $50 barrier (which they've sold at since the 80's)?
User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

HyperionX wrote: Well that's what the SPE's are for (you have 7 of them). Each one can do a FMADD per cycle at 3.2Ghz, where as each one of the RSX vertex shaders can also do one FMADD per cycle but only at 550Mhz. In total, the RSX vertex units can do 4.4 billion FMADDs per second, but Cell's SPE's can do 22.4 billion FMADDs. Although there's a lot more to it than this, FMADDs are the most important aspect of vertex shading and the Cell has it in droves.
Let's assume for one second that you are right, and that the Cell SPE's can perform geometry calculations at acceptable levels of performance without negatively impacting system performance. My question is: so what? You really think that having more geometry processing power is going to seriously affect graphcis performance? A properly balanced GPU has enough geometry processing to feed its rendering pipelines, geometry hasn't been a bottleneck for years.

So given that, how do you expect the RSX to keep up with all of the extra geometry processing? Hint: it can't.
Medic
Sith Devotee
Posts: 2632
Joined: 2004-12-31 01:51pm
Location: Deep South

Post by Medic »

What I'm seeing right now between HyperionX and The Kernal is that the X360 is going to have near parity with the PS3. Slightly less or more powerful, with advantages going to both systems.

At the end of the day, release date and reasonable price-estimates are still set in stone until and unless Sony or Microsoft announce something drastic.

And what about this? A console with a 10 year life span? I know obsession over cost easily enough with PS2, but that was against the Dreamcast and Microsoft at the time wasn't an alternative. "By all accounts," the article says, the PS3 is more powerful than the XBox360 too. :roll: They don't seem to be afraid to call it another way. As to the raw power of the Cell? Well, it certainly won't make developing games easier. But Sony obviously doesn't care about that -- look at PS2.

Knowing the later release date and an exorbitant cost around $450 Sony's banking big time on brand loyalty, obsession and the desire to own more than just a gaming system.
User avatar
HyperionX
Village Idiot
Posts: 390
Joined: 2004-09-29 10:27pm
Location: InDoORS

Post by HyperionX »

The Kernel wrote:
HyperionX wrote: Well gas usage hasn't drop like a rock, or much at all for that matter. I'd like see to this market research, because I seriously doubt how big of an effect it'll have. For one thing Sony certainly doesn't think it's a big deal.
Go look up the consoles from the 1990's, particularly the Sega Saturn which retailed at $399 at launch. After the demise of the expensive series of consoles during the 90's, the Playstation was launched at $299. That was found to be the sweet spot for a new console, which is why every new console since then has launched at or below this point.

The point about the Saturn cannot be overstated, Sega was a major player prior to the Saturn (they were #2 in the US and Japan) so they had plenty of brand recognition, but consumer balked at the $399 price.
PS2 launched at around 40,000 yen in Japan I think, which is close to $400. Anyways, $399 was a lot more money in 1994-95 than it is today. Saturn is a bad example, it failed for reasons more than just price: It was also hard to program for, was another in a series of failed Sega addons (Sega 32x, CD), and lacked games. On top of that, PS3 has "hype," and an enormous amount of it. That alone steamrolled the Dreamcast with the PS2, and I predict a smaller version of it for PS3.

Also, I've checked the inflation calculator, $400 is like $321 in 1995, so I was a little off. But it still should close enough.

PS: For those who value market research: http://biz.yahoo.com/bw/050719/195743.html?.v=1
Even the PS2 sold only 10 million it's first year.
You didn't read my previous post did you? Sony only sold 10 milion units for the first year because they did not do a simultaneous worldwide launch, and because they did not have sufficient parts availible during the first year, mostly due to yield problems.[/quote]

I don't recall you ever saying that in a previous post this thread. No matter, AFAIK Sony has stated that PS3 will launch in US and Japan by Spring 2006, so the gap is not going to be that big.
The Xbox 360 isn't having such problems, TSMC is already getting good yields out of the Xbox 360 hardware, and Microsoft is going to do a worldwide simulataeous launch (or at least, near enough to one).
Interestingly enough just today someone came out with sales predictions here. Xbox1 also used TSMC to make the GPU, and launched with basically the same amount of units (1.5 million or so). Somehow there seems to be some absolute limit to how many you can make at launch.
Do you really know what you're talking about here? Transform and lighting are nothing more than matrix and vector related operations, something that a SPE is well designed for. Not as efficient as a vertex shader perhaps, but way faster (3.2Ghz vs. ~550Mhz). Overall the PS3's CPU could do a fuckload of vertex shading. You also are woefully unaware of the PS3's design: the CPU-to-GPU bus is 20GB/s read and 15GB/s write (from the GPU's point of view). That's fucking huge, and there's very little chance of starvation.
I'm perfectly aware of the amount of system bandwidth the PS3 has thank you. I'm also aware that doing T&L calculations on the CPU is stupid because it will put unneeded stress on the bus, be done slower than the GPU can do it, and leave the GPU waiting for geometry data. Don't you know why Sony ditched the idea of a dumb renderer in the first place? :roll:
While it'll be very unlikely that you would even need the full Cell to do vertex shading, but you could do so every time you run into a VS limited scenario. Think of it as a crazy version of unified shaders. It's not such a dumb idea. And for the last point I think they dropped the dumb rasterizer (and the Cell based GPU) due to a lack of pixel shading power, not because of vertex shading.
Physics is pretty damn demanding actually if want to do it well.
Which I've already conceded to. I still don't know how much impact it will have though, physics programming is in its infancy.
Somewhat true. It'll be seen just how useful adding lots of extra physics will be. But the possibility is there, and I don't think both MS and Sony added that much FP power in their CPU if there wasn't a point to it.
There's also a number of non-obvious uses like a particle system or procedurally generating data. You can also use the SPEs in assisting the GPU like vertex shading and maybe pixel shading. I also heard you can do SSAA via the CPU.
All of which the GPU can do better. And if you think that developers are going to use SSAA when they won't even use MSAA, you are kidding yourself.
Not for a particle system nor generating procedural data. SSAA is crazy though, but doable for anyone crazy enough. In fact some demos Sony's shown were done entirely on the CPU. These are merely the possibilities that exists. Also I see no reason why there won't be MSAA for the PS3. It's used all the time in PC games and there's no reason why they can't for the PS3.
Also you're still dodging: There's still no point to having a lot of integer power, so FP is basically it.
Don't strawman my position. I never claimed that integer power is super important to a console, I claimed that the Cell design has more FP power than any console needs or can realistically use for anything. Cell was designed with a dumb renderer in mind, now that Sony isn't using one, all that power will either have to be used for other things (dubious) or not be used at all.
So it's your position that the CPU isn't very important at all?:D Like I said, I don't think MS or Sony would put so much FP power in their CPUs if there wasn't any use for it.
Basically an argument from ignorance. We know enough that it's very unlikely that the X360's GPU will outpower the PS3's GPU in most cases.
No we don't dumbass. We don't know the performance characteristics of a unified shader architecture (since there hasn't been one) AND we don't know what affect the integrated DRAM will have (since no GPU has had one).
If all the unified shaders do is add efficiency and all the eDRAM is give free 4xAA then you can estimate that they won't save you much. Read any graphics card review. At 1024x768 the cost of 4xAA isn't a lot for a 7800GTX.
True. But the points stand, which is that the eDRAM of the Xenos isn't much of a boost relative to the PS3's GPU.
And you know this how?
Read above.
They have a higher clockspeed. Plus they are dedicated ALUs with a lot of "tweak" type of functionality which means individually they are going to be more powerful. I probably should've elaborated that.
I wouldn't count on any clockspeed numbers from Sony until they ship, especially since they are already having manufacturing troubles.
What manufacturing troubles?
True, but we can estimate. Also brand new designs are not miracle workers, so if the gap is large enough I don't see a new design changing anything.
How exactly is the gap large when we don't have final clockspeed numbers OR any idea about how the unified shader architecture will perform in the real world?
I'm assuming that final clockspeeds aren't going to change much, which given the relatively unaggressive clockspeeds is not a unreasonable assumption. As for the unified shaders they are not going to be magic. The RSX will have more pixel shading power then the Xenos will have total, so the only way the Xenos can overpower the RSX is in a vertex limited scenario. Something that will never be an issue seeing how the Cell has vertex shading power to burn.
Well the ultimate result is highly subjective. But I am saying that there's simply no way that the X360 is equivalent to the PS3 in power. IMO that means PS3 will overall look better. And the NV2a and the flipper were not much more far apart in performance (both were 4 pipeline GPUs, but NV2a's were more powerful and faster), but in most cases the Xbox pulled away.
Are you kidding? Flipper didn't have programmable shaders, ran at a paltry 162 MHz, and didn't have near the memory bandwidth of the Xbox IGP.
Flipper had programmable shaders actually. 162 Mhz isn't that bad versus the 233Mhz of the Xbox. Ironically enough Flipper actually has eDRAM for a framebuffer just like the Xenos, and X360 also has less memory bandwidth (22.4GB/s versus 48GB/s for PS3), just like the Gamecube. In fact I expect X360 vs. PS3 graphics to have the same difference of GC vs. Xbox.
Well neither will the vast majority of console buyers will either. Only something like 15 million or so bought the PS2 at $300 dollars. The rest got it for $200 or less. Also, since most of the part for the PS3 are made in-house the PS3 clearly has a better cost structure as the generation progresses. The higher cost issue is only short term.
That depends on R&D, the nVidia licensing structure, the premium of the Blu-Ray drive and the cost of the Nagasaki fab.

R&D for Cell is obviously higher for Cell, especially since Sony shouldered a great deal of it, hoping for cross application uses (remains to be seen and means nothing to the PS3 costs). The premium for Blu-Ray is obviously significant (this has been stated by numerous electronics manufacturers) and the costs of the Nagasaki fab are considerable, especially since Sony is going to be using it purely for PS3/PSP manufacturing. Not to mention that the most expensive parts (the GPU) are going to be manufactured by Fishkill, not Nagasaki.
There's another fab at Oita actually. Anyways both Cell and Bluray are in-house technology, which means little if any licensing costs. The RSX licensing deal is $5 per chip last I heard. And it is not going to be made at Fishkill but in-house at Nagasaki or Oita. I don't know where you heard that but ultimately every major IC will be made in-house eventually. With dieshrinking everything will drop in costs very quickly. The only truely expensive unit will be the Bluray drive, which will also drop to DVD drive level in the next few years. Since virtually nothing in X360 is made in-house eventually the marginal costs are going to be greater due to licensing fees and other costs.

All R&D costs are fixed costs, so if PS3 sells 100 million and R&D is 2 billion, R&D will be only $20 per PS3. If Cell is used for something else that will also reduce per unit costs.

Just look at PStwo: It's probably the cheapest console on the market right now, and I expect the same to be true for the PSthree as well.
FYI, the game market is immune to inflation from the consumer perspective. Why do you think games are only now passing the $50 barrier (which they've sold at since the 80's)?
Some N64 games did go for $70. Regardlessly the consumer cannot be immune from inflation indefinitely. At some point consoles will go for more than $300 and games for more than $50.
Let's assume for one second that you are right, and that the Cell SPE's can perform geometry calculations at acceptable levels of performance without negatively impacting system performance. My question is: so what? You really think that having more geometry processing power is going to seriously affect graphcis performance? A properly balanced GPU has enough geometry processing to feed its rendering pipelines, geometry hasn't been a bottleneck for years.

So given that, how do you expect the RSX to keep up with all of the extra geometry processing? Hint: it can't.
While you will not run into a geometry limited scenario very often it can happen, such as a scenario when you use lots of micropolygons. My point is that you can do that without murdering the CPU.
"Hey, genius, evolution isn't science. That's why its called a theory." -A Fundie named HeroofPellinor
"If it was a proven fact, there wouldn't be any controversy. That's why its called a 'Theory'"-CaptainChewbacca[img=left]http://www.jasoncoleman.net/wp-images/b ... irefox.png[/img][img=left]http://img296.imageshack.us/img296/4226 ... ll42ew.png[/img]
User avatar
SPOOFE
Sith Devotee
Posts: 3174
Joined: 2002-07-03 07:34pm
Location: Woodland Hills, CA
Contact:

Post by SPOOFE »

I've never claimed what other people I've met and their opinions matter at all.
"Another day and another "Sony is doomed" tirade (actually this isn't very common, most people I met agree that Sony will win again)."

Bolding mine. You're welcome.
I was wondering what "market research" states that anything over $300 is too much, seeing how $400 is actually less $300 10 or so years ago when adjusted for inflation.
I guess you don't realize that people live NOW, and are buying products NOW, not ten years ago. $400 is more than $300 NOW, no matter what stupid way you try to spin it. People don't take inflation into account when making their purchases, because it doesn't matter what something cost ten years ago. All that matters is how much it costs NOW.
The Great and Malignant
User avatar
Uraniun235
Emperor's Hand
Posts: 13772
Joined: 2002-09-12 12:47am
Location: OREGON
Contact:

Post by Uraniun235 »

The Kernel wrote:FYI, the game market is immune to inflation from the consumer perspective. Why do you think games are only now passing the $50 barrier (which they've sold at since the 80's)?
This is complete bullshit. There have been quite a few games sold at above $50 for the Super Nintendo and Nintendo 64. These tended to be cartridges with additional chips like the SuperFX, or bigger ROM chips.
SPOOFE wrote:Enthusiasts, obviously, won't blink an eye at spending $3000 on a gaming rig. If you think that person is "average", I weep for your intellect.
I think that person is "retarded". :wink: Unless said "enthusiast" is a lazy son of a bitch who can't be arsed to build his own computer, and exclusively buys overpriced Alienware crap, a powerful and competitive gaming PC can be built from scratch for just about $1000. A lot of gamers, myself included, will incrementally upgrade their PCs, replacing only those parts that become bottlenecks.
User avatar
HyperionX
Village Idiot
Posts: 390
Joined: 2004-09-29 10:27pm
Location: InDoORS

Post by HyperionX »

SPOOFE wrote:
I've never claimed what other people I've met and their opinions matter at all.
"Another day and another "Sony is doomed" tirade (actually this isn't very common, most people I met agree that Sony will win again)."

Bolding mine. You're welcome.
I still never claimed that their opinions matter. I'm merely pointing out that most people I've met had agreed that PS3 will be the ultimate victor again.
I was wondering what "market research" states that anything over $300 is too much, seeing how $400 is actually less $300 10 or so years ago when adjusted for inflation.
I guess you don't realize that people live NOW, and are buying products NOW, not ten years ago. $400 is more than $300 NOW, no matter what stupid way you try to spin it. People don't take inflation into account when making their purchases, because it doesn't matter what something cost ten years ago. All that matters is how much it costs NOW.
The $199 price tag of the Gamecube and the Dreamcast didn't save them against the $299 PS2 either. So clear it's doesn't matter NOW.
"Hey, genius, evolution isn't science. That's why its called a theory." -A Fundie named HeroofPellinor
"If it was a proven fact, there wouldn't be any controversy. That's why its called a 'Theory'"-CaptainChewbacca[img=left]http://www.jasoncoleman.net/wp-images/b ... irefox.png[/img][img=left]http://img296.imageshack.us/img296/4226 ... ll42ew.png[/img]
User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

HyperionX wrote: PS2 launched at around 40,000 yen in Japan I think, which is close to $400. Anyways, $399 was a lot more money in 1994-95 than it is today. Saturn is a bad example, it failed for reasons more than just price: It was also hard to program for, was another in a series of failed Sega addons (Sega 32x, CD), and lacked games. On top of that, PS3 has "hype," and an enormous amount of it. That alone steamrolled the Dreamcast with the PS2, and I predict a smaller version of it for PS3.
The Dreamcast failed because Sega lost a lot of mindshare after the Saturn flop true, but the Saturn itself lost when going head to head with the Playstation because of the price difference, especially at first when the Saturn had the superior games library.

Also I obviously wasn't talking about the Japanese launch, the Xbox 360 is unlikely to sell huge there anyway. The Japanese will never be keen on an American console, they have a certain amount of racism in their culture which makes an American console manufacturer making serious inroads difficult.
Also, I've checked the inflation calculator, $400 is like $321 in 1995, so I was a little off. But it still should close enough.
We're obviously not going to agree on this, I believe that $400 is too much for consumers to pay, and there is historical prescedent backing this up. I know inflation tends to cancel out the changes, but the minds of the consumers move much slower than inflation.
PS: For those who value market research: http://biz.yahoo.com/bw/050719/195743.html?.v=1
I believe that is a fairly accurate assessment, although I think the X360 might sell at little better than expected. The PS3 is still going to outsell the X360 based purely on Japanese sales alone.
I don't recall you ever saying that in a previous post this thread. No matter, AFAIK Sony has stated that PS3 will launch in US and Japan by Spring 2006, so the gap is not going to be that big.
They've said that they will launch by Spring, not US launch by Spring. US and Japanese Playstation launches tend to be about six months apart.
Interestingly enough just today someone came out with sales predictions here. Xbox1 also used TSMC to make the GPU, and launched with basically the same amount of units (1.5 million or so). Somehow there seems to be some absolute limit to how many you can make at launch.
The Xbox 1 had supply problems at launch due to the manufacturing of the sound chip and the fact that they cut the final hardware designs closer to the edge then they did on the X360. Also, Flextronics had some manufacturing setbacks, while we haven't heard about similar problems with this launch.
While it'll be very unlikely that you would even need the full Cell to do vertex shading, but you could do so every time you run into a VS limited scenario. Think of it as a crazy version of unified shaders. It's not such a dumb idea.
I've never heard of any game that was VS limited, every game I've ever seen is either PS limited, CPU limited, or bandwidth limited. Geometry processing has always been extremely high in modern GPU's since it is so easy to do for relatively low transistor counts.
And for the last point I think they dropped the dumb rasterizer (and the Cell based GPU) due to a lack of pixel shading power, not because of vertex shading.
No, it's because it's a stupid idea. It stresses system bandwidth heavily and no general purpose processor (no matter how specialized) can max the shader power of a purpose built GPU.
Somewhat true. It'll be seen just how useful adding lots of extra physics will be. But the possibility is there, and I don't think both MS and Sony added that much FP power in their CPU if there wasn't a point to it.
True, physics programming will probably be the thing that consumes the FP resources on the CPU's of the next batch of consoles. The problem is, physics is still so new that developers haven't scratched the surface on what is possible.

The real problem from this is that if the X360 has a good market share, there will be little incentive for developers to put serious physics engines into games that will be cross platform. Obviously first party Sony titles like Grand Turismo will potentially get a huge boost, but otherwise it's unlikely that developers will spend the time and money on developing physics that is integral to the game purely for the PS3.
Not for a particle system nor generating procedural data. SSAA is crazy though, but doable for anyone crazy enough. In fact some demos Sony's shown were done entirely on the CPU. These are merely the possibilities that exists. Also I see no reason why there won't be MSAA for the PS3. It's used all the time in PC games and there's no reason why they can't for the PS3.
It's possible, just unlikely IMO. The Xbox and Gamecube could have done AA for very cheap performance hits, yet it was never done.
So it's your position that the CPU isn't very important at all?:D Like I said, I don't think MS or Sony would put so much FP power in their CPUs if there wasn't any use for it.
I didn't say there wasn't any use for it, I said that its use is overstated; the GPU is still the biggest factor in game performance assuming sufficient CPU power for setup, AI, etc.
If all the unified shaders do is add efficiency and all the eDRAM is give free 4xAA then you can estimate that they won't save you much. Read any graphics card review. At 1024x768 the cost of 4xAA isn't a lot for a 7800GTX.
That's not the only thing that the integrated cache on the X360 GPU does, it also eliminates the need for processing color and z data and removes the bandwidth needed to handle it thanks to the integrated logic. This is extremely important; color and z data take up an enormous amount of usable bandwidth and processing and the inclusion of it for free effectively negates the PS3 bandwidth advantage.
What manufacturing troubles?
Sony has already had to disable one of the SPE's on the Cell to improve yields. That indicates manufacturing problems.
I'm assuming that final clockspeeds aren't going to change much, which given the relatively unaggressive clockspeeds is not a unreasonable assumption. As for the unified shaders they are not going to be magic. The RSX will have more pixel shading power then the Xenos will have total, so the only way the Xenos can overpower the RSX is in a vertex limited scenario. Something that will never be an issue seeing how the Cell has vertex shading power to burn.
Best case scenario, the RSX has a small advantage in pixel shading power due to increased clock rates. The RSX still has to deal with color and z data processing, and does not have free AA. Altogether, I'm going to say that the difference is mostly a wash, I don't see anything that establishes clear superiority one way or the other.
Flipper had programmable shaders actually. 162 Mhz isn't that bad versus the 233Mhz of the Xbox. Ironically enough Flipper actually has eDRAM for a framebuffer just like the Xenos, and X360 also has less memory bandwidth (22.4GB/s versus 48GB/s for PS3), just like the Gamecube. In fact I expect X360 vs. PS3 graphics to have the same difference of GC vs. Xbox.
Wrong, Flipper did not have programmable shaders, it was a fixed function GPU.

And it did NOT have integrated DRAM (not to mention the integrated DRAM logic), it had off chip 1T SRAM. The X360 solution puts the integrated cache ON CHIP, and it has integrated logic for handling AA, color and z data.
There's another fab at Oita actually. Anyways both Cell and Bluray are in-house technology, which means little if any licensing costs. The RSX licensing deal is $5 per chip last I heard. And it is not going to be made at Fishkill but in-house at Nagasaki or Oita. I don't know where you heard that but ultimately every major IC will be made in-house eventually. With dieshrinking everything will drop in costs very quickly.
IBM has already stated that they will be manufacturing the RSX, with only some of the production being done (overflow presumably) at Nagasaki. Oita will only be manufacturing Cell, and probably only for off applications (it's a joint Toshiba/Sony fab).
The only truely expensive unit will be the Bluray drive, which will also drop to DVD drive level in the next few years. Since virtually nothing in X360 is made in-house eventually the marginal costs are going to be greater due to licensing fees and other costs.
Possibly, it depends on volume. TSMC manufactures enough that they keep their fabs operating at near 100% capacity while Sony is not assured to be able to manufacture enough to amortize the costs of the Nagasaki fab.
All R&D costs are fixed costs, so if PS3 sells 100 million and R&D is 2 billion, R&D will be only $20 per PS3. If Cell is used for something else that will also reduce per unit costs.
Sony claimed that they would be able to amortize the EE costs too with wide application sales, but this didn't pan out.

Also, Sony already has billions of sunk costs into PS3 R&D on the GS2 which they didn't end up using. The RSX was a last minute solution.
Just look at PStwo: It's probably the cheapest console on the market right now, and I expect the same to be true for the PSthree as well.
You're talking years down the line, the PS3/X360 war will be settled far before then.
Some N64 games did go for $70.
Because the costs of manufacturing the carts were so high on the higher RAM parts. And they usually only did this with the most popular games that people would buy anyway.
Regardlessly the consumer cannot be immune from inflation indefinitely. At some point consoles will go for more than $300 and games for more than $50.
Games are already slated to go up to $60 in the next gen of consoles. This can probably be done without a hit on sales, but paying an extra $100+ on a console may mean a hit on sales.
While you will not run into a geometry limited scenario very often it can happen, such as a scenario when you use lots of micropolygons. My point is that you can do that without murdering the CPU.
This type of design could be used on the PC as well for extra geometry power (Intel used a CPU based vertex shader for their EG solutions), but I challenge you to find a single game where this was actually implemented. Face it, geometry simple isn't a bottleneck except in specially designed tech demos.
User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

Uraniun235 wrote: This is complete bullshit. There have been quite a few games sold at above $50 for the Super Nintendo and Nintendo 64. These tended to be cartridges with additional chips like the SuperFX, or bigger ROM chips.
All of those games needed to have higher costs because of, as you said, bigger ROM chips and things like geometry chips. However, this does not disprove the point since most of these games were of the highly popular variety (Mario, Starfox, Chrono Trigger), and we don't have data to suggest how this impacted potential sales.

If it is so easy to raise prices on games, how come $50 has been the established limit since the 80's with few exceptions? Only recently has this started to trend upward.
User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

HyperionX wrote: The $199 price tag of the Gamecube and the Dreamcast didn't save them against the $299 PS2 either. So clear it's doesn't matter NOW.
Red herring, I told you before that $299 is the sweet spot, anything lower and you see diminishing returns, anything higher and you start to get sticker shock. What about this is so hard to understand?
User avatar
Vendetta
Emperor's Hand
Posts: 10895
Joined: 2002-07-07 04:57pm
Location: Sheffield, UK

Post by Vendetta »

The Kernel wrote:Red herring, I told you before that $299 is the sweet spot, anything lower and you see diminishing returns, anything higher and you start to get sticker shock. What about this is so hard to understand?
There is a growing generation though that will pay big prices for fancy tech on release.

The analogy I've seen made is the iPod. It's one of the most expensive devices in its market, and it still slaps the competition.

People who paid $400 for an iPod will think little about paying $400 for a PS3.
User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

Vendetta wrote:
The Kernel wrote:Red herring, I told you before that $299 is the sweet spot, anything lower and you see diminishing returns, anything higher and you start to get sticker shock. What about this is so hard to understand?
There is a growing generation though that will pay big prices for fancy tech on release.

The analogy I've seen made is the iPod. It's one of the most expensive devices in its market, and it still slaps the competition.

People who paid $400 for an iPod will think little about paying $400 for a PS3.
Absolutely, there are millions of people that are willing to pay $400 for the PS3. There are also millions of people that are willing to pay $300 for a new console, but not $400. In this group, Sony is going to lose business to the X360.

EDIT: Also, remember that the people who are willing to pay top dollar for a console (at least in the US) are in Microsoft's age group, not Sony's. Sony sells the best relative to the Xbox to younger people, and younger people are less likely to have as much disposable income.
User avatar
HyperionX
Village Idiot
Posts: 390
Joined: 2004-09-29 10:27pm
Location: InDoORS

Post by HyperionX »

The Kernel wrote:The Dreamcast failed because Sega lost a lot of mindshare after the Saturn flop true, but the Saturn itself lost when going head to head with the Playstation because of the price difference, especially at first when the Saturn had the superior games library.

Also I obviously wasn't talking about the Japanese launch, the Xbox 360 is unlikely to sell huge there anyway. The Japanese will never be keen on an American console, they have a certain amount of racism in their culture which makes an American console manufacturer making serious inroads difficult.
Saturn was also crazily difficult to program for, and I don't think it had better games relative to the PSX at any time, or at least not in the US. Anyways, I think we can agree that PS series has huge mindshare and hype. The failings of previous consoles do not necessary apply to PS3.
Also, I've checked the inflation calculator, $400 is like $321 in 1995, so I was a little off. But it still should close enough.
We're obviously not going to agree on this, I believe that $400 is too much for consumers to pay, and there is historical prescedent backing this up. I know inflation tends to cancel out the changes, but the minds of the consumers move much slower than inflation.
PS: For those who value market research: http://biz.yahoo.com/bw/050719/195743.html?.v=1
I believe that is a fairly accurate assessment, although I think the X360 might sell at little better than expected. The PS3 is still going to outsell the X360 based purely on Japanese sales alone.
We'll know eventually, but I think we're are now in agreement that PS3 isn't going to be doomed because of its price.
I don't recall you ever saying that in a previous post this thread. No matter, AFAIK Sony has stated that PS3 will launch in US and Japan by Spring 2006, so the gap is not going to be that big.
They've said that they will launch by Spring, not US launch by Spring. US and Japanese Playstation launches tend to be about six months apart.
The info I was able to find was thinner than I thought, but I did find an interview from Kaz Hirai saying that they may launch outside of Japan first or even a world-wide launch here. So it may be the case that they launch in Japan first and US later on in the Fall, but we don't know for sure. Still the Xbox cannot possibly build more than a 5 million unit lead in the US alone no matter the case. If PS3 sales are anything like PS2 sales that's an easily beatable gap. Japan's still a lost cause.
Interestingly enough just today someone came out with sales predictions here. Xbox1 also used TSMC to make the GPU, and launched with basically the same amount of units (1.5 million or so). Somehow there seems to be some absolute limit to how many you can make at launch.
The Xbox 1 had supply problems at launch due to the manufacturing of the sound chip and the fact that they cut the final hardware designs closer to the edge then they did on the X360. Also, Flextronics had some manufacturing setbacks, while we haven't heard about similar problems with this launch.
Well I never heard of any manufacturing problems for Xbox1. Still, that guy thinks that X360 won't be selling millions within 2 months.
While it'll be very unlikely that you would even need the full Cell to do vertex shading, but you could do so every time you run into a VS limited scenario. Think of it as a crazy version of unified shaders. It's not such a dumb idea.
I've never heard of any game that was VS limited, every game I've ever seen is either PS limited, CPU limited, or bandwidth limited. Geometry processing has always been extremely high in modern GPU's since it is so easy to do for relatively low transistor counts.
Yes, but it can still happen. In those situations it can be another one of the CPU's abilities.
And for the last point I think they dropped the dumb rasterizer (and the Cell based GPU) due to a lack of pixel shading power, not because of vertex shading.
No, it's because it's a stupid idea. It stresses system bandwidth heavily and no general purpose processor (no matter how specialized) can max the shader power of a purpose built GPU.
Which is pixel shading power in particular. Anyways, the end results the same: A true GPU is a good idea.
True, physics programming will probably be the thing that consumes the FP resources on the CPU's of the next batch of consoles. The problem is, physics is still so new that developers haven't scratched the surface on what is possible.

The real problem from this is that if the X360 has a good market share, there will be little incentive for developers to put serious physics engines into games that will be cross platform. Obviously first party Sony titles like Grand Turismo will potentially get a huge boost, but otherwise it's unlikely that developers will spend the time and money on developing physics that is integral to the game purely for the PS3.
Looks like X360's going to hold back the industry :P.
It's possible, just unlikely IMO. The Xbox and Gamecube could have done AA for very cheap performance hits, yet it was never done.
Actually the GC nor Xbox had the bandwidth to do AA without major performance hits. This will not be case for PS3.
I didn't say there wasn't any use for it, I said that its use is overstated; the GPU is still the biggest factor in game performance assuming sufficient CPU power for setup, AI, etc.
For now, and for PC gaming. Once an abundance of FP power is available things like physics, particle system, etc., the CPU will become more useful. The GPU will still be very important though, but the CPU will make more of a difference.
That's not the only thing that the integrated cache on the X360 GPU does, it also eliminates the need for processing color and z data and removes the bandwidth needed to handle it thanks to the integrated logic. This is extremely important; color and z data take up an enormous amount of usable bandwidth and processing and the inclusion of it for free effectively negates the PS3 bandwidth advantage.
:wtf: You do realize that increased color and z bandwidth were the main costs of 4xAA in the first place? Again, a relatively minor cost at 1024x768.
Sony has already had to disable one of the SPE's on the Cell to improve yields. That indicates manufacturing problems.
Actually this indicates your lack of understand of chip fabrication: When you make chips you make them a whole wafer at a time, and each wafer is going to have a number of defects on it created during fabrication. Each defect will ruin whatever functional block that that defect is located on.

For example if there is a 40% chance of a defect existing on a certain kind of chip, then 40% of those chips will have a non-functional block. If you need every block to work then 40% of your chips are useless. However, if you can make it such that you can tolerate one defect per chip then suddenly your defect rate drops from 40% to 16%, a 24% yield improvement. This is the idea with disabling one SPE per Cell. It improves yields and decreases costs at only a 12.5% decrease in performance. It is not a sign of manufacturing problems, but a common cost savings tactic. In fact GPU makers do this all the time. This is why the X800XT has all 16 pipelines while the X800Pro only has 12, the GF6800GT/Ultra have 16 pipelines and the GF6800 has 12, etc.
Best case scenario, the RSX has a small advantage in pixel shading power due to increased clock rates. The RSX still has to deal with color and z data processing, and does not have free AA. Altogether, I'm going to say that the difference is mostly a wash, I don't see anything that establishes clear superiority one way or the other.
The extra color and z bandwidth was always a part of AA. However at modest resolutions these do not incure a serious cost. The RSX's advantage in pixel shading power is a lot more than you think, because the Xenos' unified shaders mean that pixeling shading units are shared with vertex shading units. The RSX pixel shading power is great than this power total. Also it is fully dedicated pixel shading power with a bunch of mini-ALUs, a texture processor, and some other minor functional units attached and is fully designed to handle pixel shading. Versus a unified shader it is easy to see that this pure pixel shading pipeline is more powerful in absolute terms relative to its equivalent in the unified shader.

According to this one article (originally in Japanese, read at your own risk), RSX full power is around 400GFLOPs, versus 240GFLOPs for Xenos. Obviously there's going to be some inaccuracy here, but you get the picture. RSX is in most cases the more powerful GPU and the only issue is how much. Unified shaders are not a miracle worker.
Wrong, Flipper did not have programmable shaders, it was a fixed function GPU.
Ok I admit, Flipper was not a particularly programmable GPU, but it did have some degree of programmability, as described here. Though I'm not quite certain if that's enough to make it really programmable, so I may be wrong here.
And it did NOT have integrated DRAM (not to mention the integrated DRAM logic), it had off chip 1T SRAM. The X360 solution puts the integrated cache ON CHIP, and it has integrated logic for handling AA, color and z data.
:lol: I think you have it backwards. The eDRAM was on-chip for the Flipper but the eDRAM for Xenos is on a daughter die not integrated with the GPU logic. [urlhttp://www.beyond3d.com/articles/xenos/images/c1.jpg]This[/url] is what it looks like. As for the Flipper this will explain things for you.
IBM has already stated that they will be manufacturing the RSX, with only some of the production being done (overflow presumably) at Nagasaki. Oita will only be manufacturing Cell, and probably only for off applications (it's a joint Toshiba/Sony fab).
No, IBM has never stated that they will make the RSX. Press releases by Sony have already stated that the RSX will be made at Nagasaki or Oita.
Possibly, it depends on volume. TSMC manufactures enough that they keep their fabs operating at near 100% capacity while Sony is not assured to be able to manufacture enough to amortize the costs of the Nagasaki fab.
Chips from TSMC also comes with a profit margin in the cost. As long as Nagasaki or Oita doesn't run too far short a dedicated in-house fab will usually have lower average costs. Sony's fabs most likely will also be much more aggressive at die shrinks than TSMC will be.
Sony claimed that they would be able to amortize the EE costs too with wide application sales, but this didn't pan out.

Also, Sony already has billions of sunk costs into PS3 R&D on the GS2 which they didn't end up using. The RSX was a last minute solution.
EE costs were absorbed by 90 million PS2s. And there was never such a thing as a GS2 as you described. The GPU for PS3 originally was suppose to be Cell based. There could not be billions of R&D for it, or much at all for that matter.
You're talking years down the line, the PS3/X360 war will be settled far before then.
Irrelevant, the costs are divided among all PS3s. Sony can lose money with every PS3 before the PSthree but as long as the PSthree makes money it counts in total.
Because the costs of manufacturing the carts were so high on the higher RAM parts. And they usually only did this with the most popular games that people would buy anyway.
Then I guess PS3's higher costs doesn't matter in end after all, since it's going to be so popular :P.
Games are already slated to go up to $60 in the next gen of consoles. This can probably be done without a hit on sales, but paying an extra $100+ on a console may mean a hit on sales.
Possibly, but relative to previous generations and inflation this isn't a whole lot more.
This type of design could be used on the PC as well for extra geometry power (Intel used a CPU based vertex shader for their EG solutions), but I challenge you to find a single game where this was actually implemented. Face it, geometry simple isn't a bottleneck except in specially designed tech demos.
Micropolygons can be used for rendering particles. While I do not know of any games that does that, you must admit that it can happen in a game environment and not always in some tech demo.
"Hey, genius, evolution isn't science. That's why its called a theory." -A Fundie named HeroofPellinor
"If it was a proven fact, there wouldn't be any controversy. That's why its called a 'Theory'"-CaptainChewbacca[img=left]http://www.jasoncoleman.net/wp-images/b ... irefox.png[/img][img=left]http://img296.imageshack.us/img296/4226 ... ll42ew.png[/img]
User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

HyperionX wrote: Saturn was also crazily difficult to program for, and I don't think it had better games relative to the PSX at any time, or at least not in the US. Anyways, I think we can agree that PS series has huge mindshare and hype. The failings of previous consoles do not necessary apply to PS3.
No, but think of it this way: in the US the Xbox and the Playstation brands are about equal in popularity as of right now. If a person sees a PS3 for $400 and an X360 for $300, which are they going to chose? Especially if this is a parent buying a console for a kid, they are probably going to go with the cheapest option as long as it will make the kid happy.

And before you say "Well then I guess the Revolution will sell the best" remember what I said about a sweet spot and that while most kids are going to settle for an Xbox over a PS3, they won't settle for a Nintendo box because they can't play their choice of games on it.
We'll know eventually, but I think we're are now in agreement that PS3 isn't going to be doomed because of its price.
No, not doomed. Handicapped is the word I'd choose.
The info I was able to find was thinner than I thought, but I did find an interview from Kaz Hirai saying that they may launch outside of Japan first or even a world-wide launch here. So it may be the case that they launch in Japan first and US later on in the Fall, but we don't know for sure. Still the Xbox cannot possibly build more than a 5 million unit lead in the US alone no matter the case. If PS3 sales are anything like PS2 sales that's an easily beatable gap. Japan's still a lost cause.
Actually, considering that the Xbox/PS2 gap has been shrinking in the US, I'd say Microsoft has a good chance at pulling off a sales victory in the US and possibly in Europe.

The point is not the units shipped at first, its the mindshare that the X360 creates.
Well I never heard of any manufacturing problems for Xbox1. Still, that guy thinks that X360 won't be selling millions within 2 months.
They have a possible year head start on the PS3 in the US, possibly more in Europe. That's quite a lead.
Yes, but it can still happen. In those situations it can be another one of the CPU's abilities.
Sure it could happen, but since it doesn't I don't see why you think this is relevent. That's like saying that the console with the most memory is going to be the higher performer because games could sap all availible memory.
Which is pixel shading power in particular. Anyways, the end results the same: A true GPU is a good idea.
Indeed, it's damn time Sony figured this out.
Looks like X360's going to hold back the industry :P.
Assuming devs are interested in making physics a big part of game design. This is a big assumption given the cost, and the fact that advanced physics has never really played a major roll in a game before.
Actually the GC nor Xbox had the bandwidth to do AA without major performance hits. This will not be case for PS3.
The Xbox did. It has 6.4 GB/s of memory bandwidth, which was enough for the GeForce 3 to do it with a "relatively painless" performance hit, at least if you go by your 10-15% assumption (at least at 2x).
For now, and for PC gaming. Once an abundance of FP power is available things like physics, particle system, etc., the CPU will become more useful. The GPU will still be very important though, but the CPU will make more of a difference.
I'll believe it when I see it.
:wtf: You do realize that increased color and z bandwidth were the main costs of 4xAA in the first place? Again, a relatively minor cost at 1024x768.
Color and z bandwidth are not just used in AA.
Actually this indicates your lack of understand of chip fabrication: When you make chips you make them a whole wafer at a time, and each wafer is going to have a number of defects on it created during fabrication. Each defect will ruin whatever functional block that that defect is located on.

For example if there is a 40% chance of a defect existing on a certain kind of chip, then 40% of those chips will have a non-functional block. If you need every block to work then 40% of your chips are useless. However, if you can make it such that you can tolerate one defect per chip then suddenly your defect rate drops from 40% to 16%, a 24% yield improvement. This is the idea with disabling one SPE per Cell. It improves yields and decreases costs at only a 12.5% decrease in performance. It is not a sign of manufacturing problems, but a common cost savings tactic. In fact GPU makers do this all the time. This is why the X800XT has all 16 pipelines while the X800Pro only has 12, the GF6800GT/Ultra have 16 pipelines and the GF6800 has 12, etc.
I'm well aware of this, I said that the yields of Cell was the reason for disabling an SPE. But the fact that they originally planned on 8 SPE's indicates yields are not where they should be.
The extra color and z bandwidth was always a part of AA. However at modest resolutions these do not incure a serious cost. The RSX's advantage in pixel shading power is a lot more than you think, because the Xenos' unified shaders mean that pixeling shading units are shared with vertex shading units. The RSX pixel shading power is great than this power total. Also it is fully dedicated pixel shading power with a bunch of mini-ALUs, a texture processor, and some other minor functional units attached and is fully designed to handle pixel shading. Versus a unified shader it is easy to see that this pure pixel shading pipeline is more powerful in absolute terms relative to its equivalent in the unified shader.
But with the Xenos it's easier to get total performance out of the availible units, you don't have to worry about bottlenecked PS's or VS's, you can assign them as you see fit. As long as the programmers for the X360 know how to adapt to the architecture of the Xenos, it's total performance might well be higher.
Ok I admit, Flipper was not a particularly programmable GPU, but it did have some degree of programmability, as described here. Though I'm not quite certain if that's enough to make it really programmable, so I may be wrong here.
It has a fixed T&L pipeline with no programmable shaders. That's not a programmable GPU.
:lol: I think you have it backwards. The eDRAM was on-chip for the Flipper
Incorrect. Here is the 1T SRAM for Flipper: (EDIT: Correction below in my next post)

Image

Distinctly off chip.
but the eDRAM for Xenos is on a daughter die not integrated with the GPU logic.
I said it is on-chip, not on-die. And it has a much wider/faster interface than Flipper had to its 1T SRAM.
No, IBM has never stated that they will make the RSX. Press releases by Sony have already stated that the RSX will be made at Nagasaki or Oita.
Ahh, okay, conceeded.
Chips from TSMC also comes with a profit margin in the cost. As long as Nagasaki or Oita doesn't run too far short a dedicated in-house fab will usually have lower average costs. Sony's fabs most likely will also be much more aggressive at die shrinks than TSMC will be.
Assuming they can fill their production capabilities. But it's a gamble.
EE costs were absorbed by 90 million PS2s. And there was never such a thing as a GS2 as you described. The GPU for PS3 originally was suppose to be Cell based. There could not be billions of R&D for it, or much at all for that matter.
The GPU for the PS3 was nearly finished and it was a dumb renderer design. These costs are part of the PS3 R&D costs.
Irrelevant, the costs are divided among all PS3s. Sony can lose money with every PS3 before the PSthree but as long as the PSthree makes money it counts in total.
You are missing the point. Sony is selling at $400, if that hurts their sales drastically, then it won't matter how cheap they can get them years later.
Then I guess PS3's higher costs doesn't matter in end after all, since it's going to be so popular :P.
We are talking about games, not consoles.
Possibly, but relative to previous generations and inflation this isn't a whole lot more.
We'll see.
Micropolygons can be used for rendering particles. While I do not know of any games that does that, you must admit that it can happen in a game environment and not always in some tech demo.
We'll see.
Last edited by The Kernel on 2005-08-07 05:02am, edited 2 times in total.
User avatar
The Kernel
Emperor's Hand
Posts: 7438
Joined: 2003-09-17 02:31am
Location: Kweh?!

Post by The Kernel »

Wait, I'm going to correct myself here. Flipper does indeed have 3MB of on die 1T SRAM, but it only has a 96-bit wide interface to it offering a total of 7.8GB/s of bandwidth, whereas the Xenos has a 256 GB/s interface with the on chip eDRAM, along with the logic.
User avatar
Xisiqomelir
Jedi Council Member
Posts: 1757
Joined: 2003-01-16 09:27am
Location: Valuetown
Contact:

Post by Xisiqomelir »

The Kernel wrote:If they lose FF as an exclusive...
....they'll still have Tekken 6. Which will guarantee them my 4 bills.
User avatar
SirNitram
Rest in Peace, Black Mage
Posts: 28367
Joined: 2002-07-03 04:48pm
Location: Somewhere between nowhere and everywhere

Post by SirNitram »

Lots of big risks in the next-gen. Sony's gambling the number of people who they lose because 'Well, I can just get the new X-Box now' will be outnumbered by the people they gain because 'Well, I've had six months/a year/whatever since I bought my new X-Box, and I've got money for a second system'. Sony's also gambling on the 'I want it all and I want it in one box' mentality of consumers. It'll let them positively dominate Japan and quite a few other places, but we'll see how far it gets them.

Of course, Microsoft's up for some potential falls as well. Much like a famous quote about no one ever needing more than 256k memory, they're gambling that this next generation of games won't need Blu-Ray/HD-DVD grade storage. I've found programmers will always find more room... They're also betting against backwards compatibility, which is odd. The X-Box 360 marketspeak talks about playing some babble which adds up to 'Well, we should be able to emulate our big hits'. This goes up against the PS3, which'll be able to run PS2 and at least a good chunk of the PS1 titles.. And the Revolution which will be capable of running games back into the 8-bit era. Halo and Halo 2 were great, but there better be an awesome starting library to catch up there.

The last big gamble comes from this announcement from Sony that the specs aren't set in stone. Sounds to me like they saw the X-Box 360, saw that, yea, it can compete, and decided to bump the date back a bit so they can improve the performance. If this is so, they're going to pull on Microsoft what they did on the industry this cycle; come in late, but with superior hardware, and gain up lots of support a little later. This'll probably work even better for Sony than for Microsoft; unlike the X-Box's release, Sony has brand recignition and loyalty as a gaming console, while the X-Box's gains were entirely from shiny hardware.

I do wonder if the game developers that leapt to the X-Box because it was the, and I quote one developer here, 'High end content machine', will bail on Microsoft if the PS3 comes out late with superior specs and it's vastly superior storage.
Manic Progressive: A liberal who violently swings from anger at politicos to despondency over them.

Out Of Context theatre: Ron Paul has repeatedly said he's not a racist. - Destructinator XIII on why Ron Paul isn't racist.

Shadowy Overlord - BMs/Black Mage Monkey - BOTM/Jetfire - Cybertron's Finest/General Miscreant/ASVS/Supermoderator Emeritus

Debator Classification: Trollhunter
Post Reply