HyperionX wrote:
PS2 launched at around 40,000 yen in Japan I think, which is close to $400. Anyways, $399 was a lot more money in 1994-95 than it is today. Saturn is a bad example, it failed for reasons more than just price: It was also hard to program for, was another in a series of failed Sega addons (Sega 32x, CD), and lacked games. On top of that, PS3 has "hype," and an enormous amount of it. That alone steamrolled the Dreamcast with the PS2, and I predict a smaller version of it for PS3.
The Dreamcast failed because Sega lost a lot of mindshare after the Saturn flop true, but the Saturn itself lost when going head to head with the Playstation because of the price difference, especially at first when the Saturn had the superior games library.
Also I obviously wasn't talking about the Japanese launch, the Xbox 360 is unlikely to sell huge there anyway. The Japanese will never be keen on an American console, they have a certain amount of racism in their culture which makes an American console manufacturer making serious inroads difficult.
Also, I've checked the inflation calculator, $400 is like $321 in 1995, so I was a little off. But it still should close enough.
We're obviously not going to agree on this, I believe that $400 is too much for consumers to pay, and there is historical prescedent backing this up. I know inflation tends to cancel out the changes, but the minds of the consumers move much slower than inflation.
I believe that is a fairly accurate assessment, although I think the X360 might sell at little better than expected. The PS3 is still going to outsell the X360 based purely on Japanese sales alone.
I don't recall you ever saying that in a previous post this thread. No matter, AFAIK Sony has stated that PS3 will launch in US and Japan by Spring 2006, so the gap is not going to be that big.
They've said that they will launch by Spring, not US launch by Spring. US and Japanese Playstation launches tend to be about six months apart.
Interestingly enough just today someone came out with sales predictions
here. Xbox1 also used TSMC to make the GPU, and launched with basically the same amount of units (1.5 million or so). Somehow there seems to be some absolute limit to how many you can make at launch.
The Xbox 1 had supply problems at launch due to the manufacturing of the sound chip and the fact that they cut the final hardware designs closer to the edge then they did on the X360. Also, Flextronics had some manufacturing setbacks, while we haven't heard about similar problems with this launch.
While it'll be very unlikely that you would even need the full Cell to do vertex shading, but you could do so every time you run into a VS limited scenario. Think of it as a crazy version of unified shaders. It's not such a dumb idea.
I've never heard of any game that was VS limited, every game I've ever seen is either PS limited, CPU limited, or bandwidth limited. Geometry processing has always been extremely high in modern GPU's since it is so easy to do for relatively low transistor counts.
And for the last point I think they dropped the dumb rasterizer (and the Cell based GPU) due to a lack of pixel shading power, not because of vertex shading.
No, it's because it's a stupid idea. It stresses system bandwidth heavily and no general purpose processor (no matter how specialized) can max the shader power of a purpose built GPU.
Somewhat true. It'll be seen just how useful adding lots of extra physics will be. But the possibility is there, and I don't think both MS and Sony added that much FP power in their CPU if there wasn't a point to it.
True, physics programming will probably be the thing that consumes the FP resources on the CPU's of the next batch of consoles. The problem is, physics is still so new that developers haven't scratched the surface on what is possible.
The real problem from this is that if the X360 has a good market share, there will be little incentive for developers to put serious physics engines into games that will be cross platform. Obviously first party Sony titles like Grand Turismo will potentially get a huge boost, but otherwise it's unlikely that developers will spend the time and money on developing physics that is integral to the game purely for the PS3.
Not for a particle system nor generating procedural data. SSAA is crazy though, but doable for anyone crazy enough. In fact some demos Sony's shown were done entirely on the CPU. These are merely the possibilities that exists. Also I see no reason why there won't be MSAA for the PS3. It's used all the time in PC games and there's no reason why they can't for the PS3.
It's possible, just unlikely IMO. The Xbox and Gamecube could have done AA for very cheap performance hits, yet it was never done.
So it's your position that the CPU isn't very important at all?:D Like I said, I don't think MS or Sony would put so much FP power in their CPUs if there wasn't any use for it.
I didn't say there wasn't any use for it, I said that its use is overstated; the GPU is still the biggest factor in game performance assuming sufficient CPU power for setup, AI, etc.
If all the unified shaders do is add efficiency and all the eDRAM is give free 4xAA then you can estimate that they won't save you much. Read any graphics card review. At 1024x768 the cost of 4xAA isn't a lot for a 7800GTX.
That's not the only thing that the integrated cache on the X360 GPU does, it also eliminates the need for processing color and z data and removes the bandwidth needed to handle it thanks to the integrated logic. This is extremely important; color and z data take up an enormous amount of usable bandwidth and processing and the inclusion of it for free effectively negates the PS3 bandwidth advantage.
What manufacturing troubles?
Sony has already had to disable one of the SPE's on the Cell to improve yields. That indicates manufacturing problems.
I'm assuming that final clockspeeds aren't going to change much, which given the relatively unaggressive clockspeeds is not a unreasonable assumption. As for the unified shaders they are not going to be magic. The RSX will have more pixel shading power then the Xenos will have total, so the only way the Xenos can overpower the RSX is in a vertex limited scenario. Something that will never be an issue seeing how the Cell has vertex shading power to burn.
Best case scenario, the RSX has a small advantage in pixel shading power due to increased clock rates. The RSX still has to deal with color and z data processing, and does not have free AA. Altogether, I'm going to say that the difference is mostly a wash, I don't see anything that establishes clear superiority one way or the other.
Flipper had programmable shaders actually. 162 Mhz isn't that bad versus the 233Mhz of the Xbox. Ironically enough Flipper actually has eDRAM for a framebuffer just like the Xenos, and X360 also has less memory bandwidth (22.4GB/s versus 48GB/s for PS3), just like the Gamecube. In fact I expect X360 vs. PS3 graphics to have the same difference of GC vs. Xbox.
Wrong, Flipper did not have programmable shaders, it was a fixed function GPU.
And it did NOT have integrated DRAM (not to mention the integrated DRAM logic), it had off chip 1T SRAM. The X360 solution puts the integrated cache ON CHIP, and it has integrated logic for handling AA, color and z data.
There's another fab at Oita actually. Anyways both Cell and Bluray are in-house technology, which means little if any licensing costs. The RSX licensing deal is $5 per chip last I heard. And it is not going to be made at Fishkill but in-house at Nagasaki or Oita. I don't know where you heard that but ultimately every major IC will be made in-house eventually. With dieshrinking everything will drop in costs very quickly.
IBM has already stated that they will be manufacturing the RSX, with only some of the production being done (overflow presumably) at Nagasaki. Oita will only be manufacturing Cell, and probably only for off applications (it's a joint Toshiba/Sony fab).
The only truely expensive unit will be the Bluray drive, which will also drop to DVD drive level in the next few years. Since virtually nothing in X360 is made in-house eventually the marginal costs are going to be greater due to licensing fees and other costs.
Possibly, it depends on volume. TSMC manufactures enough that they keep their fabs operating at near 100% capacity while Sony is not assured to be able to manufacture enough to amortize the costs of the Nagasaki fab.
All R&D costs are fixed costs, so if PS3 sells 100 million and R&D is 2 billion, R&D will be only $20 per PS3. If Cell is used for something else that will also reduce per unit costs.
Sony claimed that they would be able to amortize the EE costs too with wide application sales, but this didn't pan out.
Also, Sony already has billions of sunk costs into PS3 R&D on the GS2 which they didn't end up using. The RSX was a last minute solution.
Just look at PStwo: It's probably the cheapest console on the market right now, and I expect the same to be true for the PSthree as well.
You're talking years down the line, the PS3/X360 war will be settled far before then.
Some N64 games did go for $70.
Because the costs of manufacturing the carts were so high on the higher RAM parts. And they usually only did this with the most popular games that people would buy anyway.
Regardlessly the consumer cannot be immune from inflation indefinitely. At some point consoles will go for more than $300 and games for more than $50.
Games are already slated to go up to $60 in the next gen of consoles. This can probably be done without a hit on sales, but paying an extra $100+ on a console may mean a hit on sales.
While you will not run into a geometry limited scenario very often it can happen, such as a scenario when you use lots of micropolygons. My point is that you can do that without murdering the CPU.
This type of design could be used on the PC as well for extra geometry power (Intel used a CPU based vertex shader for their EG solutions), but I challenge you to find a single game where this was actually implemented. Face it, geometry simple isn't a bottleneck except in specially designed tech demos.