The Kernel wrote:Did I say resolution? Who said anything about resolution?
The Kernel's Left Hand isn't telling the Right Hand what it wrote:The kicker here is the higher resolution of the PC games. This was an unfortunate side effect of both the slow adoption of HDTV and the lack of speedy memory chips during the Xbox's launch (for those who don't know, faster memory is key to running at higher resolutions). The next-gen consoles will probably be pushing around 100 GB/s of memory bandwidth so making all games 1080i native won't be a problem. Five years may seem like a long time for a single hardware platform but I am constantly amazed by the amount of power developers have pushed from the Xbox given the relatively antiquated architechture. Have you seen Sudeki yet? Best damn graphics I've ever seen.
There, you said it. Now, on to the next part.
Anyways, as time rolls on I think that the PC will become more and more of a niche market. It won't die anytime soon and AAA titles will always sell well enough to keep the key developers making games, but the PC is going to lose a lot of its strengths versus consoles. I only wonder if console control systems will continue to evolve as well.
There are several strengths that the PC will never lose to consoles. Consoles are specialized, and are hurt severely by it. It's hard to justify a console purchase for anything but gaming. You can purchase a gaming PC and use it for a host of different tasks which can be completely seperate from gaming.
You think that a GeForce FX 5200 or Radeon 9200 will do you any better? These chips have been castrated beyond belief and even at 800x600, they simply can't maintain a decent framerate in games like Doom III will all the effects turned on (this I can prove btw).
Spec-tac-ular, I don't care. Graphics cards are only one piece of the puzzle, and if DOOM 3's latest leaked Alpha or Beta doesn't run smoothly on your machine, it doesn't fucking matter because it's
not the final fucking released code! Of course a fucking leaked Alpha isn't going to be a smooth as the final product, the same is just as true with consoles as PCs. If you got your hands on the original alpha of FF7 and tried to run it on your PS1, it would have looked like shit and been as buggy as hell.
Meanwhile, on the real side of things, let's look at why you need to occasionally upgrade your hardware for PC games...could it be that PC games are continually improving by leaps and bounds? Why, yes it could! With consoles, you have several years where the only improvements are in optimization of code and better use of the hardware available. In PC gaming, the amount of power available combined with the continued optimization of code (albiet at a somewhat slower rate) combine to create a massive increase in graphics in comparison to consoles. The consoles are still trying to catch up to even come close to PC graphics and speed.
So if a game comes out that pushes the bleeding edge of graphics, like Unreal or Morrowind, yes, your system may not run it very well (or even at all) unless you upgrade. However, you can still play all ther other games out there, the vast majority of which don't need insanely high spec systems to run. If you want, you can upgrade your system, but nobody is forcing you. Up until you can't run 50%-80% of the new games, upgrading is not required. You can still run most games with a GeForce2 quite nicely, I know I could.
Meanwhile, if a console game comes out that lags like a bitch and makes your system chug, you're shit out of luck. You can't hope for a patch that improves the code, you can't put in some more memory or a new graphics card, it's just going to lag and there's nothing you can do about it. Maybe in three or four years the next generation console will come out, and it can run the game just fine, but by then there will be new games out, and that one just won't be worth playing anymore.
Then there's reverse compatability. Despite the problems people have with getting some older games to run on newer operating systems, the fact of the matter is that PCs are the ultimate universal gaming platform. Between Dual-Booting and Emulation, you can play just about any game ever made, PC or Console, on just one computer. For most people, obsolete consoles get left behind on the shelf, gathering dust as the new consoles come in. Old PC gear can be put to uses other than gaming once it becomes past its prime. You can donate old systems to your local public schools or other charities, you can salvage perfectly good parts for new systems, you can use it as a webserver or game server, or you can wipe it and give it to the kids to use so that they don't fuck up your personal PC. And a modern, brand-spanking new PC is just as capable of playing Morrowind as Daggerfall. It can play Wing Commander: Prophecy and Wing Commander 1. Not only that, but third-party programs do exist that allow PCs to play newer (disk-based) console games without needing a console. Imagine what could be done if Sony, Microsoft, and Nintendo got off their asses and provided full PC support! They wouldn't have to spend as much money selling those pesky standalone systems at a loss (though they still could), and thus make MORE MONEY from what brings in the big bucks, the individual games themselves.
Also, consider this: you said that more blockbuster games come out per year for consoles than for PCs. I'm going to ask you to consider looking at that example harder. How long are these games? What is the gameplay like? How is the replayability? How well does a game have to sell to be a blockbuster? Are you considering the quality of the game, or just the units it moved? Deer Hunter and Millionare were very, very popular PC games, but most gamers considered them to be shit.
How about you tell us what games you feel were console blockbusters in the last year, and I'll come up with what games I feel were PC blockbusters in the past year. Then we can compare the quality of the PC blockbusters to the Console blockbusters.